Hi Guys, I am creating a custom dataplatform (look...
# ingestion
b
Hi Guys, I am creating a custom dataplatform (looking at this)
Copy code
pydantic.error_wrappers.ValidationError: 1 validation error for PipelineConfig
source -> filename:
  extra fields not permitted (type=value_error.extra)
This error is coming out. I don’t know which one is wrong. my python code is
Copy code
def add_quicksight_platform():
    pipeline = Pipeline.create(
        # This configuration is analogous to a recipe configuration.
        {
            "source": {
                "type": "file",
                "filename:":"/opt/airflow/dags/datahub/quicksight.json"
            },
            "sink": {
                "type": "datahub-rest",
                "config": {"server": "{datahub-gms-ip}:8080"},
            }
        }
    )
    pipeline.run()
    pipeline.raise_from_status()
quicksight.json
Copy code
{
  "auditHeader": null,
  "proposedSnapshot": {
    "com.linkedin.pegasus2avro.metadata.snapshot.DataPlatformSnapshot": {
      "urn": "urn:li:dataPlatform:quicksight",
      "aspects": [
        {
          "com.linkedin.pegasus2avro.dataplatform.DataPlatformInfo": {
            "datasetNameDelimiter": "/",
            "name": "quicksight",
            "type": "OTHERS",
            "logoUrl": "<https://play-lh.googleusercontent.com/dbiOAXowepd9qC69dUnCJWEk8gg8dsQburLUyC1sux9ovnyoyH5MsoLf0OQcBqRZILB0=w240-h480-rw>"
          }
        }
      ]
    }
  },
  "proposedDelta": null
}
b
Copy code
source:
  type: "file"
  config:
    filename: "./examples/mce_files/single_mce.json"
missing config
b
If i attach config , this error is added, so it was removed.
Copy code
pydantic.error_wrappers.ValidationError: 2 validation errors for FileSourceConfig
filename
  field required (type=value_error.missing)
filename:
  extra fields not permitted (type=value_error.extra)
Copy code
def add_quicksight_platform():
    pipeline = Pipeline.create(
        # This configuration is analogous to a recipe configuration.
        {
            "source": {
                "type": "file",
                "config":{"filename:":"/opt/airflow/dags/datahub/quicksight.json"}
            },
            "sink": {
                "type": "datahub-rest",
                "config": {"server": "{datahub-gms-ip}:8080"},
            }
        }
    )
    pipeline.run()
    pipeline.raise_from_status()
@better-orange-49102 Do you know any other way?
b
i just did a programmatic pipeline and it works correctly for file🤨 and i cant see the diff
Copy code
"filename:"
try
Copy code
def add_quicksight_platform():
    pipeline = Pipeline.create(
        # This configuration is analogous to a recipe configuration.
        {
            "source": {
                "type": "file",
                "config":{"filename":"/opt/airflow/dags/datahub/quicksight.json"}
            },
            "sink": {
                "type": "datahub-rest",
                "config": {"server": "{datahub-gms-ip}:8080"},
            }
        }
    )
    pipeline.run()
    pipeline.raise_from_status()
b
@better-orange-49102 oh you are a genius The typo bothered me.
Thanks!!! 👍 runs normally.