bright-cpu-56427
06/09/2022, 7:32 AMpydantic.error_wrappers.ValidationError: 1 validation error for PipelineConfig
source -> filename:
extra fields not permitted (type=value_error.extra)
This error is coming out.
I don’t know which one is wrong.
my python code is
def add_quicksight_platform():
pipeline = Pipeline.create(
# This configuration is analogous to a recipe configuration.
{
"source": {
"type": "file",
"filename:":"/opt/airflow/dags/datahub/quicksight.json"
},
"sink": {
"type": "datahub-rest",
"config": {"server": "{datahub-gms-ip}:8080"},
}
}
)
pipeline.run()
pipeline.raise_from_status()
quicksight.json
{
"auditHeader": null,
"proposedSnapshot": {
"com.linkedin.pegasus2avro.metadata.snapshot.DataPlatformSnapshot": {
"urn": "urn:li:dataPlatform:quicksight",
"aspects": [
{
"com.linkedin.pegasus2avro.dataplatform.DataPlatformInfo": {
"datasetNameDelimiter": "/",
"name": "quicksight",
"type": "OTHERS",
"logoUrl": "<https://play-lh.googleusercontent.com/dbiOAXowepd9qC69dUnCJWEk8gg8dsQburLUyC1sux9ovnyoyH5MsoLf0OQcBqRZILB0=w240-h480-rw>"
}
}
]
}
},
"proposedDelta": null
}
better-orange-49102
06/09/2022, 7:33 AMsource:
type: "file"
config:
filename: "./examples/mce_files/single_mce.json"
better-orange-49102
06/09/2022, 7:33 AMbright-cpu-56427
06/09/2022, 7:55 AMpydantic.error_wrappers.ValidationError: 2 validation errors for FileSourceConfig
filename
field required (type=value_error.missing)
filename:
extra fields not permitted (type=value_error.extra)
bright-cpu-56427
06/09/2022, 7:56 AMdef add_quicksight_platform():
pipeline = Pipeline.create(
# This configuration is analogous to a recipe configuration.
{
"source": {
"type": "file",
"config":{"filename:":"/opt/airflow/dags/datahub/quicksight.json"}
},
"sink": {
"type": "datahub-rest",
"config": {"server": "{datahub-gms-ip}:8080"},
}
}
)
pipeline.run()
pipeline.raise_from_status()
bright-cpu-56427
06/09/2022, 7:57 AMbetter-orange-49102
06/09/2022, 8:00 AMbetter-orange-49102
06/09/2022, 8:01 AM"filename:"
better-orange-49102
06/09/2022, 8:01 AMdef add_quicksight_platform():
pipeline = Pipeline.create(
# This configuration is analogous to a recipe configuration.
{
"source": {
"type": "file",
"config":{"filename":"/opt/airflow/dags/datahub/quicksight.json"}
},
"sink": {
"type": "datahub-rest",
"config": {"server": "{datahub-gms-ip}:8080"},
}
}
)
pipeline.run()
pipeline.raise_from_status()
bright-cpu-56427
06/09/2022, 8:07 AMbright-cpu-56427
06/09/2022, 8:08 AM