cold-autumn-7250
07/31/2022, 5:46 PMs3_sensor = S3KeySensor(
task_id="s3_file_check",
aws_conn_id="aws_prod",
bucket_key=bronze_path,
bucket_name=bronze_bucket,
poke_interval=60,
mode="reschedule",
timeout=60 * 60 * 8,
dag=dag,
inlets=[Dataset("s3", "test/{{ ds }}")],
)
Reason is that I would like to connect the actual file on S3 with the Airflow run.
Thanks a lot for any suggestion 🙂
PS: I am using the following versions:
apache-airflow-providers-amazon==4.1.0
acryl-datahub-airflow-plugin==0.8.41.2dazzling-judge-80093
08/01/2022, 1:32 PMcold-autumn-7250
08/01/2022, 3:07 PMclass CustomS3KeySensor(S3KeySensor):
def poke(self, context: 'Context'):
self.inlets = [Dataset("s3", f"{self.bucket_name}/{self.bucket_key[0]}")] # define inlets in Sensors/operators
return super().poke(context)
dazzling-judge-80093
08/01/2022, 3:09 PMwide-florist-83539
05/04/2023, 5:52 PM