Hi there! I am new to Airbyte and I am trying to l...
# ask-community-for-troubleshooting
p
Hi there! I am new to Airbyte and I am trying to load 36 different sets of csv files from s3 to Snowflake. I am able to do this directly on Snowflake but wanted to test Airbyte. Is it possible to define a single S3 source that will understand the s3 "folder" structure and create separate tables in the DB? The structure is: s3://<bucket>/<table name>/yyyy/mm/dd/<table name>.csv.gz and I'd like to have a table for each <table name> created in the DB. Thanks!
1
m
p
Hi Marcos, I can see how I can select multiple files to be loaded in the same table regardless of where they are, but I am missing how I could extract the desired table name from the s3 path.
In other words, I was hoping I could do something like this to tell Airbyte where to get the table name from.:
Copy code
/{$AIRBYTE_TABLE_NAME}/**/*.csv.gz
m
Unfortunately it’s not possible 😞
p
Got it. Is this doable architecturally? If so, I'll formalize it as a product suggestion.
m
@Pedro Machado Did you create an issue for this already?
p
Hi Melker. I did not. We went in a different direction.
👍 1