# troubleshooting

Kemp Po

03/08/2022, 4:15 PM
Is this your first time deploying Airbyte: No / Yes OS Version / Instance: GKE n1-standard-4 Memory / Disk: 4gb Deployment: Kubernetes Airbyte Version: 0.35.46-alpha Source name/version: Zendesk Support 0.2.0 Destination name/version: Google Cloud Storage (GCS) 0.1.24 Step: On initial sync Description: having issues with the GCS destination parquet files? I can write to csvs fine but not parquets, I have a hunch that it might be the url its using is
instead of
? All default settings except compression codec = SNAPPY

Augustin Lafanechere (Airbyte)

03/08/2022, 4:24 PM
Hi @Kemp Po I think the url might be a red-herring here. GCS works with the S3 api, it's why we are using an S3 client to upload files to GCS. Could you please share the error log you have?

Kemp Po

03/08/2022, 4:30 PM
Hey @Augustin Lafanechere (Airbyte), so this is what I'm seeing at the tail end of the log, could it be that I just have it set up wrong?
Copy code
2022-03-08 15:52:45 destination > 2022-03-08 15:52:45 WARN o.a.h.f.FileSystem(createFileSystem):3418 - Failed to initialize fileystem <s3a://bucket/data/2022_03_08_1646754764563_0.parquet>: java.lang.IllegalArgumentException: bucket
2022-03-08 15:52:45 destination > 2022-03-08 15:52:45 ERROR i.a.i.b.FailureTrackingAirbyteMessageConsumer(start):39 - Exception while starting consumer
2022-03-08 15:52:45 destination > java.lang.IllegalArgumentException: bucket
2022-03-08 15:52:45 destination > 	at ~[guava-31.0.1-jre.jar:?]
2022-03-08 15:52:45 destination > 	at org.apache.hadoop.fs.s3a.S3AUtils.propagateBucketOptions( ~[hadoop-aws-3.3.0.jar:?]
2022-03-08 15:52:45 destination > 	at org.apache.hadoop.fs.s3a.S3AFileSystem.initialize( ~[hadoop-aws-3.3.0.jar:?]
2022-03-08 15:52:45 destination > 	at org.apache.hadoop.fs.FileSystem.createFileSystem( ~[hadoop-common-3.3.0.jar:?]
2022-03-08 15:52:45 destination > 	at org.apache.hadoop.fs.FileSystem.access$200( ~[hadoop-common-3.3.0.jar:?]
2022-03-08 15:52:45 destination > 	at org.apache.hadoop.fs.FileSystem$Cache.getInternal( ~[hadoop-common-3.3.0.jar:?]
2022-03-08 15:52:45 destination > 	at org.apache.hadoop.fs.FileSystem$Cache.get( ~[hadoop-common-3.3.0.jar:?]
2022-03-08 15:52:45 destination > 	at org.apache.hadoop.fs.FileSystem.get( ~[hadoop-common-3.3.0.jar:?]
2022-03-08 15:52:45 destination > 	at org.apache.hadoop.fs.Path.getFileSystem( ~[hadoop-common-3.3.0.jar:?]
2022-03-08 15:52:45 destination > 	at org.apache.parquet.hadoop.util.HadoopOutputFile.fromPath( ~[parquet-hadoop-1.12.0.jar:1.12.0]
2022-03-08 15:52:45 destination > 	at io.airbyte.integrations.destination.gcs.parquet.GcsParquetWriter.<init>( ~[io.airbyte.airbyte-integrations.connectors-destination-gcs-0.35.36-alpha.jar:?]
2022-03-08 15:52:45 destination > 	at io.airbyte.integrations.destination.gcs.writer.ProductionWriterFactory.create( ~[io.airbyte.airbyte-integrations.connectors-destination-gcs-0.35.36-alpha.jar:?]
2022-03-08 15:52:45 destination > 	at io.airbyte.integrations.destination.gcs.GcsConsumer.startTracked( ~[io.airbyte.airbyte-integrations.connectors-destination-gcs-0.35.36-alpha.jar:?]

Marcos Marx (Airbyte)

03/08/2022, 10:40 PM
Can you share the complete logs and edit your post using the Slack Issue Template?
Did you deploy Airbyte for the first time? Or upgrade the connector from a old connection

Kemp Po

03/10/2022, 4:13 PM
Hey @Marcos Marx (Airbyte), edited the post above
And yes, its the first time I'm using it

Harshith (Airbyte)

03/14/2022, 8:55 AM
Hey surely logs look different can you create an issue around this so that team can track this

Kemp Po

03/15/2022, 10:32 AM