*Is this your first time deploying Airbyte*: Yes *...
# troubleshooting
u
Is this your first time deploying Airbyte: Yes OS Version / Instance: Amazon Linux 2, AWS Memory / Disk: 4.01476 Gb / 50 Gb Deployment: Docker Airbyte Version: 0.35.14-alpha Source name/version: Twilio Destination name/version: Amazon S3 Description: I'm trying to sync using parquet output format but it give me this error:
Copy code
Exception in thread "main" java.nio.file.AccessDeniedException: s3-bucket: org.apache.hadoop.fs.s3a.auth.NoAwsCredentialsException: SimpleAWSCredentialsProvider: No AWS credentials in the Hadoop configuration
Also when I try with JSON/CSV format it works, and to mention that I attach an IAM role on the instance that runs Airbyte docker to read/write on the S3 bucket.
o
We are experiencing the same problem with @Gavriel Oren and @Dejan Antonic. It looks like parquet output connector is ignoring the "global" AWS connection and overriding it and it doesn't support roles. See https://airbytehq.slack.com/archives/C01MFR03D5W/p1645517651153029 and https://airbytehq.slack.com/archives/C01MFR03D5W/p1645196299746269 also comments in https://github.com/airbytehq/airbyte/issues/5942
@Liren Tu hope you don't mind me to ping you directly, You appear as an author in https://github.com/airbytehq/airbyte/pull/3908/files
Hey @Pawel can you create a github issue and tag @Liren Tu (Airbyte) there so that he can comment or the team can also look into it
Hey @Harshith (Airbyte) Thanks for reply. I commented on an existing issue https://github.com/airbytehq/airbyte/issues/5942#issuecomment-1043271349 Would that be good enough if I tag him there?
That's ok if you think it makes sense to get a response from him.
m
@Pawel, thanks for pinging us about this issue. I did add the Parquet and Avro formats for S3, but I did not work on the IAM role part. The IAM role for S3 is recently added if I remember correctly. The Parquet format fails under IAM role probably because it constructs the writer object differently from Jsonl and CSV formats. I will also comment on the issue.
thanks for taking the time to look into this @Liren Tu!