https://linen.dev logo
l

Lennert Rienau

02/18/2022, 2:58 PM
Is this your first time deploying Airbyte:  No Airbyte Version: 0.35.30-alpha Source: Salesforce (0.1.23) Destination: S3 (0.2.7) Deployment: Kubernetes Hi team, When using s3 destination with format type 
Parquet
 it seems not to support instanceprofile. Do I get it right that S3ParquetWriter is using SimpleAWSCredentialsProvider in any case?
m

Madhu Prabhakara

02/18/2022, 11:45 PM
Hello @Dejan Antonic the implementation of InstanceProfile shouldn’t be exclusive to a format type. Check the PR implementing this: https://github.com/airbytehq/airbyte/pull/9399
m

Muhammad Imtiaz

02/21/2022, 8:45 AM
Thanks Marcos!
Isn't it that a specific connector can override the connection settings? Not sure if I read it right what this bit does: https://github.com/airbytehq/airbyte/blob/a094142825b53692f0c15db505608968fcebc153[…]irbyte/integrations/destination/s3/parquet/S3ParquetWriter.java
o

Ofek Katriel

02/21/2022, 2:11 PM
@Marcos Marx (Airbyte) We get
Copy code
java.nio.file.AccessDeniedException: my-bucket-name: org.apache.hadoop.fs.s3a.auth.NoAwsCredentialsException: SimpleAWSCredentialsProvider: No AWS credentials in the Hadoop configuration
when we try to sync into
parquet
s3 destination using
InstanceProfile
while for
jsonl
and
avro
it works fine. These are the versions of airbyte and s3 destination we use:
Copy code
Airbyte Version: 0.35.30-alpha
Source: Salesforce (0.1.23)
Destination: S3 (0.2.7)
Is there any special environment variable or any other setting we should define in order to make it work?
4 Views