Hi team, I'm trying to sink the data into S3 but ...
# troubleshooting
y
Hi team, I'm trying to sink the data into S3 but I keep getting this error message. Here is my code snippet and the error I got
Copy code
stockStream.addSink(StreamingFileSink
				.forRowFormat(new Path("<s3a://aws-s3-path/>"),
						new SimpleStringEncoder<Stock>("UTF-8"))
				.withBucketAssigner(new DateTimeBucketAssigner<>(partitionFormat))
				.build()).name("S3 Sink");
Caused by: org.apache.hadoop.fs.s3a.auth.NoAuthWithAWSException: No AWS Credentials provided by DynamicTemporaryAWSCredentialsProvider TemporaryAWSCredentialsProvider SimpleAWSCredentialsProvider EnvironmentVariableCredentialsProvider IAMInstanceCredentialsProvider : com.amazonaws.SdkClientException: Unable to load AWS credentials from environment variables (AWS_ACCESS_KEY_ID (or AWS_ACCESS_KEY) and AWS_SECRET_KEY (or AWS_SECRET_ACCESS_KEY)) my
pom.xml
has this dependency too:
Copy code
<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-aws</artifactId>
    <version>3.2.4</version>
</dependency>
<dependency>
    <groupId>org.apache.flink</groupId>
    <artifactId>flink-s3-fs-hadoop</artifactId>
    <version>1.19.1</version>
</dependency>
s
Try setting the below in the flink configuration
'fs.s3a.aws.credentials.provider': 'com.amazonaws.auth.WebIdentityTokenCredentialsProvider'
y
Thanks @Sricharan is this in the
Configuration
in the code or the resources section? I'm unclear please
s
This is the flink config you provide during deployment.