Hello, does anyone know how to pass AWS configs to...
# random
r
Hello, does anyone know how to pass AWS configs to flink's flink-s3-fs-hadoop plugin during local testing? I made this stack overflow post with more information: https://stackoverflow.com/questions/77540049/how-can-i-pass-aws-configs-to-flinks-flink-s3-fs-hadoop-plugin-in-local-tests
a
Based on your problem description, it should be:
Copy code
fs.s3a.access.key: <>
fs.s3a.secret.key: <>
fs.s3a.endpoint.region: <region>
fs.s3a.endpoint: <endpoint>
fs.s3a.path.style.access: <value>
r
Yup, I've tried to specify that using many different methods; the problem is that the plugin doesn't seem to be receiving those configs Note: full details of what I've tried (incl. the above) are in the stackoverflow post
j
You could try this: 1- Create your test config file (flink-config.yaml) with
Copy code
## S3 config
s3.access-key: test
s3.secret-key: test
s3.endpoint: <http://awslocal:4566>
s3.endpoint.region: us-east-1
s3.path.style.access: true
2- Load this configuration with
Copy code
import org.apache.flink.configuration.Configuration;
import org.apache.flink.configuration.GlobalConfiguration;
import org.apache.flink.core.fs.FileSystem;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;

Configuration loadedConfig = GlobalConfiguration.loadConfiguration("directory/where/is/flinkconfig");
FileSystem.initialize(config, null);
StreamExecutionEnvironment environment = StreamExecutionEnvironment.createLocalEnvironmentWithWebUI(loadedConfig);
The important part is
FileSystem.initialize(config, null);
gratitude thank you 1
πŸ™ 1
r
Thank you kindly; it worked!