Travis Carter
09/18/2023, 10:47 PMTravis Carter
09/18/2023, 10:47 PMCaused by: org.apache.flink.core.fs.UnsupportedFileSystemSchemeException: Could not find a file system implementation for scheme 'gs'. The scheme is directly supported by Flink through the following plugin(s): flink-gs-fs-hadoop. Please ensure that each plugin resides within its own subfolder within the plugins directory. See <https://nightlies.apache.org/flink/flink-docs-stable/docs/deployment/filesystems/plugins/> for more information. If you want to use a Hadoop file system for that scheme, please add the scheme to the configuration fs.allowed-fallback-filesystems. For a full list of supported file systems, please see <https://nightlies.apache.org/flink/flink-docs-stable/ops/filesystems/>.
Just unsure where the Jar should go for local runs like thisTravis Carter
09/19/2023, 7:30 PMStreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
if (config.env() == EnvEnum.DEV) {
env = StreamExecutionEnvironment.createLocalEnvironmentWithWebUI(new Configuration());
}
So I was able to simply add the PluginUtils and include a plugins/gs-fs-hadoop/xxx.jar
alongside my main/java
so the updated conditional is
StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
if (config.env() == EnvEnum.DEV) {
env = StreamExecutionEnvironment.createLocalEnvironmentWithWebUI(new Configuration());
var pluginManager = PluginUtils.createPluginManagerFromRootFolder(new Configuration());
FileSystem.initialize(new Configuration(), pluginManager);
}
This will at minimum load the Plugins at startup in the IDE.