hello, I'm using hive catalog and want to create a...
# troubleshooting
s
hello, I'm using hive catalog and want to create a permanent table. According to the documentation,
Permanent tables require a catalog (such as Hive Metastore) to maintain metadata about the table. Once a permanent table is created, it is visible to any Flink session that is connected to the catalog and will continue to exist until the table is explicitly dropped.
so, when i try to create a table
Copy code
create table foo(id int);
flink gives an error
org.apache.flink.table.api.ValidationException: Table options do not contain an option key 'connector' for discovering a connector. Therefore, Flink assumes a managed table. However, a managed table factory that implements org.apache.flink.table.factories.ManagedTableFactory is not in the classpath.
my understanding is that creating such a table uses hive metastore to store the DDL, and uses the flink built-in rocksDB as the store for data. Is this correct? if so, how can I fix the error and create permanent table? Using paimon is currently not an option because we are on GCP and paimon does not seem to support GCS (google cloud storage)
m
if so, how can I fix the error and create permanent table?
You need to properly create a table. Depending on the type of system you want to connect to, you need to provide the right configuration options after supplying a
WITH
clause. For example Kafka requires something like:
Copy code
CREATE TABLE KafkaTable (
  `user_id` BIGINT,
  `item_id` BIGINT,
  `behavior` STRING,
  `ts` TIMESTAMP_LTZ(3) METADATA FROM 'timestamp'
) WITH (
  'connector' = 'kafka',
  'topic' = 'user_behavior',
  'properties.bootstrap.servers' = 'localhost:9092',
  'properties.group.id' = 'testGroup',
  'scan.startup.mode' = 'earliest-offset',
  'format' = 'csv'
)
Because you're not specifying a
WITH
clause, Flink thinks that you want to use a managed table, which isn't the case.