Hello, I have a local (non dockerized) test setup ...
# troubleshooting
a
Hello, I have a local (non dockerized) test setup working nice with: Flink (1.19), High Availability through Zookeeper, Hive Catalog on PostgresSQL for the SQL Client, Kafka integrations, etc. Now i'm moving to docker compose, everything works fine but one: starting the SQL Client it doesn't find any custom catalog, but, if i search for them in Hive, or I recreate them from the SQL Client (
CREATE CATALOG hive_catalog WITH ('type' = 'hive','hive-conf-dir' = '/opt/flink/conf');
) then all the previous tables created during the previous sessions are available, so the catalog is persisted in Hive/Postgres but seems it's just not available on startup. The catalog is configured as a file catalog and persisted on a mounted volume in docker: From flink config.yaml
Copy code
table:
  catalog-store:
    kind: file
    file:
      path: file:///opt/flink/catalogs/
From compose:
Copy code
jobmanager:
    image: flink:1.19
    volumes:
      - ./jobmanager/:/tmp/
      - ./jobmanager/:/opt/flink/flink-web
      - ./conf/config.yaml:/opt/flink/conf/config.yaml
      - ./catalogs/:/opt/flink/catalogs/
      - ./conf/hive-site.xml:/opt/flink/conf/hive-site.xml
I also moved a hive_catalog.yaml file retrieved from the non dockerized setup inside the catalogs directory but nothing changes
Copy code
type: "hive"
hive-conf-dir: "/opt/flink/conf"
Any hints? Thanks!
r
it sounds about right what you're doing with the catalog store config (based on what I found and wrote about here https://www.decodable.co/blog/catalogs-in-flink-sql-a-primer#catalog-stores-its-metadata-all-the-way-down) I'd try a few things: 1. Does the
file.path
need the
file://
prefix? 2. Get a shell within the Docker container and verify the contents of
/opt/flink/catalogs/
to make sure the volume mount is working 3. With the catalog-store config in place, create a new catalog and verify that it's written to the path
a
Hello! Just fixed. All your hints make sense, but the root problem was i lost a piece of configuration in the docker-compose.file
Copy code
sql-client:
    image: ultrafab_flink_kafka:1.19
    #command: bin/sql-gateway.sh start-foreground -Dsql-gateway.endpoint.type=rest -Dsql-gateway.endpoint.rest.address=jobmanager
    command: bin/sql-client.sh
    ports:
      - "8983:8083"
      - "10000:10000"
    depends_on:
      - jobmanager
    environment:
      - |
        FLINK_PROPERTIES=
        jobmanager.rpc.address: jobmanager
        rest.address: jobmanager
        table.catalog-store.kind: file
        table.catalog-store.file.path: /opt/flink/catalogs/
    volumes:
      - ./catalogs/:/opt/flink/catalogs/ #<- this one
      - ./conf/hive-site.xml:/opt/flink/conf/hive-site.xml # <- and this one
      - ./highavailabilty:/tmp
r
glad you got it working 🙂
🎉 1
a
Now when started the sql clients it shows all the previous catalogs, and creating a new catalog it appears in the catalog folder