Hi Everyone, I have 2 queries (Datahub & Clien...
# troubleshoot
w
Hi Everyone, I have 2 queries (Datahub & Client version 10.0.1) 1. Has anyone able to achieve update lineage automatically between Kafka topics with upstream & downstream datasets? a. If yes then did you manage to do it without Kafka connectors ? 2. I am also trying to ingest Kafka connectors and only data it ingest is the Connector name. It does not even imports the Kafka connector properties. a. Message I can get
Detected undefined connector <Kafka Connector Name>, which is not in the customized connector list
. The Datahub document does not explains how do I make connector ingest the properties anyway. b. Connector which I am trying to import are Snowflake-Sink and Generic connectors. Thank you in advance.
📖 1
🔍 1
l
Hey there 👋 I'm The DataHub Community Support bot. I'm here to help make sure the community can best support you with your request. Let's double check a few things first: ✅ There's a lot of good information on our docs site: www.datahubproject.io/docs, Have you searched there for a solution? ✅ button ✅ It's not uncommon that someone has run into your exact problem before in the community. Have you searched Slack for similar issues? ✅ button Did you find a solution to your issue? ❌ Sorry you weren't able to find a solution. I'm sending you some tips on info you can provide to help the community troubleshoot. Whenever you feel your issue is solved, please react ✅ to your original message to let us know!
w
1. Datahub version = 10.0.1 2. Kubernetes 3. Error logs ===>
Detected undefined connector <my kafka connector name>, which is not in the customized connector list. Please refer to Kafka Connect ingestion recipe to define this customized connector.
but I cannot find any documentation which can listen to conenctors which Datahub does not supports. 4. I am using simple Kafka - connect ingestion as mentioned along with MSKC connector APIs
@delightful-ram-75848 Would you be able to help me in ingesting the Kafka connectors custom properties ?
@dazzling-judge-80093 Would you be able to help me in figuring a solution/work around for this ?
d
@wide-afternoon-79955 I think you should define some generic connector for your custom connector -> something like this -> https://github.com/datahub-project/datahub/blob/87d32d7377d2bd05635ae71cec2e4a58a9[…]tests/integration/kafka-connect/kafka_connect_mongo_to_file.yml