Hi, team. I'm one of the `clickhouse destination` ...
# help-connector-development
m
Hi, team. I'm one of the
clickhouse destination
maintainers. One on
clickhouse destination
customers has the following error:
Missing columns: '_airbyte_unique_key' while processing query: 'SELECT number, _deleted, _creationTime, _ab_cdc_updated_at, _ab_cdc_deleted_at, _ab_cdc_lsn, _id, _ts, _airbyte_ab_id, _airbyte_emitted_at, _airbyte_no
on a subsequent run with
Incremental | Deduped + history
strategy. What could be a reason, considering that Airbyte itself creates the table? How can we debug it?
👀 1
k
A message from kapa.ai
Hmm, I don't know enough to give you a confident answer yet. However, I can provide you with some information on ClickHouse destination features and output schema from the [documentation](https://docs.airbyte.com/integrations/destinations/clickhouse). Each stream will be output into its own table in ClickHouse, and each table will contain 3 columns: - `_airbyte_ab_id`: a uuid assigned by Airbyte to each event that is processed. The column type in ClickHouse is
String
. - `_airbyte_emitted_at`: a timestamp representing when the event was pulled from the data source. The column type in ClickHouse is
DateTime64
. - `_airbyte_data`: a json blob representing the event data. The column type in ClickHouse is
String
. For more information on incremental data synchronization and deduped + history strategy, you can refer to this [tutorial](https://airbyte.com/tutorials/incremental-data-synchronization). If you still need assistance, I recommend reaching out to the Airbyte community on their [forum](https://discuss.airbyte.io) or [Slack](https://airbytehq.slack.com) for further support.
c
@Mikhail Shustov have you solved it? I am facing the same problem
👍 1
m
@Mikhail Shustov, I am also facing the same issue. How did you solve it?
👍 1
m
I didn't. we need assistance someone from Airbyte to debug it
👍 2
m
@Mikhail Shustov, Could you point me to the person that helped you?
m
It used to be Greg S. He is not with Airbyte anymore
m
Hmm... What do you suggest I do? I'm kinda stumped on this one.
m
how does you pipeline look like?
m
MySQL (5.7) to Clickhouse (23.5.3.24). Incremental Deduped + history. No customizations or anything crazy What other detail(s) would be helpful?
m
How do you run ClickHouse? Can you use experimental features?
m
I have deployed it using docker to a server that I control
I'll start investigating that now!
Is this what you are using to solve your issue?
a
Were you all able to resolve this? I am still facing this issue with Clickhouse as a destination and MySQL or PGSQL as sources
j
Hi All, does Airbyte supports CDC to Clickhouse? I have PostgreSQL (source) and Clickhouse (dest), and unable to create destionation connection to Clickhouse. I'm using Airbyte on the Cloud. I am able to CDC from PostgreSQL to SQL server with no issue.