I am trying to use kafka source and redshift desti...
# ask-community-for-troubleshooting
m
I am trying to use kafka source and redshift destination. I have message in the form of JSON, but the JSON is just inserted into raw table as blob, I want the JSON message as a proper table with fields as columns in the table. I have turned on the “Normalisation” on, still it is not working Help appreciated
👀 1
h
Hey can you share the connection catalog which you have set for the connection
m
Hey, I am new to airbyte, can you please guide me where I can find connection catalogue ?
h
This is basically a step where you are creating a connection. You would have choosen certain streams to sync and each stream will have columns so can you help with column names with their types
m
so, in that list I can see only one stream. In that stream there is only one column with type as String
h
Got it so that has to be of type object and with the respective properties it has. You can edit it with API here
m
Screenshot 2022-03-01 at 4.22.36 PM.png,Screenshot 2022-03-01 at 4.22.54 PM.png
I am using Airbyte UI, can this be done using UI ?.. else I am OK doing it using API
h
We don't support UI right now.
m
Not even in cloud subscription ?
h
m
This is not the issue I am facing … I am OK getting all the columns.. but none are showing up is my problem
Let me post a sample JSON. Let us say below is the JSON payload in my kafka
Copy code
{
  "ID": "1",
  "Artist": "Rick Astley",
  "Song": "Never Gonna Give You Up"
}
The columns I should get is
ID
,
Artist
,
Song
but I can just see a column with the above object in a string column I want all these fields as columns in my destination.
Basic Normalisation
claims that it will map the fields in JSON object to columns in the destination. But that is not happening and I have no control over manual mapping like we have in other tools [Kafka connect and so ]
h
If you can do a get call to that connection and you can check in the streams it is of type string you can change that to object and add properties to it and hit update call and then run the sync again
m
Using the API ? I am using airbyte UI, it does not any have such option
h
Yeah you have to use API right now as we don't support it in UI yet
m
@Harshith (Airbyte), Thank you, the API did work. We should mention that in the doc, please use API for this instead of UI. That may save time 😇
s
Hi, I face the same situation and haven't use API yet, could you please share some hint or code how to do that? What I need is to normalize simple JSON from Kafka topic to Postgres table with several attributes.