Can someone please help me here?. I am facing issu...
# troubleshooting
s
Can someone please help me here?. I am facing issue in modelling table as the input event is highly nested and i am not able to access the nested element from the code snippet. https://apache-flink.slack.com/archives/C03G7LJTS2G/p1684852955178819
m
@Shiv Desai Like mentioned in the channel description, one of the few rules we have in this Slack is not direct @ people
s
Sorry for that. Could you please help me as i am stuck with the above mentioned issue.
m
I can’t. My experience on switching from the DataStream API to the Table API is very little. Not sure why you can’t read from the Schema Registry (confluent-avro) in the Table API to avoid the conversion
s
I tried the below code to avoid the conversion and it is working fine for root element("a") in the json but i am not sure what datatype should i give for "b". Furthermore, how can i access "c" using select statement. Example:
Copy code
{
"a":"a"
"b": {
       "c": "c"
     }
}
Code:
Copy code
tEnv.executeSql("CREATE TABLE rules (\n" +
         "    a VARCHAR(3),  \n" +
         ") WITH (\n" +
         "    'connector' = 'kafka',\n" +
         "    'properties.bootstrap.servers' = '',\n" +
         "    'scan.startup.mode' = 'earliest-offset',\n" +
         "    'topic'     = '',\n" +
         "    'value.format'    = 'avro-confluent',\n" +
         "    'value.avro-confluent.url' = ''\n" +
         ")");
m
s
Thank you very much for you help. Furthermore, i was curious and wanted to know whether is there a way in which we can create the Table with all fields using the Schema string directly. ?.
m
There isn't in open source Flink, there is https://issues.apache.org/jira/browse/FLINK-18777 but that has been stalled for quite some time