Hey I am writing a Flink task that writes data fro...
# troubleshooting
d
Hey I am writing a Flink task that writes data from Kafka to Postgresql using this approach
Copy code
//Obtain stream
    DataStream<PUsageRecord> records =
                env.fromSource(source, WatermarkStrategy.noWatermarks(), "Kafka Source");

    Table table = tEnv.fromDataStream(records);
    
    final String s3Sink = "CREATE TABLE usagedata (" +
    "ts TIMESTAMP(3)," +
    "account STRING," +
    "bytesused BIGINT" +
    ")" +
    " WITH" +
    "(" +
    " 'connector' = 'jdbc'," +
    " 'url' = 'jdbc:postgresql://********/testdb'," +
    " 'table-name' = 'tablename'," +
    " 'username' = 'postgres'," +
    " 'password' = '*****'" +
    ")";
    tEnv.executeSql(s3Sink);
However, I getting this error Query schema: [account: STRING, bytesUsed: BIGINT NOT NULL, ts: TIMESTAMP(9)] Sink schema: [ts: TIMESTAMP(3), account: STRING, bytesused: BIGINT] ts in PUsageRecord is java.sql.timestamp but it seems to be extracted to TimeStamp(9). How do I make it change the conversion from java.sql.timestamp to TimeStamp(9) when converting from DataStream to Table?