1) There are 3 tables – master table (employee) and two child tables (employeeDept, employeestatus)
2) Data is stored in Master Kafka Topics (here kafka topics are used as a DB more than streaming, data of all employees stored in these topics), so we have 3 master kafka topics.
3) There are 3 Change Data Capture(CDC) kafka topics.
4) There are 1 Kafka sink topic
Requirement: when there change in data, CDC kafka topic(point3) will trigger message to flink, and flink needs to send consolidate information of all data for particular row to kafka sink topic(point4)
Employee table : Name, ID, Address, DeptId,statusID columns
employeeDept : DeptId,DeptName columns
employeestatus: StatusID,StatusName columns
John(Employee Table) had an address change, employee CDC kafka topic triggers(point3), flink listen to CDC kafka topic, and flink program should be able to get the data from all three tables employee, employeeDept, employeestatus that belongs to John and send consolidated data of john(Name, ID, Address, DeptId,statusID, DeptId,DeptName, StatusID,StatusName) to the corresponding kafka sink topics(point5) with the updated address
What is the best approach for this use case ?