Hi guys! I’m trying to do sql profiling for redshi...
# troubleshoot
c
Hi guys! I’m trying to do sql profiling for redshift. First i’ve tried it with limit:10 and it was ok. But then i’ve tried it without limit:10 and i had an error for specific table which has 25m row and 370 column. The error is:
Copy code
KafkaException: KafkaError{code=MSG_SIZE_TOO_LARGE,val=10,str="Unable to produce message: Broker: Message size too large"}
I did some research about it and people say that i need to increase some of the properties like message.max.byte, max.request.byte etc. from broker,producer and consumer sides. I updated the server.properties,consumer.properties and producer.properties files inside the k8s kafka pod, but i couldn’t solve the issue. Can anybody help me about kafka and k8s? Note: I think i need to restart the kafka broker to apply server.properties changes somehow, but i don’t know how.
e
Quick q. are you using kafka-ingest or rest-ingest?
c
@early-lamp-41924 kafka-ingest
Do you have any idea ?
h
@curved-jordan-15657, could you dump the mcps to a file (use a file sink in your recipe instead of kafka rest) and share the file with us?