New to Pinot. We're using the Pinot Docker images....
# troubleshooting
j
New to Pinot. We're using the Pinot Docker images. We've created offline tables successfully, but can't create a realtime table. The segment status is 'bad'. There's no error messages in the logs for the broker, controller, or server so I'm stuck on how to debug this?
n
Not sure what your setup is but I was getting some bad segments and after I gave some of the components some more resources that helped. Try reloading the segment (in the UI you click on the segment, and then reload) and see if it reloads successfully.
How are you ingesting/creating the segments?
j
I've tried reloading with no success. Data will eventually come from a Kafka topic (different server). Kafka authentication is a bit complex: SASL_SSL with a user id/password and a corporate self-signed CA for the ssl cert. I've added the certificate authority to the docker images' cacerts file /usr/local/openjdk-8/jre/lib/security/cacerts I've specified the id/pw in the streamConfigs stream.kafka.username and stream.kafka.password
I would expect a connection error to be in the logs somewhere though
c
@Xiang Fu @Jackie any pointers here ^^ ?
x
how do you connect to kafka cluster? does the connection open?
also if you deploy pinot controller/server separately, then you need to configure kafka auth in both controller/server
k
@Josh Highley FWIW, I’ve been fooled into thinking there was no logging output before with Pinot, due to
<RandomAccessFile name="controllerLog" fileName="pinotController.log" immediateFlush="false">
in the log configuration file(s). Since
immediateFlush
is false, a connection error wouldn’t show up right away in the logs.
j
I've managed to find an error message in the Controller log. "org.apache.kafka.common.errors.TimeoutException:Timeout expired while fetching topic metadata". I have added the certificates to the Controller container's cacerts and restarted it. Still no success. However, I was able to start the Kafka container as described in the Pinot docs Manual Cluster Setup and successfully connect the realtime table to that Kafka instance. So, this seems to be some kind of connectivity or authentication issue with the other Kafka server.