Bruno Mendes
10/17/2022, 5:38 PMKishore G
Bruno Mendes
10/17/2022, 5:41 PMBruno Mendes
10/17/2022, 5:45 PMKishore G
Xiang Fu
Xiang Fu
Bruno Mendes
10/17/2022, 8:03 PMBruno Mendes
10/17/2022, 8:06 PMjaas.conf
as secret mount and set the -Djava.security.auth.login.config=/etc/kafka/secrets/jaas.conf
environment variable so Pinot could read the file...
was something like:
apiVersion: v1
kind: ConfigMap
metadata:
name: cfmp
namespace: pinot
data:
realtime.json: |-
{
...
# REALTIME table
}
schema.json: |-
{
# schema
}
---
apiVersion: batch/v1
kind: Job
metadata:
name: job
namespace: pinot
spec:
template:
spec:
containers:
- name: pinot-job
image: apachepinot/pinot:latest
args: [ "AddTable", "-schemaFile", "/var/pinot/cfmp/schema.json", "-tableConfigFile", "/var/pinot/cfmp/realtime.json", "-controllerHost", "pinot-controller", "-controllerPort", "9000", "-exec" ]
env:
- name: JAVA_OPTS
value: "-Xms4G -Xmx4G -Dpinot.admin.system.exit=true -Djava.security.auth.login.config=/etc/kafka/secrets/jaas.conf"
volumeMounts:
- name: cfmp
mountPath: /var/pinot/examples
- name: kafka-secret
mountPath: /etc/kafka/secrets
subPath: rest_jaas.conf
restartPolicy: Never
volumes:
- name: cfmp
configMap:
name: cfmp
- name: kafka-secret
secret:
secretName: kafka-secret
backoffLimit: 0
Xiang Fu
sed 's/JAAS_CONFIG_PATH/${JAAS_CONFIG_PATH}/g' /var/pinot/cfmp/realtime.json
Xiang Fu
-Djava.security.auth.login.config=/etc/kafka/secrets/jaas.conf"
this kind of JVM opts has to be set at startup timeXiang Fu
Bruno Mendes
10/17/2022, 9:05 PMBruno Mendes
09/11/2023, 9:03 PMKafkaClient {
org.apache.kafka.common.security.plain.PlainLoginModule required
username="foo"
password="bar";
};
• on the controller, passed the path of the file: "-Djava.security.auth.login.config=/etc/kafka/secrets/jaas.conf"
• set table conf as following:
[...]
"tableIndexConfig": {
"loadMode": "MMAP",
"streamConfigs": {
"streamType": "kafka",
"stream.kafka.topic.name": "test",
"stream.kafka.consumer.type": "lowlevel",
"stream.kafka.consumer.prop.auto.offset.reset": "7d",
"stream.kafka.consumer.factory.class.name": "org.apache.pinot.plugin.stream.kafka20.KafkaConsumerFactory",
"stream.kafka.broker.list": "my-confluent-address:9092",
"stream.kafka.decoder.class.name": "org.apache.pinot.plugin.stream.kafka.KafkaJSONMessageDecoder",
"stream.kafka.decoder.prop.basic.auth.credentials.source": "USER_INFO",
"sasl.mechanism": "PLAIN",
"security.protocol": "SASL_PLAINTEXT",
"sasl.enabled.mechanisms": "PLAIN",
"realtime.segment.flush.threshold.rows": "50000",
"realtime.segment.flush.threshold.time": "3h",
"realtime.segment.flush.threshold.segment.size": "150M"
}
}
[...]
Now when when I add the table I receive the timeout error:
{
"code": 500,
"error": "org.apache.pinot.spi.stream.TransientConsumerException: org.apache.pinot.shaded.org.apache.kafka.common.errors.TimeoutException: Timeout expired while fetching topic metadata"
}
Do you have any clue what am I messing?Xiang Fu
"sasl.jaas.config":"org.apache.kafka.common.security.scram.ScramLoginModule required username=\"foo\" password=\"bar\";",
into table config works for you?
I don’t think pinot honors "-Djava.security.auth.login.config=/etc/kafka/secrets/jaas.conf"
as it may have many kafka connections.Xiang Fu
Xiang Fu
Bruno Mendes
09/12/2023, 3:03 AMBruno Mendes
09/12/2023, 3:04 AMXiang Fu
Bruno Mendes
09/22/2023, 7:22 PMBruno Mendes
09/22/2023, 7:23 PMBruno Mendes
09/22/2023, 7:23 PMXiang Fu
Bruno Mendes
09/26/2023, 3:17 AM