I am facing issues in setting up kafka for datahub...
# all-things-deployment
h
I am facing issues in setting up kafka for datahub, I have used following configuration to connect my AWS MSK with SASL/SCRAM in my values.yaml file:
Copy code
kafka:
    bootstrap:
      server: "bootstrap1:9096,bootstrap2:9096,bootstrap3:9096"
    zookeeper:
      server: "zk1:2181,zk2:2181,zk3:2181"
global:
credentialsAndCertsSecrets:
    name: sasl-jass-config
    secureEnv:
      sasl.jaas.config: sasl_jaas_config


springKafkaConfigurationOverrides:
   security.protocol: SASL_SSL
   sasl.mechanism: SCRAM-SHA-512
My secrets file is containing the jaas.config file content as follows.
Copy code
org.apache.kafka.common.security.scram.ScramLoginModule required username="xxxxxxxxxx" password="xxxxxxxxxxxx";
and i verified that my MSK is already configured with SASL/SCRAM but still
datahub-kafka-setup-job
is failing with following error:
Copy code
[main] ERROR io.confluent.admin.utils.cli.KafkaReadyCommand - Error while running kafka-ready.
org.apache.kafka.common.KafkaException: Failed to create new KafkaAdminClient
	at org.apache.kafka.clients.admin.KafkaAdminClient.createInternal(KafkaAdminClient.java:535)
	at org.apache.kafka.clients.admin.Admin.create(Admin.java:75)
	at org.apache.kafka.clients.admin.AdminClient.create(AdminClient.java:49)
	at io.confluent.admin.utils.ClusterStatus.isKafkaReady(ClusterStatus.java:138)
	at io.confluent.admin.utils.cli.KafkaReadyCommand.main(KafkaReadyCommand.java:150)
Caused by: java.lang.IllegalArgumentException: Could not find a 'KafkaClient' entry in the JAAS configuration. System property 'java.security.auth.login.config' is not set
	at org.apache.kafka.common.security.JaasContext.defaultContext(JaasContext.java:131)
	at org.apache.kafka.common.security.JaasContext.load(JaasContext.java:96)
	at org.apache.kafka.common.security.JaasContext.loadClientContext(JaasContext.java:82)
	at org.apache.kafka.common.network.ChannelBuilders.create(ChannelBuilders.java:134)
	at org.apache.kafka.common.network.ChannelBuilders.clientChannelBuilder(ChannelBuilders.java:73)
	at org.apache.kafka.clients.ClientUtils.createChannelBuilder(ClientUtils.java:105)
	at org.apache.kafka.clients.admin.KafkaAdminClient.createInternal(KafkaAdminClient.java:508)
Can someone please help if anything specific I am missing in the configuration ?
l
@bumpy-needle-3184 ^
b
Hi Kishore, could you modify your value.yaml to below and try
Copy code
kafka:
  bootstrap:
    server: "bootstrap1:9096,bootstrap2:9096,bootstrap3:9096"
  zookeeper:
    server: "zk1:2181,zk2:2181,zk3:2181"

global:
  springKafkaConfigurationOverrides:
    security.protocol: SASL_SSL
    sasl.mechanism: SCRAM-SHA-512	
  credentialsAndCertsSecrets:
    name: sasl-jass-config
    secureEnv:
      sasl.jaas.config: sasl_jaas_config
"credentialsAndCertsSecrets" key entry shoud be under "global" key
h
@bumpy-needle-3184 ah that was a typo mistake. It is under global only. But still seeing the same error.
b
from the error, it looks like it is not able to read jaas config. Could you see if secret is created with name "sasl-jass-config" and secret key sasl-jass-config under data as in below entry
Copy code
apiVersion: v1
data:
  sasl_jaas_config: xxxxxxxxx
kind: Secret
metadata:
  name: sasl-jass-config
h
Copy code
apiVersion: v1
data:
 sasl_jaas_config: xxxxxxxxxxxxxxxxxxxxxxxxx
kind: Secret
metadata:
 creationTimestamp: "2022-06-17T17:28:45Z"
 name: sasl-jass-config
 namespace: default
my secrets file looks like the same
b
@early-lamp-41924 is SASL_SCRAM authentication supported for datahub-kafka-setup-job?
l
@incalculable-ocean-74010 ^
i
We have never configured SASL_SCRAM as far as I’m aware. I’m unfamiliar with it, I can try to help. Could we schedule a call @helpful-processor-71693?
h
@incalculable-ocean-74010 This issue has been resolved. We have modified the kafka-setup.sh script to get the SASL/SCRAM connection string from it's environment variables and it was successfully created required kafka topics. Code snippet for reference.
Copy code
if [[ $KAFKA_PROPERTIES_SECURITY_PROTOCOL == "SSL" ]]; then
    if [[ -n $KAFKA_PROPERTIES_SSL_KEYSTORE_LOCATION ]]; then
        echo "ssl.keystore.location=$KAFKA_PROPERTIES_SSL_KEYSTORE_LOCATION" >> $CONNECTION_PROPERTIES_PATH
        echo "ssl.keystore.password=$KAFKA_PROPERTIES_SSL_KEYSTORE_PASSWORD" >> $CONNECTION_PROPERTIES_PATH
        echo "ssl.key.password=$KAFKA_PROPERTIES_SSL_KEY_PASSWORD" >> $CONNECTION_PROPERTIES_PATH
        if [[ -n $KAFKA_PROPERTIES_SSL_KEYSTORE_TYPE ]]; then
            echo "ssl.keystore.type=$KAFKA_PROPERTIES_SSL_KEYSTORE_TYPE" >> $CONNECTION_PROPERTIES_PATH
        fi
    fi
    if [[ -n $KAFKA_PROPERTIES_SSL_TRUSTSTORE_LOCATION ]]; then
        echo "ssl.truststore.location=$KAFKA_PROPERTIES_SSL_TRUSTSTORE_LOCATION" >> $CONNECTION_PROPERTIES_PATH
        echo "ssl.truststore.password=$KAFKA_PROPERTIES_SSL_TRUSTSTORE_PASSWORD" >> $CONNECTION_PROPERTIES_PATH
        if [[ -n $KAFKA_PROPERTIES_SSL_TRUSTSTORE_TYPE ]]; then
            echo "ssl.truststore.type=$KAFKA_PROPERTIES_SSL_TRUSTSTORE_TYPE" >> $CONNECTION_PROPERTIES_PATH
        fi
    fi
    echo "ssl.endpoint.identification.algorithm=$KAFKA_PROPERTIES_SSL_ENDPOINT_IDENTIFICATION_ALGORITHM" >> $CONNECTION_PROPERTIES_PATH
fi
Currently we are facing this issue in datahub-gms pod https://datahubspace.slack.com/archives/CV2UVAPPG/p1655742739959599
196 Views