chilly-elephant-51826
06/04/2022, 12:00 PMdatahub-actions
run some ingestion over kafka stream, but since I have protected kafka, that does not use simple ssl connection instead uses sasl
, it is not passing the correct parameter required to connect.
This is the kafka config that I am using
security.protocol: SASL_SSL
sasl.mechanism: SCRAM-SHA-512
client.sasl.mechanism: SCRAM-SHA-512
kafkastore.security.protocol: SSL
ssl.endpoint.identification.algorithm: https
ssl.keystore.type: JKS
ssl.protocol: TLSv1.2
ssl.truststore.type: JKS
even though they are passed correctly to the container env
variables but are not populated in the config file that gets executed
here is the config that was generated (found from container logs)
{'source':
{ 'type': 'datahub-stream',
'config': {
'auto_offset_reset': 'latest',
'connection': {
'bootstrap': 'XXXXXXXXXXXXXX',
'schema_registry_url': 'XXXXXXXXXXXXX',
'consumer_config': {'security.protocol': 'SASL_SSL'}
},
'actions': [
{ 'type': 'executor',
'config': {
'local_executor_enabled': True,
'remote_executor_enabled': 'False',
'remote_executor_type': 'acryl.executor.sqs.producer.sqs_producer.SqsRemoteExecutor',
'remote_executor_config': {
'id': 'remote',
'aws_access_key_id': '""',
'aws_secret_access_key': '""',
'aws_session_token': '""',
'aws_command_queue_url': '""',
'aws_region': '""'
}
}
}
],
'topic_routes': {
'mae': 'MetadataAuditEvent_v4',
'mcl': 'MetadataChangeLog_Versioned_v1'
}
}
},
'sink': {'type': 'console'},
'datahub_api': {'server': '<http://datahub-datahub-gms:8080>', 'extra_headers': {'Authorization': 'Basic __datahub_system:NOTPASSING'}}
}
as in the above config it is clear that other required configuration are not getting passed,
I have raised a bug, any help is appreciated.little-megabyte-1074
chilly-elephant-51826
06/15/2022, 4:37 AMbig-carpet-38439
06/17/2022, 3:14 PMbig-carpet-38439
06/17/2022, 3:14 PM