white-guitar-82227
05/31/2023, 12:30 PMastonishing-answer-96712
05/31/2023, 9:24 PMwhite-guitar-82227
06/02/2023, 1:59 PM{'exec_id': '7529c81e-9fd3-4038-9702-9aa7035b2e78',
'infos': ['2023-06-02 12:40:09.819865 INFO: Starting execution for task with name=RUN_INGEST',
'2023-06-02 12:40:09.828182 INFO: Caught exception EXECUTING task_id=7529c81e-9fd3-4038-9702-9aa7035b2e78, name=RUN_INGEST, '
'stacktrace=Traceback (most recent call last):\n'
' File "/usr/local/lib/python3.10/site-packages/acryl/executor/execution/default_executor.py", line 122, in execute_task\n'
' task_event_loop.run_until_complete(task_future)\n'
' File "/usr/local/lib/python3.10/asyncio/base_events.py", line 649, in run_until_complete\n'
' return future.result()\n'
' File "/usr/local/lib/python3.10/site-packages/acryl/executor/execution/sub_process_ingestion_task.py", line 69, in execute\n'
' recipe: dict = SubProcessTaskUtil._resolve_recipe(validated_args.recipe, ctx, self.ctx)\n'
' File "/usr/local/lib/python3.10/site-packages/acryl/executor/execution/sub_process_task_common.py", line 100, in _resolve_recipe\n'
' raise TaskError(f"Failed to resolve secret with name {match}. Aborting recipe execution.")\n'
'acryl.executor.execution.task.TaskError: Failed to resolve secret with name SINK_TOKEN. Aborting recipe execution.\n'],
'errors': []}
white-guitar-82227
06/02/2023, 2:00 PMwhite-guitar-82227
06/09/2023, 6:29 AMhelpful-tent-87247
06/21/2023, 7:11 PMlittle-megabyte-1074
aloof-gpu-11378
06/26/2023, 7:53 PMaloof-gpu-11378
06/26/2023, 7:55 PMhelpful-tent-87247
06/26/2023, 7:56 PMwhite-guitar-82227
06/27/2023, 6:38 AMwhite-guitar-82227
06/27/2023, 1:39 PMsource:
type: mongodb
config:
connect_uri: 'xxxxxxx'
username: '${USER}'
password: '${PASSWORD}'
enableSchemaInference: true
useRandomSampling: true
maxSchemaSize: 500
database_pattern:
allow:
- xxxxxx
deny:
- admin|local|config|system
ignoreCase: true
collection_pattern:
allow:
- xxxxxx
deny:
- 'system.*'
ignoreCase: true
sink:
type: datahub-rest
config:
server: '<http://datahub-app-datahub-gms:8080>'
token: '${SINK_TOKEN}'
This is our ingestion. The error:
~~~~ Execution Summary - RUN_INGEST ~~~~
Execution finished with errors.
{'exec_id': 'e4f45ba9-675a-4080-83ca-dd96948d247f',
'infos': ['2023-06-27 13:15:26.499701 INFO: Starting execution for task with name=RUN_INGEST',
'2023-06-27 13:15:26.508139 INFO: Caught exception EXECUTING task_id=e4f45ba9-675a-4080-83ca-dd96948d247f, name=RUN_INGEST, '
'stacktrace=Traceback (most recent call last):\n'
' File "/usr/local/lib/python3.10/site-packages/acryl/executor/execution/default_executor.py", line 122, in execute_task\n'
' task_event_loop.run_until_complete(task_future)\n'
' File "/usr/local/lib/python3.10/asyncio/base_events.py", line 649, in run_until_complete\n'
' return future.result()\n'
' File "/usr/local/lib/python3.10/site-packages/acryl/executor/execution/sub_process_ingestion_task.py", line 69, in execute\n'
' recipe: dict = SubProcessTaskUtil._resolve_recipe(validated_args.recipe, ctx, self.ctx)\n'
' File "/usr/local/lib/python3.10/site-packages/acryl/executor/execution/sub_process_task_common.py", line 100, in _resolve_recipe\n'
' raise TaskError(f"Failed to resolve secret with name {match}. Aborting recipe execution.")\n'
'acryl.executor.execution.task.TaskError: Failed to resolve secret with name SINK_TOKEN. Aborting recipe execution.\n'],
'errors': []}
~~~~ Ingestion Logs ~~~~
white-guitar-82227
06/27/2023, 1:42 PM2023-06-27 13:15:26,505 [ForkJoinPool.commonPool-worker-23]
ERROR c.datahub.graphql.GraphQLController:107 -
Errors while executing graphQL query: \"query getSecretValues($input: GetSecretValuesInput!) {
getSecretValues(input: $input) { name value } }\",
result:
{errors=[{message=An unknown error occurred.,
locations=[{line=3,column=17}],
path=[getSecretValues],
extensions={code=500,
type=SERVER_ERROR,
classification=DataFetchingException}}],
data={getSecretValues=null},
extensions={tracing={version=1,
startTime=2023-06-27T13:15:26.501925Z,
endTime=2023-06-27T13:15:26.505901Z,
duration=3978201,
parsing={startOffset=244000,
duration=211824},
validation={startOffset=425286, duration=162932},
execution={resolvers=[{path=[getSecretValues], parentType=Query, returnType=[SecretValue!],
fieldName=getSecretValues,
startOffset=507388,
duration=2968027}]}}}},
errors: [DataHubGraphQLError{path=[getSecretValues], code=SERVER_ERROR, locations=[SourceLocation{line=3, column=17}]}]",
aloof-gpu-11378
06/27/2023, 1:42 PMwhite-guitar-82227
06/27/2023, 1:42 PMwhite-guitar-82227
06/27/2023, 1:43 PMwhite-guitar-82227
06/27/2023, 1:43 PMbulky-soccer-26729
06/27/2023, 1:43 PMwhite-guitar-82227
06/27/2023, 1:44 PMwhite-guitar-82227
06/27/2023, 1:44 PMaloof-gpu-11378
06/27/2023, 1:45 PMwhite-guitar-82227
06/27/2023, 1:54 PMwhite-guitar-82227
06/27/2023, 1:57 PMwhite-guitar-82227
06/27/2023, 2:07 PM2023-06-27 14:04:13,805 [I/O dispatcher 1] INFO c.l.m.s.e.update.BulkListener:47 - Successfully fed bulk request. Number of events: 3 Took time ms: -1
aloof-gpu-11378
06/27/2023, 2:09 PMurn:li:dataHubSecret:
?aloof-gpu-11378
06/27/2023, 2:09 PMwhite-guitar-82227
06/27/2023, 2:10 PMaloof-gpu-11378
06/27/2023, 2:11 PMwhite-guitar-82227
06/27/2023, 2:17 PMwhite-guitar-82227
06/27/2023, 2:18 PMaloof-gpu-11378
06/27/2023, 2:19 PMaloof-gpu-11378
06/27/2023, 2:19 PMwhite-guitar-82227
06/27/2023, 2:20 PMaloof-gpu-11378
06/27/2023, 2:23 PMlistSecrets
docs here: https://datahubproject.io/docs/graphql/queries/#listsecretsaloof-gpu-11378
06/27/2023, 2:24 PMaloof-gpu-11378
06/27/2023, 2:24 PMgetSecretValues
?white-guitar-82227
06/27/2023, 2:29 PMwhite-guitar-82227
06/27/2023, 2:30 PMwhite-guitar-82227
06/27/2023, 2:37 PMaloof-gpu-11378
06/27/2023, 2:37 PMgetSecretValues
?white-guitar-82227
06/27/2023, 2:38 PMaloof-gpu-11378
06/27/2023, 2:40 PMwhite-guitar-82227
06/27/2023, 2:42 PM2023-06-27 13:22:10,352 [ForkJoinPool.commonPool-worker-27] ERROR c.l.d.g.e.DataHubDataFetcherExceptionHandler:21 - Failed to execute DataFetcher
java.util.concurrent.CompletionException: java.lang.RuntimeException: Failed to perform update against input com.linkedin.datahub.graphql.generated.GetSecretValuesInput@7125083c
at java.base/java.util.concurrent.CompletableFuture.encodeThrowable(CompletableFuture.java:314)
at java.base/java.util.concurrent.CompletableFuture.completeThrowable(CompletableFuture.java:319)
at java.base/java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1702)
at java.base/java.util.concurrent.CompletableFuture$AsyncSupply.exec(CompletableFuture.java:1692)
at java.base/java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
at java.base/java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020)
at java.base/java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656)
at java.base/java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594)
at java.base/java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:183)
Caused by: java.lang.RuntimeException: Failed to perform update against input com.linkedin.datahub.graphql.generated.GetSecretValuesInput@7125083c
at com.linkedin.datahub.graphql.resolvers.ingest.secret.GetSecretValuesResolver.lambda$get$2(GetSecretValuesResolver.java:87)
at java.base/java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1700)
... 6 common frames omitted
Caused by: java.lang.RuntimeException: Failed to decrypt value using provided secret!
at com.linkedin.metadata.secret.SecretService.decrypt(SecretService.java:80)
at com.linkedin.datahub.graphql.resolvers.ingest.secret.GetSecretValuesResolver.decryptSecret(GetSecretValuesResolver.java:95)
at com.linkedin.datahub.graphql.resolvers.ingest.secret.GetSecretValuesResolver.lambda$get$1(GetSecretValuesResolver.java:77)
at java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:195)
at java.base/java.util.HashMap$ValueSpliterator.forEachRemaining(HashMap.java:1693)
at java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:484)
at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:474)
at java.base/java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:913)
at java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
at java.base/java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:578)
at com.linkedin.datahub.graphql.resolvers.ingest.secret.GetSecretValuesResolver.lambda$get$2(GetSecretValuesResolver.java:85)
... 7 common frames omitted
Caused by: javax.crypto.BadPaddingException: Given final block not properly padded. Such issues can arise if a bad key is used during decryption.
at java.base/com.sun.crypto.provider.CipherCore.unpad(CipherCore.java:975)
at java.base/com.sun.crypto.provider.CipherCore.fillOutputBuffer(CipherCore.java:1056)
at java.base/com.sun.crypto.provider.CipherCore.doFinal(CipherCore.java:853)
at java.base/com.sun.crypto.provider.AESCipher.engineDoFinal(AESCipher.java:446)
at java.base/javax.crypto.Cipher.doFinal(Cipher.java:2202)
at com.linkedin.metadata.secret.SecretService.decrypt(SecretService.java:78)
... 17 common frames omitted
2023-06-27 13:22:10,352 [ForkJoinPool.commonPool-worker-1] ERROR c.datahub.graphql.GraphQLController:107 - Errors while executing graphQL query: "query getSecretValues($input: GetSecretValuesInput!) {\n\n getSecretValues(input: $input) {\n\n name\n\n value\n\\
n }\n\n }", result: {errors=[{message=An unknown error occurred., locations=[{line=3, column=17}], path=[getSecretValues], extensions={code=500, type=SERVER_ERROR, classification=DataFetchingException}}], data={getSecretValues=null}, extensions={tracing={version=1, startTime=2023-06-27\
T13:22:10.349460Z, endTime=2023-06-27T13:22:10.352648Z, duration=3190479, parsing={startOffset=185674, duration=154469}, validation={startOffset=400365, duration=195731}, execution={resolvers=[{path=[getSecretValues], parentType=Query, returnType=[SecretValue!], fieldName=getSecretValues, startOffset=483188, du\
ration=2147348}]}}}}, errors: [DataHubGraphQLError{path=[getSecretValues], code=SERVER_ERROR, locations=[SourceLocation{line=3, column=17}]}]
2023-06-27 13:22:10,356 [qtp1645547422-520] INFO c.l.m.r.entity.AspectResource:171 - INGEST PROPOSAL proposal: {aspectName=dataHubExecutionRequestResult, entityKeyAspect={contentType=application/json, value=ByteString(length=46,bytes=7b226964...6237227d)}, entityType=dataHubExecutionRequest, aspect={contentTyp\
e=application/json, value=ByteString(length=1582,bytes=7b227374...2032327d)}, changeType=UPSERT}
2023-06-27 13:22:10,369 [pool-13-thread-13] INFO c.l.m.filter.RestliLoggingFilter:55 - POST /aspects?action=ingestProposal - ingestProposal - 200 - 13ms
2023-06-27 13:22:10,774 [I/O dispatcher 1] INFO c.l.m.s.e.update.BulkListener:47 - Successfully fed bulk request. Number of events: 8 Took time ms: -1
white-guitar-82227
06/27/2023, 2:43 PMwhite-guitar-82227
06/27/2023, 2:45 PMwhite-guitar-82227
06/27/2023, 2:48 PMhelpful-tent-87247
07/05/2023, 1:51 PMwhite-guitar-82227
07/05/2023, 1:51 PMhelpful-tent-87247
07/05/2023, 1:51 PMwhite-guitar-82227
07/05/2023, 1:51 PMwhite-guitar-82227
07/05/2023, 1:52 PMhelpful-tent-87247
07/05/2023, 1:52 PMaloof-gpu-11378
07/05/2023, 1:53 PMwhite-guitar-82227
07/05/2023, 1:53 PMaloof-gpu-11378
07/05/2023, 1:54 PMwhite-guitar-82227
07/05/2023, 1:56 PMwhite-guitar-82227
07/05/2023, 2:02 PMaloof-gpu-11378
07/05/2023, 2:04 PMwhite-guitar-82227
07/06/2023, 8:26 AMelasticsearch:
enabled: false
neo4j:
enabled: false
neo4j-community:
enabled: false
mysql:
enabled: false
postgresql:
enabled: false
cp-helm-charts:
enabled: true
cp-schema-registry:
enabled: true
resources:
requests:
cpu: "100m"
memory: "512Mi"
limits:
memory: "512Mi"
kafka:
bootstrapServers: "<http://b-3.datahubcluster.xxxxxx.c3.kafka.eu-central-1.amazonaws.com:9092,b-2.datahubcluster.xxxxxx.c3.kafka.eu-central-1.amazonaws.com:9092,b-1.datahubcluster.xxxxxx.c3.kafka.eu-central-1.amazonaws.com:9092|b-3.datahubcluster.xxxxxx.c3.kafka.eu-central-1.amazonaws.com:9092,b-2.datahubcluster.xxxxxx.c3.kafka.eu-central-1.amazonaws.com:9092,b-1.datahubcluster.xxxxxx.c3.kafka.eu-central-1.amazonaws.com:9092>"
cp-kafka:
enabled: false
cp-zookeeper:
enabled: false
cp-kafka-rest:
enabled: false
cp-kafka-connect:
enabled: false
cp-ksql-server:
enabled: false
cp-control-center:
enabled: false
kafka:
enabled: false
white-guitar-82227
07/06/2023, 8:26 AMelasticsearch:
enabled: false
neo4j:
enabled: false
neo4j-community:
enabled: false
mysql:
enabled: false
postgresql:
enabled: false
cp-helm-charts:
enabled: true
cp-schema-registry:
enabled: true
resources:
requests:
cpu: "100m"
memory: "512Mi"
limits:
memory: "512Mi"
kafka:
bootstrapServers: "<http://b-3.datahubcluster.xxxxxx.c3.kafka.eu-central-1.amazonaws.com:9092,b-2.datahubcluster.xxxxxx.c3.kafka.eu-central-1.amazonaws.com:9092,b-1.datahubcluster.xxxxxx.c3.kafka.eu-central-1.amazonaws.com:9092|b-3.datahubcluster.xxxxxx.c3.kafka.eu-central-1.amazonaws.com:9092,b-2.datahubcluster.xxxxxx.c3.kafka.eu-central-1.amazonaws.com:9092,b-1.datahubcluster.xxxxxx.c3.kafka.eu-central-1.amazonaws.com:9092>"
cp-kafka:
enabled: false
cp-zookeeper:
enabled: false
cp-kafka-rest:
enabled: false
cp-kafka-connect:
enabled: false
cp-ksql-server:
enabled: false
cp-control-center:
enabled: false
kafka:
enabled: false
peter@MacBookPro datahub % cat datahub-app-values.yaml
datahub-gms:
enabled: true
service:
type: ClusterIP
image:
repository: linkedin/datahub-gms
extraEnvs:
- name: METADATA_SERVICE_AUTH_ENABLED
value: 'true'
resources:
limits:
memory: 4Gi
requests:
cpu: 100m
memory: 4Gi
datahub-frontend:
enabled: true
image:
repository: linkedin/datahub-frontend-react
extraEnvs:
- name: METADATA_SERVICE_AUTH_ENABLED
value: 'true'
resources:
limits:
memory: 1400Mi
requests:
cpu: 100m
memory: 1400Mi
ingress:
enabled: true
annotations:
<http://alb.ingress.kubernetes.io/ssl-redirect|alb.ingress.kubernetes.io/ssl-redirect>: '443'
<http://alb.ingress.kubernetes.io/certificate-arn|alb.ingress.kubernetes.io/certificate-arn>: 'arn:aws:acm:eu-central-1:xxxxxx''
<http://alb.ingress.kubernetes.io/group.name|alb.ingress.kubernetes.io/group.name>: infrastructure
<http://alb.ingress.kubernetes.io/healthcheck-path|alb.ingress.kubernetes.io/healthcheck-path>: '/admin'
<http://alb.ingress.kubernetes.io/listen-ports|alb.ingress.kubernetes.io/listen-ports>: '[{"HTTP": 80}, {"HTTPS":443}]'
<http://alb.ingress.kubernetes.io/scheme|alb.ingress.kubernetes.io/scheme>: 'internal'
<http://alb.ingress.kubernetes.io/target-type|alb.ingress.kubernetes.io/target-type>: 'ip'
<http://kubernetes.io/ingress.class|kubernetes.io/ingress.class>: 'alb'
hosts:
- host: datahub.example.local
paths:
- '/*'
extraVolumes:
- name: datahub-users
secret:
defaultMode: 0444
secretName: datahub-users-secret
extraVolumeMounts:
- name: datahub-users
mountPath: /datahub-frontend/conf/user.props
subPath: user.props
service:
type: "ClusterIP"
acryl-datahub-actions:
enabled: true
image:
repository: acryldata/datahub-actions
tag: "v0.0.11"
resources:
limits:
memory: 2Gi
requests:
cpu: 300m
memory: 2Gi
datahub-mae-consumer:
image:
repository: linkedin/datahub-mae-consumer
resources:
limits:
memory: 1536Mi
requests:
cpu: 100m
memory: 1536Mi
datahub-mce-consumer:
image:
repository: linkedin/datahub-mce-consumer
resources:
limits:
memory: 1536Mi
requests:
cpu: 100m
memory: 1536Mi
datahub-ingestion-cron:
enabled: false
image:
repository: acryldata/datahub-ingestion
elasticsearchSetupJob:
enabled: true
image:
repository: linkedin/datahub-elasticsearch-setup
extraEnvs:
- name: USE_AWS_ELASTICSEARCH
value: "true"
resources:
limits:
cpu: 500m
memory: 512Mi
requests:
cpu: 300m
memory: 512Mi
podSecurityContext:
fsGroup: 1000
securityContext:
runAsUser: 1000
podAnnotations: {}
kafkaSetupJob:
enabled: true
image:
repository: linkedin/datahub-kafka-setup
resources:
limits:
cpu: 500m
memory: 1024Mi
requests:
cpu: 300m
memory: 1024Mi
podSecurityContext:
fsGroup: 1000
securityContext:
runAsUser: 1000
podAnnotations: {}
mysqlSetupJob:
enabled: false
image:
repository: acryldata/datahub-mysql-setup
resources:
limits:
cpu: 500m
memory: 512Mi
requests:
cpu: 300m
memory: 512Mi
podSecurityContext:
fsGroup: 1000
securityContext:
runAsUser: 1000
podAnnotations: {}
postgresqlSetupJob:
enabled: true
image:
repository: acryldata/datahub-postgres-setup
resources:
limits:
cpu: 500m
memory: 512Mi
requests:
cpu: 300m
memory: 512Mi
podSecurityContext:
fsGroup: 1000
securityContext:
runAsUser: 1000
podAnnotations: {}
datahubUpgrade:
enabled: true
image:
repository: acryldata/datahub-upgrade
batchSize: 1000
batchDelayMs: 100
noCodeDataMigration:
sqlDbType: "POSTGRES"
podSecurityContext: {}
securityContext: {}
podAnnotations: {}
restoreIndices:
resources:
limits:
cpu: 500m
memory: 512Mi
requests:
cpu: 300m
memory: 512Mi
global:
strict_mode: true
graph_service_impl: elasticsearch
datahub_analytics_enabled: false
datahub_standalone_consumers_enabled: false
elasticsearch:
host: "<http://vpc-datahub-xxxxxxxxxxxxxxxxxxxxxxxxxx.eu-central-1.es.amazonaws.com|vpc-datahub-xxxxxxxxxxxxxxxxxxxxxxxxxx.eu-central-1.es.amazonaws.com>"
port: "443"
skipcheck: "false"
insecure: "false"
useSSL: "true"
region: eu-central-1
auth:
username: datahub
password:
secretRef: opensearch-secret
secretKey: opensearch-password
index:
enableMappingsReindex: true
enableSettingsReindex: true
upgrade:
cloneIndices: true
allowDocCountMismatch: false
search:
maxTermBucketSize: 20
exactMatch:
exclusive: false
withPrefix: true
exactFactor: 2.0
prefixFactor: 1.6
caseSensitivityFactor: 0.7
enableStructured: true
graph:
timeoutSeconds: 50
batchSize: 1000
maxResult: 10000
kafka:
bootstrap:
server: "<http://b-2.datahubcluster.xxxxxx.c3.kafka.eu-central-1.amazonaws.com:9092,b-3.datahubcluster.xxxxxx.c3.kafka.eu-central-1.amazonaws.com:9092,b-1.datahubcluster.xxxxxx.c3.kafka.eu-central-1.amazonaws.com:9092|b-2.datahubcluster.xxxxxx.c3.kafka.eu-central-1.amazonaws.com:9092,b-3.datahubcluster.xxxxxx.c3.kafka.eu-central-1.amazonaws.com:9092,b-1.datahubcluster.xxxxxx.c3.kafka.eu-central-1.amazonaws.com:9092>"
zookeeper:
server: "<http://z-3.datahubcluster.xxxxxx.c3.kafka.eu-central-1.amazonaws.com:2181,z-2.datahubcluster.xxxxxx.c3.kafka.eu-central-1.amazonaws.com:2181,z-1.datahubcluster.xxxxxx.c3.kafka.eu-central-1.amazonaws.com:2181|z-3.datahubcluster.xxxxxx.c3.kafka.eu-central-1.amazonaws.com:2181,z-2.datahubcluster.xxxxxx.c3.kafka.eu-central-1.amazonaws.com:2181,z-1.datahubcluster.xxxxxx.c3.kafka.eu-central-1.amazonaws.com:2181>"
topics:
metadata_change_event_name: "MetadataChangeEvent_v4"
failed_metadata_change_event_name: "FailedMetadataChangeEvent_v4"
metadata_audit_event_name: "MetadataAuditEvent_v4"
datahub_usage_event_name: "DataHubUsageEvent_v1"
metadata_change_proposal_topic_name: "MetadataChangeProposal_v1"
failed_metadata_change_proposal_topic_name: "FailedMetadataChangeProposal_v1"
metadata_change_log_versioned_topic_name: "MetadataChangeLog_Versioned_v1"
metadata_change_log_timeseries_topic_name: "MetadataChangeLog_Timeseries_v1"
platform_event_topic_name: "PlatformEvent_v1"
datahub_upgrade_history_topic_name: "DataHubUpgradeHistory_v1"
schemaregistry:
url: "<http://datahub-prerequisites-cp-schema-registry:8081>"
type: KAFKA
paritions: 3
replicationFactor: 3
neo4j:
host: "prerequisites-neo4j-community:7474"
uri: "<bolt://prerequisites-neo4j-community>"
username: "neo4j"
password:
secretRef: neo4j-secrets
secretKey: neo4j-password
sql:
datasource:
host: "datahubdb.example.local:5432"
hostForpostgresqlClient: "datahubdb.example.local"
port: "5432"
url: "jdbc:<postgresql://datahubdb.example.local:5432/datahub>"
driver: "org.postgresql.Driver"
username: "svc_datahub"
password:
secretRef: postgres-secret
secretKey: postgres-root-password
datahub:
gms:
port: "8080"
nodePort: "30001"
monitoring:
enablePrometheus: true
mae_consumer:
port: "9091"
nodePort: "30002"
encryptionKey:
secretRef: "datahub-encryption-secrets"
secretKey: "encryption_key_secret"
provisionSecret:
enabled: true
autoGenerate: true
managed_ingestion:
enabled: true
metadata_service_authentication:
enabled: false
systemClientId: "__datahub_system"
systemClientSecret:
secretRef: "datahub-auth-secrets"
secretKey: "token_service_signing_key"
tokenService:
signingKey:
secretRef: "datahub-auth-secrets"
secretKey: "token_service_signing_key"
salt:
secretRef: "datahub-auth-secrets"
secretKey: "token_service_salt"
provisionSecrets:
enabled: true
autoGenerate: true
alwaysEmitChangeLog: true
enableGraphDiffMode: true
white-guitar-82227
07/06/2023, 8:35 AMhelpful-tent-87247
07/07/2023, 4:02 PMincalculable-ocean-74010
07/10/2023, 9:30 AMlookup
method which requires access to the underlying Kubernetes cluster. If that access is somehow not direct then it will not work.
To fix this issue I would recommend setting: global.datahub.encryptionKey.provisionSecret.autoGenerate
and global.datahub.metadata_service_authentication.provisionSecrets.autoGenerate
to false. This will force to either specify the secret values in the `values.yaml`file OR provision the secret yourself and reference it like: https://github.com/acryldata/datahub-helm/blob/d56333b25996172ae68c01b4aa2f3d0d7de51b05/charts/datahub/values.yaml#L576
For more information on the helm lookup function, see: https://helm.sh/docs/chart_template_guide/functions_and_pipelines/#using-the-lookup-functionwhite-guitar-82227
07/10/2023, 11:48 AMincalculable-ocean-74010
07/10/2023, 2:00 PMwhite-guitar-82227
07/10/2023, 2:12 PMwhite-guitar-82227
07/10/2023, 2:42 PMwhite-guitar-82227
07/10/2023, 2:43 PMhelpful-tent-87247
07/11/2023, 1:02 AMwhite-guitar-82227
07/11/2023, 1:23 PM2023-07-11 13:18:31,250 [qtp1645547422-279] INFO c.l.m.r.entity.AspectResource:171 - INGEST PROPOSAL proposal: {aspectName=dataHubExecutionRequestResult, entityKeyAspect={contentType=application/json, value=ByteString(length=46,bytes=7b226964...6138227d)}, entityType=dataHubExecutionRequest, aspect={contentType=application/json, value=ByteString(length=51,bytes=7b227374...3234367d)}, changeType=UPSERT}
2023-07-11 13:18:31,261 [pool-13-thread-5] INFO c.l.m.filter.RestliLoggingFilter:55 - POST /aspects?action=ingestProposal - ingestProposal - 200 - 11ms
2023-07-11 13:18:31,269 [ForkJoinPool.commonPool-worker-19] ERROR c.l.d.g.e.DataHubDataFetcherExceptionHandler:21 - Failed to execute DataFetcher
java.util.concurrent.CompletionException: java.lang.RuntimeException: Failed to perform update against input com.linkedin.datahub.graphql.generated.GetSecretValuesInput@6e4557d4
at java.base/java.util.concurrent.CompletableFuture.encodeThrowable(CompletableFuture.java:314)
at java.base/java.util.concurrent.CompletableFuture.completeThrowable(CompletableFuture.java:319)
at java.base/java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1702)
at java.base/java.util.concurrent.CompletableFuture$AsyncSupply.exec(CompletableFuture.java:1692)
at java.base/java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
at java.base/java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020)
at java.base/java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656)
at java.base/java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594)
at java.base/java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:183)
Caused by: java.lang.RuntimeException: Failed to perform update against input com.linkedin.datahub.graphql.generated.GetSecretValuesInput@6e4557d4
at com.linkedin.datahub.graphql.resolvers.ingest.secret.GetSecretValuesResolver.lambda$get$2(GetSecretValuesResolver.java:87)
at java.base/java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1700)
... 6 common frames omitted
Caused by: java.lang.RuntimeException: Failed to decrypt value using provided secret!
at com.linkedin.metadata.secret.SecretService.decrypt(SecretService.java:80)
at com.linkedin.datahub.graphql.resolvers.ingest.secret.GetSecretValuesResolver.decryptSecret(GetSecretValuesResolver.java:95)
at com.linkedin.datahub.graphql.resolvers.ingest.secret.GetSecretValuesResolver.lambda$get$1(GetSecretValuesResolver.java:77)
at java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:195)
at java.base/java.util.HashMap$ValueSpliterator.forEachRemaining(HashMap.java:1693)
at java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:484)
at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:474)
at java.base/java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:913)
at java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
at java.base/java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:578)
at com.linkedin.datahub.graphql.resolvers.ingest.secret.GetSecretValuesResolver.lambda$get$2(GetSecretValuesResolver.java:85)
... 7 common frames omitted
Caused by: javax.crypto.BadPaddingException: Given final block not properly padded. Such issues can arise if a bad key is used during decryption.
at java.base/com.sun.crypto.provider.CipherCore.unpad(CipherCore.java:975)
at java.base/com.sun.crypto.provider.CipherCore.fillOutputBuffer(CipherCore.java:1056)
at java.base/com.sun.crypto.provider.CipherCore.doFinal(CipherCore.java:853)
at java.base/com.sun.crypto.provider.AESCipher.engineDoFinal(AESCipher.java:446)
at java.base/javax.crypto.Cipher.doFinal(Cipher.java:2202)
at com.linkedin.metadata.secret.SecretService.decrypt(SecretService.java:78)
... 17 common frames omitted
2023-07-11 13:18:31,269 [ForkJoinPool.commonPool-worker-23] ERROR c.datahub.graphql.GraphQLController:107 - Errors while executing graphQL query: "query getSecretValues($input: GetSecretValuesInput!) {\n\n getSecretValues(input: $input) {\n\n name\n\n value\n\n }\n\n }", result: {errors=[{message=An unknown error occurred., locations=[{line=3, column=17}], path=[getSecretValues], extensions={code=500, type=SERVER_ERROR, classification=DataFetchingException}}], data={getSecretValues=null}, extensions={tracing={version=1, startTime=2023-07-11T13:18:31.265782Z, endTime=2023-07-11T13:18:31.269727Z, duration=3947655, parsing={startOffset=330873, duration=292201}, validation={startOffset=572721, duration=220571}, execution={resolvers=[{path=[getSecretValues], parentType=Query, returnType=[SecretValue!], fieldName=getSecretValues, startOffset=670785, duration=2645164}]}}}}, errors: [DataHubGraphQLError{path=[getSecretValues], code=SERVER_ERROR, locations=[SourceLocation{line=3, column=17}]}]
2023-07-11 13:18:31,274 [qtp1645547422-279] INFO c.l.m.r.entity.AspectResource:171 - INGEST PROPOSAL proposal: {aspectName=dataHubExecutionRequestResult, entityKeyAspect={contentType=application/json, value=ByteString(length=46,bytes=7b226964...6138227d)}, entityType=dataHubExecutionRequest, aspect={contentType=application/json, value=ByteString(length=1568,bytes=7b227374...2032377d)}, changeType=UPSERT}
2023-07-11 13:18:31,290 [pool-13-thread-4] INFO c.l.m.filter.RestliLoggingFilter:55 - POST /aspects?action=ingestProposal - ingestProposal - 200 - 16ms
2023-07-11 13:18:31,691 [I/O dispatcher 1] INFO c.l.m.s.e.update.BulkListener:47 - Successfully fed bulk request. Number of events: 7 Took time ms: -1
helpful-tent-87247
07/11/2023, 1:54 PMwhite-guitar-82227
07/11/2023, 1:54 PMhelpful-tent-87247
07/11/2023, 1:54 PMencryptionKey:
secretRef: "datahub-encryption-secrets"
secretKey: "encryption_key_secret"
# Set to false if you'd like to provide your own secret.
provisionSecret:
enabled: false
autoGenerate: false
annotations: {}
# Only specify if autoGenerate set to false
# secretValues:
# encryptionKey: <encryption key value>
helpful-tent-87247
07/11/2023, 1:55 PMwhite-guitar-82227
07/11/2023, 1:56 PMwhite-guitar-82227
07/11/2023, 1:56 PMwhite-guitar-82227
07/14/2023, 6:57 AMhelpful-tent-87247
07/14/2023, 9:52 PMwhite-guitar-82227
08/04/2023, 6:12 AMwitty-night-28872
09/15/2023, 12:15 PMbumpy-manchester-97826
10/25/2023, 1:59 PMwhite-guitar-82227
11/10/2023, 9:44 AMbumpy-manchester-97826
11/16/2023, 10:42 AM