crooked-baker-53493
05/23/2023, 8:42 PMelegant-nightfall-29115
05/23/2023, 10:50 PM21:56:03.265 [ForkJoinPool.commonPool-worker-5] ERROR c.d.a.a.AuthServiceController:314 - Failed to verify credentials for native user urn:li:corpuser:jbolesjc
java.lang.RuntimeException: Failed to decrypt value using provided secret!
at com.linkedin.metadata.secret.SecretService.decrypt(SecretService.java:80)
at com.datahub.authentication.user.NativeUserService.doesPasswordMatch(NativeUserService.java:200)
at com.datahub.auth.authentication.AuthServiceController.lambda$verifyNativeUserCredentials$3(AuthServiceController.java:310)
at java.base/java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1700)
at java.base/java.util.concurrent.CompletableFuture$AsyncSupply.exec(CompletableFuture.java:1692)
at java.base/java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
at java.base/java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020)
at java.base/java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656)
at java.base/java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594)
at java.base/java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:183)
Caused by: javax.crypto.BadPaddingException: Given final block not properly padded. Such issues can arise if a bad key is used during decryption.
at java.base/com.sun.crypto.provider.CipherCore.unpad(CipherCore.java:975)
at java.base/com.sun.crypto.provider.CipherCore.fillOutputBuffer(CipherCore.java:1056)
at java.base/com.sun.crypto.provider.CipherCore.doFinal(CipherCore.java:853)
at java.base/com.sun.crypto.provider.AESCipher.engineDoFinal(AESCipher.java:446)
at java.base/javax.crypto.Cipher.doFinal(Cipher.java:2202)
at com.linkedin.metadata.secret.SecretService.decrypt(SecretService.java:78)
... 9 common frames omitted
green-soccer-37145
05/24/2023, 7:23 AM- path: /datahub
pathType: Prefix
backend:
service:
name: datahub-service
port:
number: 9002
hundreds-airline-29192
05/24/2023, 9:32 AMhundreds-airline-29192
05/24/2023, 9:33 AMhundreds-airline-29192
05/24/2023, 9:33 AMhundreds-airline-29192
05/24/2023, 9:33 AMhelpful-guitar-93961
05/24/2023, 1:49 PMhelpful-guitar-93961
05/24/2023, 1:50 PMhelpful-guitar-93961
05/24/2023, 1:51 PMstale-traffic-76901
05/24/2023, 9:32 PMstale-traffic-76901
05/24/2023, 9:33 PMstale-traffic-76901
05/24/2023, 9:44 PMpowerful-battery-5070
05/24/2023, 9:47 PMUnable to run quickstart - the following issues were detected:
- quickstart.sh or dev.sh is not running
If you think something went wrong, please file an issue at <https://github.com/datahub-project/datahub/issues>
or send a message in our Slack <https://slack.datahubproject.io/>
Be sure to attach the logs from /tmp/tmpbsyeu5qv.log
apuranik@cvia1dct001:~$ less /tmp/tmpbsyeu5qv.log
apuranik@cvia1dct001:~$ less /tmp/tmpbsyeu5qv.log
apuranik@cvia1dct001:~$ ls -l /tmp/tmpbsyeu5qv.log
-rw------- 1 apuranik docker 0 May 24 16:33 /tmp/tmpbsyeu5qv.log
apuranik@cvia1dct001:~$
Unfortunately, nothing is captured in the log. I was not able to find anything online that could help me with this issue. I have tried this twice (rebuilt the VM) and landed in the same spot. Any help with this will be greatly appreciated!lemon-scooter-69730
05/25/2023, 11:04 AMenvFromSecrets:
BIGQUERY_PRIVATE_KEY: <--- environment variable
key: some-privatekey <-- the key in the k8s-secret data file
secret: k8s-secret <--- the name of the k8s-secret
BIGQUERY_PRIVATE_KEY_ID:
key: ...
secret: k8s-secret
We also use a configmap for the recipe where we then specify
source:
type: bigquery
config:
include_table_lineage: true
include_usage_statistics: true
include_tables: true
include_views: true
profiling:
enabled: true
profile_table_level_only: false
stateful_ingestion:
enabled: true
credential:
project_id: project_id
private_key: ${BIGQUERY_PRIVATE_KEY}
private_key_id: ${BIGQUERY_PRIVATE_KEY_ID}
...
When the pod runs we get UnboundVariable: 'BIGQUERY_PRIVATE_KEY: unbound variable'
Can someone who knows more about this than myself advice or if you have set something like this up before any advice is welcomehallowed-kilobyte-916
05/25/2023, 2:42 PMhandsome-park-80602
05/25/2023, 4:11 PMData name must: be all lowercase, start and end with a letter or number, and may contain letters, numbers, dashes, underscores, and dots.
I was wondering if it is okay if I created all the topics specified in the documentation as lower case and would I be able to configure Datahub to refer to the lower cased topics in the integration?witty-motorcycle-52108
05/25/2023, 4:11 PMOSError: [Errno 24] Too many open files\n']
errors in the actions
container once it has been online for a while ingesting from a glue data source. there seems to be a connection/file reference leak somewhere, any thoughts?bitter-waitress-17567
05/25/2023, 7:03 PMPyPi package potentially vulnerable to dependency confusion attack | acryl-datahub-actions
Anyone recieved this warning before?rich-policeman-92383
05/26/2023, 4:15 AMhundreds-airline-29192
05/29/2023, 3:42 AMflat-afternoon-55941
05/29/2023, 5:10 AMfaint-translator-23365
05/29/2023, 3:10 PMmany-rocket-80549
05/29/2023, 3:28 PMboundless-piano-94348
05/29/2023, 8:09 PMneo4j.uri
config in values.yaml. I see that using Kubernetes, it can be specified as <bolt://prerequisites-neo4j-community>
without specifying port 7687. How about using host from VM which is typically an IP address? Do we need to explicitly specify the port? Something like <bolt://172.32.31.18:7687>
bland-gigabyte-28270
05/30/2023, 12:44 AMmagnificent-honey-40185
05/30/2023, 11:08 AMchilly-boots-22585
05/30/2023, 12:34 PMsource:
type: starburst-trino-usage
config: null
host_port: 'datamesh.conci**<http://usquest.com:443|usquest.com:443>'
database: tpch
username: ds-starburst
password: 'lnBs****6Up'
email_domain: ankit.rawat@co***<http://rrus.com|rrus.com>
audit_catalog: tiny
audit_schema: customer
sink:
type: datahub-rest
config:
server: '<http://localhost:8080>'
Now i am receiving this error
~~~~ Execution Summary - RUN_INGEST ~~~~
Execution finished with errors.
{'exec_id': '85e65980-bd26-416d-9fbd-bb15840a12d3',
'infos': ['2023-05-30 12:32:00.237740 INFO: Starting execution for task with name=RUN_INGEST',
"2023-05-30 12:32:04.279953 INFO: Failed to execute 'datahub ingest'",
'2023-05-30 12:32:04.280103 INFO: Caught exception EXECUTING task_id=85e65980-bd26-416d-9fbd-bb15840a12d3, name=RUN_INGEST, '
'stacktrace=Traceback (most recent call last):\n'
' File "/usr/local/lib/python3.10/site-packages/acryl/executor/execution/default_executor.py", line 122, in execute_task\n'
' task_event_loop.run_until_complete(task_future)\n'
' File "/usr/local/lib/python3.10/asyncio/base_events.py", line 649, in run_until_complete\n'
' return future.result()\n'
' File "/usr/local/lib/python3.10/site-packages/acryl/executor/execution/sub_process_ingestion_task.py", line 231, in execute\n'
' raise TaskError("Failed to execute \'datahub ingest\'")\n'
"acryl.executor.execution.task.TaskError: Failed to execute 'datahub ingest'\n"],
'errors': []}
~~~~ Ingestion Logs ~~~~
Obtaining venv creation lock...
Acquired venv creation lock
venv setup time = 0
This version of datahub supports report-to functionality
datahub ingest run -c /tmp/datahub/ingest/85e65980-bd26-416d-9fbd-bb15840a12d3/recipe.yml --report-to /tmp/datahub/ingest/85e65980-bd26-416d-9fbd-bb15840a12d3/ingestion_report.json
[2023-05-30 12:32:02,233] INFO {datahub.cli.ingest_cli:165} - DataHub CLI version: 0.10.0
7 validation errors for PipelineConfig
source -> audit_catalog
extra fields not permitted (type=value_error.extra)
source -> audit_schema
extra fields not permitted (type=value_error.extra)
source -> database
extra fields not permitted (type=value_error.extra)
source -> email_domain
extra fields not permitted (type=value_error.extra)
source -> host_port
extra fields not permitted (type=value_error.extra)
source -> password
extra fields not permitted (type=value_error.extra)
source -> username
extra fields not permitted (type=value_error.extra)
chilly-boots-22585
05/30/2023, 12:35 PMchilly-boots-22585
05/30/2023, 4:19 PM