creamy-caravan-15387
05/22/2023, 8:12 AMhallowed-lock-74921
05/22/2023, 11:30 AMhallowed-lock-74921
05/22/2023, 11:31 AMadventurous-room-36802
05/23/2023, 12:47 PMglamorous-easter-30119
05/24/2023, 12:24 AMbumpy-shoe-90203
05/24/2023, 2:12 AMaverage-dentist-82800
05/24/2023, 3:48 AMflat-afternoon-55941
05/24/2023, 5:21 AMbumpy-engineer-7375
05/24/2023, 10:16 AMwhite-refrigerator-42062
05/24/2023, 12:29 PMbrief-advantage-89816
05/24/2023, 1:32 PMicy-train-40359
05/24/2023, 4:05 PM[2023-05-24 16:52:58,950] INFO {datahub.cli.ingest_cli:173} - DataHub CLI version: 0.10.2.2
[2023-05-24 16:52:59,141] INFO {datahub.ingestion.run.pipeline:204} - Sink configured successfully. DataHubRestEmitter: configured to talk to <http://localhost:8080> with token: eyJh**********O9bU
[2023-05-24 16:53:03,611] ERROR {datahub.entrypoints:195} - Command failed: Failed to find a registered source for type snowflake: Updating forward references for asset model PandasCSVAsset raised TypeError: issubclass() arg 1 must be a class
And here's the recipe:
source:
type: "snowflake"
config:
username: "${SNOWFLAKE_USERID}"
password: "${SNOWFLAKE_PASSWORD}"
account_id: "real_account_name"
warehouse: "bi_transforming_development"
database_pattern:
allow:
- "^ANALYTICS_DBT_PROD$"
schema_pattern:
allow:
- "^PROD$"
role: "analyst"
sink:
type: "datahub-rest"
config:
token: "${TOKEN}"
Using the UI with the same credentials is working, and the environment variables appear to be ingested correctly at least in the case of the token (also, hard-coding the variables in the recipe does not solve the problem either). Thank you for any help!gorgeous-room-15515
05/25/2023, 5:28 AMadamant-musician-90219
05/25/2023, 7:07 AMmonitoring:
enablePrometheus: true
Like this but is there any docs that how we can setup whole monitoring process of enabling in datahubnutritious-salesclerk-57675
05/25/2023, 8:39 AMhallowed-lock-74921
05/25/2023, 10:00 AMhallowed-lock-74921
05/25/2023, 10:00 AMglamorous-spring-97970
05/25/2023, 11:29 AMglamorous-spring-97970
05/25/2023, 11:29 AMglamorous-spring-97970
05/25/2023, 11:32 AMglamorous-spring-97970
05/25/2023, 11:33 AMmagnificent-honey-40185
05/25/2023, 1:47 PMadventurous-pillow-74569
05/25/2023, 3:18 PMdatahub.ingestion.run.pipeline.PipelineInitError: Failed to find a registered source for type bigquery: 'str' object is not callable
elegant-minister-82709
05/25/2023, 6:52 PM{
"type": "server",
"timestamp": "2023-05-25T18:01:45,859Z",
"level": "ERROR",
"component": "o.e.i.g.DatabaseNodeService",
"cluster.name": "elasticsearch",
"node.name": "elasticsearch-master-0",
"message": "failed to retrieve database [GeoLite2-Country.mmdb]",
"stacktrace": [
"org.elasticsearch.cluster.block.ClusterBlockException: blocked by: [SERVICE_UNAVAILABLE/1/state not recovered / initialized];",
"at org.elasticsearch.cluster.block.ClusterBlocks.globalBlockedException(ClusterBlocks.java:179) ~[elasticsearch-7.17.3.jar:7.17.3]",
"at org.elasticsearch.cluster.block.ClusterBlocks.globalBlockedRaiseException(ClusterBlocks.java:165) ~[elasticsearch-7.17.3.jar:7.17.3]",
"at org.elasticsearch.action.search.TransportSearchAction.executeSearch(TransportSearchAction.java:929) ~[elasticsearch-7.17.3.jar:7.17.3]",
"at org.elasticsearch.action.search.TransportSearchAction.executeLocalSearch(TransportSearchAction.java:763) ~[elasticsearch-7.17.3.jar:7.17.3]",
"at org.elasticsearch.action.search.TransportSearchAction.lambda$executeRequest$6(TransportSearchAction.java:399) ~[elasticsearch-7.17.3.jar:7.17.3]",
"at org.elasticsearch.action.ActionListener$1.onResponse(ActionListener.java:136) ~[elasticsearch-7.17.3.jar:7.17.3]",
"at org.elasticsearch.index.query.Rewriteable.rewriteAndFetch(Rewriteable.java:112) ~[elasticsearch-7.17.3.jar:7.17.3]",
"at org.elasticsearch.index.query.Rewriteable.rewriteAndFetch(Rewriteable.java:77) ~[elasticsearch-7.17.3.jar:7.17.3]",
"at org.elasticsearch.action.search.TransportSearchAction.executeRequest(TransportSearchAction.java:487) ~[elasticsearch-7.17.3.jar:7.17.3]",
"at org.elasticsearch.action.search.TransportSearchAction.doExecute(TransportSearchAction.java:285) ~[elasticsearch-7.17.3.jar:7.17.3]",
"at org.elasticsearch.action.search.TransportSearchAction.doExecute(TransportSearchAction.java:101) ~[elasticsearch-7.17.3.jar:7.17.3]",
"at org.elasticsearch.action.support.TransportAction$RequestFilterChain.proceed(TransportAction.java:179) ~[elasticsearch-7.17.3.jar:7.17.3]",
"at org.elasticsearch.action.support.ActionFilter$Simple.apply(ActionFilter.java:53) ~[elasticsearch-7.17.3.jar:7.17.3]",
"at org.elasticsearch.action.support.TransportAction$RequestFilterChain.proceed(TransportAction.java:177) ~[elasticsearch-7.17.3.jar:7.17.3]",
"at org.elasticsearch.xpack.security.action.filter.SecurityActionFilter.apply(SecurityActionFilter.java:145) ~[?:?]",
"at org.elasticsearch.action.support.TransportAction$RequestFilterChain.proceed(TransportAction.java:177) ~[elasticsearch-7.17.3.jar:7.17.3]",
"at org.elasticsearch.action.support.TransportAction.execute(TransportAction.java:154) ~[elasticsearch-7.17.3.jar:7.17.3]",
"at org.elasticsearch.action.support.TransportAction.execute(TransportAction.java:82) ~[elasticsearch-7.17.3.jar:7.17.3]",
"at org.elasticsearch.client.node.NodeClient.executeLocally(NodeClient.java:95) ~[elasticsearch-7.17.3.jar:7.17.3]",
"at org.elasticsearch.client.node.NodeClient.doExecute(NodeClient.java:73) ~[elasticsearch-7.17.3.jar:7.17.3]",
"at org.elasticsearch.client.support.AbstractClient.execute(AbstractClient.java:407) ~[elasticsearch-7.17.3.jar:7.17.3]",
"at org.elasticsearch.client.FilterClient.doExecute(FilterClient.java:57) ~[elasticsearch-7.17.3.jar:7.17.3]",
"at org.elasticsearch.client.OriginSettingClient.doExecute(OriginSettingClient.java:51) ~[elasticsearch-7.17.3.jar:7.17.3]",
"at org.elasticsearch.client.support.AbstractClient.execute(AbstractClient.java:407) ~[elasticsearch-7.17.3.jar:7.17.3]",
"at org.elasticsearch.client.support.AbstractClient.execute(AbstractClient.java:392) ~[elasticsearch-7.17.3.jar:7.17.3]",
"at org.elasticsearch.client.support.AbstractClient.search(AbstractClient.java:542) ~[elasticsearch-7.17.3.jar:7.17.3]",
"at org.elasticsearch.ingest.geoip.DatabaseNodeService.lambda$retrieveDatabase$11(DatabaseNodeService.java:367) [ingest-geoip-7.17.3.jar:7.17.3]",
"at org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingRunnable.run(ThreadContext.java:718) [elasticsearch-7.17.3.jar:7.17.3]",
"at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]",
"at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]",
"at java.lang.Thread.run(Thread.java:833) [?:?]"
]
}
bitter-waitress-17567
05/25/2023, 7:02 PMbitter-waitress-17567
05/25/2023, 7:02 PMPyPi package potentially vulnerable to dependency confusion attack | acryl-datahub-actions datahub-prod-acryl-datahub-actions
bitter-waitress-17567
05/25/2023, 7:02 PMbrainy-balloon-97302
05/25/2023, 9:40 PM'failures': {'<s3://aws-glue-assets-XXXXXX-us-west-2/scripts/Untitled> job.py': ['Unable to download DAG for Glue job from <s3://aws-glue-assets-XXXXXX-us-west-2/scripts/Untitled> job.py, so job subtasks and lineage will be missing: An error occurred (NoSuchKey) when calling the GetObject operation: The specified key does not exist.', 'Unable to download DAG for Glue job from <s3://aws-glue-assets-XXXXXX-us-west-2/scripts/Untitled> job.py, so job subtasks and lineage will be missing: An error occurred (NoSuchKey) when calling the GetObject operation: The specified key does not exist.']}
I don't have that file in s3 nor a glue job called Untitled job.py
so I am trying to see what I can do to resolve. The rest of the metadata is being pulled over but it's annoying it's marking it as a failure.red-zebra-92204
05/26/2023, 3:59 AMnumerous-account-62719
05/26/2023, 4:30 AM