great-motherboard-71467
08/16/2022, 11:18 AMthousands-solstice-2498
08/16/2022, 11:58 AMdry-hair-98162
08/16/2022, 1:30 PMeager-florist-67924
08/16/2022, 3:35 PMapiVersion: <http://networking.k8s.io/v1|networking.k8s.io/v1>
kind: Ingress
metadata:
annotations:
<http://kubernetes.io/ingress.class|kubernetes.io/ingress.class>: azure/application-gateway
name: datahub
spec:
rules:
- host: <http://datahub.d.foo-bar.net|datahub.d.foo-bar.net>
http:
paths:
- backend:
service:
name: datahub-frontend
port:
name: http
path: /
pathType: Prefix
tls:
- hosts:
- <http://datahub.d.foo-bar.net|datahub.d.foo-bar.net>
secretName: agic-tls
but the ui seems to be redirecting to https://datahub.d.foo-bar.net/authenticate?redirect_uri=%2F
and in logs i get :
! @7ojk5i2mk - Internal server error, for (GET) [/authenticate?redirect_uri=%2F] ->
play.api.UnexpectedException: Unexpected exception[TechnicalException: java.net.ConnectException: Connection refused (Connection refused)]
at play.api.http.HttpErrorHandlerExceptions$.throwableToUsefulException(HttpErrorHandler.scala:247)
at play.api.http.DefaultHttpErrorHandler.onServerError(HttpErrorHandler.scala:176)
at play.core.server.AkkaHttpServer$$anonfun$2.applyOrElse(AkkaHttpServer.scala:363)
at play.core.server.AkkaHttpServer$$anonfun$2.applyOrElse(AkkaHttpServer.scala:361)
at scala.concurrent.Future$$anonfun$recoverWith$1.apply(Future.scala:346)
at scala.concurrent.Future$$anonfun$recoverWith$1.apply(Future.scala:345)
at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:36)
at akka.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:55)
at akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run$1.apply$mcV$sp(BatchingExecutor.scala:92)
at akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run$1.apply(BatchingExecutor.scala:92)
at akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run$1.apply(BatchingExecutor.scala:92)
at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:72)
at akka.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:91)
at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:41)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:49)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: org.pac4j.core.exception.TechnicalException: java.net.ConnectException: Connection refused (Connection refused)
at org.pac4j.oidc.config.OidcConfiguration.internalInit(OidcConfiguration.java:136)
at org.pac4j.core.util.InitializableObject.init(InitializableObject.java:20)
at auth.sso.oidc.custom.CustomOidcClient.clientInit(CustomOidcClient.java:22)
at org.pac4j.core.client.IndirectClient.internalInit(IndirectClient.java:58)
at org.pac4j.core.util.InitializableObject.init(InitializableObject.java:20)
at org.pac4j.core.client.IndirectClient.getRedirectAction(IndirectClient.java:93)
at org.pac4j.core.client.IndirectClient.redirect(IndirectClient.java:79)
at controllers.AuthenticationController.redirectToIdentityProvider(AuthenticationController.java:151)
at controllers.AuthenticationController.authenticate(AuthenticationController.java:85)
at router.Routes$$anonfun$routes$1$$anonfun$applyOrElse$4$$anonfun$apply$4.apply(Routes.scala:374)
at router.Routes$$anonfun$routes$1$$anonfun$applyOrElse$4$$anonfun$apply$4.apply(Routes.scala:374)
at play.core.routing.HandlerInvokerFactory$$anon$3.resultCall(HandlerInvoker.scala:134)
at play.core.routing.HandlerInvokerFactory$$anon$3.resultCall(HandlerInvoker.scala:133)
at play.core.routing.HandlerInvokerFactory$JavaActionInvokerFactory$$anon$8$$anon$2$$anon$1.invocation(HandlerInvoker.scala:108)
at play.core.j.JavaAction$$anon$1.call(JavaAction.scala:88)
at play.http.DefaultActionCreator$1.call(DefaultActionCreator.java:31)
at play.core.j.JavaAction$$anonfun$9.apply(JavaAction.scala:138)
at play.core.j.JavaAction$$anonfun$9.apply(JavaAction.scala:138)
at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
at play.core.j.HttpExecutionContext$$anon$2.run(HttpExecutionContext.scala:56)
at play.api.libs.streams.Execution$trampoline$.execute(Execution.scala:70)
at play.core.j.HttpExecutionContext.execute(HttpExecutionContext.scala:48)
at scala.concurrent.impl.Future$.apply(Future.scala:31)
at scala.concurrent.Future$.apply(Future.scala:494)
at play.core.j.JavaAction.apply(JavaAction.scala:138)
at play.api.mvc.Action$$anonfun$apply$2.apply(Action.scala:96)
at play.api.mvc.Action$$anonfun$apply$2.apply(Action.scala:89)
at play.api.libs.streams.StrictAccumulator$$anonfun$mapFuture$2$$anonfun$1.apply(Accumulator.scala:174)
at play.api.libs.streams.StrictAccumulator$$anonfun$mapFuture$2$$anonfun$1.apply(Accumulator.scala:174)
at scala.util.Try$.apply(Try.scala:192)
at play.api.libs.streams.StrictAccumulator$$anonfun$mapFuture$2.apply(Accumulator.scala:174)
at play.api.libs.streams.StrictAccumulator$$anonfun$mapFuture$2.apply(Accumulator.scala:170)
at scala.Function1$$anonfun$andThen$1.apply(Function1.scala:52)
at play.api.libs.streams.StrictAccumulator.run(Accumulator.scala:207)
at play.core.server.AkkaHttpServer$$anonfun$14.apply(AkkaHttpServer.scala:357)
at play.core.server.AkkaHttpServer$$anonfun$14.apply(AkkaHttpServer.scala:355)
at akka.http.scaladsl.util.FastFuture$.akka$http$scaladsl$util$FastFuture$$strictTransform$1(FastFuture.scala:41)
at akka.http.scaladsl.util.FastFuture$$anonfun$transformWith$extension1$1.apply(FastFuture.scala:51)
at akka.http.scaladsl.util.FastFuture$$anonfun$transformWith$extension1$1.apply(FastFuture.scala:50)
... 13 common frames omitted
Caused by: java.net.ConnectException: Connection refused (Connection refused)
at java.net.PlainSocketImpl.socketConnect(Native Method)
at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
at java.net.Socket.connect(Socket.java:607)
at sun.security.ssl.SSLSocketImpl.connect(SSLSocketImpl.java:288)
at sun.net.NetworkClient.doConnect(NetworkClient.java:175)
at sun.net.www.http.HttpClient.openServer(HttpClient.java:463)
at sun.net.www.http.HttpClient.openServer(HttpClient.java:558)
at sun.net.www.protocol.https.HttpsClient.<init>(HttpsClient.java:264)
at sun.net.www.protocol.https.HttpsClient.New(HttpsClient.java:367)
at sun.net.www.protocol.https.AbstractDelegateHttpsURLConnection.getNewHttpClient(AbstractDelegateHttpsURLConnection.java:203)
at sun.net.www.protocol.http.HttpURLConnection.plainConnect0(HttpURLConnection.java:1162)
at sun.net.www.protocol.http.HttpURLConnection.plainConnect(HttpURLConnection.java:1056)
at sun.net.www.protocol.https.AbstractDelegateHttpsURLConnection.connect(AbstractDelegateHttpsURLConnection.java:189)
at sun.net.www.protocol.http.HttpURLConnection.getInputStream0(HttpURLConnection.java:1570)
at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1498)
at sun.net.www.protocol.https.HttpsURLConnectionImpl.getInputStream(HttpsURLConnectionImpl.java:268)
at com.nimbusds.jose.util.DefaultResourceRetriever.getInputStream(DefaultResourceRetriever.java:249)
at com.nimbusds.jose.util.DefaultResourceRetriever.retrieveResource(DefaultResourceRetriever.java:201)
at org.pac4j.oidc.config.OidcConfiguration.internalInit(OidcConfiguration.java:133)
... 52 common frames omitted
by looking at it i dont fully understand the issue. Can you please point me to right direction? What do i miss?
thxbusy-dusk-4970
08/16/2022, 5:42 PMrapid-house-76230
08/16/2022, 7:49 PMdatahub-datahub-upgrade-job
( I ran helm upgrade --install --namespace datahub -f values.yaml datahub datahub/datahub --debug
). I appreciate any help herenutritious-bird-77396
08/16/2022, 9:51 PMdatahub-frontend
image using the tag
Here is the error 🧵
Any help on this would be great. Thanks!white-hydrogen-24531
08/16/2022, 10:49 PMclean-monkey-7245
08/17/2022, 7:34 AMclean-monkey-7245
08/17/2022, 7:36 AMbland-orange-13353
08/17/2022, 8:03 AMbitter-insurance-49151
08/17/2022, 8:39 AMwhite-hydrogen-24531
08/17/2022, 6:52 PMsteep-finland-24780
08/17/2022, 8:09 PMMETADATA_SERVICE_AUTH_ENABLED=true
full-toddler-4661
08/17/2022, 10:49 PMlate-rocket-94535
08/18/2022, 7:36 AMdatahub delete --env TEST --entity_type=datajob --platform=airflow --hard
return "0 entities with 0 rows" but I can delete specific urn or other platform such "kafka" and "postgres". How can I make mass delete for airflow?colossal-king-55688
08/18/2022, 12:22 PMdatahub docker quickstart --backup
But what I get is this:
Error: No such option: --backu
I'm running in Windows environment.busy-dusk-4970
08/18/2022, 1:45 PM./gradlew build
on an M1 mac? 🧵handsome-football-66174
08/18/2022, 4:13 PMdatahub.configuration.common.PipelineExecutionError: ('Source reported errors', RedshiftReport(workunits_produced=0, workunit_ids=[], warnings={}, failures={'version': ["Error: invalid literal for int() with base 10: 'redshift:'"]}, cli_version='0.8.41', cli_entry_location='/root/.venvs/airflow/lib/python3.7/site-packages/datahub/__init__.py', py_version='3.7.10 (default, Jun 3 2021, 00:02:01) \n[GCC 7.3.1 20180712 (Red Hat 7.3.1-13)]', py_exec_path='/root/.venvs/airflow/bin/python', os_details='Linux-4.14.287-215.504.amzn2.x86_64-x86_64-with-glibc2.2.5', tables_scanned=0, views_scanned=0, entities_profiled=0, filtered=[], soft_deleted_stale_entities=[], query_combiner=None, saas_version='', upstream_lineage={}))
cuddly-butcher-39945
08/18/2022, 5:07 PMancient-apartment-23316
08/18/2022, 5:42 PMsteep-finland-24780
08/18/2022, 5:59 PMincalculable-branch-51967
08/18/2022, 8:11 PMredshift-usage
source. We've set up a pipeline with the following configuration:
...
table_pattern:
deny:
- 'analytics.*.*requests*'
- 'analytics.public.requests_raw_stg'
...
we triggered ingestion and in the gms logs we observed entries like the following:
16:07:58.319 [qtp1830908236-16] INFO c.l.m.r.entity.AspectResource:126 - INGEST PROPOSAL proposal: {aspectName=datasetUsageStatistics, systemMetadata={lastObserved=1660832494657, runId=redshift-usage-2022_08_18-14_10_28}, entityUrn=urn:li:dataset:(urn:li:dataPlatform:redshift,analytics.public.requests_current_year_old,PROD), entityType=dataset, aspect={contentType=application/json, value=ByteString(length=1336,bytes=7b227469...205b5d7d)}, changeType=UPSERT}
...
16:08:35.622 [qtp1830908236-1878] INFO c.l.m.r.entity.AspectResource:126 - INGEST PROPOSAL proposal: {aspectName=datasetUsageStatistics, systemMetadata={lastObserved=1660832496738, runId=redshift-usage-2022_08_18-14_10_28}, entityUrn=urn:li:dataset:(urn:li:dataPlatform:redshift,analytics.analytics_sources.potential_signup_requests,PROD), entityType=dataset, aspect={contentType=application/json, value=ByteString(length=1526,bytes=7b227469...205b5d7d)}, changeType=UPSERT}
...
there are no records for analytics.public.requests_raw_stg
. Could it be that only the last regex is being considered?jolly-traffic-67085
08/17/2022, 6:59 AMclean-monkey-7245
08/17/2022, 7:36 AMthankful-vr-12699
08/19/2022, 9:43 AMERROR: for datahub-actions failed to register layer: error creating overlay mount to /var/lib/docker/overlay2/8e4a978b78ea7e210f5feb6cc5d864ca03ed1cc652bde6a0a6e0772fb1ab71b2/merged: too many levels of symbolic links
ERROR: failed to register layer: error creating overlay mount to /var/lib/docker/overlay2/8e4a978b78ea7e210f5feb6cc5d864ca03ed1cc652bde6a0a6e0772fb1ab71b2/merged: too many levels of symbolic links
Error while pulling images. Going to attempt to move on to docker-compose up assuming the images have been built locally
I've tried to delete all my container/images/volumes. But I still have this error.
Thank you for your help!limited-forest-73733
08/19/2022, 10:04 AMlimited-forest-73733
08/19/2022, 10:04 AMbillions-horse-96717
08/19/2022, 3:51 PMvolumes:
- /home/adm_avs/dwh_prod:/home/dbt
and in the Datahub UI under Ingest --> Source,
i added this code:
source:
type: dbt
config:
manifest_path: /home/dbt/target/manifest_file.json'
test_results_path: /home/dbt/target/run_results.json'
load_schemas: /home/dbt/target/sources_file.json'
target_platform: my_target_platform_id
catalog_path: /home/dbt/target/catalog_file.json'
When i run docker colume ls
i don't see the volume.
Does anybody hagve an idea?nutritious-bird-77396
08/19/2022, 4:15 PM0.8.43
I am facing issues with Okta Ingestion.
Looks like an additional parameter report-to
has been added in this version
--report-to TEXT Provide an output file to produce a\n'
'This version of datahub supports report-to functionality\n'
'datahub ingest run -c /tmp/datahub/ingest/6ff6e569-5133-4711-accb-2a321ade586a/recipe.yml --report-to
Any thoughts on the resolution?