bitter-dusk-52400
05/05/2022, 8:45 AMgentle-camera-33498
05/06/2022, 12:52 AMfresh-monitor-41243
05/06/2022, 4:53 PM16:47:06 [application-akka.actor.default-dispatcher-5] INFO o.a.kafka.common.utils.AppInfoParser - Kafka version: 2.3.0
16:47:06 [application-akka.actor.default-dispatcher-5] INFO o.a.kafka.common.utils.AppInfoParser - Kafka commitId: fc1aaa116b661c8a
16:47:06 [application-akka.actor.default-dispatcher-5] INFO o.a.kafka.common.utils.AppInfoParser - Kafka startTimeMs: 1651855626390
16:47:07 [kafka-producer-network-thread | datahub-frontend] INFO org.apache.kafka.clients.Metadata - [Producer clientId=datahub-frontend] Cluster ID: 9g92g2gkQ0CAzW82CbxqTA
16:48:49 [application-akka.actor.default-dispatcher-9] ERROR application -
! @7nh6pkgij - Internal server error, for (GET) [/] ->
play.api.UnexpectedException: Unexpected exception[NullPointerException: Null stream]
at play.api.http.HttpErrorHandlerExceptions$.throwableToUsefulException(HttpErrorHandler.scala:340)
at play.api.http.DefaultHttpErrorHandler.onServerError(HttpErrorHandler.scala:263)
at play.core.server.AkkaHttpServer$$anonfun$1.applyOrElse(AkkaHttpServer.scala:443)
at play.core.server.AkkaHttpServer$$anonfun$1.applyOrElse(AkkaHttpServer.scala:441)
at scala.concurrent.Future.$anonfun$recoverWith$1(Future.scala:417)
at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41)
at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64)
at akka.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:55)
at akka.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:92)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85)
at akka.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:92)
at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:41)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:49)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
I nuked it, restarted the containers, same thing. Pruned my docker state and same thing. Not really sure b/c I’m JUST starting to look at this why it worked yesterday but not today! haha. Any thoughts?at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
handsome-football-66174
05/06/2022, 4:55 PMfresh-monitor-41243
05/06/2022, 7:44 PMbest-umbrella-24804
05/09/2022, 3:16 AMquick-family-76114
05/09/2022, 5:56 AMbrash-sundown-77702
05/09/2022, 5:23 PMException in thread "main" java.lang.IllegalArgumentException: 1,93: "00100030" is an invalid field name.
at com.linkedin.data.template.DataTemplateUtil.parseSchema(DataTemplateUtil.java:313)
at com.linkedin.data.template.DataTemplateUtil.parseSchema(DataTemplateUtil.java:291)
Java code snippet used is
SchemaToPdlEncoder schemaToPdlEncoder
= new SchemaToPdlEncoder(fileWriter);
RecordDataSchema recordDataSchema = ((RecordDataSchema) DataTemplateUtil.parseSchema("{\"type\":\"record\",\"name\":\"Adjusteddicompdl\",\"namespace\":\"resources.practice.dicom\",\"fields\":[{\"name\":\"00100030\",\"type\":{\"type\":\"array\",\"items\":{\"type\":\"record\",\"name\":\"Value\",\"fields\":[{\"name\":\"dummyKey\",\"type\":\"string\"}]}}}]}"));
schemaToPdlEncoder.encode(recordDataSchema);
Here the key being used is "00100030" in String format.
If the same key starts with an alphabet, say "d00100030", it works fine.
Issue 2: We need a pdl schema(array) for the following dicom json data which contains Array of Strings for the key "Key"
{
"d00091002": {
"Key": [
"z0x9c8v7",
"z0x9c8v8"
]
}
}
As per the PDL schema documentation(https://linkedin.github.io/rest.li/pdl_schema), we have PDL schema only if the "Key" has array of "Key, Value" pairs like the below
{
"d00091002": {
"Key": [{"dummyKey":"z0x9c8v7"},{"dummyKey":"z0x9c8v"}]
}
} whose corresponding PDL schema is
record dicomInfo {
d00091002: array[record Value {
dummyKey: string
}]
}
Please let us know if you need more details.gentle-camera-33498
05/09/2022, 7:29 PMcrooked-furniture-44524
05/09/2022, 9:13 PMSELECT tbl1.id
FROM tbl1
JOIN tbl2 on tbl1.join_id=tbl2.join_id
return:
• tables_used = [tbl1, tbl2]
• columns_used = [tbl1.id, tbl1.join_id, tbl2.join_id]
A trickier example would be when I do SELECT *
, columns_used would instead need to be all of tbl1 and all of tbl2.
Is this something that is possible with the datahub project?
Thanks in advance for any tips!salmon-rose-54694
05/10/2022, 2:37 AMsparse-raincoat-42898
05/10/2022, 3:04 AMastonishing-dusk-99990
05/10/2022, 2:24 PMwonderful-smartphone-35332
05/10/2022, 3:27 PMFailed to create ingestion source!: Unauthorized to perform this action. Please contact your DataHub administrator.
I also don't have permission to go to my user page?
Thanks in advanced :)adorable-receptionist-20059
05/10/2022, 10:44 PMsalmon-rose-54694
05/11/2022, 8:05 AMCorpGroupInfoClass(
email=email,
admins=owners,
members=members,
groups=[],
)
handsome-stone-44066
05/11/2022, 11:09 AMpython3 -m pip install 'acryl-datahub[mysql]'
it doesn’t work.wonderful-egg-79350
05/12/2022, 4:46 AMgreat-nest-9369
05/12/2022, 7:04 AMhallowed-analyst-96384
05/12/2022, 7:08 AMastonishing-dusk-99990
05/12/2022, 9:42 AMdatahub docker quickstart --quickstart-compose-file=docker-compose.quickstart.yml
Can I change the credential of mysql in that .yml into mysql rds? Thank youmost-plumber-32123
05/12/2022, 10:04 AMchilly-gpu-46080
05/12/2022, 11:35 AMmost-plumber-32123
05/12/2022, 12:22 PMchilly-gpu-46080
05/13/2022, 5:20 AMastonishing-dusk-99990
05/13/2022, 6:38 AMdocker/datahub-frontend/env/docker.env
AUTH_OIDC_ENABLED=true
AUTH_OIDC_CLIENT_ID=your-client-id
AUTH_OIDC_CLIENT_SECRET=your-client-secret
AUTH_OIDC_DISCOVERY_URI=<https://accounts.google.com/.well-known/openid-configuration>
AUTH_OIDC_BASE_URL=your-datahub-url
AUTH_OIDC_SCOPE="openid profile email"
AUTH_OIDC_USER_NAME_CLAIM=email
AUTH_OIDC_USER_NAME_CLAIM_REGEX=([^@]+)
Question : Can I put this into docker-compose file? If can, which container should I put to? Thank you
Note : I'm using datahub docker quickstart --quickstart-compose-file=docker-compose.quickstart.yml
to run datahubgreat-cpu-72376
05/13/2022, 1:51 PMambitious-lizard-47888
05/13/2022, 6:04 PMsticky-dawn-95000
05/14/2022, 11:36 PMgreat-cpu-72376
05/16/2022, 12:13 PM