Hello, I deployed datahub using this helm chart <h...
# troubleshoot
k
Hello, I deployed datahub using this helm chart https://github.com/acryldata/datahub-helm. I am getting several errors like this one
Copy code
datahub/datahub-datahub-gms-654d4f8457-467qj[datahub-gms]: 10:13:31.450 [Thread-2114] ERROR c.d.m.graphql.GraphQLController - Errors while executing graphQL query: "query getSearchResultsForMultiple($input: SearchAcrossEntitiesInput!) {\n  searchAcrossEntities(input: $input)
when accessing pages like
/search?page=1&query=name
. Have you seen this type of error? I am running this version
linkedin/datahub-gms:v0.8.16
e
Hey! Can you post the full stack trace that comes before this error msg?
b
@kind-psychiatrist-76973 Hey there! We'd love to help you debug this.... Is there anything additional in the trace?
k
this is all I get, url called
/search?page=1&query=total_x_gross_profit
Is there a way to make more verbose? The error does not look much informative, at least to me
I also get this one
Copy code
09:15:00.889 [Thread-8770] ERROR c.l.d.g.e.DataHubDataFetcherExceptionHandler - Failed to execute DataFetcher                                                                                           │
│ java.util.concurrent.CompletionException: java.lang.RuntimeException: Failed to retrieve entities of type Dataset                                                                                       │
│     at java.util.concurrent.CompletableFuture.encodeThrowable(CompletableFuture.java:273)                                                                                                               │
│     at java.util.concurrent.CompletableFuture.completeThrowable(CompletableFuture.java:280)                                                                                                             │
│     at java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1606)                                                                                                              │
│     at java.lang.Thread.run(Thread.java:748)                                                                                                                                                            │
│ Caused by: java.lang.RuntimeException: Failed to retrieve entities of type Dataset                                                                                                                      │
│     at com.linkedin.datahub.graphql.GmsGraphQLEngine.lambda$null$116(GmsGraphQLEngine.java:889)                                                                                                         │
│     at java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1604)                                                                                                              │
│     ... 1 common frames omitted                                                                                                                                                                         │
│ Caused by: java.lang.RuntimeException: Failed to batch load Datasets                                                                                                                                    │
│     at com.linkedin.datahub.graphql.types.dataset.DatasetType.batchLoad(DatasetType.java:108)                                                                                                           │
│     at com.linkedin.datahub.graphql.GmsGraphQLEngine.lambda$null$116(GmsGraphQLEngine.java:886)                                                                                                         │
│     ... 2 common frames omitted                                                                                                                                                                         │
│ Caused by: java.lang.IllegalStateException: Duplicate key com.linkedin.metadata.entity.ebean.EbeanAspectV2@dd26e011                                                                                     │
│     at java.util.stream.Collectors.lambda$throwingMerger$0(Collectors.java:133)                                                                                                                         │
│     at java.util.HashMap.merge(HashMap.java:1254)                                                                                                                                                       │
│     at java.util.stream.Collectors.lambda$toMap$58(Collectors.java:1320)                                                                                                                                │
│     at java.util.stream.ReduceOps$3ReducingSink.accept(ReduceOps.java:169)                                                                                                                              │
│     at java.util.ArrayList$Itr.forEachRemaining(ArrayList.java:901)                                                                                                                                     │
│     at java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801)                                                                                                              │
│     at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:482)                                                                                                                            │
│     at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:472)                                                                                                                     │
│     at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)                                                                                                                       │
│     at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)                                                                                                                            │
│     at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:566)                                                                                                                           │
│     at com.linkedin.metadata.entity.ebean.EbeanAspectDao.batchGet(EbeanAspectDao.java:234)                                                                                                              │
│     at com.linkedin.metadata.entity.ebean.EbeanEntityService.getLatestAspects(EbeanEntityService.java:105)                                                                                              │
│     at com.linkedin.metadata.entity.EntityService.getLatestAspectUnions(EntityService.java:306)                                                                                                         │
│     at com.linkedin.metadata.entity.EntityService.getSnapshotRecords(EntityService.java:298)                                                                                                            │
│     at com.linkedin.metadata.entity.EntityService.getSnapshotUnions(EntityService.java:290)                                                                                                             │
│     at com.linkedin.metadata.entity.EntityService.getEntities(EntityService.java:215)                                                                                                                   │
│     at com.linkedin.entity.client.JavaEntityClient.batchGet(JavaEntityClient.java:55)                                                                                                                   │
│     at com.linkedin.datahub.graphql.types.dataset.DatasetType.batchLoad(DatasetType.java:89)                                                                                                            │
│     ... 3 common frames omitted
b
Hi there! We’ve seen this error before - have you been running datahub for a while?
I think this will help fix. Future releases of DataHub will by default have this issue fixed
k
b
Wonderful to hear - thanks for confirming!