Hi, I am trying to extend the metadata model, I fo...
# troubleshoot
f
Hi, I am trying to extend the metadata model, I follow the example project on github https://github.com/datahub-project/datahub/tree/master/metadata-models-custom to execute, in v0.8.20 version It can be successful, but it reported an error after updating to v0.8.29 recently, I see the error log in the logs in the Datahub-gms container as follows, I want to know how I should fix it
Copy code
09:54:08 [ForkJoinPool.commonPool-worker-5] ERROR c.l.d.g.e.DataHubDataFetcherExceptionHandler - Failed to execute DataFetcher
java.util.concurrent.CompletionException: java.lang.ClassCastException: com.linkedin.data.DataList cannot be cast to com.linkedin.data.DataMap
        at java.util.concurrent.CompletableFuture.encodeThrowable(CompletableFuture.java:273)
        at java.util.concurrent.CompletableFuture.completeThrowable(CompletableFuture.java:280)
        at java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1606)
        at java.util.concurrent.CompletableFuture$AsyncSupply.exec(CompletableFuture.java:1596)
        at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
        at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
        at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
        at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.lang.ClassCastException: com.linkedin.data.DataList cannot be cast to com.linkedin.data.DataMap
        at com.linkedin.data.DataMap.getDataMap(DataMap.java:286)
        at com.linkedin.datahub.graphql.WeaklyTypedAspectsResolver.lambda$null$1(WeaklyTypedAspectsResolver.java:72)
        at java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:183)
        at java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:175)
        at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1384)
        at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:482)
        at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:472)
        at java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:150)
        at java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:173)
        at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
        at java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:485)
        at com.linkedin.datahub.graphql.WeaklyTypedAspectsResolver.lambda$get$2(WeaklyTypedAspectsResolver.java:56)
        at java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1604)
        ... 5 common frames omitted
09:54:08 [qtp544724190-13] INFO  c.l.m.resources.usage.UsageStats - Attempting to query usage stats
09:54:08 [pool-9-thread-1] INFO  c.l.m.filter.RestliLoggingFilter - POST /usageStats?action=queryRange - queryRange - 200 - 13ms
g
Hey @fierce-author-36990 - I’m looking into this now. Can you share your custom metadata aspect?
Hey @fierce-author-36990 - i am able to reproduce. working on a fix
f
Copy code
namespace <http://com.threatbook.ps|com.threatbook.ps>

record PartitionStatsRecord {
  partition: string
  recordCount: optional long
  fileCount: optional long
  totalSize: optional long
  data: optional string
}
Copy code
namespace <http://com.threatbook.ps|com.threatbook.ps>

@Aspect = {
  "name": "partitionStats",
  "autoRender": true,
  "renderSpec": {
    "displayType": "tabular", // or properties, markdown, syntax
    "key": "partitionStats",
    "displayName": "Partition Stats"
  }
}
record PartitionStats {
  partitionStats: array[PartitionStatsRecord]
}
g
Thanks-- the fix was merged. if you pull master the error should be gone!
f
okay, thank you