Errors I am getting while loading datasets: ```Cau...
# troubleshoot
c
Errors I am getting while loading datasets:
Copy code
Caused by: com.linkedin.data.template.TemplateOutputCastException: Output <https://app.powerbi.com/groups/a4998c5d-fab6-46b0-99cc-490d3df28981/datasets/81c2b25c-e08c-4d89-a09c-6f790efcd483/details> has type java.lang.String, but does not have a registered coercer and cannot be coerced to type java.net.URI
        at com.linkedin.data.template.DataTemplateUtil.coerceOutput(DataTemplateUtil.java:950)
        at com.linkedin.data.template.RecordTemplate.obtainCustomType(RecordTemplate.java:365)
        at com.linkedin.dataset.DatasetProperties.getUri(DatasetProperties.java:282)
        at com.linkedin.datahub.graphql.types.dataset.mappers.DatasetSnapshotMapper.lambda$apply$0(DatasetSnapshotMapper.java:70)
        at java.util.ArrayList.forEach(ArrayList.java:1259)
        at com.linkedin.datahub.graphql.types.dataset.mappers.DatasetSnapshotMapper.apply(DatasetSnapshotMapper.java:55)
        at com.linkedin.datahub.graphql.types.dataset.mappers.DatasetSnapshotMapper.map(DatasetSnapshotMapper.java:40)
        at com.linkedin.datahub.graphql.types.dataset.DatasetType.lambda$batchLoad$0(DatasetType.java:102)
        at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
        at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1384)
        at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:482)
        at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:472)
        at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)
        at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
        at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:566)
        at com.linkedin.datahub.graphql.types.dataset.DatasetType.batchLoad(DatasetType.java:106)
m
Hi @curved-magazine-23582 I encountered this recently. We have bug filed on our side to fix it. You should be filling out the “externalUrl” field instead of the uri. (Set uri to null).
c
thanks for the info! so DatasetProperties.ExternalReference.externalUrl field?
m
yes @curved-magazine-23582
c
thanks again! have a nice weekend
externalUrl is just a top level field in DatasetProperties
in pdl: the “include ExternalReference” just inlines the field definitions
@curved-magazine-23582
c
thanks! will do that
m
@curved-magazine-23582 there was a bug in an older release about this. If you upgrade to latest, this should go away.
c
ah thanks! will do
b
Hi @curved-magazine-23582 - we have not yet heard this
Do you mind sending over the GMS container logs?
c
thanks. will do
Good morning, John. Here are the log files
b
Thanks! Based on gms.log it seems that the dataset index has changed and we’ve attempted to reindex existing documents into the form, but that has failed. @early-lamp-41924 have you seen cases where reindexing fails? Ming, which version were you on previously and which are you upgrading to?
c
thanks. I always upgrade it to latest. the last upgrade was done about a month ago, think I've seen that index err at the time, but gms continued working until this upgrade. would the index err cause gms be unhealthy?
b
Yeah it’s a good question
My suspicion is that it could since it’s a critical index
And it’s failing to create the consumer object that is critical to the system
e
Wow this error is not helpful. Do you know if reindexing haopened before? Do you have rbac on your elasticsearch cluster by any chance?
c
hmm, not sure about reindexing, but my impression is I saw similar errors after the previous upgrade a month ago. the installation is just the default local docker compose deployment, so I assume no rbac with ES.
b
We may want to hop into some debugging here
c
cool thanks. just let me know how I can participate