https://datahubproject.io logo
Join Slack
Powered by
# troubleshoot
  • r

    rapid-sundown-8805

    09/24/2021, 8:46 AM
    How can one log in as the datahub superuser after AzureAD has been configured? I'd like to promote more admins (eg. myself), and I think I have to log in as the datahub user to do so?
    s
    b
    b
    • 4
    • 12
  • w

    wooden-arm-26381

    09/27/2021, 3:01 PM
    Hi, I’m currently trying to replace schema field descriptions in the UI with a transformer. I want to selectively replace fields with the source content and override user input. I’m using EditableSchemaFieldInfoClass for the fields which should be overwritten. However, after ingestion all descriptions have been set back to the source values, instead of the selection defined in the recipe for my transformer. Are the Editable classes the wrong approach here? Thanks in advance for any help!
    g
    • 2
    • 3
  • c

    curved-sandwich-81699

    09/27/2021, 4:12 PM
    Hi! I am getting
    An unknown error occurred. (code 500)
    error message from the UI after ingesting Looker dashboards in DataHub v0.8.14 using
    acryl-datahub 0.8.14.2
    . Dashboard elements are not displaying either with the same error message.
    g
    b
    • 3
    • 12
  • s

    stale-jewelry-2440

    09/28/2021, 12:02 PM
    Hello, can I set the location of the
    policies.json
    with a variable, i.e. in the docker-compose.yml?
    b
    t
    +2
    • 5
    • 49
  • w

    wooden-article-28111

    09/28/2021, 7:04 PM
    hello, The project datahubproject.io connect in azure sql (https://datahubproject.io/docs/metadata-ingestion/) ? It's possible make an connection.
    m
    • 2
    • 1
  • c

    curved-magazine-23582

    09/29/2021, 3:17 AM
    Hello team, I am trying to upgrade a local docker installation to latest release, by following
    No Code Upgrade guide
    with this command:
    Copy code
    docker-compose down --remove-orphans && docker-compose pull && docker-compose -p datahub up --force-recreate
    but are seeing gms errors on startup, and UI not working
    e
    • 2
    • 12
  • c

    curved-magazine-23582

    09/29/2021, 3:17 AM
    Copy code
    datahub-gms               | 2021-09-29 03:08:50.361:WARN:oejs.HttpChannel:qtp544724190-15: /health
    datahub-gms               | javax.servlet.ServletException: javax.servlet.UnavailableException: Servlet Not Initialized
    datahub-gms               |     at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:162)
    datahub-gms               |     at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
    datahub-gms               |     at org.eclipse.jetty.server.Server.handle(Server.java:494)
    datahub-gms               |     at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:374)
    datahub-gms               |     at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:268)
    datahub-gms               |     at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)
    datahub-gms               |     at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)
    datahub-gms               |     at org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:117)
    datahub-gms               |     at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:336)
    datahub-gms               |     at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:313)
    datahub-gms               |     at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:171)
    datahub-gms               |     at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:129)
    datahub-gms               |     at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:367)
    datahub-gms               |     at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:782)
    datahub-gms               |     at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:918)
    datahub-gms               |     at java.lang.Thread.run(Thread.java:748)
    datahub-gms               | Caused by:
    datahub-gms               | javax.servlet.UnavailableException: Servlet Not Initialized
    datahub-gms               |     at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:822)
    datahub-gms               |     at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:544)
    datahub-gms               |     at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143)
    datahub-gms               |     at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:536)
    datahub-gms               |     at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
    datahub-gms               |     at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235)
    datahub-gms               |     at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1581)
    datahub-gms               |     at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)
    datahub-gms               |     at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1307)
    datahub-gms               |     at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188)
    datahub-gms               |     at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:482)
    datahub-gms               |     at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1549)
    datahub-gms               |     at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186)
    datahub-gms               |     at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1204)
    datahub-gms               |     at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
    datahub-gms               |     at org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:221)
    datahub-gms               |     at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146)
    datahub-gms               |     at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
    datahub-gms               |     at org.eclipse.jetty.server.Server.handle(Server.java:494)
    datahub-gms               |     at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:374)
    datahub-gms               |     at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:268)
    datahub-gms               |     at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)
    datahub-gms               |     at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)
    datahub-gms               |     at org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:117)
    datahub-gms               |     at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:336)
    datahub-gms               |     at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:313)
    datahub-gms               |     at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:171)
    datahub-gms               |     at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:129)
    datahub-gms               |     at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:367)
    datahub-gms               |     at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:782)
    datahub-gms               |     at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:918)
    datahub-gms               |     at java.lang.Thread.run(Thread.java:748)
    e
    b
    • 3
    • 14
  • l

    lemon-receptionist-88902

    09/29/2021, 9:48 AM
    I failed override airflow config, add lineage backend, in google composer composer-1.17.1-airflow-2.1.2. It said no module
    datahub_provider
    .. but i already installed the pypi package.. how to solve it?
    m
    s
    • 3
    • 3
  • h

    handsome-belgium-11927

    09/29/2021, 3:03 PM
    Hello! I am trying to create a lineage for charts and dashboards, and getting some strange behavior: Dashboard is shown in the 'Dashboards' tab of the chart and chart is displayed in the 'Charts' tab of the dashboard, but switching to graph view of lineage shows empty screen. Any ideas what's wrong?
    b
    g
    • 3
    • 8
  • h

    hallowed-pager-87903

    09/30/2021, 6:02 AM
    Hi team, While declaring relationships as per the no-code model using the @Relationship annotation, we specify the entityTypes field. Is there any validation on the corresponding aspect at the time of ingestion. I ask this since I was trying out ingestion via the rest api, and in the ownership aspect I provided dummy urns (instead of legitimate corpuser/corpgroup urns), but still it got indexed into neo4j and the previously non-existent nodes(which weren't valid entities in the system) got created as the destination of the 'OwnedBy' relationship. Is this expected or something that I might be missing ?
    m
    • 2
    • 6
  • b

    brief-lizard-77958

    09/30/2021, 7:41 AM
    EDIT: turns out it was a network issue on my side Running docker-compose -p datahub up from a freshly pulled datahub code and elasticsearch-setup fails to build. I'm also having trouble building images that I could build yesterday. Any idea what could be wrong? Using Ubuntu 21.04. Tried on my windows machine and kafka-setup and elasticsearch-setup fails to build.
    ✅ 1
    r
    • 2
    • 2
  • h

    handsome-belgium-11927

    09/30/2021, 11:53 AM
    Hi team! I'm still strugling with Tableau. Any idea on how to ingest Workbooks? This is some kind of entity which consists of several dashboards, and it seems that atm in Datahub dashboards are top-level entities. Also datasources are included into Workbook, so it becomes even more complicated. Any tutorial on how to create interdependent entities?
    m
    c
    • 3
    • 7
  • p

    powerful-telephone-71997

    10/04/2021, 4:50 AM
    Hi Folks, has someone seen this while ingesting from Superset - I am following the procedure as per documentation, but I am getting this:
    Copy code
    ---- (full traceback above) ----
    File "/usr/local/lib/python3.8/site-packages/datahub/entrypoints.py", line 91, in main
        sys.exit(datahub(standalone_mode=False, **kwargs))
    File "/usr/local/lib/python3.8/site-packages/click/core.py", line 829, in __call__
        return self.main(*args, **kwargs)
    File "/usr/local/lib/python3.8/site-packages/click/core.py", line 782, in main
        rv = self.invoke(ctx)
    File "/usr/local/lib/python3.8/site-packages/click/core.py", line 1259, in invoke
        return _process_result(sub_ctx.command.invoke(sub_ctx))
    File "/usr/local/lib/python3.8/site-packages/click/core.py", line 1259, in invoke
        return _process_result(sub_ctx.command.invoke(sub_ctx))
    File "/usr/local/lib/python3.8/site-packages/click/core.py", line 1066, in invoke
        return ctx.invoke(self.callback, **ctx.params)
    File "/usr/local/lib/python3.8/site-packages/click/core.py", line 610, in invoke
        return callback(*args, **kwargs)
    File "/usr/local/lib/python3.8/site-packages/datahub/cli/ingest_cli.py", line 52, in run
        pipeline = Pipeline.create(pipeline_config)
    File "/usr/local/lib/python3.8/site-packages/datahub/ingestion/run/pipeline.py", line 120, in create
        return cls(config)
    File "/usr/local/lib/python3.8/site-packages/datahub/ingestion/run/pipeline.py", line 88, in __init__
        self.source: Source = source_class.create(
    File "/usr/local/lib/python3.8/site-packages/datahub/ingestion/source/superset.py", line 156, in create
        return cls(ctx, config)
    File "/usr/local/lib/python3.8/site-packages/datahub/ingestion/source/superset.py", line 136, in __init__
        self.access_token = login_response.json()["access_token"]
    
    KeyError: 'access_token'
    m
    a
    • 3
    • 7
  • g

    gentle-father-80172

    10/04/2021, 8:21 PM
    Hi Team, I am getting a
    KeyError: 'PartitionKeys'
    when ingesting Glue metadata. The table this is failing on does not have any Partition Keys in Glue. I'm wondering if line 559 glue.py should be updated to make this key conditional. Thoughts? cc: @square-activity-64562 @witty-actor-87329
    m
    s
    l
    • 4
    • 7
  • m

    millions-soccer-98440

    10/06/2021, 8:44 AM
    Hi, Please help I try to build custom BI sources ingestion and run ingest script no problem but after use in web-frontend show “unknown error 500” and gms log show “Failed to load Entities of type: Chart, keys” how to fix this problem ChartInfoClass (fixed type = ChartTypeClass.TABLE)
    Copy code
    chart_info = ChartInfoClass(
                type=ChartTypeClass.TABLE,
                description="",
                title=chart_title,
                lastModified=last_modified,
                chartUrl="",
            )
    gsm error msg
    Copy code
    08:18:59.210 [qtp544724190-56058] INFO  c.l.m.r.entity.EntityResource - BATCH GET [urn:li:chart:(grafana,2sX8mD7Mk_12)]
    08:18:59.211 [pool-9-thread-1] INFO  c.l.m.filter.RestliLoggingFilter - GET /entities?ids=List(urn%3Ali%3Achart%3A%28grafana%2C2sX8mD7Mk_12%29) - batchGet - 200 - 1ms
    08:18:59.212 [Thread-42099] ERROR c.l.datahub.graphql.GmsGraphQLEngine - Failed to load Entities of type: Chart, keys: [urn:li:chart:(grafana,2sX8mD7Mk_12)] Failed to batch load Charts
    08:18:59.212 [Thread-42099] ERROR c.l.d.g.e.DataHubDataFetcherExceptionHandler - Failed to execute DataFetcher
    java.util.concurrent.CompletionException: java.lang.RuntimeException: Failed to retrieve entities of type Chart
    	at java.util.concurrent.CompletableFuture.encodeThrowable(CompletableFuture.java:273)
    	at java.util.concurrent.CompletableFuture.completeThrowable(CompletableFuture.java:280)
    	at java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1606)
    	at java.lang.Thread.run(Thread.java:748)
    Caused by: java.lang.RuntimeException: Failed to retrieve entities of type Chart
    	at com.linkedin.datahub.graphql.GmsGraphQLEngine.lambda$null$112(GmsGraphQLEngine.java:788)
    	at java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1604)
    	... 1 common frames omitted
    Caused by: java.lang.RuntimeException: Failed to batch load Charts
    	at com.linkedin.datahub.graphql.types.chart.ChartType.batchLoad(ChartType.java:105)
    	at com.linkedin.datahub.graphql.GmsGraphQLEngine.lambda$null$112(GmsGraphQLEngine.java:785)
    	... 2 common frames omitted
    Caused by: java.lang.NullPointerException: null
    	at com.linkedin.datahub.graphql.types.chart.mappers.ChartSnapshotMapper.mapChartInfo(ChartSnapshotMapper.java:70)
    	at com.linkedin.datahub.graphql.types.chart.mappers.ChartSnapshotMapper.apply(ChartSnapshotMapper.java:44)
    	at com.linkedin.datahub.graphql.types.chart.mappers.ChartSnapshotMapper.map(ChartSnapshotMapper.java:29)
    	at com.linkedin.datahub.graphql.types.chart.ChartType.lambda$batchLoad$0(ChartType.java:100)
    	at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
    	at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1384)
    	at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:482)
    	at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:472)
    	at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)
    	at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
    	at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:566)
    	at com.linkedin.datahub.graphql.types.chart.ChartType.batchLoad(ChartType.java:103)
    	... 3 common frames omitted
    08:18:59.212 [Thread-42098] ERROR c.d.m.graphql.GraphQLController - Errors while executing graphQL query: "query getChart($urn: String!) ....
    m
    • 2
    • 10
  • c

    clean-crayon-15379

    10/06/2021, 2:45 PM
    [Resolved] Hi Team, quick question regarding transformers:
    Copy code
    add_dataset_tags
    What kind of output must my user defined function provide? A str or list of strings leads to errors. Thank you Example:
    Copy code
    import logging
    
    import datahub.emitter.mce_builder as builder
    from datahub.metadata.schema_classes import (
        DatasetSnapshotClass,
        TagAssociationClass,
    )
    
    def custom_tags(current: DatasetSnapshotClass):
        """ Returns tags to associate to a dataset depending on custom logic
    
        This function receives a DatasetSnapshotClass, performs custom logic and returns
        a list of TagAssociationClass-wrapped tags.
    
        Args:
            current (DatasetSnapshotClass): Single DatasetSnapshotClass object
    
        Returns:
            List of TagAssociationClass objects.
        """
    
        tag_strings = []
    
        ### Add custom logic here
        tag_strings.append('custom1')
        tag_strings.append('custom2')
        
        tag_strings = [builder.make_tag_urn(tag=n) for n in tag_strings]
        tags = [TagAssociationClass(tag=tag) for tag in tag_strings]
        
        <http://logging.info|logging.info>(f"Tagging dataset {current.urn} with {tag_strings}.")
        return tags
    m
    • 2
    • 8
  • a

    astonishing-lunch-91223

    10/07/2021, 12:58 AM
    Hi everyone. Would you happen to know how to configure Kafka timeouts via environment variables? Schema Registry has
    SCHEMA_REGISTRY_KAFKASTORE_TIMEOUT_MS
    and
    SCHEMA_REGISTRY_KAFKASTORE_INIT_TIMEOUT_MS
    but it’s not clear what they’re called for the GMS and upgrade-job pieces that are deployed via https://github.com/acryldata/datahub-helm. I think it relies on Spring Boot to load these, but I got a bit lost in the docs.
    a
    • 2
    • 2
  • f

    faint-painting-38451

    10/07/2021, 1:23 PM
    I have been trying to build the containers with
    COMPOSE_DOCKER_CLI_BUILD=1 DOCKER_BUILDKIT=1 docker-compose -p datahub build
    but I keep getting the error
    UNABLE_TO_GET_ISSUER_CERT_LOCALLY
    when it tries to build the datahub-frontend container. I tried changing the registry to the proxy that we need to use at my company through
    .npmrc
    and
    .yarnrc
    files, but I am still seeing
    <https://registry.npmjs.org/yarn>
    in the error log. How can I change that to my proxy so that it will build?
    b
    l
    +2
    • 5
    • 8
  • w

    witty-butcher-82399

    10/07/2021, 7:06 PM
    Screenshot 2021-10-07 at 20.55.13.png
    m
    • 2
    • 5
  • g

    gentle-father-80172

    10/07/2021, 9:00 PM
    Hi seeing this error when looking at various things in Datahub. How can I figure out whats erroring out?
    g
    m
    b
    • 4
    • 13
  • s

    stocky-pencil-78684

    10/08/2021, 1:57 AM
    hi there, i have trouble getting quickstart work with win 10. Kept getting error message of 'Docker doesn't seem to be running. Did you start it?' and I am pretty sure the docker desktop is running. please kindly help point out a direction for debugging. many thanks in advance.
    b
    m
    • 3
    • 11
  • c

    curved-magazine-23582

    10/08/2021, 3:08 PM
    Errors I am getting while loading datasets:
    Copy code
    Caused by: com.linkedin.data.template.TemplateOutputCastException: Output <https://app.powerbi.com/groups/a4998c5d-fab6-46b0-99cc-490d3df28981/datasets/81c2b25c-e08c-4d89-a09c-6f790efcd483/details> has type java.lang.String, but does not have a registered coercer and cannot be coerced to type java.net.URI
            at com.linkedin.data.template.DataTemplateUtil.coerceOutput(DataTemplateUtil.java:950)
            at com.linkedin.data.template.RecordTemplate.obtainCustomType(RecordTemplate.java:365)
            at com.linkedin.dataset.DatasetProperties.getUri(DatasetProperties.java:282)
            at com.linkedin.datahub.graphql.types.dataset.mappers.DatasetSnapshotMapper.lambda$apply$0(DatasetSnapshotMapper.java:70)
            at java.util.ArrayList.forEach(ArrayList.java:1259)
            at com.linkedin.datahub.graphql.types.dataset.mappers.DatasetSnapshotMapper.apply(DatasetSnapshotMapper.java:55)
            at com.linkedin.datahub.graphql.types.dataset.mappers.DatasetSnapshotMapper.map(DatasetSnapshotMapper.java:40)
            at com.linkedin.datahub.graphql.types.dataset.DatasetType.lambda$batchLoad$0(DatasetType.java:102)
            at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
            at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1384)
            at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:482)
            at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:472)
            at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)
            at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
            at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:566)
            at com.linkedin.datahub.graphql.types.dataset.DatasetType.batchLoad(DatasetType.java:106)
    m
    b
    e
    • 4
    • 26
  • w

    wide-cricket-79512

    10/08/2021, 6:57 PM
    Hello everyone! I am new to DataHub. Now I am trying to understand the basics, for me now it looks like there is nothing to use except pushing and pulling data and adding tags. Could someone, please, tell me if there are some other helpful features?
    l
    • 2
    • 1
  • b

    blue-table-15403

    10/08/2021, 9:55 PM
    Hello! I am new to DataHub. I am trying to get the dataset over graphiql, but the problem is with the parameter "schemaMetadata". if I make a request without this field, everything is fine. If I specify it, I get an error.
    g
    • 2
    • 3
  • q

    quiet-kilobyte-82304

    10/08/2021, 11:14 PM
    Hi! What IntelliJ plugin does the community use for
    pdl
    files?
    m
    • 2
    • 4
  • s

    stale-jewelry-2440

    10/11/2021, 12:09 PM
    Hi guys, I noticed the class aliases in
    datahub/metadata-ingestion/src/datahub/metadata/com/linkedin/pegasus2avro/metadata/key
    have been removed, but one of them is still in use in
    datahub/metadata-ingestion/src/datahub/emitter/mce_builder.py
    (from datahub.metadata.com.linkedin.pegasus2avro.metadata.key import DatasetKey) Is it ok if I open a pull request to point that import to the original class?
    m
    • 2
    • 1
  • f

    fresh-carpet-31048

    10/11/2021, 3:05 PM
    Hello! I'm currently getting an error when trying to generate a react hook for a new mutation I'm adding. I've only added to entity.graphql and mutations.graphql, and the error I'm getting upon running
    yarn generate
    from datahub-web-react reads:
    Found 2 errors
    ✖ src/types.generated.ts
    AggregateError:
    GraphQLDocumentError: Cannot query field "addSurveyResponse" on type "Mutation".
    at /Users/melindacardenas/Documents/GitHub/datahub/datahub-web-react/src/graphql/mutations.graphql:38:5
    at new AggregateError (/Users/melindacardenas/Documents/GitHub/datahub/datahub-web-react/node_modules/@ardatan/aggregate-error/index.cjs.js:141:24)
    at Object.checkValidationErrors (/Users/melindacardenas/Documents/GitHub/datahub/datahub-web-react/node_modules/@graphql-codegen/core/node_modules/@graphql-tools/utils/index.cjs
    .js:517:15)
    at Object.codegen (/Users/melindacardenas/Documents/GitHub/datahub/datahub-web-react/node_modules/@graphql-codegen/core/index.cjs.js:102:15)
    AggregateError:
    GraphQLDocumentError: Cannot query field "addSurveyResponse" on type "Mutation".
    at /Users/melindacardenas/Documents/GitHub/datahub/datahub-web-react/src/graphql/mutations.graphql:38:5
    at new AggregateError (/Users/melindacardenas/Documents/GitHub/datahub/datahub-web-react/node_modules/@ardatan/aggregate-error/index.cjs.js:141:24)
    at Object.checkValidationErrors (/Users/melindacardenas/Documents/GitHub/datahub/datahub-web-react/node_modules/@graphql-codegen/core/node_modules/@graphql-tools/utils/index.cjs
    .js:517:15)
    at Object.codegen (/Users/melindacardenas/Documents/GitHub/datahub/datahub-web-react/node_modules/@graphql-codegen/core/index.cjs.js:102:15)
    ✖ src/
    File /Users/melindacardenas/Documents/GitHub/datahub/datahub-web-react/src/graphql/mutations.graphql caused error:
    Unable to find field "addSurveyResponse" on type "Mutation"!
    Error: Unable to validate GraphQL document!
    at documents.map.documentFile (/Users/melindacardenas/Documents/GitHub/datahub/datahub-web-react/node_modules/@graphql-codegen/near-operation-file-preset/index.cjs.js:204:19)
    at Array.map (<anonymous>)
    at resolveDocumentImports (/Users/melindacardenas/Documents/GitHub/datahub/datahub-web-react/node_modules/@graphql-codegen/near-operation-file-preset/index.cjs.js:180:22)
    at Object.buildGeneratesSection (/Users/melindacardenas/Documents/GitHub/datahub/datahub-web-react/node_modules/@graphql-codegen/near-operation-file-preset/index.cjs.js:232:25)
    at Listr.task.wrapTask (/Users/melindacardenas/Documents/GitHub/datahub/datahub-web-react/node_modules/@graphql-codegen/cli/bin.js:913:64)
    Error: Unable to validate GraphQL document!
    at documents.map.documentFile (/Users/melindacardenas/Documents/GitHub/datahub/datahub-web-react/node_modules/@graphql-codegen/near-operation-file-preset/index.cjs.js:204:19)
    at Array.map (<anonymous>)
    at resolveDocumentImports (/Users/melindacardenas/Documents/GitHub/datahub/datahub-web-react/node_modules/@graphql-codegen/near-operation-file-preset/index.cjs.js:180:22)
    at Object.buildGeneratesSection (/Users/melindacardenas/Documents/GitHub/datahub/datahub-web-react/node_modules/@graphql-codegen/near-operation-file-preset/index.cjs.js:232:25)
    at Listr.task.wrapTask (/Users/melindacardenas/Documents/GitHub/datahub/datahub-web-react/node_modules/@graphql-codegen/cli/bin.js:913:64)
    Would appreciate any help at all, thank you!
    s
    b
    l
    • 4
    • 13
  • c

    clean-crayon-15379

    10/11/2021, 4:39 PM
    [Resolved] Hi together, I want to patch the description of my tags via the graphql interface (using a mutate command that works via postman). Calling programmatically leads to an error 405 message="Request method 'POST' not supported". Any ideas? Maybe missing an additional header (I have headers={"X-DataHub-Actor": "urnlicorpuser:datahub"})
    b
    • 2
    • 4
  • b

    better-orange-49102

    10/12/2021, 5:36 AM
    I've a instance of datahub that uses Keycloak OIDC, of which there is no "datahub" user in the user list. Am trying to set some specific users as platform admins. How do I go about adding users as platform admins? Do I disable oidc first, then set? But if I disable oidc, I can't specify user, can I?
    b
    • 2
    • 6
  • m

    mammoth-lawyer-49919

    10/12/2021, 6:17 AM
    Hi Team-When I am clicking on a dataset,I am getting this error. On checking the GMS,I see these logs.What could be causing this issue? ... 1 common frames omitted Caused by: io.netty.handler.codec.TooLongFrameException: Response entity too large: HttpObjectAggregator$AggregatedFullHttpResponse(decodeResult: success, version: HTTP/1.1, content: CompositeByteBuf(ridx: 0, widx: 2097152, cap: 2097152, components=291)) HTTP/1.1 200 OK Date: Tue, 12 Oct 2021 061034 GMT Content-Type: application/json X-RestLi-Protocol-Version: 2.0.0 Server: Jetty(9.4.20.v20190813) at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) ... 1 common frames omitted 061035.337 [Thread-14119] ERROR c.d.m.graphql.GraphQLController - Errors while executing graphQL query: "query getChart($urn: String!) {\n chart(urn: $urn) {\n  urn\n  type\n  tool\n  chartId\n  info {\n   name\n   description\n   inputs {\n    urn\n    name\n    origin\n    description\n    platform {\n     name\n     info {\n      logoUrl\n      __typename\n     }\n     __typename\n    }\n    platformNativeType\n    tags\n    ownership {\n     ...ownershipFields\n     __typename\n    }\n    downstreamLineage {\n     ...downstreamRelationshipFields\
    m
    b
    • 3
    • 2
12345...119Latest