https://datahubproject.io logo
Join Slack
Powered by
# troubleshoot
  • q

    quaint-branch-37931

    01/20/2022, 5:23 PM
    Hey! I'm trying to install the latest version of datahub on AWS MWAA, but I'm hitting an issue with dependencies:
    Copy code
    ERROR: Cannot install acryl-datahub[datahub-rest,dbt,glue,metabase,postgres]==0.8.23.1 and apache-airflow[package-extra]==2.0.2 because these package versions have conflicting dependencies.
    The conflict is caused by:
        apache-airflow[package-extra] 2.0.2 depends on typing-extensions>=3.7.4; python_version < "3.8"
        acryl-datahub[datahub-rest,dbt,glue,metabase,postgres] 0.8.23.1 depends on typing-extensions<4 and >=3.10.0.2
        The user requested (constraint) typing-extensions==3.7.4.3
    The constraint is required for MWAA compatibility (refer to https://docs.aws.amazon.com/mwaa/latest/userguide/working-dags-dependencies.html#working-dags-dependencies-syntax-create). Would it be possible to lower this bound, or is there a workaround?
    s
    • 2
    • 10
  • c

    curved-thailand-48451

    01/20/2022, 7:03 PM
    hello everyone . i trying to implement the astro example im running into this :
    s
    • 2
    • 1
  • r

    red-napkin-59945

    01/20/2022, 7:23 PM
    Hey everyone, I got some build error saying
    :metadata-ingestion:testQuick
    task failed. It complains
    Copy code
    >   from _lzma import *
    E   ModuleNotFoundError: No module named '_lzma'
    I am using a Mac with M1 chip. Could the CPU be the issue?
    c
    s
    • 3
    • 5
  • c

    clean-crayon-15379

    01/20/2022, 7:46 PM
    Hi Guys! I use a programatic pipeline to access multiple Oracle databases. Today, I tried to add a new one and I have an issue with cursor buffer size. I can increase it locally and confirm that that solves the issue. But I could not find a way of passing the array size option to sqlalchemy through the Pipeline class. Any ideas or pointers on how to do it? Thank you in advance!
    s
    • 2
    • 17
  • q

    quick-pizza-8906

    01/20/2022, 8:33 PM
    Hello, I updated my deployment of Datahub from 0.8.19 to 0.8.23 and gms pod refuses to start with an exception:
    Copy code
    20:21:33.820 [main] INFO  c.l.m.s.e.i.ESIndexBuilder:86 - There's diff between new mappings (left) and old mappings (right): not equal: value differences={properties=({urn={type=keyword}, @timestamp={type=date}, timestampMillis={type=date}, isExploded={type=boolean}, messageId={type=keyword}, eventGranularity={type=keyword}, event={type=object, enabled=false}, systemMetadata={type=object, enabled=false}}, {urn={type=keyword}, @timestamp={type=date}, isExploded={type=boolean}, timestampMillis={type=date}, eventGranularity={type=keyword}, event={type=object, enabled=false}, systemMetadata={type=object, enabled=false}})}
    20:21:33.821 [main] INFO  c.l.m.s.e.i.ESIndexBuilder:182 - Index dataset_datasetprofileaspect_v1_1642710093821 does not exist. Creating
    20:21:33.961 [main] INFO  c.l.m.s.e.i.ESIndexBuilder:187 - Created index dataset_datasetprofileaspect_v1_1642710093821
    20:21:33.989 [main] INFO  c.l.m.s.e.i.ESIndexBuilder:104 - Reindexing from dataset_datasetprofileaspect_v1 to dataset_datasetprofileaspect_v1_1642710093821 in progress...
    20:21:38.999 [main] INFO  c.l.m.s.e.i.ESIndexBuilder:104 - Reindexing from dataset_datasetprofileaspect_v1 to dataset_datasetprofileaspect_v1_1642710093821 in progress...
    20:21:39.004 [main] INFO  c.l.m.s.e.i.ESIndexBuilder:108 - Reindexing dataset_datasetprofileaspect_v1 to dataset_datasetprofileaspect_v1_1642710093821 task has completed, will now check if reindex was successful
    20:21:42.069 [main] INFO  c.l.m.s.e.i.ESIndexBuilder:146 - Post-reindex document count is different, source_doc_count: 5879 reindex_doc_count: 5000 
    20:21:42.146 [main] WARN  o.s.w.c.s.XmlWebApplicationContext:558 - Exception encountered during context initialization - cancelling refresh attempt: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'metadataChangeLogProcessor' defined in URL [jar:file:/tmp/jetty-0_0_0_0-8080-war_war-_-any-2317031366994056070.dir/webapp/WEB-INF/lib/mae-consumer.jar!/com/linkedin/metadata/kafka/MetadataChangeLogProcessor.class]: Bean instantiation via constructor failed; nested exception is org.springframework.beans.BeanInstantiationException: Failed to instantiate [com.linkedin.metadata.kafka.MetadataChangeLogProcessor]: Constructor threw exception; nested exception is java.lang.RuntimeException: Reindex from dataset_datasetprofileaspect_v1 to dataset_datasetprofileaspect_v1_1642710093821 failed
    any hints what can I do? (beside removing the index...)
    s
    • 2
    • 7
  • r

    red-napkin-59945

    01/21/2022, 12:03 AM
    after pull in the latest code, looks like the above issue is resolved. But another issue happened.
    :metadata-io:test
    task failed with some docker container error
    e
    • 2
    • 2
  • r

    red-napkin-59945

    01/21/2022, 12:05 AM
    Here are the failed tests:
    Copy code
    ElasticSearchGraphServiceTest. setup
    SearchServiceTest. setup
    ElasticSearchTimeseriesAspectServiceTest. testUpsertProfiles
    ElasticSearchTimeseriesAspectServiceTest. testUpsertProfilesWithUniqueMessageIds
    all of them failed due to the same reason: connection refused.
    l
    • 2
    • 3
  • r

    red-napkin-59945

    01/21/2022, 3:11 AM
    Hey Team, I am using IntelliJ to open the project. But looks class could not be find by Intellij, like in
    elastic_search.py
    file
    s
    • 2
    • 2
  • b

    billowy-jewelry-4209

    01/21/2022, 9:58 AM
    Hello guys I have a problem with port for datahub-gms I want to use port 8083 instead of default 8080 port I have changed default config (in file docker-compose-without-neo4j.quickstart.yml changed 8080 to 8083) and try to start datahub using command "datahub docker quickstart --quickstart-compose-file docker-compose-without-neo4j.quickstart.yml" So when I try to log in at "server_ip:9002" I get error (you can see it on the screenshot) When I use default config with default 8080 port everything works perfectly. How can I fix it? PS -Coomand "datahub docker check" says "No issues detected" -In datahub-gms logs I have found string: "2022-01-21 092724.617INFOoejs.AbstractConnectormain Started ServerConnector@5fdef03a{HTTP/1.1,[http/1.1]}{0.0.0.0:8080}" -Command "docker container ls" outputs: "d1c3140a4d40 linkedin/datahub-gms:head "/bin/sh -c /datahub…" 30 minutes ago Up 30 minutes (healthy) 8080/tcp, 0.0.0.0:8083->8083/tcp, :::8083->8083/tcp datahub-gms"
    s
    • 2
    • 6
  • s

    some-microphone-33485

    01/21/2022, 3:28 PM
    Hello Team . Good day . We have datahub instance installed in EKS . Can I send a nuke command from API ? I donot have much acess to the infrastructure . Only we have access to the UI . Thanks
    s
    m
    • 3
    • 6
  • m

    modern-monitor-81461

    01/21/2022, 4:15 PM
    Hi all, I have configured my Airflow instance to enable DataHub events emitting and while configuring an
    inlet
    , I did the mistake of writing
    prod
    instead of
    PROD
    for the Dataset env:
    Copy code
    inlets={
            "datasets": [
                Dataset("iceberg","my.iceberg.table", "prod"),
            ],
        },
    That broke the UI with
    An unknown error occurred (code 500)
    and I see a large stack trace in
    datahub-gms
    . I know I should have used
    PROD
    , but now it is broken and I'd like to know: 1. Why did the rest API accepted the event in the first place? There isn't validation for the enum value? 2. How do I fix my DataHub? By deleting the offending DataJob?
    r
    m
    • 3
    • 13
  • w

    witty-actor-87329

    01/21/2022, 5:53 PM
    Hi Team, Trying to integrate Okta with datahub and its throwing below error in datahub frontend-react logs:
    Copy code
    Caused by: com.nimbusds.jwt.proc.BadJWTException: JWT issue time ahead of current time
    We have checked the okta logs and can see there is a successful authentication on okta end. Can anyone please help us with this error. Thanks
    b
    • 2
    • 7
  • b

    busy-dusk-4970

    01/21/2022, 7:04 PM
    Anyone know how to solve this issue I'm getting while trying to build our docker image with github actions?
    Copy code
    #14 [linux/amd64 prod-build 4/4] RUN cd datahub-src && ./gradlew :datahub-frontend:dist -PenableEmber=false -PuseSystemNode=true -x test -x yarnTest -x yarnLint     && cp datahub-frontend/build/distributions/datahub-frontend.zip ../datahub-frontend.zip     && cd .. && rm -rf datahub-src && unzip datahub-frontend.zip
    #14 9.507 Configuration on demand is an incubating feature.
    #14 61.01 
    #14 61.01 FAILURE: Build failed with an exception.
    #14 61.01 
    #14 61.01 * Where:
    #14 61.01 Build file '/datahub-src/metadata-integration/java/datahub-client/build.gradle' line: 7
    #14 61.01 
    #14 61.01 * What went wrong:
    #14 61.01 A problem occurred evaluating project ':metadata-integration:java:datahub-client'.
    #14 61.01 > Failed to apply plugin [id 'com.palantir.git-version']
    #14 61.01    > Cannot find '.git' directory
    #14 61.01 
    #14 61.01 * Try:
    #14 61.01 Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
    #14 61.01 
    #14 61.01 Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
    #14 61.01 
    #14 61.01 * Get more help at <http://help.gradle.org|help.gradle.org>
    #14 61.01 Use '--warning-mode all' to show the individual deprecation warnings.
    #14 61.01 See <http://docs.gradle.org/5.6.4/userguide/command_line_interface.html#sec:command_line_warnings|docs.gradle.org/5.6.4/userguide/command_line_interface.html#sec:command_line_warnings>
    #14 61.01 
    #14 61.01 BUILD FAILED in 1m 0s
    #14 ERROR: process "/bin/sh -c cd datahub-src && ./gradlew :datahub-frontend:dist -PenableEmber=${ENABLE_EMBER} -PuseSystemNode=${USE_SYSTEM_NODE} -x test -x yarnTest -x yarnLint     && cp datahub-frontend/build/distributions/datahub-frontend.zip ../datahub-frontend.zip     && cd .. && rm -rf datahub-src && unzip datahub-frontend.zip" did not complete successfully: exit code: 1
    ------
     > [linux/amd64 prod-build 4/4] RUN cd datahub-src && ./gradlew :datahub-frontend:dist -PenableEmber=false -PuseSystemNode=true -x test -x yarnTest -x yarnLint     && cp datahub-frontend/build/distributions/datahub-frontend.zip ../datahub-frontend.zip     && cd .. && rm -rf datahub-src && unzip datahub-frontend.zip:
    #14 61.01 * Try:
    #14 61.01 Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
    #14 61.01 
    #14 61.01 Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
    #14 61.01 
    #14 61.01 * Get more help at <http://help.gradle.org|help.gradle.org>
    #14 61.01 Use '--warning-mode all' to show the individual deprecation warnings.
    #14 61.01 See <http://docs.gradle.org/5.6.4/userguide/command_line_interface.html#sec:command_line_warnings|docs.gradle.org/5.6.4/userguide/command_line_interface.html#sec:command_line_warnings>
    #14 61.01 
    #14 61.01 BUILD FAILED in 1m 0s
    ------
    ./docker/datahub-frontend/Dockerfile:23
    --------------------
      22 |     COPY . datahub-src
      23 | >>> RUN cd datahub-src && ./gradlew :datahub-frontend:dist -PenableEmber=${ENABLE_EMBER} -PuseSystemNode=${USE_SYSTEM_NODE} -x test -x yarnTest -x yarnLint \
      24 | >>>     && cp datahub-frontend/build/distributions/datahub-frontend.zip ../datahub-frontend.zip \
      25 | >>>     && cd .. && rm -rf datahub-src && unzip datahub-frontend.zip
      26 |     
    --------------------
    error: failed to solve: process "/bin/sh -c cd datahub-src && ./gradlew :datahub-frontend:dist -PenableEmber=${ENABLE_EMBER} -PuseSystemNode=${USE_SYSTEM_NODE} -x test -x yarnTest -x yarnLint     && cp datahub-frontend/build/distributions/datahub-frontend.zip ../datahub-frontend.zip     && cd .. && rm -rf datahub-src && unzip datahub-frontend.zip" did not complete successfully: exit code: 1
    Error: buildx failed with: error: failed to solve: process "/bin/sh -c cd datahub-src && ./gradlew :datahub-frontend:dist -PenableEmber=${ENABLE_EMBER} -PuseSystemNode=${USE_SYSTEM_NODE} -x test -x yarnTest -x yarnLint     && cp datahub-frontend/build/distributions/datahub-frontend.zip ../datahub-frontend.zip     && cd .. && rm -rf datahub-src && unzip datahub-frontend.zip" did not complete successfully: exit code: 1
    e
    l
    • 3
    • 15
  • b

    breezy-controller-54597

    01/24/2022, 7:39 AM
    When I add Glossary Terms from the Web UI, I can only add Glossary Terms with more than 3 characters. Is there any way to add 2 character terms like "AR", "VR" from the Web UI?
    b
    e
    g
    • 4
    • 32
  • m

    miniature-television-17996

    01/24/2022, 10:52 AM
    Hello! The field at path '/browse/entities[0]/name' was declared as a non null type, but the code involved in retrieving data has wrongly returned a null value. The graphql specification requires that the parent field be set to null, or if that is non nullable that it bubble up null to its parent and so on. The non-nullable type is 'String' within parent type 'Dataset' (code undefined) Anybody sees?
    b
    • 2
    • 3
  • g

    gorgeous-napkin-73659

    01/24/2022, 4:55 PM
    Hello! I have a question about capturing lineage from an Airflow job vs from the data source itself. So for example, if we run BigQuery SQL scripts from an Airflow job, we could collect lineage information via the BQ ingestion script that reads the Audit Logs, but we could also collect lineage from the Airflow lineage backend that would have information about the Airflow job. Both would provide somewhat different results. How should we decide which to use, or does it make sense to use both? Thanks!
    b
    • 2
    • 7
  • p

    plain-lion-38626

    01/25/2022, 9:51 AM
    Hi guys, not sure if this is a bug or a feature, but:
    Copy code
    After inserting the groups (and users) from Azure AD we have the following situation:
    
    Good:
    - Users are imported;
    - Groups are imported;
    - User count in groups are shown;
    - groups of the user are shown.
    
    BAD:
    - users of each group are not listed/shown;
    - users don't inherit the privileges of the groups;
    
    Other details:
    - for a split second the users (members) of the group i'm opening are shown but then are overwritten visually;
    - when inspecting the page, one graphql lists the group+ users but after it there are multiple search results that apparently overwrite the initial result.
    Has anyone encountered this error before?
    ✅ 1
    l
    • 2
    • 1
  • f

    fast-caravan-10124

    01/25/2022, 10:10 AM
    Hello! I have a question about capturing lineage why lineage in datahub only has expansion and no folding?
    b
    • 2
    • 1
  • h

    high-hospital-85984

    01/25/2022, 10:39 AM
    I'm having problems setting up a new "machine user" for Datahub. I've overridden the
    /datahub-frontenf/conf/user.props
    file with something like:
    Copy code
    datahub:<password 1>
    my_service:<password 2>
    but the frontend fails to start with this error:
    Copy code
    09:15:48,267 |-INFO in ch.qos.logback.core.joran.action.AppenderRefAction - Attaching appender named [FILE] to Logger[ROOT]
    09:15:48,267 |-INFO in ch.qos.logback.classic.joran.action.ConfigurationAction - End of configuration.
    09:15:48,268 |-INFO in ch.qos.logback.classic.joran.JoranConfigurator@202b0582 - Registering current configuration as safe fallback point
    
    Oops, cannot start the server.
    com.google.inject.CreationException: Unable to create injector, see the following errors:
    
    1) No implementation for com.linkedin.entity.client.EntityClient was bound.
      while locating com.linkedin.entity.client.EntityClient
        for the 3rd parameter of controllers.SsoCallbackController.<init>(SsoCallbackController.java:39)
      while locating controllers.SsoCallbackController
        for the 4th parameter of router.Routes.<init>(Routes.scala:45)
      at play.api.inject.RoutesProvider$.bindingsFromConfiguration(BuiltinModule.scala:121):
    Binding(class router.Routes to self) (via modules: com.google.inject.util.Modules$OverrideModule -> play.api.inject.guice.GuiceableModuleConversions$$anon$1)
    
    2) Could not find a suitable constructor in client.AuthServiceClient. Classes must have either one (and only one) constructor annotated with @Inject or a zero-argument constructor that is not private.
      at client.AuthServiceClient.class(AuthServiceClient.java:22)
      while locating client.AuthServiceClient
        for field at controllers.AuthenticationController._authClient(AuthenticationController.java:43)
      while locating controllers.AuthenticationController
        for the 3rd parameter of router.Routes.<init>(Routes.scala:45)
      at play.api.inject.RoutesProvider$.bindingsFromConfiguration(BuiltinModule.scala:121):
    Binding(class router.Routes to self) (via modules: com.google.inject.util.Modules$OverrideModule -> play.api.inject.guice.GuiceableModuleConversions$$anon$1)
    
    3) Could not find a suitable constructor in client.AuthServiceClient. Classes must have either one (and only one) constructor annotated with @Inject or a zero-argument constructor that is not private.
      at client.AuthServiceClient.class(AuthServiceClient.java:22)
      while locating client.AuthServiceClient
        for the 4th parameter of controllers.SsoCallbackController.<init>(SsoCallbackController.java:39)
      while locating controllers.SsoCallbackController
        for the 4th parameter of router.Routes.<init>(Routes.scala:45)
      at play.api.inject.RoutesProvider$.bindingsFromConfiguration(BuiltinModule.scala:121):
    Binding(class router.Routes to self) (via modules: com.google.inject.util.Modules$OverrideModule -> play.api.inject.guice.GuiceableModuleConversions$$anon$1)
    
    4) Could not find a suitable constructor in com.datahub.authentication.Authentication. Classes must have either one (and only one) constructor annotated with @Inject or a zero-argument constructor that is not private.
      at com.datahub.authentication.Authentication.class(Authentication.java:21)
      while locating com.datahub.authentication.Authentication
        for the 2nd parameter of controllers.SsoCallbackController.<init>(SsoCallbackController.java:39)
      while locating controllers.SsoCallbackController
        for the 4th parameter of router.Routes.<init>(Routes.scala:45)
      at play.api.inject.RoutesProvider$.bindingsFromConfiguration(BuiltinModule.scala:121):
    Binding(class router.Routes to self) (via modules: com.google.inject.util.Modules$OverrideModule -> play.api.inject.guice.GuiceableModuleConversions$$anon$1)
    
    5) No implementation for org.pac4j.core.context.session.SessionStore was bound.
      while locating org.pac4j.core.context.session.SessionStore
        for field at controllers.AuthenticationController._playSessionStore(AuthenticationController.java:43)
      while locating controllers.AuthenticationController
        for the 3rd parameter of router.Routes.<init>(Routes.scala:45)
      at play.api.inject.RoutesProvider$.bindingsFromConfiguration(BuiltinModule.scala:121):
    Binding(class router.Routes to self) (via modules: com.google.inject.util.Modules$OverrideModule -> play.api.inject.guice.GuiceableModuleConversions$$anon$1)
    
    6) No implementation for org.pac4j.play.store.PlaySessionStore was bound.
      while locating org.pac4j.play.store.PlaySessionStore
        for field at org.pac4j.play.LogoutController.playSessionStore(LogoutController.java:28)
      while locating controllers.CentralLogoutController
        for the 5th parameter of router.Routes.<init>(Routes.scala:45)
      at play.api.inject.RoutesProvider$.bindingsFromConfiguration(BuiltinModule.scala:121):
    Binding(class router.Routes to self) (via modules: com.google.inject.util.Modules$OverrideModule -> play.api.inject.guice.GuiceableModuleConversions$$anon$1)
    
    6 errors
            at com.google.inject.internal.Errors.throwCreationExceptionIfErrorsExist(Errors.java:543)
            at com.google.inject.internal.InternalInjectorCreator.initializeStatically(InternalInjectorCreator.java:159)
            at com.google.inject.internal.InternalInjectorCreator.build(InternalInjectorCreator.java:106)
            at com.google.inject.Guice.createInjector(Guice.java:87)
            at com.google.inject.Guice.createInjector(Guice.java:78)
            at play.api.inject.guice.GuiceBuilder.injector(GuiceInjectorBuilder.scala:185)
            at play.api.inject.guice.GuiceApplicationBuilder.build(GuiceApplicationBuilder.scala:137)
            at play.api.inject.guice.GuiceApplicationLoader.load(GuiceApplicationLoader.scala:21)
            at play.core.server.ProdServerStart$.start(ProdServerStart.scala:51)
            at play.core.server.ProdServerStart$.main(ProdServerStart.scala:25)
            at play.core.server.ProdServerStart.main(ProdServerStart.scala)
    It seems to be related to the formatting in the file, but I cant figure out the problem. Any ideas?
    b
    • 2
    • 6
  • f

    few-air-56117

    01/25/2022, 11:50 AM
    Hi guys , i think i found a problem for biguqery ingestion. If i create a view via interface (not create or replace view as select ..) the lineage donset work 😞
    b
    d
    p
    • 4
    • 7
  • s

    strong-iron-17184

    01/25/2022, 6:41 PM
    image.png
    b
    b
    l
    • 4
    • 11
  • n

    numerous-eve-42142

    01/25/2022, 7:10 PM
    Hi Everyone! I've started to test some stats from tables and this happened: Not all information appeared, like Min, Max and Mean. Here is my ingestion file: Anyone can notice what could be wrong?
    Copy code
    source:
      type: postgres
      config:
        # Coordinates
        host_port: localhost:5432
        database: postgres
        # Credentials
        username: postgres
        password: 1234
    
        # Options
        database_alias: PostgresTest
        schema_pattern:
         deny:
          - "information_schema"
        profiling:
          enabled: true
          include_field_null_count: true
          include_field_min_value: true
          include_field_max_value: true
          include_field_mean_value: true
          include_field_median_value: false
          
    
    
    sink:
      # sink configs
      type: "datahub-rest"
      config:
        server: "<http://localhost:8080>"
    b
    b
    l
    • 4
    • 9
  • m

    miniature-television-17996

    01/25/2022, 10:34 PM
    Hello! may be some one else have the same problem The field at path '/browse/entities[0]/name' was declared as a non null type, but the code involved in retrieving data has wrongly returned a null value. The graphql specification requires that the parent field be set to null, or if that is non nullable that it bubble up null to its parent and so on. The non-nullable type is 'String' within parent type 'Dataset' (code undefined)
    b
    • 2
    • 7
  • a

    acoustic-wolf-70583

    01/26/2022, 1:44 AM
    Hi, Iam trying to build a lineage with multiple downstreams, using the lineage_emitter_mce with kafka. Is there a way to specify multiple downstreams ?
    g
    h
    +2
    • 5
    • 10
  • d

    damp-minister-31834

    01/26/2022, 2:10 AM
    Hi, all! Can tag be changed? For example, I want to change the tag 'urnlitag:tag1' to 'usrnlitag:tag2', is there a way?
    plus1 1
    b
    • 2
    • 6
  • b

    brief-church-81973

    01/26/2022, 8:07 AM
    Hey there, I was wondering if it is possible to do a partial update on an aspect? As an example, we have a Data Quality Status aspect which has the following fields: dmstatus,last_used,last_updated If I send a POST request to aspect?ingestProposal which does not contain, let's say, dmstatus, DataHub deletes the missing properties from the dataset. For us, this means that the system responsible of filling the aspect should have a full control over all of the properties of the aspect and should be able to send all of the properties at once. What do you think on that?
    ☝️ 1
    b
    • 2
    • 1
  • h

    happy-island-35913

    01/26/2022, 10:43 AM
    I'm new to DataHub and installed it locally, but I got this error message Validation error of type FieldUndefined: Field 'platform' in type 'Dashboard' is undefined @ 'browse/entities/platform' (code undefined)
    b
    • 2
    • 1
  • m

    most-boots-68766

    01/26/2022, 11:57 AM
    I wonder if datahub supports AWS EMR spark3.0
    d
    • 2
    • 1
  • n

    nice-country-99675

    01/26/2022, 1:41 PM
    👋 Hi team! I don't know if you are already aware of it, but the demo environment is showing some errors https://demo.datahubproject.io/
    plus1 2
    h
    i
    b
    • 4
    • 8
  • w

    witty-butcher-82399

    01/26/2022, 3:34 PM
    Quick question: Using the redshift ingestor with profiling enabled, I noted min/max/mean/median/stddev stats are missed. According to the config, they are enabled by default and so they should exist for numeric columns. I have
    limit: 1000
    , could be that in those cases those stats are skipped? Thanks
    d
    • 2
    • 6
1...131415...119Latest