https://datahubproject.io logo
Join Slack
Powered by
# troubleshoot
  • m

    millions-notebook-72121

    01/04/2022, 3:29 PM
    image.png
  • g

    gentle-nest-904

    01/05/2022, 9:45 AM
    hi guys, today i tried to load a .json file as an ingestion: datahub ingest -c ./datahub/fileloads-api.yml the fileloads-api is: source:
  • g

    gentle-nest-904

    01/05/2022, 9:46 AM
    type: file config: #coordinates <filepath/filename.json> Sink: type: "datahub-rest" config: specsf for localhost8080
  • g

    gentle-nest-904

    01/05/2022, 9:46 AM
    what i got as an error returned is that the ingestion -c for .json files expects a STRING as input and not a filename, so it can't process it basically
  • g

    gentle-nest-904

    01/05/2022, 9:55 AM
    jsonError.txt
    jsonError.txt
  • n

    nutritious-bird-77396

    01/12/2022, 10:08 PM
    Unfortunately I couldn't use the IP address of
    gms
    as well..... Looking to debug this issue further....
  • n

    nutritious-bird-77396

    01/19/2022, 2:55 PM
    DATAHUB_ANALYTICS_ENABLED
    has been set to
    false
    in the frontend config. As the integration for frontend with MSK IAM Auth is still not in place. On the browser
    Developer Tools
    , I get
    DataFetchingException
    Does any of this explain the
    An unknown error occurred. (code 500)
    error?
  • c

    curved-thailand-48451

    01/20/2022, 7:03 PM
    Error checking feature flag no context set, have you authenticated to a cluster
    Error checking feature flag no context set, have you authenticated to a cluster
    buildkit not supported by daemon
    Error: command 'docker build -t datahub-test_048890/airflow:latest failed: failed to execute cmd: exit status 1
  • c

    curved-thailand-48451

    01/20/2022, 7:03 PM
    any idea
  • c

    curved-thailand-48451

    01/20/2022, 7:04 PM
    ?
  • r

    red-napkin-59945

    01/20/2022, 10:11 PM
    Hey team, I met exact the same issue as in this thread. I want to check if it gets fixed?
  • r

    red-napkin-59945

    01/21/2022, 2:35 AM
    looks like it might be related to the ElasticSearch I was running on Docker. After shutting down the existing container,
    ./gradlew build
    finished successfully!
  • s

    strong-iron-17184

    01/25/2022, 6:40 PM
    Copy code
    Hello I am trying to start datahub in docker, when I run "datahub docker quickstart
    "in the terminal it appears that it is already running on port 9002, when I look at the browser it cannot be accessed
  • q

    quaint-whale-60966

    01/26/2022, 6:19 AM
    hello! I just set up airflow to use datahub as lineage backend. its cool. And now I have some problem, which is I use many custom operators and I think some of them does not show in datahub. where should I debug first? (some of custom operator task is okay, that’s why I am confused)
  • d

    delightful-orange-22738

    01/28/2022, 12:28 PM
    Hello! I don't understand what's going wrong with my metadata inlet and outlet? When i push it in datahub it return 500 error code. Does anyone seen this?
    Copy code
    bash_operator = BashOperator(
            dag=dag,
            **create_dag_params(dag_conf=DAG_CONF, task_id='task'),
            # working
            inlets={
                    "datasets": [
                    Dataset("snowflake", "mydb.schema.tableA"),
                    Dataset("snowflake", "mydb.schema.tableB"),
                ],
            },
            outlets={"datasets": [Dataset("snowflake", "mydb.schema.tableC")]}
            # My sample not working 
            # inlets={
            #     "datasets": [
            #         Dataset("hive", "db.read_1", "prod"),
            #         Dataset("hive", "db.read_2", "prod")
            #     ],
            # },
            # outlets={
            #     "datasets": [
            #         Dataset("hive", "db.read_3", "prod")
            #     ],
            # }
  • l

    loud-musician-49912

    01/28/2022, 2:27 PM
    Even for pyspark word count above doesn't seem to work
  • b

    billions-receptionist-60247

    02/01/2022, 5:00 AM
    Can you help me to debug this
  • r

    red-napkin-59945

    02/01/2022, 6:20 PM
    also want to check what the recommended language is if we want to interact with Datahub.
  • b

    billions-receptionist-60247

    02/02/2022, 9:53 AM
    Hi i'm getting this error
    Copy code
    org.apache.kafka.common.errors.TimeoutException: Expiring 3 record(s) for MetadataAuditEvent_v4-1:120000 ms has passed since batch creation
    solution found on internet and shared by @orange-night-91387 is to increase request.timeout.ms on producer. i'm deploying datahub using helmchart on kubernetes. Can someone sugget me how to change this through helm chart or is there any other reason why this occurs
  • s

    strong-iron-17184

    02/03/2022, 2:37 PM
    Copy code
    Hi, I'm trying to get Airflow up on port 58080, but it doesn't show me anything. some help?
  • s

    strong-iron-17184

    02/03/2022, 2:37 PM
    Copy code
    It appears to me that they are unhealthy
    ``````
  • f

    few-air-56117

    02/03/2022, 3:33 PM
    Hi guys, i open this github issue 😄 https://github.com/linkedin/datahub/issues/4049
  • s

    strong-iron-17184

    02/04/2022, 3:38 PM
    This occurs on port 8080/tcp. any solution?
  • s

    strong-iron-17184

    02/04/2022, 4:45 PM
    Copy code
    Is the documentation outdated?
    🧵 2
  • s

    strong-iron-17184

    02/08/2022, 2:06 PM
    my ports are open
  • w

    wooden-football-7175

    02/08/2022, 3:31 PM
    Any idea why getting connection refused ☝️ on publish lineage with airflow backend ?
  • f

    future-dusk-77156

    02/09/2022, 9:18 AM
    Hi all. Has anyone come across the error
    The retention policy of the schema topic _schemas is incorrect. Expected cleanup.policy to be 'compact' but it is delete
    ? We’re using the Helm chart and the pod
    prerequisites-cp-schema-registry
    is in a crash loop due to this error. Is there a way to modify the cleanup policy in the chart?
  • s

    strong-iron-17184

    02/10/2022, 6:46 PM
    is airflow-datahub
  • n

    nutritious-bird-77396

    02/11/2022, 5:01 PM
    Bumping this thread to hear a response.
  • a

    acoustic-raincoat-46544

    02/12/2022, 1:43 AM
    Hi everyone!We are evaluating DataHub these days and I wonder if DataHub supports kerberos certification or not, because we deeply rely on kerberos, such as kafka, hive, hdfs and so on. But I can't find any Datahub documentation and code about kerberos. Thanks for all your help!
1...109110111...119Latest