https://datahubproject.io logo
Join Slack
Powered by
# troubleshoot
  • a

    agreeable-table-54007

    04/05/2023, 9:06 AM
    Hello all! Hope you are fine. Can someone help me please with the installation?
    πŸ‘€ 1
    βœ… 1
    a
    • 2
    • 2
  • w

    wonderful-wall-76801

    04/05/2023, 10:29 AM
    Hello everyone! We are trying to upgrade from 0.9.6 to 0.10.1 but faced with 500 error in UI Logs from GMS (similar with datahub-datahub-system-update-job logs):
    Copy code
    Suppressed: org.elasticsearch.client.ResponseException: method [POST], host [<http://123.123.123.123:9200>], URI [/datahubpolicyindex_v2/_search?typed_keys=true&max_concurrent_shard_requests=5&ignore_unavailable=false&expand_wildcards=open&allow_no_indices=true&ignore_throttled=true&search_type=query_then_fetch&batched_reduce_size=512&ccs_minimize_roundtrips=true], status line [HTTP/1.1 400 Bad Request]
    {"error":{"root_cause":[{"type":"parsing_exception","reason":"[term] query does not support [case_insensitive]","line":1,"col":942}],"type":"x_content_parse_exception","reason":"[1:942] [bool] failed to parse field [must]","caused_by":{"type":"x_content_parse_exception","reason":"[1:942] [bool] failed to parse field [should]","caused_by":{"type":"x_content_parse_exception","reason":"[1:942] [bool] failed to parse field [should]","caused_by":{"type":"parsing_exception","reason":"[term] query does not support [case_insensitive]","line":1,"col":942}}}},"status":400}
    How we can fix that? Thanks a lot! PS ElasticSearch version: 7.9.3 HelmChart version: datahub-0.2.161
    b
    a
    • 3
    • 23
  • b

    bland-orange-13353

    04/05/2023, 5:58 PM
    This message was deleted.
    βœ… 1
    l
    • 2
    • 1
  • s

    shy-alarm-27631

    04/05/2023, 6:47 PM
    When I try running
    ./gradlew quickstart
    I get the following error:
    Copy code
    ModuleNotFoundError: No module named 'confluent_kafka'
    I’m not sure why this is happening since I try importing it in a python shell and it works fine. Can I get some help with this?
    πŸ” 1
    πŸ“– 1
    βœ… 1
    l
    a
    • 3
    • 6
  • c

    cuddly-beach-83988

    04/05/2023, 7:14 PM
    EDIT - Found a solution, detailed write up in thread :) Hi good afternoon. We are looking into utilizing DataHub for our org, I am trying to put together a demo using our own data and integrations. I have followed the quickstart guide to spin up a dockerized version of datahub, however when trying to setup a source for Microsofft SQL Server and using ODBC, it requires both pyodbc module and the physical driver. I have connected to the docker image
    acryldata/datahub-actions:head
    found the mssql-venv and phyiscally just install pyodbc, but i still require the driver file and unfortuantly the local
    datahub
    user in the container does not have sudo permissions so I can't just install it via apt-get. sigh can anyone please point me in the right direction?
    πŸ“– 1
    πŸ” 1
    πŸ’― 1
    βœ… 1
    l
    a
    +3
    • 6
    • 7
  • b

    bumpy-musician-39948

    04/06/2023, 3:06 AM
    How can I use datahub docker quickstart with mirros in China
    πŸ“– 1
    πŸ” 1
    βœ… 1
    l
    b
    • 3
    • 3
  • b

    bumpy-musician-39948

    04/06/2023, 3:06 AM
    read tcp 192.168.100.6053804 &gt;104.18.125.25443: read: connection timed out . Unable to run quickstart - the following issues were detected: - quickstart.sh or dev.sh is not running
  • b

    bland-orange-13353

    04/06/2023, 6:28 AM
    This message was deleted.
    βœ… 1
    l
    • 2
    • 1
  • b

    bland-orange-13353

    04/06/2023, 7:36 AM
    This message was deleted.
    πŸ“– 1
    βœ… 1
    πŸ” 1
    l
    • 2
    • 1
  • i

    icy-dentist-82336

    04/06/2023, 8:13 AM
    I have a question. Is it not possible to deactivate the 'readers view all asset' policy on the manager permissions page in the Datahub?
    πŸ“– 1
    πŸ” 1
    l
    a
    • 3
    • 2
  • r

    rich-dusk-60426

    04/06/2023, 9:03 AM
    Hi Team, I Was following this document to deploy datahub But I am getting the following errors with the kafka pods, Can someone please help me with this?
    l
    a
    • 3
    • 2
  • w

    wide-afternoon-79955

    04/06/2023, 9:47 AM
    Hi All, Problem : Datahub searches are getting very slow Current Datahub is like, We are using Datahub in prod and we loaded around 10k Datasets and have 27 Active users. We are running Datahub pod are with 1 CPU, 4GB and I am seeing memory usage of 90%. So we will have to scale up the no. of pods for GMS along with better memory. Can anyone here have faced search slowness issue and identified the components to scale and how much should they be scaled ?
    βœ… 1
    l
    a
    +2
    • 5
    • 13
  • f

    fancy-crayon-39356

    04/06/2023, 11:00 AM
    Hello guys! πŸ˜„ Hope you are well. I have a question about graphQL searches, specifically related to filtering datasets by URNs. I'm using the following query to filter datasets by a list of URNs.
    Copy code
    {
      search(
        input: {
          type: DATASET, 
          query: "*", 
          start: 0, 
          count: 100,
          orFilters: {
            and: [
            	{
            	field: "urn",
            	values: ["urn:li:dataset:(...)", "urn:li:dataset:(...)"],
            	condition: EQUAL,
            	},
            ]
          }
        }
      ) {
        start
        count
        total
        searchResults {
          entity {
            ... on Dataset {
              urn
              name
              platform {
                urn
              }
              dataPlatformInstance {
                urn
              }
              domain {
                domain {
                  urn
                }
              }
              properties {
                description
              }
              ownership {
                owners {
                  associatedUrn
                }
              }
              tags {
                tags {
                  associatedUrn
                }
              }
              glossaryTerms {
                terms {
                  associatedUrn
                }
              }
              schemaMetadata {
                fields {
                  description
                }
              }
            }
          }
        }
      }
    }
    The problem is that it returns me empty results, always. The same does not happen when I try to filter for tags, platform, etc. It all works, except for filtering by URNs. Am I doing something wrong here? My intention is to fetch information about ALL datasets that we have. However, this is not possible due to ES limitations when query is above 10k results (https://github.com/datahub-project/datahub/issues/4575), so my plan is to create batches of URNs and filter for it, one batch at a time. Appreciate your help! πŸ˜ƒ
    πŸ“– 1
    πŸ” 1
    βœ… 1
    l
    a
    +7
    • 10
    • 47
  • a

    astonishing-animal-7168

    04/06/2023, 12:49 PM
    Hi, I've noticed that this GitHub action has failed: https://github.com/acryldata/datahub-helm/actions/runs/4622804511/jobs/8175947119 This means the latest versions of the Helm charts are unavailable. Would it be possible to retry the job?
    l
    a
    • 3
    • 3
  • b

    bland-orange-13353

    04/06/2023, 1:39 PM
    This message was deleted.
    πŸ“– 1
    βœ… 1
    πŸ” 1
    l
    • 2
    • 1
  • n

    numerous-eve-42142

    04/06/2023, 6:32 PM
    Hi team! After upgrading airflow from v2.2.4 to v2.5.1, i'm having this error:
    Failed to execute job 3031 for task pipeline_ingest (No module named 'great_expectations.datasource.sqlalchemy_datasource'; 2234)
    Even using
    Copy code
    pip install 'acryl-datahub[great-expectations]'==0.9.3
    (this is the version that i'm running datahub by now) PS: i'm not using DataHubValidationAction, but without using
    Copy code
    from datahub.integrations.great_expectations.action import DataHubValidationAction
    It not works. I'm just profiling tables. Just upgrading the platform and dependencies should work?
    l
    a
    d
    • 4
    • 4
  • c

    cuddly-butcher-39945

    04/06/2023, 7:57 PM
    Hey Team, I am now getting the following error when kicking off the following command: ./gradlew quickstartDebug --stacktrace -x yarnTest -x yarnLint error: Error response from daemon: image with reference linkedin/datahub-kafka-setup:debug was found but does not match the specified platform: wanted linux/amd64, actual: linux/arm64/v8 Here is my environment: Apple M1 openjdk 11.0.18 2023-01-17 LTS OpenJDK Runtime Environment Zulu11.62+17-CA (build 11.0.18+10-LTS) OpenJDK 64-Bit Server VM Zulu11.62+17-CA (build 11.0.18+10-LTS, mixed mode) Java Compiler javac 11.0.18 joshuagarza@Joshuas-Mini datahub % ./gradlew --version ------------------------------------------------------------ Gradle 6.9.2 ------------------------------------------------------------ Build time: 2021-12-21 201838 UTC Revision: 5d94aa68c0fdbe443838bb977080e3b9f273e889 Kotlin: 1.4.20 Groovy: 2.5.12 Ant: Apache Ant(TM) version 1.10.9 compiled on September 27 2020 JVM: 11.0.18 (Azul Systems, Inc. 11.0.18+10-LTS) OS: Mac OS X 13.2.1 x86_64 Thanks for any help on this!
    πŸ” 1
    πŸ“– 1
    l
    l
    a
    • 4
    • 5
  • a

    astonishing-australia-72492

    04/06/2023, 8:45 PM
    Hey all. I am just getting started installing datahub on AWS via kubernetes. I deployed the cluster with no errors. Now I've installed the prequisites via 'helm install prerequisites datahub/datahub-prerequisites', but when I check the status of the pods, the only pod that will start is the schema registry, and it continues to restart in a loop. I checked the logs of the schema registry pod, but it isn't helpful. Just says: Defaulted container "prometheus-jmx-exporter" out of: prometheus-jmx-exporter, cp-schema-registry-server VM settings: Max. Heap Size (Estimated): 6.71G Ergonomics Machine Class: server Using VM: OpenJDK 64-Bit Server VM
    πŸ“– 1
    πŸ” 1
    l
    a
    s
    • 4
    • 6
  • w

    white-knife-12883

    04/06/2023, 8:49 PM
    I'm having trouble with the latest helm chart
    Copy code
    helm --debug pull datahub --repo $'<https://helm.datahubproject.io>' --version 0.2.162 --destination __downloads --untar
    Error: chart "datahub" version "0.2.162" not found in <https://helm.datahubproject.io> repository
    But I see that version right over: https://github.com/acryldata/datahub-helm/releases/tag/datahub-0.2.162
    πŸ” 1
    πŸ“– 1
    l
    a
    • 3
    • 3
  • n

    numerous-byte-87938

    04/06/2023, 10:08 PM
    Hopefully a QQ. I saw we recently updated ES to 7.10 from 7.9 in v0.10.0, which introduced some issues like (thread 1 and thread 2) for ES cluster 7.9. Is there anyway to get around it, and will upgrading ES version directly to 7.16 cause more issues than fix? thanks bear
    l
    a
    o
    • 4
    • 4
  • a

    astonishing-knife-25309

    04/07/2023, 3:23 AM
    hello! hopefully someone can help. I am trying to run through the quickstart and am getting an error when trying to run
    Copy code
    python3 -m datahub docker quickstart
    the error that I am getting is this on a loop
    Copy code
    C:\Windows\System32>python3 -m datahub docker quickstart
    ←[32m←[2m[2023-04-06 22:19:52,724]←[0m INFO    ←[0m ←[34m←[2m{datahub.cli.quickstart_versioning:144}←[0m - Saved quickstart config to C:\Users\baelf/.datahub/quickstart/quickstart_version_mapping.yaml.←[0m
    ←[32m←[2m[2023-04-06 22:19:52,725]←[0m INFO    ←[0m ←[34m←[2m{datahub.cli.docker_cli:638}←[0m - Using quickstart plan: composefile_git_ref='master' docker_tag='head'←[0m
    ←[32m←[2m[2023-04-06 22:19:52,735]←[0m INFO    ←[0m ←[34m←[2m{datahub.cli.docker_cli:656}←[0m - compose file name C:\Users\baelf\.datahub\quickstart/docker-compose.yml←[0m
    ←[32m←[2m[2023-04-06 22:19:52,740]←[0m INFO    ←[0m ←[34m←[2m{datahub.cli.docker_cli:840}←[0m - Fetching docker-compose file <https://raw.githubusercontent.com/datahub-project/datahub/master/docker/quickstart/docker-compose-without-neo4j.quickstart.yml> from GitHub←[0m
    
    Pulling docker images...
    This may take a while depending on your network bandwidth.
    time="2023-04-06T22:19:53-05:00" level=warning msg="The \"HOME\" variable is not set. Defaulting to a blank string."
    time="2023-04-06T22:19:53-05:00" level=warning msg="The \"HOME\" variable is not set. Defaulting to a blank string."
    Error response from daemon: readlink /var/lib/docker/overlay2: invalid argument
    Error while pulling images. Going to attempt to move on to docker compose up assuming the images have been built locally
    
    Starting up DataHub...
    .time="2023-04-06T22:19:55-05:00" level=warning msg="The \"HOME\" variable is not set. Defaulting to a blank string."
    time="2023-04-06T22:19:55-05:00" level=warning msg="The \"HOME\" variable is not set. Defaulting to a blank string."
    Error response from daemon: readlink /var/lib/docker/overlay2: invalid argument
    πŸ” 1
    πŸ“– 1
    l
    a
    • 3
    • 2
  • m

    microscopic-room-90690

    04/07/2023, 6:27 AM
    Hello team, I want to remove a schema from datahub and get this error
    Failed to execute operation
    java.lang.UnsupportedOperationException: Only upsert operation is supported
    I found that others also got this trouble but did not got a workaround. Anyone can help?
    πŸ“– 1
    πŸ” 1
    l
    a
    a
    • 4
    • 7
  • b

    brave-room-48783

    04/07/2023, 9:15 AM
    πŸ‘‹ Hello, team! Getting supposedly an infinite run with the message -
    Copy code
    yaml: unmarshal errors:
     line 123: mapping key "labels" already defined at line 98
     line 157: mapping key "labels" already defined at line 146
     line 172: mapping key "labels" already defined at line 160
     line 203: mapping key "labels" already defined at line 190
    After running the command -
    python3 -m datahub docker quickstart
    Versions I am running -
    DataHub CLI version: 0.10.1.1
    Python version: 3.9.6 (default, Mar 10 2023, 20:16:38)
    [Clang 14.0.3 (clang-1403.0.22.14.1)]
    Deployment method - Docker
    πŸ” 1
    πŸ“– 1
    βœ… 1
    l
    a
    +3
    • 6
    • 10
  • w

    wide-ghost-47822

    04/07/2023, 10:35 AM
    Hi, I am trying to integrate datahub with great expectations. I have a script which do some validation and send its result to datahub. I am having issues with integration. According the outputs printed in my script
    action_list
    contains Datahub configs and I investigate the traffic with nmap tool on port 8080 which is the gms port, and I can see that every requests are responded with HTTP status code 200. You can see some examples of the output:
    Copy code
    {"proposal": {"entityType": "assertion", "entityUrn": "urn:li:assertion:81f38ff71a20153a0291b5f087e75f55", "changeType": "UPSERT", "aspectName": "assertionInfo", "aspect": {"value": "{\"customProperties\": {\"expectation_suite_name\": \"test_suite_3\"}, \"type\": \"DATASET\", \"datasetAssertion\": {\"dataset\": \"urn:li:dataset:(urn:li:dataPlatform:mysql,<my_table>)\", \"scope\": \"DATASET_COLUMN\", \"fields\": [\"urn:li:schemaField:(urn:li:dataset:(urn:li:dataPlatform:mysql,london.logging_bid,PROD),id)\"], \"aggregation\": \"IDENTITY\", \"operator\": \"NOT_NULL\", \"nativeType\": \"expect_column_values_to_not_be_null\", \"nativeParameters\": {\"column\": \"id\", \"mostly\": \"1.0\"}}}", "contentType": "application/json"}}}
    
    HTTP/1.1 200 OK.
    Date: Fri, 07 Apr 2023 07:39:55 GMT.
    Content-Type: application/json.
    X-RestLi-Protocol-Version: 2.0.0.
    Content-Length: 61.
    Server: Jetty(9.4.46.v20220331).
    .
    {"value":"urn:li:assertion:81f38ff71a20153a0291b5f087e75f55"}
    
    {"proposal": {"entityType": "assertion", "entityUrn": "urn:li:assertion:81f38ff71a20153a0291b5f087e75f55", "changeType": "UPSERT", "aspectName": "dataPlatformInstance", "aspect": {"value": "{\"platform\": \"urn:li:dataPlatform:great-expectations\"}", "contentType": "application/json"}}}
    
    POST /aspects?action=ingestProposal HTTP/1.1.
    Host: <my-host>:8080.
    User-Agent: python-requests/2.26.0.
    Accept-Encoding: gzip, deflate.
    Accept: */*.
    Connection: keep-alive.
    X-RestLi-Protocol-Version: 2.0.0.
    Content-Type: application/json.
    Content-Length: 287.
    What I expect now is that I should see the expectations result in Datahub UI in tab called
    Validation
    . Yet it looks disabled. So, maybe you can guide me what should be the reason for that.
    πŸ“– 1
    πŸ” 1
    l
    a
    • 3
    • 8
  • a

    average-alligator-6750

    04/07/2023, 12:27 PM
    Hello, im receiving 403 errors from elasticsearch in the datahub-elasticsearch-setup-job. Is there anywhere details or a template for the policy I should apply to the datahub user ? I implemented a prefix with all the rights but it doesn't seem to work.
    plus1 1
    l
    a
    +2
    • 5
    • 5
  • a

    astonishing-printer-13992

    04/07/2023, 1:11 PM
    Hi all! I trying to test great expectation tool to data quality. I observed the following error: ('Unable to emit metadata to DataHub GMS', {'message': '404 Client Error: Not Found for url: <https://d/aspects?action=ingestProposal'}|https://<datahub-dns-name>/aspects?action=ingestProposal'}>) Do you have any advise for this error?
    βœ… 1
    l
    • 2
    • 2
  • i

    important-night-50346

    04/08/2023, 1:25 AM
    Hi. I’m running update from 0.9.5 to 0.10.1 (deployed in EKS with Helm). It looks like, reindex job does not close ELK indices. I’m trying to complete update in DEV and this reindex is never ending, because there is always mismach in document counts due to open indices. It end-up spawning 1000 shards on EKS cluster and I gave up restoring indices from snapshot. Please find more details below (this is live log, update-job is running, but indices are open). Am I doing something wrong? Job datahub-datahub-system-update-job:
    Copy code
    2023-04-08 01:15:49.352  INFO 1 --- [           main] c.l.m.s.e.indexbuilder.ESIndexBuilder    : Created index dataprocessinstanceindex_v2_1680916549224
    2023-04-08 01:16:49.405  INFO 1 --- [           main] c.l.m.s.e.indexbuilder.ESIndexBuilder    : Task: yLvZIFxaQ8Km_Dg8_0vdHg:464294584 - Reindexing from dataprocessinstanceindex_v2 to dataprocessinstanceindex_v2_1680916549224 in progress...
    2023-04-08 01:17:49.441  WARN 1 --- [           main] c.l.m.s.e.indexbuilder.ESIndexBuilder    : Task: yLvZIFxaQ8Km_Dg8_0vdHg:464294584 - Document counts do not match 3036994 != 79135. Complete: 2.6057014%
    2023-04-08 01:17:50.441  INFO 1 --- [           main] c.l.m.s.e.indexbuilder.ESIndexBuilder    : Task: yLvZIFxaQ8Km_Dg8_0vdHg:464294584 - Reindexing from dataprocessinstanceindex_v2 to dataprocessinstanceindex_v2_1680916549224 in progress...
    2023-04-08 01:18:50.489  WARN 1 --- [           main] c.l.m.s.e.indexbuilder.ESIndexBuilder    : Task: yLvZIFxaQ8Km_Dg8_0vdHg:464294584 - Document counts do not match 3037042 != 123981. Complete: 4.0822945%
    2023-04-08 01:18:52.492  INFO 1 --- [           main] c.l.m.s.e.indexbuilder.ESIndexBuilder    : Task: yLvZIFxaQ8Km_Dg8_0vdHg:464294584 - Reindexing from dataprocessinstanceindex_v2 to dataprocessinstanceindex_v2_1680916549224 in progress...
    indices:
    Copy code
    $ curl -XGET <https://redacted:443/_cat/indices?s=index:asc>
    green  open .kibana_1                                                _v7DnFkWTfiNG9bsgVBmbg 1 0       4     3   22.9kb   22.9kb
    yellow open .opendistro-job-scheduler-lock                           gXdKjsz8TmyVxQOZO7JcVA 5 1      58   896    405kb    405kb
    green  open .opensearch-observability                                ne6d_vqlS7azmrr7rs8MmA 1 0       0     0     208b     208b
    green  open .tasks                                                   JmDYsKzQSByAOe9_4FnXdw 1 0       2     0   13.8kb   13.8kb
    yellow open assertion_assertionruneventaspect_v1                     E4Oy5K5LSMuw9sCoBdG3wg 1 1       0     0     208b     208b
    yellow open assertionindex_v2_1680916341451                          MeSGfdE2TDCT3P8NzLJVow 1 1       0     0     208b     208b
    yellow open assertionindex_v2_clone_1680916209740                    SZ5x1BmoRb-XQog-ldzVLg 1 1       0     0     208b     208b
    yellow open chart_chartusagestatisticsaspect_v1                      6sK_wp9qTf6YfbTNHU3cvA 1 1       0     0     208b     208b
    yellow open chartindex_v2_1680916508743                              FX-Q3t6VQgel_29jwYQwcA 1 1       0     0     208b     208b
    yellow open chartindex_v2_clone_1680916215081                        EPWNO6eORKmlyvvU4LdzpA 1 1       0     0     208b     208b
    yellow open containerindex_v2_1680916217544                          GdHa6mUWQJCnNchf8kc5pg 1 1     295     0  445.4kb  445.4kb
    yellow open containerindex_v2_clone_1680916205327                    aZ9Y6FpuTLq-C0nHUPQLgw 1 1     295     1  240.2kb  240.2kb
    yellow open corpgroupindex_v2_1680916298952                          c0k6PZlLThGxolcV2jcCuw 1 1    9039     0   33.5mb   33.5mb
    yellow open corpgroupindex_v2_clone_1680916207464                    lOwXCdCrQba0R52yqQ1UoQ 1 1    9039     3      7mb      7mb
    yellow open corpuserindex_v2_1680916385514                           J7Pm9vamT7ChRHZH58-szQ 1 1   54570     0  108.2mb  108.2mb
    yellow open corpuserindex_v2_clone_1680916212508                     sbBpUSPfTPOXGhlIgwSXbA 1 1   54570     1   30.1mb   30.1mb
    yellow open dashboard_dashboardusagestatisticsaspect_v1              j4zayejpSwSxbwa_zT7umw 1 1       0     0     208b     208b
    yellow open dashboardindex_v2_1680916383282                          IUviBU6qSz2p-ixrJ3_a5Q 1 1       0     0     208b     208b
    yellow open dashboardindex_v2_clone_1680916211075                    6fPmGrUHQKmD3IVaGaNjOw 1 1       0     0     208b     208b
    yellow open dataflowindex_v2_1680916447029                           moxh5o89Ruugr0Jp4snkpA 1 1     351     0  880.8kb  880.8kb
    yellow open dataflowindex_v2_clone_1680916213195                     lU3789SoR5q6El2ILp6vew 1 1     351     8  469.6kb  469.6kb
    yellow open datahubaccesstokenindex_v2_1680916258241                 icRmUSt-TXqbUj1DS6dDEg 1 1       2     0   16.3kb   16.3kb
    yellow open datahubaccesstokenindex_v2_clone_1680916206271           6j4VQAnMQ0CVwGoJJYWT1g 1 1       2     0   14.8kb   14.8kb
    yellow open datahubexecutionrequestindex_v2_1680916320699            soLegwj1Q4Oi4YeDgcWfVQ 1 1       0     0     208b     208b
    yellow open datahubexecutionrequestindex_v2_clone_1680916209016      TPcHVGCORfyxspzgJwlbzw 1 1       0     0     208b     208b
    yellow open datahubingestionsourceindex_v2_1680916487702             VSCr-ptJQhOz8at3xv7Hyg 1 1       0     0     208b     208b
    yellow open datahubingestionsourceindex_v2_clone_1680916213912       gweGm5jMTIqpVDCihNV3Yg 1 1       0     0     208b     208b
    yellow open datahubpolicyindex_v2_1680916238023                      BuJtg-s_Sjq-1fp5wAhxRA 1 1      15     0     43kb     43kb
    yellow open datahubpolicyindex_v2_clone_1680916205715                8ms_cGW_R8KYrUrcaA1UkQ 1 1      15     2   50.3kb   50.3kb
    yellow open datahubretentionindex_v2_1680916385221                   -g4aYQilSOiRwLWnJOyhEA 1 1       0     0     208b     208b
    yellow open datahubretentionindex_v2_clone_1680916212282             qkmMdoB_T8OojE-KEXwHwQ 1 1       0     0     208b     208b
    yellow open datahubroleindex_v2_1680916278714                        wPxRPENxQdS46Y-L_eW2AQ 1 1       3     0   11.6kb   11.6kb
    yellow open datahubroleindex_v2_clone_1680916207180                  3F9I5cySTtaJQcRaGbRdyA 1 1       3     0    7.7kb    7.7kb
    yellow open datajob_datahubingestioncheckpointaspect_v1              CHuumlxhTqW2BJbiDk5xhQ 1 1     356     0    6.2mb    6.2mb
    yellow open datajob_datahubingestionrunsummaryaspect_v1              pzAyIz42Rbif9dZOnOXmCQ 1 1       0     0     208b     208b
    yellow open datajobindex_v2_1680916321205                            cmO1uwNJR3O967kXcD_67Q 1 1    3547    26    6.3mb    6.3mb
    yellow open datajobindex_v2_clone_1680916209481                      j-V2kltTQw-a1Cbssg_Ztw 1 1    3547   469    3.5mb    3.5mb
    yellow open dataprocessinstance_dataprocessinstanceruneventaspect_v1 iT2WND21QaWcaBm6DDOA5Q 1 1 6076340     0      1gb      1gb
    yellow open dataprocessinstanceindex_v2                              MuN9IKTLQgS-YvD2YiheoQ 1 1 3037068 18854    2.5gb    2.5gb
    yellow open dataprocessinstanceindex_v2_1680916549224                6jd2Ykf5RzG3tjSGijOyAA 1 1  176529     0  296.5mb  296.5mb
    yellow open dataprocessinstanceindex_v2_clone_1680916215556          ZAKuf8iwT7m8t8lysazAyQ 1 1 3036957 19447    2.5gb    2.5gb
    yellow open dataset_datasetprofileaspect_v1                          pzQ-y-U0TO-ZR70ZNIJ6Pg 1 1       0     0     208b     208b
    yellow open dataset_datasetusagestatisticsaspect_v1                  UWg2cP78T6uVPNr6ThRloQ 1 1  124827  7465  136.8mb  136.8mb
    yellow open dataset_operationaspect_v1                               S5Dw0GKlTbGECeGcKqvzlQ 1 1 6758731  1416 1018.3mb 1018.3mb
    yellow open datasetindex_v2_1680916508969                            S-F_HiBgTXaqQAi8mfK67Q 1 1    6560     0   22.4mb   22.4mb
    yellow open datasetindex_v2_clone_1680916215296                      nrzWuXqjSoeN0HCMZ5RFgw 1 1    6560   100   17.1mb   17.1mb
    yellow open domainindex_v2_1680916487907                             4izOUkg8TeiGOvp7ggb8Gg 1 1      20     0   30.8kb   30.8kb
    yellow open domainindex_v2_clone_1680916214148                       Fj5-lqmZRn62u4y879PAwQ 1 1      20     8   46.3kb   46.3kb
    yellow open globalsettingsindex_v2_1680916319428                     7svGEVJmT6-n5XvGP-UfOw 1 1       0     0     208b     208b
    yellow open globalsettingsindex_v2_clone_1680916207901               -SFPy2AGRMyTBIFY7DI1wQ 1 1       0     0     208b     208b
    yellow open glossarynodeindex_v2_1680916467488                       O-4-C2RHRCCgz9EBYiIaDA 1 1       1     0   38.4kb   38.4kb
    yellow open glossarynodeindex_v2_clone_1680916213672                 zx8Yb_ZJSQGzPpW3uzwKLw 1 1       1     0   32.6kb   32.6kb
    yellow open glossarytermindex_v2_1680916362427                       D2bBNT6FT9-2IrSNssfw0w 1 1       1     0   40.4kb   40.4kb
    yellow open glossarytermindex_v2_clone_1680916210607                 VrgvdzxvTtmktHCTGq1_6A 1 1       1     0   34.7kb   34.7kb
    yellow open graph_service_v1                                         W_Bcz6tuTwW6nH2NWSZ1uQ 1 1 3079242 15035  604.1mb  604.1mb
    yellow open system_metadata_service_v1                               -ljiiLMTSeidyP0yqTZ7Bw 1 1 9408521 73010    1.1gb    1.1gb
    yellow open tagindex_v2_1680916341913                                raqGV84UQAqJ0P3GdkLYtg 1 1       5     0   25.7kb   25.7kb
    yellow open tagindex_v2_clone_1680916210179                          LdLr2zOWQzy8LnlClzlqtw 1 1       5     0   12.3kb   12.3kb
    yellow open telemetryindex_v2_1680916508365                          6Nh947jPRLW4RqQcuk_JlA 1 1       0     0     208b     208b
    yellow open telemetryindex_v2_clone_1680916214654                    yjhPnNPNQuOFjUUBz9jbSA 1 1       0     0     208b     208b
    yellow open testindex_v2_1680916445849                               BP18OC6lTZezTEjNPQm68Q 1 1       0     0     208b     208b
    yellow open testindex_v2_clone_1680916212733                         UY6leBs9QDur6ENpOwg1Rg 1 1       0     0     208b     208b
    πŸ“– 1
    πŸ” 1
    l
    a
    +2
    • 5
    • 6
  • m

    most-room-32003

    04/09/2023, 1:52 AM
    reposting this question from several months ago, is externalUrl for assertions live?
    πŸ” 1
    βœ… 1
    πŸ“– 1
    l
    a
    • 3
    • 2
  • n

    nice-helmet-40615

    04/09/2023, 5:43 PM
    Hey guys, I try to use the OpenApi ingestion feature but can do it successfully only for the example pointed out here in Slack:
    Copy code
    source:
      type: openapi
      config:
        name: petstore
        url: <https://petstore.swagger.io/>  
        swagger_file: v2/swagger.json
    Ingestion for own DataHub OpenApi endpoint has no errors but nothing ingested: https://datahubproject.io/docs/api/openapi/openapi-usage-guide/
    Copy code
    source:
      type: openapi
      config:
        name: datahub
        url: <http://localhost:8080/>
        swagger_file: openapi/v3/api-docs
    The same behavior for OpenApi endpoints examples from here: https://developer.imis.com/docs/imis-rest-api-data-models-and-swagger-json-files Did I do something wrong or it is an ingestion bug that should be reported?
    πŸ” 1
    πŸ“– 1
    🩺 1
    l
    a
    • 3
    • 3
  • b

    billions-journalist-13819

    04/10/2023, 7:18 AM
    Hi, Team. I'm trying to configure datahub in a private environment using the provided docker image. However, the following security vulnerabilities were pointed out. Please let me know if you have any plans or workarounds to resolve this issue. 1. org.springframework:spring-beans (acryldata/datahub-upgrade) https://security.snyk.io/vuln/SNYK-JAVA-ORGSPRINGFRAMEWORK-2436751 2. org.postgresql:postgresql (acryldata/datahub-upgrade) https://security.snyk.io/vuln/SNYK-JAVA-ORGPOSTGRESQL-2390459 https://security.snyk.io/vuln/SNYK-JAVA-ORGPOSTGRESQL-2401816 3. sqlite (linkedin/datahub-gms, linkedin/datahub-frontend-react, linkedin/datahub-mae-consumer, linkedin/datahub-mce-consumer) https://security.snyk.io/vuln/SNYK-ALPINE39-SQLITE-449762
    l
    a
    +4
    • 7
    • 13
1...878889...119Latest