https://datahubproject.io logo
Join SlackCommunities
Powered by
# troubleshoot
  • m

    most-market-73884

    11/10/2022, 3:35 PM
    Hi, I am having trouble with the
    datahub_action
    to push checkpoint results into DataHub. In DataHub, I am using
    database_alias
    to use a different name for a Postgres schema, but the URNs generated by great expectations use the original name of it and results won’t show up in DataHub. There is
    platform_instance_map
    for platforms, is there something similar also for the database name?
    plus1 1
    e
    • 2
    • 2
  • b

    bland-orange-13353

    11/10/2022, 5:00 PM
    This message was deleted.
  • w

    worried-flower-88750

    11/10/2022, 5:06 PM
    Hi, we have Google-based SSO enabled for our DataHub. When we try to visit a specific DataHub link (say, to a certain dataset), but are not logged in, we get redirected back to the main DataHub front page after logging in, rather than being redirected back to the Dataset we wanted to see originally. Is there a fix for this?
    b
    l
    e
    • 4
    • 3
  • d

    dazzling-insurance-83303

    11/10/2022, 6:43 PM
    Hello, need help with table/column
    COMMENT
    ingestion. I am working on making Postgres table/column `COMMENT`s available in DataHub documentation. For some reasons the `COMMENT`s are not getting picked. The CLI and DataHub version we are using are 0.9.1 The table DDL is as below:
    Copy code
    CREATE TABLE IF NOT EXISTS public.accounts
    (
        id bigint NOT NULL DEFAULT nextval('accounts_id_seq'::regclass),
        account_uuid character varying COLLATE pg_catalog."default",
        status character varying COLLATE pg_catalog."default",
        created_at timestamp(6) without time zone NOT NULL,
        updated_at timestamp(6) without time zone NOT NULL,
        CONSTRAINT accounts_pkey PRIMARY KEY (id)
    );
    
    ALTER TABLE IF EXISTS public.accounts OWNER to mse_accounting_qa_user;
    
    COMMENT ON TABLE public.accounts IS 'Representation of a user account.';
    COMMENT ON COLUMN public.accounts.account_uuid IS 'Unique identifier for the account across all services';
    COMMENT ON COLUMN public.accounts.status IS 'The current status of the account, default("created")';
    -- Recipe file (redacted) is as below
    Copy code
    # accounts
    
    source:
      type: postgres
      config:
        # Coordinates
        host_port: xxxx:65432
        database: accounts_db
    
        # Credentials
        username: datahub_user
        password: ${DATAHUB_USER_DB_PWD}
        env: 'QA'
    
        # allow or deny tables for ingestion
        table_pattern:
          allow:
            - .*
          deny: []
    
        # allow or deny schemas for ingestion
        schema_pattern:
          allow:
            - .*
          deny:
            - "information_schema"
    
        # allow or deny views for ingestion - 'schema_name.view_name'
        view_pattern:
          allow:
            - .*
          deny: []
    
        # PostgreSQL DataHub profiler settings
        # See README.md for details
        profile_pattern:
          allow:
            - .*
          deny: []
    
        profiling:
          enabled: true # default false
          profile_table_level_only: False # default false
          include_field_sample_values: False # default is True. 
    
    transformers:
      - type: "simple_add_dataset_ownership"
        config:
          owner_urns:
            - "urn:li:corpGroup:d94f1f51-xxxx-4cbc-xxxx-3197b0d9862d" # Team accounts
            - "urn:li:corpGroup:ccbf944a-xxxx-4b39-xxxx-65d19ae967d6" # Data Dictionary
    
      - type: "simple_add_dataset_domain"
        config:
          domains:
            - "urn:li:domain:xxxxxxx-51bc-4f87-bc2f-b44dfb8b977d" # Domain
    
    sink:
      type: "datahub-kafka"
      config:
        connection:
          bootstrap: "xxxx:9999"
          producer_config:
            security.protocol: "ssl"
            ssl.ca.location: "/secrets/vault_ca_chain.pem"
            ssl.certificate.location: "/secrets/vault_cert.pem"
            ssl.key.location: "/secrets/vault_key.pem"
          schema_registry_url: "<https://schema-registryxxx>"
          schema_registry_config:
            ssl.ca.location: "/secrets/vault_ca_chain.pem"
            ssl.certificate.location: "/secrets/vault_cert.pem"
            ssl.key.location: "/secrets/vault_key.pem"
    
    # for `- type: "simple_add_dataset_domain"` to work
    datahub_api:
      server: "<https://datahub-gms.xxxx:443>"
    Could someone please advise if anything is amiss? TIA! 🙏
    h
    • 2
    • 10
  • h

    handsome-football-66174

    11/10/2022, 8:23 PM
    Hi Team, Facing some issues with package dependency mismatch ( due to the appplication of constraints file during installation ). Will acryl-datahub[redshift] package work with sqlalchemy==1.4.9 ?
    g
    • 2
    • 1
  • b

    better-spoon-77762

    11/11/2022, 3:05 AM
    Hello everyone I see some new services added under
    metadata-io/src/main/java/com/linkedin/metadata/service/
    e.g DomainService, TagService, OwnerService etc these are only called from the unit tests as of now. Can someone share whats the long term plan of using these
    e
    • 2
    • 2
  • a

    average-dinner-25106

    11/11/2022, 4:20 AM
    Does anyone exist who knows how to insert "bind dn" and "its password" to one's customrized file "jaas.conf"? The example of jaas authentication in the datahub guide hompage didn't show the configuration about them. What I found from ldaploginmodule is "javax.security.auth.login.name" and "javax.security.auth.login.password". Do they are mappeed to bind dn and password? I want to ldap-based authentication, not oidc due to the policy of our company.
    e
    • 2
    • 2
  • l

    lively-jackal-83760

    11/11/2022, 8:34 AM
    Hi guys I use datahub hosted on kubernetes. For some reason some of enities\indexes became broken and I can't delete them properly. Now I want force delete everything. Do we have any cli command for this? Or which index or table in db I should drop?
    g
    • 2
    • 1
  • m

    mysterious-advantage-78411

    11/11/2022, 1:54 PM
    Could somebody help me with this bug or feature? There is a flag to publish or unpublish each sheet in Tableau. Currently Datahub ignore this flag but it should be as option to avoid unpublished sheets for the ingestion.
  • b

    better-spoon-77762

    11/12/2022, 6:02 AM
    Hello Everyone I am using datahub cli version 0.9.2 to ingest a DBT lineage, using a recipe file like this
    Copy code
    source:
        type: "dbt"
        config:
          # Coordinates
          manifest_path: "/Users/asif/dbt_data/manifest.json"
          catalog_path: "/Users/asif/dbt_data/catalog.json"
          sources_path: "/Users/asif/dbt_data/sources.json"
      
          # Options
          target_platform: "snowflake" # e.g. bigquery/postgres/etc.
          platform_instance: "snowflake-1" # The instance of the platform that all assets produced by this recipe belong to
      sink:
        type: datahub-rest # default datahub-rest
        config:
          server: "<https://localhost:9002/api/gms>"
          extra_headers:
          token: xxxxx
      
      transformers:
        - type: "simple_add_dataset_properties"
          config:
            semantics: OVERWRITE
            properties:
              prop1: value1
              prop2: value2
    But I keep getting this error
    Copy code
    File "/usr/local/lib/python3.9/site-packages/datahub/cli/ingest_cli.py", line 142, in run_pipeline_async
        return await loop.run_in_executor(
      File "/usr/local/Cellar/python@3.9/3.9.15/Frameworks/Python.framework/Versions/3.9/lib/python3.9/concurrent/futures/thread.py", line 58, in run
        result = self.fn(*self.args, **self.kwargs)
      File "/usr/local/lib/python3.9/site-packages/datahub/cli/ingest_cli.py", line 133, in run_pipeline_to_completion
        raise e
      File "/usr/local/lib/python3.9/site-packages/datahub/cli/ingest_cli.py", line 125, in run_pipeline_to_completion
        pipeline.run()
      File "/usr/local/lib/python3.9/site-packages/datahub/ingestion/run/pipeline.py", line 376, in run
        for record_envelope in self.transform(
      File "/usr/local/lib/python3.9/site-packages/datahub/ingestion/transformer/base_transformer.py", line 217, in transform
        transformed_aspect = self.transform_aspect(
      File "/usr/local/lib/python3.9/site-packages/datahub/ingestion/transformer/add_dataset_properties.py", line 95, in transform_aspect
        assert in_dataset_properties_aspect
    AssertionError
    Can someone help what could be causing this?
    a
    g
    • 3
    • 9
  • b

    bulky-salesclerk-62223

    11/12/2022, 9:10 PM
    Hi all. I'm having an issue when using the dbt ingestion CLI to create glossary terms from the meta_mappings. (using version
    v0.9.2
    of the CLI and of DataHub). • When created, I can see the Glossary Term on the DBT Dataset in DataHub (screenshot) which has been automatically assigned using
    meta_mapping
    . • If I click on the glossary term, it says it exists and I can see all the related entities (screenshot) • However, if I go to the glossary, the glossary term isn't displayed (screenshot). • If while on the phantom glossary term's menu where I can see the entities etc, if I click on the three dots and try to move it into a term group, it says "Unkown Error Occured" (screenshot). I've noticed that I can actually type absolutely anything into the URL urn (
    /glossaryTerm/urn:li:glossaryTerm:<ANYTHING>/Related%20Entities?is_lineage_mode=false
    ). I can type any string into where I've put
    <ANYTHING>
    and it'll give me a glossary view of that term. However when they're created in the UI, they are given a long uuid which you can see in the URL. • Terms created in the UI persist in the Glossary menu, and can be moved into groups • Terms created via the datahub ingestion CLI (the API) can't do either of the above • Creating them in the UI first, then syncing the terms up via the ingestion CLI doesn't link the term you created in the UI to the term you've assigned to your datasets via
    meta_mapping
    , because they seem to have different
    urn:li<BLABLA>
    values. The UI is a uuid, and the dbt ingestion cli one is a friendly name. Any ideas? Edit: I believe this issue is related but not a duplicate of this: https://datahubspace.slack.com/archives/C029A3M079U/p1666343681646089 (cc @bulky-soccer-26729 @gifted-bird-57147)
    plus1 1
    b
    g
    b
    • 4
    • 6
  • g

    great-computer-16446

    11/14/2022, 8:04 AM
    Hi team, I think I have encountered a problem related to mae-consumer, the general phenomenon is that when ingesting more data at one time (in our actual example, this value is tens of thousands), the offset of mae-consumer will not be updated again, whether it is a standalone consumer or a consumer integrated with the metadata service has encountered this problem, the version I am currently using is v0.9.2, if rolled back to v0.9.0, this problem will not exist, I am not sure what to do next, hope to get guidance.
    a
    • 2
    • 8
  • r

    ripe-belgium-29225

    11/14/2022, 9:31 AM
    Hi guys, help me please with advice on how to configure datahub with default authentification for anonymous users. I need to implement the very same behavior as on demo.datahubproject. So when I open datahub, i get logged in with some default user with read-only permissions, but also there must be option to log in under different user (with full permissions, for example)
    e
    • 2
    • 1
  • b

    breezy-portugal-43538

    11/14/2022, 9:55 AM
    Hello, is there some tutorial or example on how to use SDK in order to ingest the MlModel to the datahub?
    d
    g
    • 3
    • 11
  • g

    green-hamburger-3800

    11/14/2022, 1:48 PM
    Hello folks, how are you? I've been searching here for a while on how to configure the Logout Redirect when using OIDC (I'm using OneLogin) I've tested a few different redirect URIs but I always end up with the error. It seems I need to add it somewhere in the datahub configuration to actually allow OneLogin to redirect there, but I couldn't find where. Can someone point it out? Thanks a lot o/
    e
    • 2
    • 2
  • g

    green-hamburger-3800

    11/14/2022, 3:19 PM
    On another topic, I keep getting the following error in the GMS logs:
    Copy code
    15:18:28.491 [qtp1830908236-10] WARN  c.d.a.a.AuthenticatorChain:70 - Authentication chain failed to resolve a valid authentication. Errors: [(com.datahub.authentication.authenticator.DataHubSystemAuthenticator,Failed to authenticate inbound request: Authorization header is missing Authorization header.), (com.datahub.authentication.authenticator.DataHubTokenAuthenticator,Failed to authenticate inbound request: Request is missing 'Authorization' header.)]
    and I'm not sure where is it coming from since no ingestion/usage exists at the moment (I'd guess an internal thing and I might be missing some configuration?) Thanks (=
    plus1 1
    e
    s
    +5
    • 8
    • 28
  • b

    bright-motherboard-35257

    11/14/2022, 3:28 PM
    S3 ingestion error below (please assist.)
    Copy code
    '[2022-11-14 15:21:02,184] ERROR    {logger:26} - Please set env variable SPARK_VERSION\n'
               'JAVA_HOME is not set\n'
    I have JAVA_HOME set...
    Copy code
    $ echo $JAVA_HOME
    /usr/lib/jvm/java-1.8.0-openjdk-1.8.0.322.b06-11.el8.x86_64/
    MY SPARK_HOME set...
    Copy code
    $ echo $SPARK_HOME
    /opt/spark
    My pyspark version == 3.0.3
    Copy code
    $ pyspark --version
    22/11/14 09:23:51 WARN Utils: Your hostname, sa1x-eam-p1 resolves to a loopback address: 127.0.0.1; using 172.30.230.254 instead (on interface ens3)
    22/11/14 09:23:51 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
    Welcome to
          ____              __
         / __/__  ___ _____/ /__
        _\ \/ _ \/ _ `/ __/  '_/
       /___/ .__/\_,_/_/ /_/\_\   version 3.0.3
          /_/
                            
    Using Scala version 2.12.10, OpenJDK 64-Bit Server VM, 1.8.0_322
    Branch HEAD
    Compiled by user ubuntu on 2021-06-17T04:08:22Z
    Revision 65ac1e75dc468f53fc778cd2ce1ba3f21067aab8
    Url <https://github.com/apache/spark>
    Type --help for more information.
    My SPARK_VERSION set...
    Copy code
    $ echo $SPARK_VERSION
    3.0.3
    ✅ 1
    a
    g
    • 3
    • 11
  • a

    ancient-apartment-23316

    11/14/2022, 6:58 PM
    Hi, I’m using the Share Invite Link for new users and I’ve noticed that 1 link can be used multiple times to register multiple different users. I was expecting this link to work only 1 time for 1 user registration, and then it becomes expired
    e
    • 2
    • 1
  • a

    acceptable-terabyte-34789

    11/15/2022, 7:57 AM
    Hi! We deployed Datahub on AWS ecosystem with kafka brokers at MSK Service. We want to consume from topic PlatformEvent_v1 as we want to know when some dataset property has changed so we used the s3 connector from Confluent to leave records on s3. We already connect it successfully but records are not properly sent, with some strange characters as you can see:
    Copy code
    {
      "_1": "\u0003\u0000*�?Y�&CZ�=9�+�%��݊��a\"entityChangeEvent�\u0005{\"auditStamp\":{\"actor\":\"urn:li:corpuser:datahub\",\"time\":1668075837259},\"entityUrn\":\"urn:li:domain:xxxxxxx\",\"entityType\":\"domain\",\"modifier\":\"urn:li:corpuser:datahub\",\"category\":\"OWNER\",\"operation\":\"ADD\",\"version\":0,\"parameters\":{\"ownerType\":\"TECHNICAL_OWNER\",\"ownerUrn\":\"urn:li:corpuser:datahub\"}} application/json"
    }
    These are the properties we used with StringConverter as key.converter and value.converter.
    Copy code
    connector.class=io.confluent.connect.s3.S3SinkConnector
    behavior.on.null.values=ignore
    s3.region=eu-west-1
    flush.size=1
    schema.compatibility=NONE
    tasks.max=2
    topics=PlatformEvent_v1
    key.converter.schemas.enable=false
    format.class=io.confluent.connect.s3.format.json.JsonFormat
    partitioner.class=io.confluent.connect.storage.partitioner.DefaultPartitioner
    value.converter.schemas.enable=false
    value.converter=org.apache.kafka.connect.json.JsonConverter
    storage.class=io.confluent.connect.s3.storage.S3Storage
    s3.bucket.name=xxxxxxxxx
    key.converter=org.apache.kafka.connect.json.JsonConverter
    Then we tried changing to JsonConverter but it throws the following error:
    Copy code
    [Worker-073fad87bf643ddc8] Caused by: org.apache.kafka.common.errors.SerializationException: com.fasterxml.jackson.core.JsonParseException: Unrecognized token 'entityChangeEvent': was expecting (JSON String, Number, Array, Object or token 'null', 'true' or 'false')
    How can we consume the records properly? Thank you!
    b
    • 2
    • 6
  • a

    acceptable-terabyte-34789

    11/15/2022, 9:49 AM
    and another doubt regarding uploading downstreams and upstreams... we have developed a custom spark listener using the spline library and we can emit downstreams at column-field level but downstreams are not present in datahub... there isn't any error. Where can I see logs to help me in the tool?
    d
    a
    • 3
    • 15
  • g

    gifted-rocket-7960

    11/15/2022, 2:46 PM
    Hi Team Could Someone please help me with it ? when i am trying to ingest upstreamLineage .. getting below error, but the field is present in datahub if we check curl response below
    Copy code
    Provided urn urn:li:datasetField:(urn:li:dataset:(urn:li:dataPlatform:postgres,bar4,DEV),c1)\" is invalid: Failed to find entity with name datasetField in EntityRegistry
    curl 'http://localhost:8080/entities/urn:li:dataset:(urn:li:dataPlatform:postgres,bar4,DEV)' {"value":{"com.linkedin.metadata.snapshot.DatasetSnapshot":{"*urn":"urnlidataset:(urnlidataPlatform:postgres,bar4,DEV*)","aspects":[{"com.linkedin.metadata.key.DatasetKey":{"origin":"DEV","name":"bar4","platform":"urnlidataPlatform:postgres"}},{"com.linkedin.common.BrowsePaths":{"paths":["/dev/postgres"]}},{"com.linkedin.schema.SchemaMetadata":{"fields":[{"*fieldPath":"c1",*"description":"test fine grained lineage","type":{"type":{"com.linkedin.schema.StringType":{}}},"nativeDataType":"VARCHAR(50)"}],"schemaName":"customer","version":0,"platformSchema":{"com.linkedin.schema.MySqlDDL":{"tableSchema":"col1"}},"platform":"urnlidataPlatform:postgres","hash":"hash"}},{"com.linkedin.dataset.DatasetProperties":{"description":"bar2 DataSet"}},{"com.linkedin.common.DataPlatformInstance":{"platform":"urnlidataPlatform:postgres"}}]}}}*%*
    d
    • 2
    • 4
  • c

    cuddly-dream-15899

    11/15/2022, 4:48 PM
    Hi Team, Could someone point me or give me guidance around how you can download later community versions of the “datahub-prerequisites” as i’m trying to install this on a M1, and what i have understood later versions of neoj4 like 4.4.14 have support for the platform. Thank you 🙂
    e
    • 2
    • 4
  • w

    worried-branch-76677

    11/15/2022, 4:49 PM
    Hi all, have anyone experience this when sending a very very long MCP ? Any solution for this?
    Copy code
    [{'error': 'Unable to emit metadata to DataHub GMS',
                   'info': {'message': "HTTPSConnectionPool(host='<http://datahub-gms.net|datahub-gms.net>', port=443): Max retries exceeded "
                                       "with url: /aspects?action=ingestProposal (Caused by SSLError(SSLEOFError(8, 'EOF occurred in violation of "
                                       "protocol (_ssl.c:2384)')))"}}],
    i
    g
    • 3
    • 12
  • w

    witty-television-74309

    11/15/2022, 4:51 PM
    Does anybody know if datahub integration with great expectation is supported for v2 , PandasDataset ( i.e PandasDatasource) ?
    g
    • 2
    • 4
  • g

    gentle-tailor-78929

    11/15/2022, 5:17 PM
    Hello, I’m trying to run DataHub locally for the first time. When running
    ./gradlew build
    , I get the following error:
    Copy code
    /datahub/metadata-models/build.gradle': 1: unable to resolve class io.datahubproject.GenerateJsonSchemaTask
       @ line 1, column 1.
         import io.datahubproject.GenerateJsonSchemaTask
    Any ideas on what the issue may be? Thanks!
    a
    • 2
    • 2
  • g

    gentle-tailor-78929

    11/15/2022, 5:21 PM
    Is it possible to use
    datahub docker quickstart --build-locally
    with
    podman-compose
    instead of
    docker-compose
    ?
    plus1 1
    i
    w
    +3
    • 6
    • 8
  • m

    miniature-plastic-43224

    11/15/2022, 5:45 PM
    All, while looking at datahub documentation I found a small article called "Monitoring DataHub", it is all about tracing and metrics. I am curious, is there any other documentation, specific examples, cases, suggestion, tools are available (especially for UI performance testing)?
    e
    • 2
    • 2
  • l

    little-breakfast-38102

    11/15/2022, 10:16 PM
    Hi @gray-shoe-75895 , I am trying to use add_dataset_terms in our transformer to add custom terms to our assets at the point of ingestion. I have attached screen shots of error message, custom function passed in transformer step, and transformer step in recipe. Appreciate if you can help me understand what is missing.
    g
    • 2
    • 3
  • b

    best-napkin-60434

    11/16/2022, 1:21 AM
    Hi Team,
  • b

    best-napkin-60434

    11/16/2022, 1:21 AM
    An error occurs when building the datahub-frontend docker image on EC2. I'm trying the docker image build by below command. "COMPOSE_DOCKER_CLI_BUILD=1 DOCKER_BUILDKIT=1 docker-compose -p datahub build --no-cache datahub-frontend-react" And the build fails with the following errors. Does anyone have the same problem? And the strange thing is that if I run the below build command described in the Dockerfile without building the docker image, it builds without any problem. "./gradlew datahub web reactbuild -x test -x yarnTest -x yarnLint" "./gradlew datahub frontenddist -PuseSystemNode=${USE_SYSTEM_NODE} -x test -x yarnTest -x yarnLint" Thanks. Errors: => [prod-build 16/16] RUN cd datahub-src && ./gradlew datahub web reactbuild -x test -x yarnTest -x yarnLint --debug --refresh-dependencies && ./gradlew :datahub-frontend 711.9s #0 101.2 2022-11-16T003015.491+0000 [ERROR] [system.err] Caused by: java.lang.IllegalArgumentException: Cannot find '.git' directory #0 161.9 2022-11-16T003116.334+0000 [ERROR] [system.err] warning " > @cypress/webpack-preprocessor@5.8.0" has unmet peer dependency "@babel/core@^7.0.1". #0 161.9 2022-11-16T003116.334+0000 [ERROR] [system.err] warning " > @cypress/webpack-preprocessor@5.8.0" has unmet peer dependency "@babel/preset-env@^7.0.0". #0 161.9 2022-11-16T003116.334+0000 [ERROR] [system.err] warning " > @cypress/webpack-preprocessor@5.8.0" has unmet peer dependency "webpack@^4.18.1". #0 161.9 2022-11-16T003116.334+0000 [ERROR] [system.err] warning " > @data-ui/xy-chart@0.0.84" has incorrect peer dependency "react@^15.0.0-0 || ^16.0.0-0". #0 161.9 2022-11-16T003116.334+0000 [ERROR] [system.err] warning " > @data-ui/xy-chart@0.0.84" has incorrect peer dependency "react-dom@^15.0.0-0 || ^16.0.0-0". #0 161.9 2022-11-16T003116.335+0000 [ERROR] [system.err] warning "@data-ui/xy-chart > @data-ui/shared@0.0.84" has incorrect peer dependency "react@^15.0.0-0 || ^16.0.0-0". #0 161.9 2022-11-16T003116.335+0000 [ERROR] [system.err] warning "@data-ui/xy-chart > @data-ui/shared@0.0.84" has incorrect peer dependency "react-dom@^15.0.0-0 || ^16.0.0-0". #0 161.9 2022-11-16T003116.335+0000 [ERROR] [system.err] warning "@data-ui/xy-chart > @vx/axis@0.0.175" has incorrect peer dependency "react@^15.0.0-0 || ^16.0.0-0". #0 161.9 2022-11-16T003116.335+0000 [ERROR] [system.err] warning "@data-ui/xy-chart > @vx/glyph@0.0.165" has incorrect peer dependency "react@^15.0.0-0 || ^16.0.0-0". #0 161.9 2022-11-16T003116.335+0000 [ERROR] [system.err] warning "@data-ui/xy-chart > @vx/gradient@0.0.165" has incorrect peer dependency "react@^15.0.0-0 || ^16.0.0-0". #0 161.9 2022-11-16T003116.336+0000 [ERROR] [system.err] warning "@data-ui/xy-chart > @vx/grid@0.0.180" has incorrect peer dependency "react@^15.0.0-0 || ^16.0.0-0". #0 161.9 2022-11-16T003116.337+0000 [ERROR] [system.err] warning "@data-ui/xy-chart > @vx/group@0.0.165" has incorrect peer dependency "react@^15.0.0-0 || ^16.0.0-0". #0 161.9 2022-11-16T003116.337+0000 [ERROR] [system.err] warning "@data-ui/xy-chart > @vx/pattern@0.0.165" has incorrect peer dependency "react@^15.0.0-0 || ^16.0.0-0". #0 161.9 2022-11-16T003116.337+0000 [ERROR] [system.err] warning "@data-ui/xy-chart > @vx/responsive@0.0.192" has incorrect peer dependency "react@^15.0.0-0 || ^16.0.0-0". #0 161.9 2022-11-16T003116.337+0000 [ERROR] [system.err] warning "@data-ui/xy-chart > @vx/shape@0.0.165" has incorrect peer dependency "react@^15.0.0-0 || ^16.0.0-0". #0 161.9 2022-11-16T003116.337+0000 [ERROR] [system.err] warning "@data-ui/xy-chart > @vx/stats@0.0.165" has incorrect peer dependency "react@^15.0.0-0 || ^16.0.0-0". #0 161.9 2022-11-16T003116.337+0000 [ERROR] [system.err] warning "@data-ui/xy-chart > @vx/text@0.0.192" has incorrect peer dependency "react@^16.3.0-0". #0 161.9 2022-11-16T003116.337+0000 [ERROR] [system.err] warning "@data-ui/xy-chart > @vx/threshold@0.0.170" has incorrect peer dependency "react@^15.0.0-0 || ^16.0.0-0". #0 161.9 2022-11-16T003116.337+0000 [ERROR] [system.err] warning "@data-ui/xy-chart > @vx/tooltip@0.0.165" has incorrect peer dependency "react@^15.0.0-0 || ^16.0.0-0". #0 161.9 2022-11-16T003116.337+0000 [ERROR] [system.err] warning "@data-ui/xy-chart > @vx/voronoi@0.0.165" has incorrect peer dependency "react@^15.0.0-0 || ^16.0.0-0". #0 161.9 2022-11-16T003116.337+0000 [ERROR] [system.err] warning "@data-ui/xy-chart > @data-ui/shared > @vx/shape@0.0.168" has incorrect peer dependency "react@^15.0.0-0 || ^16.0.0-0". #0 161.9 2022-11-16T003116.337+0000 [ERROR] [system.err] warning "@data-ui/xy-chart > @vx/axis > @vx/group@0.0.170" has incorrect peer dependency "react@^15.0.0-0 || ^16.0.0-0". #0 161.9 2022-11-16T003116.338+0000 [ERROR] [system.err] warning "@data-ui/xy-chart > @vx/axis > @vx/shape@0.0.175" has incorrect peer dependency "react@^15.0.0-0 || ^16.0.0-0". #0 161.9 2022-11-16T003116.338+0000 [ERROR] [system.err] warning "@data-ui/xy-chart > @vx/axis > @vx/text@0.0.175" has incorrect peer dependency "react@^15.0.0-0 || ^16.0.0-0". #0 161.9 2022-11-16T003116.338+0000 [ERROR] [system.err] warning "@data-ui/xy-chart > @vx/grid > @vx/shape@0.0.179" has incorrect peer dependency "react@^15.0.0-0 || ^16.0.0-0". #0 161.9 2022-11-16T003116.339+0000 [ERROR] [system.err] warning "@data-ui/xy-chart > @vx/threshold > @vx/clip-path@0.0.165" has incorrect peer dependency "react@^15.0.0-0 || ^16.0.0-0". #0 161.9 2022-11-16T003116.339+0000 [ERROR] [system.err] warning "@data-ui/xy-chart > @vx/threshold > @vx/shape@0.0.170" has incorrect peer dependency "react@^15.0.0-0 || ^16.0.0-0". #0 161.9 2022-11-16T003116.339+0000 [ERROR] [system.err] warning "@data-ui/xy-chart > @vx/tooltip > @vx/bounds@0.0.165" has incorrect peer dependency "react@^15.0.0-0 || ^16.0.0-0". #0 161.9 2022-11-16T003116.339+0000 [ERROR] [system.err] warning "@data-ui/xy-chart > @vx/tooltip > @vx/bounds@0.0.165" has incorrect peer dependency "react-dom@^15.0.0-0 || ^16.0.0-0". #0 161.9 2022-11-16T003116.340+0000 [ERROR] [system.err] warning " > @testing-library/user-event@12.8.3" has unmet peer dependency "@testing-library/dom@>=7.21.4". #0 161.9 2022-11-16T003116.343+0000 [ERROR] [system.err] warning " > @visx/drag@1.7.4" has incorrect peer dependency "react@^16.8.0-0". #0 161.9 2022-11-16T003116.343+0000 [ERROR] [system.err] warning " > @visx/responsive@1.10.1" has incorrect peer dependency "react@^15.0.0-0 || ^16.0.0-0". #0 161.9 2022-11-16T003116.343+0000 [ERROR] [system.err] warning " > @vx/axis@0.0.199" has incorrect peer dependency "react@^16.3.0-0". #0 161.9 2022-11-16T003116.343+0000 [ERROR] [system.err] warning " > @vx/group@0.0.199" has incorrect peer dependency "react@^15.0.0-0 || ^16.0.0-0". #0 161.9 2022-11-16T003116.344+0000 [ERROR] [system.err] warning " > @vx/shape@0.0.199" has incorrect peer dependency "react@^16.3.0-0". #0 161.9 2022-11-16T003116.344+0000 [ERROR] [system.err] warning "@vx/axis > @vx/text@0.0.199" has incorrect peer dependency "react@^16.3.0-0". #0 161.9 2022-11-16T003116.344+0000 [ERROR] [system.err] warning " > @vx/gradient@0.0.199" has incorrect peer dependency "react@^15.0.0-0 || ^16.0.0-0". #0 161.9 2022-11-16T003116.344+0000 [ERROR] [system.err] warning " > @vx/grid@0.0.199" has incorrect peer dependency "react@^15.0.0-0 || ^16.0.0-0". #0 161.9 2022-11-16T003116.344+0000 [ERROR] [system.err] warning " > @vx/hierarchy@0.0.199" has incorrect peer dependency "react@^16.3.0-0". #0 161.9 2022-11-16T003116.344+0000 [ERROR] [system.err] warning " > @vx/legend@0.0.199" has incorrect peer dependency "react@^16.3.0-0". #0 161.9 2022-11-16T003116.344+0000 [ERROR] [system.err] warning " > @vx/zoom@0.0.199" has incorrect peer dependency "react@^15.0.0-0 || ^16.0.0-0". #0 161.9 2022-11-16T003116.344+0000 [ERROR] [system.err] warning "analytics > @analytics/core > analytics-utils@0.4.4" has unmet peer dependency "@types/dlv@^1.0.0". #0 161.9 2022-11-16T003116.344+0000 [ERROR] [system.err] warning "antd > rc-picker@2.5.10" has unmet peer dependency "dayjs@^1.8.30". #0 161.9 2022-11-16T003116.346+0000 [ERROR] [system.err] warning " > craco-antd@1.19.0" has incorrect peer dependency "@craco/craco@^5.5.0". #0 161.9 2022-11-16T003116.346+0000 [ERROR] [system.err] warning " > craco-antd@1.19.0" has incorrect peer dependency "react-scripts@^3.4.3". #0 161.9 2022-11-16T003116.346+0000 [ERROR] [system.err] warning "craco-antd > craco-less@1.17.0" has incorrect peer dependency "@craco/craco@^5.5.0". #0 161.9 2022-11-16T003116.346+0000 [ERROR] [system.err] warning "craco-antd > craco-less@1.17.0" has incorrect peer dependency "react-scripts@^3.3.0". #0 161.9 2022-11-16T003116.346+0000 [ERROR] [system.err] warning "craco-antd > craco-less > less-loader@6.2.0" has unmet peer dependency "webpack@^4.0.0 || ^5.0.0". #0 161.9 2022-11-16T003116.347+0000 [ERROR] [system.err] warning " > graphql-tag@2.10.3" has incorrect peer dependency "graphql@^0.9.0 || ^0.10.0 || ^0.11.0 || ^0.12.0 || ^0.13.0 || ^14.0.0". #0 161.9 2022-11-16T003116.347+0000 [ERROR] [system.err] warning "graphql.macro > babel-literal-to-ast@2.1.0" has unmet peer dependency "@babel/core@^7.1.2". #0 161.9 2022-11-16T003116.348+0000 [ERROR] [system.err] warning " > react-highlighter@0.4.3" has incorrect peer dependency "react@^0.14.0 || ^15.0.0 || ^16.0.0". #0 162.0 2022-11-16T003116.364+0000 [ERROR] [system.err] warning " > styled-components@5.3.0" has unmet peer dependency "react-is@>= 16.8.0". #0 162.0 2022-11-16T003116.370+0000 [ERROR] [system.err] warning " > babel-loader@8.2.2" has unmet peer dependency "@babel/core@^7.0.0". #0 162.0 2022-11-16T003116.370+0000 [ERROR] [system.err] warning " > babel-loader@8.2.2" has unmet peer dependency "webpack@>=2". #0 162.0 2022-11-16T003116.370+0000 [ERROR] [system.err] warning " > copy-webpack-plugin@6.4.1" has unmet peer dependency "webpack@^4.37.0 || ^5.0.0". #0 162.0 2022-11-16T003116.370+0000 [ERROR] [system.err] warning "eslint-config-airbnb-typescript > eslint-config-airbnb@18.2.1" has unmet peer dependency "eslint-plugin-import@^2.22.1". #0 162.0 2022-11-16T003116.370+0000 [ERROR] [system.err] warning "eslint-config-airbnb-typescript > eslint-config-airbnb@18.2.1" has unmet peer dependency "eslint-plugin-jsx-a11y@^6.4.1". #0 162.0 2022-11-16T003116.370+0000 [ERROR] [system.err] warning "eslint-config-airbnb-typescript > eslint-config-airbnb@18.2.1" has unmet peer dependency "eslint-plugin-react-hooks@^4 || ^3 || ^2.3.0 || ^1.7.0". #0 162.0 2022-11-16T003116.370+0000 [ERROR] [system.err] warning "eslint-config-airbnb-typescript > eslint-config-airbnb-base@14.2.1" has unmet peer dependency "eslint-plugin-import@^2.22.1".
    i
    • 2
    • 6
1...596061...119Latest