https://datahubproject.io logo
Join Slack
Powered by
# troubleshoot
  • b

    billions-tent-29367

    12/15/2021, 8:49 PM
    what's the difference between https://github.com/linkedin/datahub and https://github.com/acryldata/datahub ? which one should we be mirroring for our fork? do they actually have different commits?
    plus1 1
    m
    • 2
    • 4
  • c

    chilly-barista-6524

    12/16/2021, 10:01 AM
    Hey guys We need to update our use cases with latest datahub APIs since we are on the latest Datahub version now and will be shutting down our older deployment soon. One of our use case is to modify properties of our lake tables via a dag scheduled at every 10 minutes. It basically follows the
    read, modify and write
    approach as suggested by @mammoth-bear-12532 (https://datahubspace.slack.com/archives/CUMUWQU66/p1631031449173600?thread_ts=1631002886.169200&cid=CUMUWQU66) In v0.6 we had
    <http://localhost:8080/datasets>
    API available through which we were able to bulk read the datasets in chunks of 100 with params like
    {"q": "search", "input": "*", "start": 0, "count": 100}
    , perform modifications on them and write them back to datahub, but in latest version we can’t seem to find any such API which can return the dataset details in bulk. In v0.8.18 we found the following API to fetch the list of datasets:
    <http://localhost:8080/entities?action=search>
    but it only gives the dataset name and other aspect details are not available. Fetching the details for each dataset separately will result in > 6000 API calls being made to GMS everytime just to fetch the details and we will definitely want to avoid this. Is there any API in newer version which can provide bulk details (can’t seem to find one in documentation) or are there any updates on the
    generic patch support
    which can help us avoid reading the datasets everytime and just update what is required? cc: @able-ghost-99534 @quiet-winter-87538
    s
    b
    • 3
    • 12
  • n

    nice-planet-17111

    12/16/2021, 10:05 AM
    Hi team, i'm deploying datahub via helm chart. But whenever i try to helm-lint my chart, it automatically
    helm deps update
    my chart.. 😢 I'm completely lost why this is happying. Is there anybody that experienced similar problems?
    s
    b
    • 3
    • 5
  • b

    billions-tent-29367

    12/16/2021, 3:44 PM
    I'm trying to build v0.8.19.1 (which is the first that includes log4j 5.16) and I'm getting a build error relating to mypy and pydantic:
    setup.cfg:24: error: Error importing plugin "pydantic.mypy": cannot import name TypeVarDef
    b
    d
    m
    • 4
    • 9
  • r

    rich-policeman-92383

    12/17/2021, 10:53 AM
    Adding a new policy with all TAG asset privileges to all users. None of the users are able to edit/add tags on datasets. Error: Unauthorized to perform this action. Please contact datahub administrator.
    s
    b
    b
    • 4
    • 21
  • n

    numerous-helicopter-87149

    12/17/2021, 1:19 PM
    Hi everyone! I am trying to set up DataHub, following these instructions: https://datahubproject.io/docs/quickstart, however when I run
    datahub docker quickstart
    I am getting the following error: Unable to run quickstart: - Docker doesn't seem to be running. Did you start it? despite the fact that my Docker Desktop app is running. Is there anybody that experienced similar problems?
    s
    c
    • 3
    • 6
  • s

    sparse-planet-56664

    12/17/2021, 2:50 PM
    Hi, is it possible to fetch “all the lineage”(until the leaves) for a dataset without nesting graphql queries and such? I would like to get it all the way out to dashboards and such. REST/GraphQL doesn’t matter
    h
    m
    l
    • 4
    • 6
  • h

    handsome-football-66174

    12/17/2021, 3:51 PM
    General / @early-lamp-41924 / @big-carpet-38439 - Just did an upgrade to v0.8.19. Getting the following error when logging in ( admin account is working fine ) . Any new config that we need to add to OIDC-
    Copy code
    15:44:33 [application-akka.actor.default-dispatcher-172] WARN p.api.mvc.LegacySessionCookieBaker - Cookie failed message authentication check
    15:44:57 [application-akka.actor.default-dispatcher-138] ERROR auth.sso.oidc.OidcCallbackLogic - Unable to renew the session. The session store may not support this feature
    15:44:57 [application-akka.actor.default-dispatcher-138] WARN auth.sso.oidc.OidcCallbackLogic - Failed to extract groups: No OIDC claim with name groups found
    15:45:07 [application-akka.actor.default-dispatcher-138] ERROR auth.sso.oidc.OidcCallbackLogic - Failed to perform post authentication steps. Redirecting to error page.
    com.linkedin.r2.RemoteInvocationException: com.linkedin.r2.RemoteInvocationException: Failed to get response from server for URI <http://datahub-datahub-gms:8080/aspects>
    	at com.linkedin.restli.internal.client.ExceptionUtil.wrapThrowable(ExceptionUtil.java:135)
    	at com.linkedin.restli.internal.client.ResponseFutureImpl.getResponseImpl(ResponseFutureImpl.java:130)
    	at com.linkedin.restli.internal.client.ResponseFutureImpl.getResponse(ResponseFutureImpl.java:94)
    	at com.linkedin.common.client.BaseClient.sendClientRequest(BaseClient.java:36)
    	at com.linkedin.entity.client.RestliEntityClient.ingestProposal(RestliEntityClient.java:525)
    	at auth.sso.oidc.OidcCallbackLogic.setUserStatus(OidcCallbackLogic.java:404)
    	at auth.sso.oidc.OidcCallbackLogic.handleOidcCallback(OidcCallbackLogic.java:145)
    	at auth.sso.oidc.OidcCallbackLogic.perform(OidcCallbackLogic.java:103)
    	at controllers.SsoCallbackController$SsoCallbackLogic.perform(SsoCallbackController.java:70)
    	at controllers.SsoCallbackController$SsoCallbackLogic.perform(SsoCallbackController.java:57)
    	at org.pac4j.play.CallbackController.lambda$callback$0(CallbackController.java:56)
    	at java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1604)
    	at play.core.j.HttpExecutionContext$$anon$2.run(HttpExecutionContext.scala:56)
    	at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:41)
    	at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:49)
    	at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
    b
    • 2
    • 18
  • g

    gentle-sundown-2310

    12/17/2021, 7:15 PM
    Hello, I started to use datahub with docker and when I run the command datahub docker quickstart, I will get the following error: Unable to run quickstart - the following issues were detected: - datahub-gms is running but not healthy - mysql-setup did not exit cleanly
    b
    • 2
    • 7
  • g

    gentle-sundown-2310

    12/17/2021, 7:22 PM
    in command prompt it is said the log file is "C:\Users\sparan\AppData\Local\Temp\tmpyhuogmbs.log" But in my direcrory I dont see AppData folder.
    m
    • 2
    • 7
  • c

    cool-painting-92220

    12/17/2021, 8:09 PM
    Hey everyone! For Snowflake metadata ingestion jobs, if I were to create a Snowflake user to access data through, what would be the bare minimum access privileges that I would need to grant the user (without any need for query stats or table lineage)?
    thank you 1
    b
    • 2
    • 4
  • m

    miniature-eve-89383

    12/17/2021, 8:41 PM
    Is it normal that I can't create terms?
    b
    • 2
    • 16
  • m

    miniature-eve-89383

    12/17/2021, 9:05 PM
    Is there a known issue with multi-word tags? When you search for a multi-word tag, it will search for both word. If we have "Corporate info" and "Financial info", both will com out no matter which one of this two we search on.
    b
    l
    • 3
    • 18
  • m

    miniature-eve-89383

    12/18/2021, 3:16 AM
    It looks like the ingest connectors don't support the json PostgreSQL data type:
    Copy code
    'db.public.table': ['Profiling exception (psycopg2.errors.UndefinedFunction) operator does not exist: json = '
    • 1
    • 1
  • k

    kind-engineer-69109

    12/20/2021, 7:54 AM
    Hello everyone, I just wanted to know can we integrate Airflow’s Dag run status into datahub, so that we know is it successful or failing?
    s
    l
    • 3
    • 3
  • b

    bumpy-activity-74405

    12/20/2021, 8:03 AM
    It seems that you cannot have search terms with commas. Is this a known issue? Running
    0.8.19
    .
    s
    • 2
    • 3
  • k

    kind-engineer-69109

    12/20/2021, 8:08 AM
    Hello all, I tried upgrading docker using datahub docker upgrade but got htis error, Please help trouble shoot:
    Copy code
    Error starting ApplicationContext. To display the conditions report re-run your application with 'debug' enabled.
    2021-12-20 08:00:20.382 ERROR 1 --- [      main] o.s.boot.SpringApplication        : Application run failed
    
    org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'upgradeCli': Unsatisfied dependency expressed through field 'noCodeUpgrade'; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'ebeanServer' defined in class path resource [com/linkedin/gms/factory/entity/EbeanServerFactory.class]: Bean instantiation via factory method failed; nested exception is org.springframework.beans.BeanInstantiationException: Failed to instantiate [io.ebean.EbeanServer]: Factory method 'createServer' threw exception; nested exception is java.lang.NullPointerException
    	at org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor$AutowiredFieldElement.inject(AutowiredAnnotationBeanPostProcessor.java:643) ~[spring-beans-5.2.3.RELEASE.jar!/:5.2.3.RELEASE]
    	at org.springframework.beans.factory.annotation.InjectionMetadata.inject(InjectionMetadata.java:116) ~[spring-beans-5.2.3.RELEASE.jar!/:5.2.3.RELEASE]
    	at org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor.postProcessProperties(AutowiredAnnotationBeanPostProcessor.java:399) ~[spring-beans-5.2.3.RELEASE.jar!/:5.2.3.RELEASE]
    	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:1422) ~[spring-beans-5.2.3.RELEASE.jar!/:5.2.3.RELEASE]
    	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:594) ~[spring-beans-5.2.3.RELEASE.jar!/:5.2.3.RELEASE]
    	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:517) ~[spring-beans-5.2.3.RELEASE.jar!/:5.2.3.RELEASE]
    	at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:323) ~[spring-beans-5.2.3.RELEASE.jar!/:5.2.3.RELEASE]
    	at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222) ~[spring-beans-5.2.3.RELEASE.jar!/:5.2.3.RELEASE]
    	at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:321) ~[spring-beans-5.2.3.RELEASE.jar!/:5.2.3.RELEASE]
    	at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:202) ~[spring-beans-5.2.3.RELEASE.jar!/:5.2.3.RELEASE]
    	at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:879) ~[spring-beans-5.2.3.RELEASE.jar!/:5.2.3.RELEASE]
    	at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:878) ~[spring-context-5.2.3.RELEASE.jar!/:5.2.3.RELEASE]
    	at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:550) ~[spring-context-5.2.3.RELEASE.jar!/:5.2.3.RELEASE]
    	at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:775) ~[spring-boot-2.1.4.RELEASE.jar!/:2.1.4.RELEASE]
    	at org.springframework.boot.SpringApplication.refreshContext(SpringApplication.java:397) ~[spring-boot-2.1.4.RELEASE.jar!/:2.1.4.RELEASE]
    	at org.springframework.boot.SpringApplication.run(SpringApplication.java:316) ~[spring-boot-2.1.4.RELEASE.jar!/:2.1.4.RELEASE]
    	at org.springframework.boot.builder.SpringApplicationBuilder.run(SpringApplicationBuilder.java:139) [spring-boot-2.1.4.RELEASE.jar!/:2.1.4.RELEASE]
    	at com.linkedin.datahub.upgrade.UpgradeCliApplication.main(UpgradeCliApplication.java:14) [classes!/:na]
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_312]
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_312]
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_312]
    	at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_312]
    	at org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:48) [datahub-upgrade.jar:na]
    	at org.springframework.boot.loader.Launcher.launch(Launcher.java:87) [datahub-upgrade.jar:na]
    	at org.springframework.boot.loader.Launcher.launch(Launcher.java:50) [datahub-upgrade.jar:na]
    	at org.springframework.boot.loader.JarLauncher.main(JarLauncher.java:51) [datahub-upgrade.jar:na]
    Caused by: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'ebeanServer' defined in class path resource [com/linkedin/gms/factory/entity/EbeanServerFactory.class]: Bean instantiation via factory method failed; nested exception is org.springframework.beans.BeanInstantiationException: Failed to instantiate [io.ebean.EbeanServer]: Factory method 'createServer' threw exception; nested exception is java.lang.NullPointerException
    	at org.springframework.beans.factory.support.ConstructorResolver.instantiate(ConstructorResolver.java:656) ~[spring-beans-5.2.3.RELEASE.jar!/:5.2.3.RELEASE]
    	at org.springframework.beans.factory.support.ConstructorResolver.instantiateUsingFactoryMethod(ConstructorResolver.java:484) ~[spring-beans-5.2.3.RELEASE.jar!/:5.2.3.RELEASE]
    	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateUsingFactoryMethod(AbstractAutowireCapableBeanFactory.java:1338) ~[spring-beans-5.2.3.RELEASE.jar!/:5.2.3.RELEASE]
    	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1177) ~[spring-beans-5.2.3.RELEASE.jar!/:5.2.3.RELEASE]
    	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:557) ~[spring-beans-5.2.3.RELEASE.jar!/:5.2.3.RELEASE]
    	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:517) ~[spring-beans-5.2.3.RELEASE.jar!/:5.2.3.RELEASE]
    	at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:323) ~[spring-beans-5.2.3.RELEASE.jar!/:5.2.3.RELEASE]
    	at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222) ~[spring-beans-5.2.3.RELEASE.jar!/:5.2.3.RELEASE]
    	at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:321) ~[spring-beans-5.2.3.RELEASE.jar!/:5.2.3.RELEASE]
    	at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:202) ~[spring-beans-5.2.3.RELEASE.jar!/:5.2.3.RELEASE]
    	at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:310) ~[spring-beans-5.2.3.RELEASE.jar!/:5.2.3.RELEASE]
    	at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:202) ~[spring-beans-5.2.3.RELEASE.jar!/:5.2.3.RELEASE]
    	at org.springframework.beans.factory.config.DependencyDescriptor.resolveCandidate(DependencyDescriptor.java:276) ~[spring-beans-5.2.3.RELEASE.jar!/:5.2.3.RELEASE]
    	at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:1287) ~[spring-beans-5.2.3.RELEASE.jar!/:5.2.3.RELEASE]
    	at org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveDependency(DefaultListableBeanFactory.java:1207) ~[spring-beans-5.2.3.RELEASE.jar!/:5.2.3.RELEASE]
    	at org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor$AutowiredFieldElement.inject(AutowiredAnnotationBeanPostProcessor.java:640) ~[spring-beans-5.2.3.RELEASE.jar!/:5.2.3.RELEASE]
    	... 25 common frames omitted
    Caused by: org.springframework.beans.BeanInstantiationException: Failed to instantiate [io.ebean.EbeanServer]: Factory method 'createServer' threw exception; nested exception is java.lang.NullPointerException
    	at org.springframework.beans.factory.support.SimpleInstantiationStrategy.instantiate(SimpleInstantiationStrategy.java:185) ~[spring-beans-5.2.3.RELEASE.jar!/:5.2.3.RELEASE]
    	at org.springframework.beans.factory.support.ConstructorResolver.instantiate(ConstructorResolver.java:651) ~[spring-beans-5.2.3.RELEASE.jar!/:5.2.3.RELEASE]
    	... 40 common frames omitted
    Caused by: java.lang.NullPointerException: null
    	at io.ebean.datasource.pool.ConnectionPool.notifyDataSourceIsDown(ConnectionPool.java:406) ~[ebean-datasource-4.3.3.jar!/:na]
    	at io.ebean.datasource.pool.ConnectionPool.createUnpooledConnection(ConnectionPool.java:535) ~[ebean-datasource-4.3.3.jar!/:na]
    	at io.ebean.datasource.pool.ConnectionPool.createUnpooledConnection(ConnectionPool.java:524) ~[ebean-datasource-4.3.3.jar!/:na]
    	at io.ebean.datasource.pool.ConnectionPool.createConnectionForQueue(ConnectionPool.java:766) ~[ebean-datasource-4.3.3.jar!/:na]
    	at io.ebean.datasource.pool.PooledConnectionQueue.ensureMinimumConnections(PooledConnectionQueue.java:227) ~[ebean-datasource-4.3.3.jar!/:na]
    	at io.ebean.datasource.pool.ConnectionPool.initialise(ConnectionPool.java:301) ~[ebean-datasource-4.3.3.jar!/:na]
    	at io.ebean.datasource.pool.ConnectionPool.<init>(ConnectionPool.java:246) ~[ebean-datasource-4.3.3.jar!/:na]
    	at io.ebean.datasource.core.Factory.createPool(Factory.java:15) ~[ebean-datasource-4.3.3.jar!/:na]
    	at io.ebeaninternal.server.core.DefaultContainer.getDataSourceFromConfig(DefaultContainer.java:273) ~[ebean-11.33.3.jar!/:na]
    	at io.ebeaninternal.server.core.DefaultContainer.setDataSource(DefaultContainer.java:217) ~[ebean-11.33.3.jar!/:na]
    	at io.ebeaninternal.server.core.DefaultContainer.createServer(DefaultContainer.java:103) ~[ebean-11.33.3.jar!/:na]
    	at io.ebeaninternal.server.core.DefaultContainer.createServer(DefaultContainer.java:35) ~[ebean-11.33.3.jar!/:na]
    	at io.ebean.EbeanServerFactory.createInternal(EbeanServerFactory.java:109) ~[ebean-11.33.3.jar!/:na]
    	at io.ebean.EbeanServerFactory.create(EbeanServerFactory.java:70) ~[ebean-11.33.3.jar!/:na]
    	at com.linkedin.gms.factory.entity.EbeanServerFactory.createServer(EbeanServerFactory.java:31) ~[factories.jar!/:na]
    	at com.linkedin.gms.factory.entity.EbeanServerFactory$$EnhancerBySpringCGLIB$$bebb7a06.CGLIB$createServer$0(<generated>) ~[factories.jar!/:na]
    	at com.linkedin.gms.factory.entity.EbeanServerFactory$$EnhancerBySpringCGLIB$$bebb7a06$$FastClassBySpringCGLIB$$875b2117.invoke(<generated>) ~[factories.jar!/:na]
    	at org.springframework.cglib.proxy.MethodProxy.invokeSuper(MethodProxy.java:244) ~[spring-core-5.2.3.RELEASE.jar!/:5.2.3.RELEASE]
    	at org.springframework.context.annotation.ConfigurationClassEnhancer$BeanMethodInterceptor.intercept(ConfigurationClassEnhancer.java:363) ~[spring-context-5.2.3.RELEASE.jar!/:5.2.3.RELEASE]
    	at com.linkedin.gms.factory.entity.EbeanServerFactory$$EnhancerBySpringCGLIB$$bebb7a06.createServer(<generated>) ~[factories.jar!/:na]
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_312]
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_312]
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_312]
    	at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_312]
    	at org.springframework.beans.factory.support.SimpleInstantiationStrategy.instantiate(SimpleInstantiationStrategy.java:154) ~[spring-beans-5.2.3.RELEASE.jar!/:5.2.3.RELEASE]
    	... 41 common frames omitted
    s
    l
    +2
    • 5
    • 16
  • b

    busy-zebra-64439

    12/20/2021, 8:57 AM
    Hi Team , I have a issue while setting up the ingestor . i have prepared the connection yml and i ran the ingestor command datahub ingest -c mysql_ingestor.yml but , it throwing the error mysql is disabled; try running: pip install 'acryl-datahub[mysql]' when i run the pip install 'acryl-datahub[mysql]' the error shows as ERROR: Could not find a version that satisfies the requirement acryl-datahub[mysql] (from versions: none) ERROR: No matching distribution found for acryl-datahub[mysql] How to activate the mysql source for ingestion.
    s
    • 2
    • 1
  • w

    wonderful-quill-11255

    12/20/2021, 12:36 PM
    Hi. In the
    datahub-web-react
    module, it seems that the generated
    .ts
    files do not get cleaned up when running
    gradlew clean
    and not regenerated if they are already existing when running a
    gradlew build
    ? This causes builds to fail when jumping between branches. Am I missing something?
    b
    • 2
    • 2
  • b

    busy-zebra-64439

    12/21/2021, 4:13 AM
    Hi Team , I have already set the datahub local and able to ingest the sample data. i have used the below command to run locally datahub docker quickstart --build-locally I need to setup the same on one of organisation linux box , the linux box does not have any public internet connection so i could not able to install it using python3 -m pip install --upgrade acryl-datahub is there any other way to do it like installing the acryl-datahub like docker container
    b
    b
    • 3
    • 2
  • g

    green-football-48146

    12/21/2021, 7:25 AM
    Hi all, as long as I haven’t interacted with datahub-frontend for a period of time (about 30 minutes), and click on the page or refresh again, there will be a
    ERROR 500
    message as shown in the picture. Only when I go back to the homepage and refresh it can the error disappear. The error message of
    datahub-gms
    shows
    Connection reset by peer
    , any friends who can help. thanks.
    Copy code
    Caused by: com.linkedin.metadata.dao.exception.ESQueryException: Search query failed:
    	at com.linkedin.metadata.search.elasticsearch.query.ESSearchDAO.executeAndExtract(ESSearchDAO.java:62)
    	at com.linkedin.metadata.search.elasticsearch.query.ESSearchDAO.search(ESSearchDAO.java:89)
    	at com.linkedin.metadata.search.elasticsearch.ElasticSearchService.search(ElasticSearchService.java:66)
    	at com.linkedin.entity.client.JavaEntityClient.search(JavaEntityClient.java:239)
    	at com.linkedin.datahub.graphql.resolvers.search.SearchResolver.lambda$get$0(SearchResolver.java:47)
    	... 2 common frames omitted
    Caused by: java.io.IOException: Connection reset by peer
    	at org.elasticsearch.client.RestClient.extractAndWrapCause(RestClient.java:854)
    	at org.elasticsearch.client.RestClient.performRequest(RestClient.java:259)
    	at org.elasticsearch.client.RestClient.performRequest(RestClient.java:246)
    	at org.elasticsearch.client.RestHighLevelClient.internalPerformRequest(RestHighLevelClient.java:1613)
    	at org.elasticsearch.client.RestHighLevelClient.performRequest(RestHighLevelClient.java:1583)
    	at org.elasticsearch.client.RestHighLevelClient.performRequestAndParseEntity(RestHighLevelClient.java:1553)
    	at org.elasticsearch.client.RestHighLevelClient.search(RestHighLevelClient.java:1069)
    	at com.linkedin.metadata.search.elasticsearch.query.ESSearchDAO.executeAndExtract(ESSearchDAO.java:57)
    	... 6 common frames omitted
    b
    • 2
    • 7
  • k

    kind-engineer-69109

    12/21/2021, 7:27 AM
    Hi everyone, I just upgraded datahub to newer version, data ingested properly but cannot see data at frontend, I see an error log at backend. Thanks in advance
    s
    b
    • 3
    • 31
  • f

    future-petabyte-5942

    12/21/2021, 7:56 AM
    Hi Guys, I'm new here and Datahub I deployed datahub in docker (Windows). Works well and good. but mysql plugin is not being enabled even after running
    Copy code
    pip install acryl-datahub[mysql]
    After installing this, I even tried restarting the docker but no use. Any idea? thanks
    s
    b
    b
    • 4
    • 15
  • m

    microscopic-elephant-47912

    12/21/2021, 12:59 PM
    Hi all, I'm trying to ingest lookml files but I got an error. I looked around but could not find a solution or a bug report. Could you please check ? Slack Conversation
    b
    • 2
    • 3
  • h

    handsome-belgium-11927

    12/21/2021, 2:54 PM
    Hi, guys. I'm a little stuck with LDAP, anybody managed to make it work? I was trying to follow this chapter https://datahubproject.io/docs/datahub-frontend/#authentication But I don't understand if I have to rebuild the image or just edit the conf-file and start compose
    b
    s
    +2
    • 5
    • 21
  • r

    red-window-75368

    12/21/2021, 4:53 PM
    Hi, I am using mysql ingestion, and on the recipe I am specifing the database from which I want to ingest the metadata from, but datahub instead of ingesting only the tables from the database I chose, it ingested all the other default tables present in all databases inside my mysql. Any thoughts on how can I only ingest the tables from a specific database?
    i
    • 2
    • 9
  • r

    red-pizza-28006

    12/22/2021, 12:59 PM
    started noticing this exception with the snowflake-usage ingestion
    Copy code
    [2021-12-22 12:43:15,525] {{cursor.py:705}} INFO - query: [SELECT CAST('test plain returns' AS VARCHAR(60)) AS anon_1]
    [2021-12-22 12:43:15,630] {{cursor.py:729}} INFO - query execution done
    [2021-12-22 12:43:15,651] {{cursor.py:705}} INFO - query: [SELECT CAST('test unicode returns' AS VARCHAR(60)) AS anon_1]
    [2021-12-22 12:43:15,729] {{cursor.py:729}} INFO - query execution done
    [2021-12-22 12:43:15,746] {{cursor.py:705}} INFO - query: [ROLLBACK]
    [2021-12-22 12:43:15,817] {{cursor.py:729}} INFO - query execution done
    [2021-12-22 12:43:15,843] {{cursor.py:705}} INFO - query: [SELECT -- access_history.query_id, -- only for debugging purposes access_history...]
    [2021-12-22 12:43:49,057] {{cursor.py:729}} INFO - query execution done
    [2021-12-22 12:43:49,718] {{logging_mixin.py:104}} WARNING - Traceback (most recent call last):
    [2021-12-22 12:43:49,761] {{logging_mixin.py:104}} WARNING -   File "/usr/local/airflow/.local/lib/python3.7/site-packages/datahub/ingestion/source/usage/snowflake_usage.py", line 339, in _get_snowflake_history
        event = SnowflakeJoinedAccessEvent(**event_dict)
    [2021-12-22 12:43:49,775] {{logging_mixin.py:104}} WARNING -   File "pydantic/main.py", line 406, in pydantic.main.BaseModel.__init__
    [2021-12-22 12:43:49,799] {{logging_mixin.py:104}} WARNING - pydantic.error_wrappers.ValidationError: 1 validation error for SnowflakeJoinedAccessEvent
    email
      none is not an allowed value (type=type_error.none.not_allowed)
    d
    • 2
    • 18
  • g

    gray-carpet-60705

    12/22/2021, 2:41 PM
    Hello, I’m trying to upgrade from v0.8.17 to the latest v0.8.20 release and running into an error for mae-consumer. ES-setup job completed successfully, and gms seem to run okay. Currently, DataHub is deployed on EKS using Helm chart with AWS MSK, ES, and Postgres resources. Any insight would be appreciated. Thank you!
    l
    g
    b
    • 4
    • 27
  • g

    gentle-florist-49869

    12/22/2021, 5:50 PM
    https://github.com/linkedin/datahub/blob/master/docker/docker-compose.consumers.yml
    b
    b
    • 3
    • 26
  • a

    abundant-receptionist-6114

    12/22/2021, 8:02 PM
    Hello, I'm trying to write some metadata into datahub. i use sink-type datahub-rest. I see in logs many messages datahub-rest. I see logs in gms. But I don't not see anything in UI 😢 Previously it worked. I have a fresh installation (deleted MySQL, Elastic, neo4j PVs in K8s and topics from Dockerfile).
    g
    b
    • 3
    • 18
1...101112...119Latest