https://datahubproject.io logo
Join Slack
Powered by
# getting-started
  • g

    glamorous-easter-15952

    06/13/2022, 3:34 PM
    How do I consume messages from the confluent cloud. Everything went smooth with UI ingestion for kafka in terms of clusters and topics except for the messages produced there. Is there are guide? Thanks.
    b
    • 2
    • 2
  • h

    handsome-stone-44066

    06/14/2022, 7:23 AM
    Hi, all. I want validate my dataframe by Great Expectations in the memory mode. when I get GE result and want send it to my Datahub by
    acryl-datahub[great-expectations]
    api. I get the warnig:
    Copy code
    DataHubValidationAction does not recognize this GE data asset type - <class 'great_expectations.validator.validator.Validator'>.                         This is either using v2-api or execution engine other than sqlalchemy.
    What is the right way to use it? Or can I format the GE result and send to Datahub bypass the api?
    h
    • 2
    • 3
  • g

    gentle-camera-33498

    06/14/2022, 2:58 PM
    Hello, Can someone help me? I'm getting an error on GMS server. Error details on thread.
    b
    b
    +2
    • 5
    • 84
  • g

    glamorous-easter-15952

    06/14/2022, 5:24 PM
    Is it possible to connect with Aiven kafka rather than confluent cloud ?
    b
    s
    • 3
    • 2
  • a

    adorable-addition-82413

    06/14/2022, 7:04 PM
    Hey guys, i'm trying to understand the datahub componets architecture. So, what is the importance of the mysql internal db that is in the pre-requisites?
    b
    • 2
    • 1
  • b

    busy-analyst-8258

    06/14/2022, 8:02 PM
    Hello All,
  • b

    busy-analyst-8258

    06/14/2022, 8:04 PM
    Hello All, There discrepancy causing counts to differ for tables and DBs on MDH home page between datasets and platforms. how to identify what cause the difference ?
    b
    b
    k
    • 4
    • 21
  • l

    late-zoo-31017

    06/15/2022, 1:55 AM
    Hi all. I have 2 questions
  • l

    late-zoo-31017

    06/15/2022, 1:58 AM
    1. I noticed that I cannot associate a domain with a user or a group. Also I cannot add browse paths to say, platforms, or data process instances. Reason is there are some aspects (eg 'browsePaths' that are not compatible w/ certain *entityType*s). Is there anywhere a list of '_what-entity-is compatible-with-what-aspectName_'?
    h
    • 2
    • 2
  • l

    late-zoo-31017

    06/15/2022, 2:00 AM
    2. I can add dataplatforms + dataprocess instances, but they are NOT being displayed (in the catalogue/UI - I can see them in the db). Also, when I associate 2 datasets with a dataprocess instance (as input + output) thesee appear to have lineage (ie the tab is enabled) but if you go to see what is there... it is empty. Is that to be expected?
    b
    • 2
    • 59
  • l

    late-zoo-31017

    06/15/2022, 2:03 AM
    (the bigger problem: I am trying to model some experiments that input output files (think datasets). What is the hierarchy I can "build" to show dependencies? So far I have only managed to create 'orchestrator->dataflow->datajob->dataset' (in the navigation path). Is there any other way (mind you I also want to have stuff displayable in the catalogue)??
  • l

    late-zoo-31017

    06/15/2022, 2:06 AM
    Any idea/hint/... re the above I would be much-much-much appreciated
  • g

    great-nest-9369

    06/15/2022, 8:43 AM
    Hi, Team! Could you help to tell me what is the difference between
    Test
    and
    Assertion
    ? And what are their usage scenarios?
    b
    • 2
    • 3
  • s

    swift-state-17795

    06/15/2022, 9:37 AM
    Hi Team, is there other existing config option when use file based lineage ? • https://datahubproject.io/docs/generated/ingestion/sources/file-based-lineage In this documentation, I can only see some EntityNodeConfig and EntityConfig.
    b
    • 2
    • 2
  • m

    millions-sundown-65420

    06/15/2022, 10:45 AM
    Hi Team. A question around users. What's the best possible way to add new users via UI and associating them with specific dataset so that they cannot access other datasets?
    b
    • 2
    • 9
  • m

    mysterious-parrot-80195

    06/16/2022, 6:41 AM
    Hi Team, I am wondering if there is any better way to add a ducumentation for a table ? Must I first upload my png to some path in my server and then insert it with url instead of ctrl-c and ctrl-v ?
    b
    s
    • 3
    • 4
  • t

    tall-butcher-30509

    06/16/2022, 8:28 AM
    Hi I’m trying to build for the first time, but I am getting stuck with this gradle failure
    Copy code
    Execution failed for task ':metadata-ingestion:codegen'.
    > Process 'command 'bash'' finished with non-zero exit value 1
    b
    b
    • 3
    • 20
  • a

    acoustic-carpenter-87261

    06/16/2022, 4:04 PM
    Hello Team! I’m trying to setup datahub and send our inlets/outlets from airflow to datahub. We have and edge case and would be curious if somebody had this use-case before. We have a dag and we run this dag with different data sources and different inlets based on type. Example: dag run 1 - type A dag run 2 - type B dag run 3 - type A Is there a way to see these 2 logical groups individually ? Right now, because we have the same dag id, the lineage is always updated by the latest dag run. It would be interesting to have them grouped by type or at least see both of the pipelines somehow.
    d
    • 2
    • 2
  • g

    gray-jelly-64425

    06/17/2022, 3:38 AM
    Hi, does getting the fine grained data lineage implementation on the UI still a focus on the Q2?
    plus1 3
    👍 1
    ❤️ 1
  • w

    wide-apple-53149

    06/17/2022, 8:15 AM
    ./gradlew :metadata-ingestion:installDev
    failing with the following error :
  • w

    wide-apple-53149

    06/17/2022, 8:15 AM
    ./gradlew :metadata-ingestion:installDev
    failing with the following error : ERROR: Exception: Traceback (most recent call last): File "/home/walker/workspace/apps/datahub-v0.8.35/metadata-ingestion/venv/lib64/python3.6/site-packages/pip/_vendor/urllib3/response.py", line 438, in _error_catcher yield File "/home/walker/workspace/apps/datahub-v0.8.35/metadata-ingestion/venv/lib64/python3.6/site-packages/pip/_vendor/urllib3/response.py", line 519, in read data = self._fp.read(amt) if not fp_closed else b"" File "/home/walker/workspace/apps/datahub-v0.8.35/metadata-ingestion/venv/lib64/python3.6/site-packages/pip/_vendor/cachecontrol/filewrapper.py", line 62, in read data = self.__fp.read(amt) File "/usr/lib64/python3.6/http/client.py", line 459, in read n = self.readinto(b) File "/usr/lib64/python3.6/http/client.py", line 503, in readinto n = self.fp.readinto(b) File "/usr/lib64/python3.6/socket.py", line 586, in readinto return self._sock.recv_into(b) File "/usr/lib64/python3.6/ssl.py", line 971, in recv_into return self.read(nbytes, buffer) File "/usr/lib64/python3.6/ssl.py", line 833, in read return self._sslobj.read(len, buffer) File "/usr/lib64/python3.6/ssl.py", line 590, in read v = self._sslobj.read(len, buffer) socket.timeout: The read operation timed out During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/home/walker/workspace/apps/datahub-v0.8.35/metadata-ingestion/venv/lib64/python3.6/site-packages/pip/_internal/cli/base_command.py", line 164, in exc_logging_wrapper status = run_func(*args) File "/home/walker/workspace/apps/datahub-v0.8.35/metadata-ingestion/venv/lib64/python3.6/site-packages/pip/_internal/cli/req_command.py", line 205, in wrapper return func(self, options, args) File "/home/walker/workspace/apps/datahub-v0.8.35/metadata-ingestion/venv/lib64/python3.6/site-packages/pip/_internal/commands/install.py", line 339, in run reqs, check_supported_wheels=not options.target_dir File "/home/walker/workspace/apps/datahub-v0.8.35/metadata-ingestion/venv/lib64/python3.6/site-packages/pip/_internal/resolution/resolvelib/resolver.py", line 93, in resolve collected.requirements, max_rounds=try_to_avoid_resolution_too_deep File "/home/walker/workspace/apps/datahub-v0.8.35/metadata-ingestion/venv/lib64/python3.6/site-packages/pip/_vendor/resolvelib/resolvers.py", line 482, in resolve state = resolution.resolve(requirements, max_rounds=max_rounds) File "/home/walker/workspace/apps/datahub-v0.8.35/metadata-ingestion/venv/lib64/python3.6/site-packages/pip/_vendor/resolvelib/resolvers.py", line 374, in resolve failure_causes = self._attempt_to_pin_criterion(name) File "/home/walker/workspace/apps/datahub-v0.8.35/metadata-ingestion/venv/lib64/python3.6/site-packages/pip/_vendor/resolvelib/resolvers.py", line 214, in _attempt_to_pin_criterion criteria = self._get_updated_criteria(candidate) File "/home/walker/workspace/apps/datahub-v0.8.35/metadata-ingestion/venv/lib64/python3.6/site-packages/pip/_vendor/resolvelib/resolvers.py", line 205, in _get_updated_criteria self._add_to_criteria(criteria, requirement, parent=candidate) File "/home/walker/workspace/apps/datahub-v0.8.35/metadata-ingestion/venv/lib64/python3.6/site-packages/pip/_vendor/resolvelib/resolvers.py", line 173, in _add_to_criteria if not criterion.candidates: File "/home/walker/workspace/apps/datahub-v0.8.35/metadata-ingestion/venv/lib64/python3.6/site-packages/pip/_vendor/resolvelib/structs.py", line 151, in bool return bool(self._sequence) File "/home/walker/workspace/apps/datahub-v0.8.35/metadata-ingestion/venv/lib64/python3.6/site-packages/pip/_internal/resolution/resolvelib/found_candidates.py", line 155, in bool return any(self) File "/home/walker/workspace/apps/datahub-v0.8.35/metadata-ingestion/venv/lib64/python3.6/site-packages/pip/_internal/resolution/resolvelib/found_candidates.py", line 143, in <genexpr> return (c for c in iterator if id(c) not in self._incompatible_ids) File "/home/walker/workspace/apps/datahub-v0.8.35/metadata-ingestion/venv/lib64/python3.6/site-packages/pip/_internal/resolution/resolvelib/found_candidates.py", line 47, in _iter_built candidate = func() File "/home/walker/workspace/apps/datahub-v0.8.35/metadata-ingestion/venv/lib64/python3.6/site-packages/pip/_internal/resolution/resolvelib/factory.py", line 206, in _make_candidate_from_link version=version, File "/home/walker/workspace/apps/datahub-v0.8.35/metadata-ingestion/venv/lib64/python3.6/site-packages/pip/_internal/resolution/resolvelib/candidates.py", line 287, in init version=version, File "/home/walker/workspace/apps/datahub-v0.8.35/metadata-ingestion/venv/lib64/python3.6/site-packages/pip/_internal/resolution/resolvelib/candidates.py", line 156, in init self.dist = self._prepare() File "/home/walker/workspace/apps/datahub-v0.8.35/metadata-ingestion/venv/lib64/python3.6/site-packages/pip/_internal/resolution/resolvelib/candidates.py", line 225, in _prepare dist = self._prepare_distribution() File "/home/walker/workspace/apps/datahub-v0.8.35/metadata-ingestion/venv/lib64/python3.6/site-packages/pip/_internal/resolution/resolvelib/candidates.py", line 292, in _prepare_distribution return preparer.prepare_linked_requirement(self._ireq, parallel_builds=True) File "/home/walker/workspace/apps/datahub-v0.8.35/metadata-ingestion/venv/lib64/python3.6/site-packages/pip/_internal/operations/prepare.py", line 482, in prepare_linked_requirement return self._prepare_linked_requirement(req, parallel_builds) File "/home/walker/workspace/apps/datahub-v0.8.35/metadata-ingestion/venv/lib64/python3.6/site-packages/pip/_internal/operations/prepare.py", line 528, in _prepare_linked_requirement link, req.source_dir, self._download, self.download_dir, hashes File "/home/walker/workspace/apps/datahub-v0.8.35/metadata-ingestion/venv/lib64/python3.6/site-packages/pip/_internal/operations/prepare.py", line 217, in unpack_url hashes=hashes, File "/home/walker/workspace/apps/datahub-v0.8.35/metadata-ingestion/venv/lib64/python3.6/site-packages/pip/_internal/operations/prepare.py", line 94, in get_http_url from_path, content_type = download(link, temp_dir.path) File "/home/walker/workspace/apps/datahub-v0.8.35/metadata-ingestion/venv/lib64/python3.6/site-packages/pip/_internal/network/download.py", line 145, in call for chunk in chunks: File "/home/walker/workspace/apps/datahub-v0.8.35/metadata-ingestion/venv/lib64/python3.6/site-packages/pip/_internal/cli/progress_bars.py", line 144, in iter for x in it: File "/home/walker/workspace/apps/datahub-v0.8.35/metadata-ingestion/venv/lib64/python3.6/site-packages/pip/_internal/network/utils.py", line 87, in response_chunks decode_content=False, File "/home/walker/workspace/apps/datahub-v0.8.35/metadata-ingestion/venv/lib64/python3.6/site-packages/pip/_vendor/urllib3/response.py", line 576, in stream data = self.read(amt=amt, decode_content=decode_content) File "/home/walker/workspace/apps/datahub-v0.8.35/metadata-ingestion/venv/lib64/python3.6/site-packages/pip/_vendor/urllib3/response.py", line 541, in read raise IncompleteRead(self._fp_bytes_read, self.length_remaining) File "/usr/lib64/python3.6/contextlib.py", line 99, in exit self.gen.throw(type, value, traceback) File "/home/walker/workspace/apps/datahub-v0.8.35/metadata-ingestion/venv/lib64/python3.6/site-packages/pip/_vendor/urllib3/response.py", line 443, in _error_catcher raise ReadTimeoutError(self._pool, None, "Read timed out.") pip._vendor.urllib3.exceptions.ReadTimeoutError: HTTPSConnectionPool(host='files.pythonhosted.org', port=443): Read timed out.
    Task metadata ingestioninstallDev FAILED
    FAILURE: Build failed with an exception. * What went wrong: Execution failed for task 'metadata ingestioninstallDev'.
    Process 'command 'bash'' finished with non-zero exit value 2
    * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/6.9.2/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 9m 26s 26 actionable tasks: 9 executed, 17 up-to-date
    h
    • 2
    • 2
  • p

    powerful-account-82153

    06/18/2022, 2:38 AM
    Hi ,freinds,is there any way to deploy datahub offline? so many freinds in china encounter same network problem that fetch container. any advice and answer will be appreciated.
    m
    e
    • 3
    • 3
  • s

    silly-judge-96653

    06/20/2022, 3:34 AM
    Hi, from the airflow summit,
    john & tamas
    give information about
    datahub operations, Assertions & Incidents
    . Is the feature already released? Is it only for the Acryl Data customer (not community) feature?
    plus1 1
    b
    • 2
    • 2
  • i

    incalculable-lizard-71180

    06/20/2022, 9:43 AM
    Hey we have deployed datahub using a helm chart we want to enable access tokens and want to create new users. Can someone help me here.
    s
    e
    • 3
    • 2
  • m

    microscopic-helicopter-87069

    06/20/2022, 11:15 AM
    Hi Team.. I am trying to install hive plugin like this:`pip install acryl-datahub[hive]==0.8.35` But I am getting error:
    Running setup.py install for sasl3 did not run successfully.
    This error originates from a subprocess, and is likely not a problem with pip.
    error: legacy-install-failure
    Can someone help me here
    h
    • 2
    • 8
  • g

    gentle-camera-33498

    06/20/2022, 1:47 PM
    Hello Guys, Is there a way to clean the persistent volumes on the K8S deployment (similar to the 'datahub docker nuke' command)?
    l
    b
    • 3
    • 4
  • l

    late-zoo-31017

    06/20/2022, 10:00 PM
    I have a need to create "custom" browse paths for entities (datasets, but also datajobs, and other entities I will create myself). I notice that dataset browse paths "behave well" - ie you can see them on the landing page (or almost), etc etc. datajob browse paths, unfortutately, I can only see once I get on the datajob via the standard path (clicking on Tasks etc). Am I doing something wrong? Also, is there any way to make browse paths "first-class citizens" so they appear at the landing page (or almost?). The reason is that my users may have their own "lingo" about things, and I want to enable them to navigate the datahub world using that "lingo" (so I would create paths like '/Experiments/data/absorptions/nickel...') - and would like to give users an "easy" way to get at the start of the navigation-as-per-their-worldview page
    e
    • 2
    • 8
  • q

    quaint-keyboard-51916

    06/21/2022, 1:21 AM
    Hi Team. I want to find acryldata/acryl-datahub-actions source code. Can someone help me here! - issue: https://github.com/datahub-project/datahub/issues/5202 - docker image: https://hub.docker.com/r/acryldata/acryl-datahub-actions
    m
    • 2
    • 2
  • p

    powerful-iron-62101

    06/21/2022, 9:52 AM
    Getting this error when I'm running datahub-gms Failed startup of context o.e.j.w.WebAppContext@2d209079{Open source GMS,/,[file:///tmp/jetty-0_0_0_0-8080-war_war-_-any-2046710323418561835/webapp/, jarfile///tmp/jetty-0_0_0_0-8080-war_war-_-any-2046710323418561835/webapp/WEB-INF/lib/swagger-ui-4.10.3.jar!/META-INF/resources],UNAVAILABLE}{file:///datahub/datahub-gms/bin/war.war} org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'deleteEntityServiceFactory': Unsatisfied dependency expressed through field '_entityService'; nested exception is org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'entityAspectDao' defined in com.linkedin.gms.factory.entity.EntityAspectDaoFactory: Unsatisfied dependency expressed through method 'createEbeanInstance' parameter 0; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'ebeanServer' defined in com.linkedin.gms.factory.entity.EbeanServerFactory: Bean instantiation via factory method failed; nested exception is org.springframework.beans.BeanInstantiationException: Failed to instantiate [io.ebean.EbeanServer]: Factory method 'createServer' threw exception; nested exception is java.lang.NullPointerException at org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor$AutowiredFieldElement.resolveFieldValue(AutowiredAnnotationBeanPostProcessor.java:659) at org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor$AutowiredFieldElement.inject(AutowiredAnnotationBeanPostProcessor.java:639) at org.springframework.beans.factory.annotation.InjectionMetadata.inject(InjectionMetadata.java:119) at org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor.postProcessProperties(AutowiredAnnotationBeanPostProcessor.java:399) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:1431) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:619) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:542) at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:335) at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:234) at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:333) at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:208) at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:953) at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:918) at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:583) at org.springframework.web.context.ContextLoader.configureAndRefreshWebApplicationContext(ContextLoader.java:401) at org.springframework.web.context.ContextLoader.initWebApplicationContext(ContextLoader.java:292) at org.springframework.web.context.ContextLoaderListener.contextInitialized(ContextLoaderListener.java:103) at org.eclipse.jetty.server.handler.ContextHandler.callContextInitialized(ContextHandler.java:1073) at org.eclipse.jetty.servlet.ServletContextHandler.callContextInitialized(ServletContextHandler.java:572) at org.eclipse.jetty.server.handler.ContextHandler.contextInitialized(ContextHandler.java:1002) at org.eclipse.jetty.servlet.ServletHandler.initialize(ServletHandler.java:746) at org.eclipse.jetty.servlet.ServletContextHandler.startContext(ServletContextHandler.java:379) at org.eclipse.jetty.webapp.WebAppContext.startWebapp(WebAppContext.java:1449) at org.eclipse.jetty.webapp.WebAppContext.startContext(WebAppContext.java:1414) at org.eclipse.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:916) at org.eclipse.jetty.servlet.ServletContextHandler.doStart(ServletContextHandler.java:288) at org.eclipse.jetty.webapp.WebAppContext.doStart(WebAppContext.java:524) at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:73) at org.eclipse.jetty.util.component.ContainerLifeCycle.start(ContainerLifeCycle.java:169) at org.eclipse.jetty.util.component.ContainerLifeCycle.doStart(ContainerLifeCycle.java:117) at org.eclipse.jetty.server.handler.AbstractHandler.doStart(AbstractHandler.java:97) at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:73) at org.eclipse.jetty.util.component.ContainerLifeCycle.start(ContainerLifeCycle.java:169) at org.eclipse.jetty.util.component.ContainerLifeCycle.doStart(ContainerLifeCycle.java:117) at org.eclipse.jetty.server.handler.AbstractHandler.doStart(AbstractHandler.java:97) at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:73) at org.eclipse.jetty.util.component.ContainerLifeCycle.start(ContainerLifeCycle.java:169) at org.eclipse.jetty.server.Server.start(Server.java:423) at org.eclipse.jetty.util.component.ContainerLifeCycle.doStart(ContainerLifeCycle.java:110) at org.eclipse.jetty.server.handler.AbstractHandler.doStart(AbstractHandler.java:97) at org.eclipse.jetty.server.Server.doStart(Server.java:387) at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:73) at org.eclipse.jetty.runner.Runner.run(Runner.java:519) at org.eclipse.jetty.runner.Runner.main(Runner.java:564) Caused by: org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'entityAspectDao' defined in com.linkedin.gms.factory.entity.EntityAspectDaoFactory: Unsatisfied dependency expressed through method 'createEbeanInstance' parameter 0; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'ebeanServer' defined in com.linkedin.gms.factory.entity.EbeanServerFactory: Bean instantiation via factory method failed; nested exception is org.springframework.beans.BeanInstantiationException: Failed to instantiate [io.ebean.EbeanServer]: Factory method 'createServer' threw exception; nested exception is java.lang.NullPointerException at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:800) at org.springframework.beans.factory.support.ConstructorResolver.instantiateUsingFactoryMethod(ConstructorResolver.java:541) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateUsingFactoryMethod(AbstractAutowireCapableBeanFactory.java:1352) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1195) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:582) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:542) at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:335) at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:234) at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:333) at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:208) at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:322) at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:208) at org.springframework.beans.factory.config.DependencyDescriptor.resolveCandidate(DependencyDescriptor.java:276) at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:1389) at org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveDependency(DefaultListableBeanFactory.java:1309)
    h
    l
    • 3
    • 8
  • b

    bumpy-portugal-86782

    06/21/2022, 11:48 AM
    Hi. I'm trying to setup up the Spark-Agent on a Azure Databricks cluster. Have anyone gotten that to work? I'm getting a NullPointerException when i run my notebook. I've tried various Spark runtime versions (2.4.5, 3.2.1 and 3.3.0) all with the same result. I'm using spark.jars.packages io.acryldatahub spark lineage0.8.36 Any ideas? EDIT: Nevermind. Found a specific Databricks.jar
    m
    • 2
    • 4
1...313233...80Latest