https://datahubproject.io logo
Join Slack
Powered by
# getting-started
  • s

    some-crayon-90964

    10/09/2020, 7:07 PM
    Which piece of code should I be looking at if I want to implement new API or change existing API to add new users?
    CorpUserViewDao
    seems to have all sorts of getting methods, but I am having hard time finding methods to create new user. Thanks in advanced!
    a
    b
    • 3
    • 7
  • a

    able-garden-99963

    10/14/2020, 10:55 AM
    Hi Datahub team! Wanted to double check if the ETA for exposing GraphQL endpoint for frontend queries is Jan 2021 as mentioned in the roadmap doc. Thanks!
    b
    a
    • 3
    • 7
  • s

    square-greece-86505

    10/20/2020, 4:28 AM
    Hi all, I am trying to do mini POC in my environment. I am using OpenLDAP as test environment and already created user named
    testuser
    The problem was: • Login via login page • Redirected to Browse (successfully logged in) • After a second, the UI redirected back to Login page • It seems that
    (GET) [/api/v1/user/me]
    has problem in my machine • I already onboard LDAP users via ldap etl • Already checked those users exists in MySQL database table
    metadata_aspect
    • Do I have something missing? I checked on datahub frontend logs:
    plus1 1
    b
    • 2
    • 2
  • s

    strong-pharmacist-65336

    10/22/2020, 7:17 AM
    How to perform column search into Datahub Frontend like I have empno column into table: emp and database : public which I want to search but current search only permit dataorigin , name , owners and platform.
    b
    • 2
    • 1
  • s

    strong-pharmacist-65336

    10/22/2020, 7:52 PM
    Hello All, I am getting this error and I have deleted all docker containers https://github.com/linkedin/datahub/blob/master/docs/debugging.md#ive-messed-up-my-docker-setup-how-do-i-start-from-scratch and starting docker again as given in step#4 https://github.com/linkedin/datahub/blob/master/docs/quickstart.md
    b
    • 2
    • 1
  • b

    breezy-guitar-97226

    10/26/2020, 5:57 PM
    Hello everyone, I’ve been working on Datahub for a few days with the intention to evaluate it as a possible solution for metadata sharing and data lineage in my company. I’ve been following the docs to try to onboard a new entity in the context of a local Docker environment. The entity is quite simple (copying from
    corpGroup
    but with a different urn and just a simple property, the
    name
    ) as I’m trying to understand all the steps involved in the process. Nonetheless, when I try to add an entity instance via the REST api (curl) I get the following error:
    Copy code
    datahub-mae-consumer    | 17:55:48.772 [mae-consumer-job-client-0-C-1] INFO  c.l.m.k.MetadataAuditEventsProcessor - {com.linkedin.metadata.snapshot.AdevintaOrganisationSnapshot={urn=urn:li:adevintaOrganisation:finance, aspects=[{com.linkedin.identity.AdevintaOrganisationInfo={name=finance}}]}}
    datahub-mae-consumer    | 17:55:48.774 [mae-consumer-job-client-0-C-1] ERROR c.l.m.k.MetadataAuditEventsProcessor - java.util.NoSuchElementException: No value present [java.util.Optional.get(Optional.java:135), com.linkedin.metadata.kafka.MetadataAuditEventsProcessor.updateNeo4j(MetadataAuditEventsProcessor.java:83), com.linkedin.metadata.kafka.MetadataAuditEventsProcessor.consume(MetadataAuditEventsProcessor.java:68), sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method), sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62), sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43), java.lang.reflect.Method.invoke(Method.java:498), org.springframework.messaging.handler.invocation.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:171), org.springframework.messaging.handler.invocation.InvocableHandlerMethod.invoke(InvocableHandlerMethod.java:120), org.springframework.kafka.listener.adapter.HandlerAdapter.invoke(HandlerAdapter.java:48), org.springframework.kafka.listener.adapter.MessagingMessageListenerAdapter.invokeHandler(MessagingMessageListenerAdapter.java:283), org.springframework.kafka.listener.adapter.RecordMessagingMessageListenerAdapter.onMessage(RecordMessagingMessageListenerAdapter.java:79), org.springframework.kafka.listener.adapter.RecordMessagingMessageListenerAdapter.onMessage(RecordMessagingMessageListenerAdapter.java:50), org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doInvokeOnMessage(KafkaMessageListenerContainer.java:1327), org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeOnMessage(KafkaMessageListenerContainer.java:1307), org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doInvokeRecordListener(KafkaMessageListenerContainer.java:1267), org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doInvokeWithRecords(KafkaMessageListenerContainer.java:1248), org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeRecordListener(KafkaMessageListenerContainer.java:1162), org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeListener(KafkaMessageListenerContainer.java:971), org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.pollAndInvoke(KafkaMessageListenerContainer.java:775), org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.run(KafkaMessageListenerContainer.java:708), java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511), java.util.concurrent.FutureTask.run(FutureTask.java:266), java.lang.Thread.run(Thread.java:748)]
    could you help debugging it, or pointing me to the right direction? Thank you!
    b
    o
    m
    • 4
    • 11
  • s

    shy-lizard-17779

    10/27/2020, 1:39 AM
    Hi, I'm new to DataHub and I'm trying to understand the DB model and source of truth. I saw that this file contains a table: https://github.com/linkedin/datahub/blob/b8e18b0b5d56b4fa69b4bc35e8055176f9577dee/docker/postgres/init.sql However I don't fully understand if all of the data (even if there are 10B+ records), will be contained there or not and if you saw that it is a method that scale, or other tables are generated on the fly. Is it also append only table?
    b
    m
    m
    • 4
    • 47
  • c

    chilly-barista-6524

    10/27/2020, 9:35 AM
    Hey, I am trying to deploy datahub using helm charts contributed by the community following this documentation: https://github.com/linkedin/datahub/tree/master/contrib/kubernetes For testing I am using
    quickstart.sh
    for kafka, es, mysql and neo4j and using helm charts for deploying datahub. Here are the changes I have done for the same : https://github.com/grofers/datahub/commit/c5f99bdf183b47a9d13a7fd32da5b7f72b1fa2d5 The UI and login is working but there seems to be some issue with MCE controller. When I check the logs I am getting this for MCE service:
    b
    • 2
    • 2
  • c

    chilly-barista-6524

    10/27/2020, 9:35 AM
    Copy code
    09:21:19.423 [mce-consumer-job-client-0-C-1] ERROR o.s.k.listener.LoggingErrorHandler - Error while processing: null
    org.apache.kafka.common.errors.SerializationException: Error deserializing key/value for partition MetadataChangeEvent_v4-0 at offset 0. If needed, please seek past the record to continue consumption.
    Caused by: org.apache.kafka.common.errors.SerializationException: Error retrieving Avro unknown schema for id 1
    Caused by: java.net.UnknownHostException: schema-registry
    	at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:184)
    	at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
    	at java.net.Socket.connect(Socket.java:589)
    	at sun.net.NetworkClient.doConnect(NetworkClient.java:175)
    	at sun.net.www.http.HttpClient.openServer(HttpClient.java:463)
    	at sun.net.www.http.HttpClient.openServer(HttpClient.java:558)
    	at sun.net.www.http.HttpClient.<init>(HttpClient.java:242)
    	at sun.net.www.http.HttpClient.New(HttpClient.java:339)
    	at sun.net.www.http.HttpClient.New(HttpClient.java:357)
    	at sun.net.www.protocol.http.HttpURLConnection.getNewHttpClient(HttpURLConnection.java:1220)
    	at sun.net.www.protocol.http.HttpURLConnection.plainConnect0(HttpURLConnection.java:1156)
    	at sun.net.www.protocol.http.HttpURLConnection.plainConnect(HttpURLConnection.java:1050)
    	at sun.net.www.protocol.http.HttpURLConnection.connect(HttpURLConnection.java:984)
    	at sun.net.www.protocol.http.HttpURLConnection.getInputStream0(HttpURLConnection.java:1564)
    	at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1492)
    	at java.net.HttpURLConnection.getResponseCode(HttpURLConnection.java:480)
    	at io.confluent.kafka.schemaregistry.client.rest.RestService.sendHttpRequest(RestService.java:272)
    	at io.confluent.kafka.schemaregistry.client.rest.RestService.httpRequest(RestService.java:351)
    	at io.confluent.kafka.schemaregistry.client.rest.RestService.getId(RestService.java:659)
    	at io.confluent.kafka.schemaregistry.client.rest.RestService.getId(RestService.java:641)
    	at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.getSchemaByIdFromRegistry(CachedSchemaRegistryClient.java:217)
    	at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.getSchemaBySubjectAndId(CachedSchemaRegistryClient.java:291)
    	at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.getSchemaById(CachedSchemaRegistryClient.java:276)
    	at io.confluent.kafka.serializers.AbstractKafkaAvroDeserializer$DeserializationContext.schemaFromRegistry(AbstractKafkaAvroDeserializer.java:273)
    	at io.confluent.kafka.serializers.AbstractKafkaAvroDeserializer.deserialize(AbstractKafkaAvroDeserializer.java:97)
    	at io.confluent.kafka.serializers.AbstractKafkaAvroDeserializer.deserialize(AbstractKafkaAvroDeserializer.java:76)
    	at io.confluent.kafka.serializers.KafkaAvroDeserializer.deserialize(KafkaAvroDeserializer.java:55)
    	at org.apache.kafka.common.serialization.Deserializer.deserialize(Deserializer.java:60)
    	at org.apache.kafka.clients.consumer.internals.Fetcher.parseRecord(Fetcher.java:1324)
    	at org.apache.kafka.clients.consumer.internals.Fetcher.access$3400(Fetcher.java:129)
    	at org.apache.kafka.clients.consumer.internals.Fetcher$CompletedFetch.fetchRecords(Fetcher.java:1555)
    	at org.apache.kafka.clients.consumer.internals.Fetcher$CompletedFetch.access$1700(Fetcher.java:1391)
    	at org.apache.kafka.clients.consumer.internals.Fetcher.fetchRecords(Fetcher.java:683)
    	at org.apache.kafka.clients.consumer.internals.Fetcher.fetchedRecords(Fetcher.java:634)
    	at org.apache.kafka.clients.consumer.KafkaConsumer.pollForFetches(KafkaConsumer.java:1290)
    	at org.apache.kafka.clients.consumer.KafkaConsumer.poll(KafkaConsumer.java:1248)
    	at org.apache.kafka.clients.consumer.KafkaConsumer.poll(KafkaConsumer.java:1216)
    	at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.pollAndInvoke(KafkaMessageListenerContainer.java:757)
    	at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.run(KafkaMessageListenerContainer.java:708)
    	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    	at java.lang.Thread.run(Thread.java:748)
    09:21:19.423 [mce-consumer-job-client-0-C-1] ERROR o.s.k.listener.LoggingErrorHandler - Error while processing: null
    org.apache.kafka.common.errors.SerializationException: Error deserializing key/value for partition MetadataChangeEvent_v4-0 at offset 0. If needed, please seek past the record to continue consumption.
    Caused by: org.apache.kafka.common.errors.SerializationException: Error retrieving Avro unknown schema for id 1
    Caused by: java.net.UnknownHostException: schema-registry
    	at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:184)
    	at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
    	at java.net.Socket.connect(Socket.java:589)
    	at sun.net.NetworkClient.doConnect(NetworkClient.java:175)
    	at sun.net.www.http.HttpClient.openServer(HttpClient.java:463)
    	at sun.net.www.http.HttpClient.openServer(HttpClient.java:558)
    	at sun.net.www.http.HttpClient.<init>(HttpClient.java:242)
    	at sun.net.www.http.HttpClient.New(HttpClient.java:339)
    	at sun.net.www.http.HttpClient.New(HttpClient.java:357)
    	at sun.net.www.protocol.http.HttpURLConnection.getNewHttpClient(HttpURLConnection.java:1220)
    	at sun.net.www.protocol.http.HttpURLConnection.plainConnect0(HttpURLConnection.java:1156)
    	at sun.net.www.protocol.http.HttpURLConnection.plainConnect(HttpURLConnection.java:1050)
    	at sun.net.www.protocol.http.HttpURLConnection.connect(HttpURLConnection.java:984)
    	at sun.net.www.protocol.http.HttpURLConnection.getInputStream0(HttpURLConnection.java:1564)
    	at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1492)
    	at java.net.HttpURLConnection.getResponseCode(HttpURLConnection.java:480)
    	at io.confluent.kafka.schemaregistry.client.rest.RestService.sendHttpRequest(RestService.java:272)
    	at io.confluent.kafka.schemaregistry.client.rest.RestService.httpRequest(RestService.java:351)
    	at io.confluent.kafka.schemaregistry.client.rest.RestService.getId(RestService.java:659)
    	at io.confluent.kafka.schemaregistry.client.rest.RestService.getId(RestService.java:641)
    	at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.getSchemaByIdFromRegistry(CachedSchemaRegistryClient.java:217)
    	at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.getSchemaBySubjectAndId(CachedSchemaRegistryClient.java:291)
    	at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.getSchemaById(CachedSchemaRegistryClient.java:276)
    	at io.confluent.kafka.serializers.AbstractKafkaAvroDeserializer$DeserializationContext.schemaFromRegistry(AbstractKafkaAvroDeserializer.java:273)
    	at io.confluent.kafka.serializers.AbstractKafkaAvroDeserializer.deserialize(AbstractKafkaAvroDeserializer.java:97)
    	at io.confluent.kafka.serializers.AbstractKafkaAvroDeserializer.deserialize(AbstractKafkaAvroDeserializer.java:76)
    	at io.confluent.kafka.serializers.KafkaAvroDeserializer.deserialize(KafkaAvroDeserializer.java:55)
    	at org.apache.kafka.common.serialization.Deserializer.deserialize(Deserializer.java:60)
    	at org.apache.kafka.clients.consumer.internals.Fetcher.parseRecord(Fetcher.java:1324)
    	at org.apache.kafka.clients.consumer.internals.Fetcher.access$3400(Fetcher.java:129)
    	at org.apache.kafka.clients.consumer.internals.Fetcher$CompletedFetch.fetchRecords(Fetcher.java:1555)
    	at org.apache.kafka.clients.consumer.internals.Fetcher$CompletedFetch.access$1700(Fetcher.java:1391)
    	at org.apache.kafka.clients.consumer.internals.Fetcher.fetchRecords(Fetcher.java:683)
    	at org.apache.kafka.clients.consumer.internals.Fetcher.fetchedRecords(Fetcher.java:634)
    	at org.apache.kafka.clients.consumer.KafkaConsumer.pollForFetches(KafkaConsumer.java:1290)
    	at org.apache.kafka.clients.consumer.KafkaConsumer.poll(KafkaConsumer.java:1248)
    	at org.apache.kafka.clients.consumer.KafkaConsumer.poll(KafkaConsumer.java:1216)
    	at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.pollAndInvoke(KafkaMessageListenerContainer.java:757)
    	at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.run(KafkaMessageListenerContainer.java:708)
    	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    	at java.lang.Thread.run(Thread.java:748)
    s
    • 2
    • 2
  • h

    hallowed-dinner-34937

    10/27/2020, 7:16 PM
    Hello, I'm trying to ingest some links for documentation and trying to add them into the "InstitutionalMemory" aspect. Unfortunately, i'm getting this error: "Message serialization failed no value and no default for url". Does anyone have an example of how to ingest documentation links using python to show them under the "Docs" tab in a Dataset Entity?
    b
    m
    • 3
    • 10
  • c

    chilly-barista-6524

    10/30/2020, 6:09 AM
    Hey everyone, Does anyone have idea about whether if we can scale our Neo4j cluster to multi node without getting the License if we are using it only internally?
    b
    • 2
    • 2
  • h

    hallowed-dinner-34937

    11/03/2020, 4:45 PM
    Hi All, We're looking to build a CI/CD pipeline for our Datahub POC for testing during future dev work and was wondering if there is either docs on the current testing process of datahub or if someone has setup a pipeline and has any suggestion on how to proceed.
    b
    • 2
    • 1
  • n

    nutritious-bird-77396

    11/03/2020, 8:59 PM
    Hi LinkedIn Team….Can you guys help us understand the reasoning behind renaming schema namespaces from
    com.linkedin.mxe
    to
    com.linkedin.pegasus2avro.mxe
    ? For ex: https://github.com/linkedin/datahub/blob/54fef777ac7079e6a44aad40af55e868a5844f57/metadata-events/mxe-utils-avro-1.7/src/main/java/com/linkedin/metadata/EventUtils.java#L84
    a
    m
    b
    • 4
    • 5
  • c

    chilly-barista-6524

    11/04/2020, 6:26 PM
    https://github.com/linkedin/datahub/issues/1983 ^ Anyone else facing this issue?
    b
    h
    • 3
    • 2
  • l

    little-megabyte-1074

    11/04/2020, 7:10 PM
    Hi folks! I’ll be presenting at a Data Quality Meetup hosted by Datafold on November 19th. I’ll be talking about SpotHero’s data discovery challenges and why we are rolling out DataHub. Interested? Join me! https://app.livestorm.co/datafold/data-quality-meetup
    🎉 3
    datahub 1
    👍 1
    b
    • 2
    • 4
  • s

    some-crayon-90964

    11/04/2020, 9:36 PM
    Is there any recommended strategies to merge code from linkedln team? My team is experiencing some difficulties merging recent code due to a huge structural changes.
    b
    m
    • 3
    • 14
  • h

    high-hospital-85984

    11/05/2020, 7:08 AM
    We’re trying to set up Datahub with the contributed Helm charts and self-hosted elasticsearch. I’ve tried looking through the code, and cant find any reference to authentication for ES. The only thing I could find was in this file where we set
    xpack.security.enabled=false
    which to my understanding disables authentication all together. Is it really the case that there is no support for authentication towards ES?
    ➕ 1
    b
    • 2
    • 1
  • n

    nutritious-bird-77396

    11/10/2020, 11:21 PM
    Team… `Issue`: We have been seeing some scalability issues in MAE. MCE is able to process events without lag MAE has been much slower `Inference`: We found that bottle neck is in gms spending 50% of its time in mysql. `Actions taken`: These were some of the steps taken to mitigate the issue…. • Increase partitions in MAE topics to 16X • Increase MAE instances and GMS instances proportional to the topic partitions • Increase the
    EBEAN_MAX_CONNECTIONS
    in GMS to 150 to increase the mysql threads. All these steps reduced the lag on the MAE topic little bit but still not satisfactorily. Are there any other tweaks or suggestions that you guys would suggest?
    b
    m
    c
    • 4
    • 8
  • d

    damp-telephone-61279

    11/12/2020, 4:38 PM
    Hi all, Greetings from Portugal. I'm evaluating datahub to see if it is a go to my company.
    m
    b
    • 3
    • 13
  • d

    damp-telephone-61279

    11/17/2020, 4:21 PM
    Hi, I’m trying to check the rest.li interface of GMS but I can’t locally. It throws me an exception when accessing (p.e.) http://localhost:8080/restli/restli/docs/data/com.linkedin.dataset.Dataset
    w
    • 2
    • 1
  • d

    damp-telephone-61279

    11/17/2020, 4:42 PM
    How can I check the models of this API? I’m interesting in using GMS
    m
    b
    • 3
    • 3
  • d

    damp-telephone-61279

    11/19/2020, 9:41 AM
    How to create a new DataPlatform? According to rest.li api only GETs are possible?
    a
    • 2
    • 1
  • d

    damp-telephone-61279

    11/19/2020, 4:24 PM
    Hi! I’m having an hard time dealing with this GMS api. For example: I have some datasets created. And I want to add (in this case I’m cheking create) an upstreamLineage to it. So I created the following request (I want to add to dataset bill-invoice the upstream dataset entitlement-server):
    Copy code
    curl --location --request CREATE '<http://localhost:8080/datasets/($params:(),name:bill-invoice,origin:PROD,platform:billing)/upstreamLineage>' \
    --header 'X-RestLi-Protocol-Version: 2.0.0' \
    --header 'Content-Type: application/json' \
    --data-raw '{
        "upstreams": [
            {
                "auditStamp": {
                    "actor": "urn:li:corpuser:ordering.product-offering",
                    "impersonator": "urn:li:corpuser:ordering.product-offering",
                    "time": 7
                },
                "dataset": "urn:li:dataset:(urn:li:dataPlatform:entitlement-server,es-go,DEV)",
                "type": "TRANSFORMED"
            }
        ]
    }'
    But gives me an error:
    java.lang.IllegalArgumentException: No enum constant com.linkedin.restli.common.HttpMethod.CREATE
    Using POST will result on an exception from server. I’m basing on the following: http://localhost:8080/restli/docs/rest/datasets.upstreamLineage
    a
    c
    +2
    • 5
    • 13
  • d

    damp-telephone-61279

    11/19/2020, 5:07 PM
    Hello again! How can I define an urn for a dataset field (used in primary keys and on DatasetFieldMapping)? Thank you for the help
    a
    c
    • 3
    • 4
  • d

    damp-telephone-61279

    11/24/2020, 3:24 PM
    How can I add an upstreamLineage to an already existing dataset? According to the documentation here: http://localhost:8080/restli/docs/rest/datasets.upstreamLineage the format should be something like this:
    Copy code
    Documentation
    <http://Rest.li|Rest.li> entry point: /datasets/{datasetKey}/upstreamLineage generated from: com.linkedin.metadata.resources.dataset.UpstreamLineageResource
    ...
    
    curl -v -X POST /datasets/($params:(),name:Doe,origin:DEV,platform:bar)/upstreamLineage?action=deltaUpdate -d @request_body           
    
    request_body file:
    {
      "delta" : {
        "upstreamsToUpdate" : [ ]
      }
    }
    What is a datasetKey??? I wanto to create the upstream lineage to the following (already created dataset):
    urn:li:dataset:(urn:li:dataPlatform:billing,bill-invoice,PROD)
    e
    • 2
    • 10
  • d

    damp-telephone-61279

    11/24/2020, 3:30 PM
    Copy code
    {
      "exceptionClass": "com.linkedin.restli.server.RestLiServiceException",
      "stackTrace": "com.linkedin.restli.server.RestLiServiceException [HTTP Status:400]: GET operation not supported for URI: '/datasets?($params:(),name:billing,origin:PROD,platform:billing)?aspects=List(bar,bar,bar)' with X-RestLi-Method: ''\n\tat com.linkedin.restli.server.RestLiServiceException.fromThrowable(RestLiServiceException.java:315)\n\tat com.linkedin.restli.server.BaseRestLiServer.buildPreRoutingError(BaseRestLiServer.java:158)\n\tat com.linkedin.restli.server.RestRestLiServer.buildPreRoutingRestException(RestRestLiServer.java:203)\n\tat com.linkedin.restli.server.RestRestLiServer.handleResourceRequest(RestRestLiServer.java:177)\n\tat com.linkedin.restli.server.RestRestLiServer.doHandleRequest(RestRestLiServer.java:164)\n\tat com.linkedin.restli.server.RestRestLiServer.handleRequest(RestRestLiServer.java:120)\n\tat com.linkedin.restli.server.RestLiServer.handleRequest(RestLiServer.java:132)\n\tat com.linkedin.restli.server.DelegatingTransportDispatcher.handleRestRequest(DelegatingTransportDispatcher.java:70)\n\tat com.linkedin.r2.filter.transport.DispatcherRequestFilter.onRestRequest(DispatcherRequestFilter.java:70)\n\tat com.linkedin.r2.filter.TimedRestFilter.onRestRequest(TimedRestFilter.java:72)\n\tat com.linkedin.r2.filter.FilterChainIterator$FilterChainRestIterator.doOnRequest(FilterChainIterator.java:146)\n\tat com.linkedin.r2.filter.FilterChainIterator$FilterChainRestIterator.doOnRequest(FilterChainIterator.java:132)\n\tat com.linkedin.r2.filter.FilterChainIterator.onRequest(FilterChainIterator.java:62)\n\tat com.linkedin.r2.filter.TimedNextFilter.onRequest(TimedNextFilter.java:55)\n\tat com.linkedin.r2.filter.transport.ServerQueryTunnelFilter.onRestRequest(ServerQueryTunnelFilter.java:58)\n\tat com.linkedin.r2.filter.TimedRestFilter.onRestRequest(TimedRestFilter.java:72)\n\tat com.linkedin.r2.filter.FilterChainIterator$FilterChainRestIterator.doOnRequest(FilterChainIterator.java:146)\n\tat com.linkedin.r2.filter.FilterChainIterator$FilterChainRestIterator.doOnRequest(FilterChainIterator.java:132)\n\tat com.linkedin.r2.filter.FilterChainIterator.onRequest(FilterChainIterator.java:62)\n\tat com.linkedin.r2.filter.TimedNextFilter.onRequest(TimedNextFilter.java:55)\n\tat com.linkedin.r2.filter.message.rest.RestFilter.onRestRequest(RestFilter.java:50)\n\tat com.linkedin.r2.filter.TimedRestFilter.onRestRequest(TimedRestFilter.java:72)\n\tat com.linkedin.r2.filter.FilterChainIterator$FilterChainRestIterator.doOnRequest(FilterChainIterator.java:146)\n\tat com.linkedin.r2.filter.FilterChainIterator$FilterChainRestIterator.doOnRequest(FilterChainIterator.java:132)\n\tat com.linkedin.r2.filter.FilterChainIterator.onRequest(FilterChainIterator.java:62)\n\tat com.linkedin.r2.filter.FilterChainImpl.onRestRequest(FilterChainImpl.java:96)\n\tat com.linkedin.r2.filter.transport.FilterChainDispatcher.handleRestRequest(FilterChainDispatcher.java:75)\n\tat com.linkedin.r2.util.finalizer.RequestFinalizerDispatcher.handleRestRequest(RequestFinalizerDispatcher.java:61)\n\tat com.linkedin.r2.transport.http.server.HttpDispatcher.handleRequest(HttpDispatcher.java:101)\n\tat com.linkedin.r2.transport.http.server.AbstractR2Servlet.service(AbstractR2Servlet.java:105)\n\tat javax.servlet.http.HttpServlet.service(HttpServlet.java:790)\n\tat com.linkedin.restli.server.spring.ParallelRestliHttpRequestHandler.handleRequest(ParallelRestliHttpRequestHandler.java:61)\n\tat org.springframework.web.context.support.HttpRequestHandlerServlet.service(HttpRequestHandlerServlet.java:73)\n\tat javax.servlet.http.HttpServlet.service(HttpServlet.java:790)\n\tat org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:852)\n\tat org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:544)\n\tat org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143)\n\tat org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:536)\n\tat org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)\n\tat org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235)\n\tat org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1581)\n\tat org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)\n\tat org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1307)\n\tat org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188)\n\tat org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:482)\n\tat org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1549)\n\tat org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186)\n\tat org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1204)\n\tat org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)\n\tat org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:221)\n\tat org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146)\n\tat org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)\n\tat org.eclipse.jetty.server.Server.handle(Server.java:494)\n\tat org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:374)\n\tat org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:268)\n\tat org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)\n\tat org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)\n\tat org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:117)\n\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:336)\n\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:313)\n\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:171)\n\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:129)\n\tat org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:367)\n\tat org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:782)\n\tat org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:918)\n\tat java.lang.Thread.run(Thread.java:748)\nCaused by: com.linkedin.restli.server.RoutingException: GET operation not supported for URI: '/datasets?($params:(),name:billing,origin:PROD,platform:billing)?aspects=List(bar,bar,bar)' with X-RestLi-Method: ''\n\tat com.linkedin.restli.internal.server.RestLiRouter.findMethodDescriptor(RestLiRouter.java:273)\n\tat com.linkedin.restli.internal.server.RestLiRouter.processResourceTree(RestLiRouter.java:219)\n\tat com.linkedin.restli.internal.server.RestLiRouter.process(RestLiRouter.java:142)\n\tat com.linkedin.restli.server.BaseRestLiServer.getRoutingResult(BaseRestLiServer.java:139)\n\tat com.linkedin.restli.server.RestRestLiServer.handleResourceRequest(RestRestLiServer.java:173)\n\t... 62 more\n",
      "message": "GET operation not supported for URI: '/datasets?($params:(),name:billing,origin:PROD,platform:billing)?aspects=List(bar,bar,bar)' with X-RestLi-Method: ''",
      "status": 400
    }
    b
    • 2
    • 4
  • d

    damp-telephone-61279

    11/24/2020, 3:30 PM
    I also can’t get a dataset:
    Copy code
    curl --location --request GET '<http://localhost:8080/datasets?($params:(),name:billing,origin:PROD,platform:billing)>'
    Gives me the following:
    e
    • 2
    • 4
  • d

    damp-telephone-61279

    11/24/2020, 4:33 PM
    What are the groups for? I created a group and added some users into it, but it is not on the UI
    e
    • 2
    • 1
  • s

    swift-lighter-33366

    11/24/2020, 11:43 PM
    Hello I am running into issues with installing datahub by following the quickstart guide. It appears that for some reasons not all the elasticsearch indices are being populated. I have attached a screenshot of the all the indices that I am seeing. I have also looked into the docker container logs for elasticsearch. It appears it not even creating the
    datasetdocument
    indice. I not sure why. I am looking for some guidance on how to troubleshoot this. Thanks in advanced.
    e
    • 2
    • 2
  • f

    fancy-advantage-41244

    11/26/2020, 1:21 PM
    Two questions on lineage aspects: 1. what's the difference between DatasetUpstreamLineage vs. UpstreamLineage aspects? 2. Why is DownstreamLineage aspect not part of DatasetAspect (Link)?
    e
    • 2
    • 4
12345...80Latest