• s

    strong-pharmacist-65336

    1 year ago
    I am not able to start a datahub-frontend container Could you please help me start it?
    s
    m
    +1
    19 replies
    Copy to Clipboard
  • s

    some-crayon-90964

    1 year ago
    Question: does Datahub currently have the audit log that tracks user activities such as user A access data 123. Or is the linkedIn team planning to add that?
    s
    b
    +2
    14 replies
    Copy to Clipboard
  • c

    chilly-barista-6524

    1 year ago
    /gradlew build
    failing with the following error :
    > Task :datahub-web:emberWorkspaceTest FAILED
    
    FAILURE: Build failed with an exception.
    
    * What went wrong:
    Execution failed for task ':datahub-web:emberWorkspaceTest'.
    > Process 'command '/home/shubham.gupta2/datahub/datahub-web/build/yarn/yarn-v1.13.0/bin/yarn'' finished with non-zero exit value 1
    can someone help with this? This is hosted on an EC2 instance
    c
    b
    +2
    7 replies
    Copy to Clipboard
  • s

    strong-pharmacist-65336

    1 year ago
    Hello <!here>, I am getting error unable to find valid certification path to required target while executing this command at https://github.com/linkedin/datahub/tree/master/metadata-ingestion
    ./gradlew :metadata-events:mxe-schemas:build
    s
    b
    2 replies
    Copy to Clipboard
  • h

    hallowed-dinner-34937

    1 year ago
    Hello LinkedIn Team, running a docker container for elasticsarch today gave me this in the logs "# License [will expire] on [Saturday, October 31, 2020]. If you have a new license, please update it.

    Otherwise, please reach out to your support contact." Wondering if this is something that needs to be looked into (if not already being looked into).

    h
    b
    4 replies
    Copy to Clipboard
  • h

    hallowed-dinner-34937

    1 year ago
    Hello Team, I was wondering what the linkedin github release cycle is like. This is to see if we can regularly update the code on our side with changes made by Linkedin. Thank you
    h
    1 replies
    Copy to Clipboard
  • n

    nutritious-bird-77396

    1 year ago
    Dear Team….I am working on exposing an API endpoint to populate
    DatasetSnapshot
    metadata….. I am having some issues when Deserializing the
    fields-> type-> type
    within the
    SchemaMetadata
    Aspect….https://github.com/linkedin/datahub/blob/master/metadata-models/src/main/pegasus/com/linkedin/schema/SchemaFieldDataType.pdl I guess i should set the type of data as value in my Jackson Deserialization inorder for me to set the corresponding type but i am having challenges with that… If linkedin or anyone in the community handled such case with Jackson Deserialization kindly help out…. Details of input/error in the
    Thread
    n
    b
    +1
    21 replies
    Copy to Clipboard
  • h

    high-hospital-85984

    1 year ago
    We’re trying to get an understanding of the storage volume needs on ES and Neo4J. Is anyone willing to share some numbers, for example Gb in Neo4J, ES versus number of elements in Datahub?
    h
    b
    +1
    7 replies
    Copy to Clipboard
  • h

    high-hospital-85984

    1 year ago
    We’re thinking about the level of persistence needed especially form Neo4J, as we might end up hosting it ourselves. In case we need to regenerate the data in Neo4J, whats the best approach? I guess we can keep a full history in the Kafka-topic, and reset the cursor manually, but it feels suboptimal from a scaling perspective? Is there a way to tell GMS to resend the MAE messages based on whats in MySQL? Backups are of course nice, but we see a risk of the ES and Neo4J backups getting out of sync in case of a disaster. Therefore, it would be nice to have a way to repopulate the DBs as a fallback. Or maybe we’re just overthinking this 😅
    h
    b
    5 replies
    Copy to Clipboard
  • h

    hallowed-dinner-34937

    1 year ago
    Hello, This may be a question that could be easily answered if I just venture into the github repo and look around but I decided to take this route instead! aha!.. I'm wondering if there is any documentation or if someone could point me towards the code that would show me how the datahub API works. We're currently looking to try to ingest data into datahub through external purposes. For example, an external process will create some documentation in google drive when a new table/entity is created, after this another seperate process will push the link to this google doc into the datahub along with the new table.
    h
    a
    +1
    13 replies
    Copy to Clipboard