• m

    many-accountant-26574

    2 years ago
    Oh great, just scrolled upwards and read it hasn't been implemented yet.
    m
    b
    2 replies
    Copy to Clipboard
  • m

    many-accountant-26574

    2 years ago
    It occurs to me that there's no pegasus schema type for timestamps or date(times).
    m
    b
    4 replies
    Copy to Clipboard
  • m

    many-accountant-26574

    2 years ago
    Got a question about the consumers: how can I configure them listening to an external kafka cluster, more importantly using https and basic auth headers for API keys? Currently using Confluent Cloud as a test environment. I couldn't find a descriptive environment variable for specific security settings.
    m
    b
    39 replies
    Copy to Clipboard
  • n

    nutritious-bird-77396

    2 years ago
    Ahh…got it…So you are saying that this is not an issue. In that case if i try to use the generated avsc in an other project to generate java objects it throws error as it doesn’t understand
    "com.linkedin.common.Ownership"
    . Does that mean i will have to manually change the
    avsc
    in this case?
    n
    m
    6 replies
    Copy to Clipboard
  • m

    many-accountant-26574

    2 years ago
    🙂 got confluent cloud working.
    m
    1 replies
    Copy to Clipboard
  • m

    many-accountant-26574

    2 years ago
    Now I want to set up a few tests but, I am a bit of newbie to GraphQL especially now with Avro/Pegasus combined so please help me out haha. I'v e run the ms-cli tasks to ingest the sample datasets but:1. Can't figure out how to delete them.
    m
    1 replies
    Copy to Clipboard
  • m

    many-accountant-26574

    2 years ago
    But why is the value of that key a tuple instead of yet another dict?
    m
    1 replies
    Copy to Clipboard
  • a

    acceptable-architect-70237

    2 years ago
    I didn't see the
    status
    , but
    removed:false
    just showed up
    a
    b
    24 replies
    Copy to Clipboard
  • p

    plain-arm-6774

    2 years ago
    Hi! I see that we have a model for DeploymentInfo but I can’t find the association to MetadataChangeEvent. Is there no way to define it via Kafka stream?
    p
    s
    +1
    11 replies
    Copy to Clipboard
  • f

    fancy-analyst-83222

    2 years ago
    Hello everyone, I am new to datahub and was exploring a little. Had a doubt around if there’s an option for a minimal setup? Can I bypass Kafka and the schema registry and just use the rest? Or is there a hard dependency on these parts?
    f
    o
    +3
    15 replies
    Copy to Clipboard