https://linen.dev logo
Join Slack
Powered by
# help-connector-development
  • d

    Damar Adi

    04/28/2023, 7:57 AM
    Hey everyone! I have a simple question. So, imagine I have a connection set up from A to B. Now suppose I create a new connection from A to B again, but this time with a different transfer and sync method. Is it alright to do that? Or will there be any issues if both connections run simultaneously? Would love to hear your thoughts on this. Thanks!
    k
    • 2
    • 8
  • m

    Mayur Choubey

    04/28/2023, 8:11 AM
    hi team, I am new to Airbyte and was trying to modify an existing connector. I just made changes in spec file to see if those are reflected on UI or not. Ran below commands -
    Copy code
    ./gradlew :airbyte-integrations:connectors:destination-iceberg:build
    ./gradlew :airbyte-integrations:connectors:destination-iceberg:airbyteDocker
    docker run --rm airbyte/destination-iceberg:dev spec
    in the output of the last command I can see the test changes I made on UI (just change of titles).
    k
    j
    • 3
    • 20
  • m

    Mayur Choubey

    04/28/2023, 8:11 AM
    But when I am logging into UI these changes are not getting reflected.
    k
    • 2
    • 2
  • m

    Mayur Choubey

    04/28/2023, 8:11 AM
    any idea which step am I missing ?
    k
    • 2
    • 2
  • d

    Dhanji Mahto

    04/28/2023, 12:57 PM
    Hi Team, I am getting access denied while creating PR. Can you please help.
    k
    m
    • 3
    • 5
  • s

    Slackbot

    04/28/2023, 6:44 PM
    This message was deleted.
  • a

    aidan

    04/28/2023, 7:49 PM
    Hi I am using the connector builder ui and I am trying to build an incrementally syncing api whose lowest cursor granularity is date. I have the incremental sync working. However whole using the cursor it syncs data including the cursor date meaning that everytime it syncs data on the cursor date is duplicated. Is there a combination of the macros to get around this . Or some other setting
    k
    a
    • 3
    • 12
  • s

    Slackbot

    04/29/2023, 11:39 AM
    This message was deleted.
  • m

    MF

    04/30/2023, 3:44 PM
    Hi. I need some help. I'm new to py and airbyte, I'm trying to create a custom connector with SDK. I'm working with an endpoin that the authentication process require one POST request with plain username and password and returns a json with a token. And on the next requests i need to add in header the field 'authorization: JWT {token}'. What is the best way to do this? Do I need to create new classes for this auth worflow?
    k
    m
    • 3
    • 5
  • m

    MF

    04/30/2023, 4:09 PM
    Hi, could you share a piece of code of how to use NoAuth class to make a POST request?
    k
    • 2
    • 2
  • m

    MF

    04/30/2023, 4:48 PM
    When using a custom authenticator in my custom conector, in what class/function should i instantiate the custom authenticator class?
    k
    • 2
    • 2
  • r

    Rutger Weemhoff

    05/01/2023, 12:38 PM
    Hey, I am building a connector using the low code connection builder. I am using "Cursor Pagination" and understand how to configure the cursor value like
    {{ last_records[-1]['id'] }}
    . Now I would like to use the actual cursor value in a custom Request Body parameter with key "query". The value of this request body parameter would be something like:
    Copy code
    select id, ..., ... from table where id > {{ cursor_value }} order by id
    I am not sure in which variable the actual cursor value would be stored or how I can find out. Can you please point me in the right direction? For incremental syncs I am already using
    {{ stream_slice['start_date'] }}
    and
    {{ stream_slice['end_date'] }}
    succesfully in the same way.
    k
    • 2
    • 3
  • a

    Adam Roderick

    05/01/2023, 3:57 PM
    Hi, what is the process for getting started? I've made some PRs but they just sit there. 1. Where can I ask for guidance? 2. Where can I get an overview of the process of submitting a PR, getting a new docker image version, and getting merged?
    k
    m
    • 3
    • 9
  • m

    MF

    05/01/2023, 5:28 PM
    Hi could you help me with this error ImportError: cannot import name 'Authenticator' from 'airbyte_cdk.sources.streams.http.auth' (/home/felipe/.local/lib/python3.10/site-packages/airbyte_cdk/sources/streams/http/auth/__init__.py)
    k
    m
    • 3
    • 5
  • g

    Glauber Costa

    05/01/2023, 11:14 PM
    Hey folks - very new to airbyte, and to be honest, not much of a python or java guy either. So forgive me for any stupid thing in advance. Also I just opened this: https://github.com/airbytehq/airbyte/issues/25738 - no pressing reason for writing this connector myself, but definitely willing to help! Here's the issue:
    k
    • 2
    • 4
  • g

    Glauber Costa

    05/02/2023, 1:07 AM
    if I want to set up basic normalization for my destination connector, do I have to shape the tables myself? (as per https://docs.airbyte.com/understanding-airbyte/basic-normalization/). Also, any reason why the SQLite connector doesn't support that ?
    k
    • 2
    • 4
  • k

    Konstantin Shamko

    05/02/2023, 6:03 AM
    Hey guys. I am currently attempting to create a custom source for CircleCI using a low-code YAML-based feature. I have encountered a specific use case that requires the following steps: 1. First, I make a request to the project stats endpoint to receive some data. In the response, along with other fields, there is a list of available workflows, which I refer to as "all_workflows". This list is structured in JSON format as follows:
    Copy code
    json
    {
           ......
    	"all_workflows": ["workflow_1", "workflow_2", ..., "workflow_N"]
    }
    2. Next, I need to fetch some workflow stats from a different endpoint, which is structured as follows: "https://circleci.com/api/v2/insights/time-series/{project-slug}/workflows/{workflow-name}/jobs". In this URL, the "project-slug" is defined as a variable, while the "workflow-name" should be taken from the previous step's response. To accomplish this, I created a separate stream with partitioning settings defined (refer to pic1.png). In these partitioning settings, I refer to the stream/field from the previous step and define some alias on that field. 3. When I test this stream, I do not receive any response, which is expected because the workflows for the request are concatenated with commas. Unfortunately, the endpoint does not support multiple workflow names in a single request, so I need to send N requests to fetch the required stats. My questions are as follows: 1. Is it possible to iterate over fields from another stream to make multiple requests? 2. If not, what is a workaround that I can use with a low-code approach to implement this use case? 3. (optional) What is the correct Airbyte way to implement such an integration? Thank you in advance
    k
    • 2
    • 3
  • m

    Matheus Barbosa

    05/02/2023, 1:39 PM
    I’M HIRING someone to make a custom connector for me, the connector is for Google Ad Manager and it has to use the Google Ad Manager API. Contact me if you want or know someone who does.
    k
    • 2
    • 2
  • d

    Dion Duran

    05/02/2023, 7:09 PM
    Hello all - I am working on creating a connector through the connector UI for Shipbob but when I go to test my stream it seems to give me a dummy output as following. Thank you for any help in advance!
    Copy code
    [
      {
        "name": "Joel Miller"
      },
      {
        "name": "Ellie Williams"
      }
    ]
    k
    a
    • 3
    • 8
  • q

    Quazi Hoque

    05/02/2023, 9:09 PM
    Hey team, I'm running into an issue when attempting to test some connector changes on our instance of Airbyte. I am building the connector with gradle and then using docker to tag and push the build up to our mitodl/destination-s3-glue. When I try to upgrade the connector in the UI, I get an error message
    Sorry. Something went wrong...
    Looking at our airbyte-worker logs, I see a more specific error message:
    Copy code
    Error while getting spec from image mitodl/destination-s3-glue:0.1.7-d
    k
    • 2
    • 3
  • s

    Slackbot

    05/02/2023, 9:14 PM
    This message was deleted.
    k
    r
    • 3
    • 3
  • r

    Randal Boyle

    05/02/2023, 9:14 PM
    image.png
  • r

    Randal Boyle

    05/02/2023, 9:16 PM
    im trying to understand what causes the message
    Non-breaking schema updates detected
    when i click review changes there are no changes observed?
    k
    • 2
    • 2
  • b

    Balaji Seetharaman

    05/03/2023, 5:58 AM
    Hi Team, I am going to work on the SEM Rush connector. Does airbyte team have an api key that i can use for connector development?
    k
    • 2
    • 2
  • a

    Aazam Thakur

    05/03/2023, 11:49 AM
    How do we implement nested substream partitions whose parent itself uses a substream partition to call it's parent? I am trying to apply it for the twilio api for it's dependent phone number stream https://github.com/airbytehq/airbyte/pull/25705/commits
    k
    • 2
    • 20
  • s

    Slackbot

    05/03/2023, 12:48 PM
    This message was deleted.
  • n

    Nicolas Jullien

    05/03/2023, 12:51 PM
    Hi everyone, I am currently using airbyte, deployed on GKE, with some custom connectors. My issue is to automatically provide a documentation for my connectors which are separated from the Airbyte project since I want them available on a docker registry. Is there any possible way to attach a documentation on a custom connector without having control over the Airbyte project? Thx a lot in advance for your help! Have a nice day 🙂
    k
    • 2
    • 3
  • s

    Slackbot

    05/03/2023, 3:17 PM
    This message was deleted.
  • v

    Vivek Jain

    05/03/2023, 3:25 PM
    Hello everyone, Attempting to create Airbyte connector using Java wherein whilst executing the standard test cases in our local environment some of which are failing. It is looking for files "destination_config.json" and "destination_catalog.json". These files does not exist in airbyte repo. When checked for logs, found that it is searching for these files under directory "/data/job". This directory generated during test case execution using below command : {"type":"LOG","log":{"level":"INFO","message":"INFO i.a.w.p.DockerProcessFactory(create):176 Preparing command: docker run --rm --init -i -w /data/job --log-driver none --name destination-vertica-check-0-0-zjnzg --network host -v /tmp/airbyte_tests/test14836968802016406834:/data -v /tmp/airbyte_tests/output4991573947145016338:/local -e USE_STREAM_CAPABLE_STATE=false -e FIELD_SELECTION_WORKSPACES= -e APPLY_FIELD_SELECTION=false -e WORKER_CONNECTOR_IMAGE=airbyte/destination-vertica:dev -e WORKER_JOB_ATTEMPT=0 -e AUTO_DETECT_SCHEMA=true -e WORKER_JOB_ID=0 airbyte/destination-vertica:dev check --config source_config.json"}} Also, our PR needs review. Kindly suggest if that can be reviewed by someone from Airbyte team : https://github.com/airbytehq/airbyte/pull/25682 Many Thanks in advance. Kind Regards/Vivek
    k
    • 2
    • 2
  • y

    Y L

    05/03/2023, 4:48 PM
    Does airbyte have a workday connector being developed ?
    k
    m
    j
    • 4
    • 7
1...789...21Latest