https://linen.dev logo
Join Slack
Powered by
# help-connector-development
  • a

    Anton Marini

    03/23/2023, 8:14 PM
    thanks @Alexandre Girard (Airbyte)
  • m

    Matheus Barbosa

    03/23/2023, 8:37 PM
    Hi guys! How are yall doing? Let me explain my question: I’m using the UI CDK builder to build a very simple connector, is an API with just one stream but it’s paginated. As you can see in the screen shot I just uploaded below, the nex_page_url is returned in the API response and I just need to keep fetching the data until there’s no more next_page_url value. Can someone help me with this? It may sound very silly doubt but I’m really struggling with that!
    s
    • 2
    • 7
  • a

    Anton Marini

    03/24/2023, 1:16 AM
    I think I have weaviate fixed - ive added manual schema insertion but im hoping i havent fucked anything up too much lol
  • a

    Anton Marini

    03/24/2023, 4:34 PM
    Stupid q - im trying to put a PR together for my Weaviate destination connector fix to generate schema via the Airbyte Configured Catalog / Stream. Ive forked airbyte and made a branch, but a push to the branch is errororing with:
    v0.44.1-helm -> v0.44.1-helm (refusing to allow a Personal Access Token to create or update workflow .github/workflows/build-report.yml
    without
    workflow
    scope)` im assuming im missing a step in my config.
    • 1
    • 1
  • v

    viritha vanama

    03/24/2023, 6:37 PM
    Hi Everyone, im using UI to build Google DCM source connector, when using path url “https://dfareporting.googleapis.com/dfareporting/v4/userprofiles/{id}/creatives” im getting valid response. so the connection is successful but anyone know how to get all the dimensions and metrics?
  • r

    Ryan (Airbyte)

    03/24/2023, 7:05 PM
    @viritha vanama is working on building out a low code version of the Google Campaign Manager source. Viritha Can you share the error message you were getting? I think from what I saw it appears to be an Oauth issue. I haven't tested Oauth with our Connector builder, so I'm not sure about the problem exactly so was hoping for more 👀
  • v

    viritha vanama

    03/24/2023, 7:16 PM
    its not oauth issue, im receiving the correct response
  • v

    viritha vanama

    03/24/2023, 7:16 PM
    my question is what should be the path url to get all the metrics and dimensions( i only tried to get creatives above)
  • a

    Anton Marini

    03/24/2023, 7:33 PM
    Hi friends. Ive submitted a PR to fix a bug and improve performance in the weaviate connector here: https://github.com/airbytehq/airbyte/pull/24524
  • a

    Anton Marini

    03/24/2023, 7:33 PM
    This is my first contribution to Airbyte and im barely familiar with it - apologies for any mis-steps
  • a

    Aazam Thakur

    03/25/2023, 8:04 AM
    Hello everyone, Does anyone know of connectors with nested streams implemented in them?
    a
    • 2
    • 2
  • e

    Eduardo lopes

    03/28/2023, 4:02 PM
    Hello guys, I'm trying to use connector-build to create a new source that returns data from an api. I have some difficulties because in this API, I need to follow some steps before accessing the data, and connector-build doesn't provide an authentication method needed to return the session id. See below the steps I need to follow in order to access the data: • Get access token • Get session id (cookie) • List campaign data using session cookie. With python, I do POST, and authenticate myself by passing the login info through the body. However, in airbyte I'm not getting any example to be able to build my connector. Can someone help me by providing a case similar to this one where I always need to pass a cookie to access certain data?
  • s

    Shreepad Khandve

    03/28/2023, 6:41 PM
    @Andy Yeo (Airbyte) @Alexandre Girard (Airbyte) @Conor Barber (Airbyte) @Erica Struthers (Airbyte) Do we have any example of low code connector where we have implemented Oauth2 authentication, paginator and nested streams. I have tried everything using documentation but its not working. I am stuck at this point where i can fetch all the data but paginator and nest streams dont work. I have raised many tickets and sent questions on slack as well. Let me know in case of any example if we have used the above request in low code connector.
    c
    b
    • 3
    • 40
  • s

    Sudarshana T

    03/31/2023, 4:28 AM
    Hello Guys i started to do CDC from postgres(VM instance in gcp) to Bigquery using airbyte . I have seen 2 documentation for CDC in both documentation they are using Loading methods as GCS staging. Is it necessary to use that for CDC can I use standard inserts?
  • j

    Josh Douch

    04/02/2023, 5:31 PM
    I’m trying to develop a connector for Unleashed. The latest hurdle I am trying to overcome is that the dates are given as timestamps in the following format “/Date(1600300800000)/”. What is the best way of going about this? The idea of a custom transformation is scary, and I’m not even sure how I would go about extracting the timestamp from this string and converting it in SQL. Any help would be greatly appreciated! Thank you!
    h
    j
    • 3
    • 2
  • h

    Hans Peter Hagblom

    04/03/2023, 7:59 AM
    I'm developing a source connector for Fortnox, and are using an OAuth Authentication Code Grant type for authentication. I need to supply ClientId ClientSecret and RefreshToken to configure this connector. My problem is the following: Whenever the refresh token is used to get a new access token a new refresh token is issued and the old one is invalidated, (see https://datatracker.ietf.org/doc/html/draft-ietf-oauth-security-topics#section-4.14.2 under Refresh Token Rotation) The question I have is, if there is any possibility of storing state in the source connector, or somehow update the config to always store the most current refreshtoken in use. I'm thinking that storing the current refresh token on the connection between source and destination is not a good solution in case the same configured source is used in two different connections. Also I have some concerns around the locking and updating this variable to make sure that the refresh-token-chain never get two concurrent updates which could potentially invalidate the refresh token chain.
    j
    • 2
    • 3
  • j

    Josh Douch

    04/03/2023, 4:55 PM
    Is anyone able to help with my query? Thanks!
  • k

    Kevin Conseil

    04/04/2023, 12:58 PM
    Hi Everyone, I m just started with the journey of building my first custom connector 🙂 I am having struggle to connect to http://localhost:8000. -> this site can't be reached Any ideas ?
  • v

    Vladimir Stankovic

    04/06/2023, 1:47 AM
    I'm creating a custom Rest API connector using UI builder tool. I have question regrading incremental sync support. Is it possible to generate a more sophisticated JSON payload using incremental sync form? Sample filter criteria that needs to be passed to the API - and last sync date needs to be injected here as well:
    {
    "filterCriteria": [
    {
    "field": "lastModifiedDate",
    "operator": "AFTER",
    "value": "2023-04-02"
    }
    ],
    "sortingCriteria": {
    "field": "lastModifiedDate",
    "order": "ASC"
    }
    }
    a
    • 2
    • 2
  • d

    Danish Raza

    04/06/2023, 5:55 PM
    bigquery denormalized is not creating the
    additionalProperties
    and instead creating 2 columns
    event_data.type
    event_data.additionalProperties
    is this the expected behavior? can someone help me on this. when i tried same on snowflake, it produces
    event_data
    as json and one column. i am looking for same behavior on bigquery denormalized connector.
    Copy code
    "event_data": {
          "type": ["null", "object"],
          "additionalProperties": true
        },
    PS: i'm using some legacy airbyte instance and can not upgrade it atm.
  • j

    Josh Douch

    04/06/2023, 9:47 PM
    Hi, I have created a custom transformation, however, I am receiving the error 'No module named dbt.adapters.mysql' Any ideas? Thank you!
    • 1
    • 1
  • e

    Eric Schrock

    04/07/2023, 12:36 PM
    I posted this a few weeks ago on discuss.airbyte.io but didn't get any responses: https://discuss.airbyte.io/t/how-to-manage-internal-id-for-dynamic-streams/4153 I have a connector where I'd like to refer to a stream by name ("foo"), but the underlying API is all based on IDs ("1234"). During discovery I can iterate over all objects and display the stream name. At that point I have the ID in hand ("1234") which would be used for all future API calls. But it looks like there is no way to store the ID, so all I have when someone selects the stream is the name ("foo"). This means that every time I have to iterate over all objects just to find the ID to then make the API calls I want. It's not the end of the world, but am I missing something? Is there any way take information found during discovery and store it alongside the name for later use? There is a "namespace" section but the documentation states that all streams returned from a source should share a namespace so I'm not sure that would be for general use.
    s
    • 2
    • 2
  • j

    Jon Erik Kemi Warghed

    04/07/2023, 4:18 PM
    Hey I got the low code yaml file working nicely, except for one problem, I want to add a transformation and I am not sure where I am going wrong. I added like this with the itention of each record also getting the streamslice.job_id it was collected from, but using the UI to test it, no records are getting this field added. What am I missing?
    Copy code
    record_selector:
            type: RecordSelector
            extractor:
              type: DpathExtractor
              field_path:
                - data
          transformations:
            - type: AddFields
              fields:
                - path: job_id
                  value: '{{ stream_slice.job_id }}'
            - type: RemoveFields
              field_pointers:
                  - ["attributes", "first-name"]
                  - ["attributes", "last-name" ]
          paginator:
            type: DefaultPaginator
            page_token_option:
              type: RequestPath
            pagination_strategy:
              type: CursorPagination
    cursor_value: '{{ response[''links''][''next''] }}'
    a
    • 2
    • 15
  • a

    Akash

    04/10/2023, 6:31 AM
    I have a custom data warehouse with me and a jdbc driver to run it. I want users of my source connector to be able to run sql operations using the source connector. How can I make this exactly using the Airbyte CDK?
  • a

    Akash

    04/10/2023, 6:31 AM
    I'd appreciate any sort of help i can get I am fully stuck in this
  • c

    Carolina Buckler

    04/10/2023, 3:24 PM
    Getting this error using the Salesforce connector
    2.0.9
    Copy code
    Error: Cannot receive data for stream 'EmailMessage', error message: 'exceeded 100000 distinct ids'
    Any suggestions on how to fix this, or backfill in chunks?
    s
    • 2
    • 1
  • e

    Eric Schrock

    04/10/2023, 3:34 PM
    Does anyone know how to test the Jira connector? I'd like to fix this issue: https://github.com/airbytehq/airbyte/issues/24981 The source code change is a one-liner but I need help with tests. The unit tests are extremely basic. Just asserting that streams exist with no content checks at all, so there's nothing to change there (though seems like maybe they could/should be expanded). The bulk of the testing seems to be through integration tests that I think are connecting to some shared Jira instance. The README states:
    Copy code
    **If you are a community contributor**, follow the instructions in the [documentation](<https://docs.airbyte.io/integrations/sources/jira>)
    to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_jira/spec.json` file.
    Note that the `secrets` directory is gitignored by default, so there is no danger of accidentally checking in sensitive information.
    See `integration_tests/sample_config.json` for a sample config file.
    However, that documentation contains nothing about generating credentials.
    s
    • 2
    • 3
  • a

    Akash

    04/10/2023, 4:53 PM
    Could I get some guidance on building a source connector for my use case. I have a custom data warehouse with me and a JDBC driver to run it. I want users using my source connector to run SQL Operations against my data warehouse. How can I get started with building this using the Airbyte CDK?
  • a

    Akash

    04/10/2023, 4:54 PM
    I am fully confused about this step and I do not know where I can start from and would love to have some guidance
  • s

    Shreepad Khandve

    04/13/2023, 12:33 PM
    Hello team, I have created custom connector and on instance i have uploaded the docker image to get the new connector. Conenctor is working fine in local, and on interface as well. I can see emitted records as well. but when i look into the schema, the main table is blank. What could be the reason ? is this a version issue or am i missing something
    m
    • 2
    • 18
1...567...21Latest