https://linen.dev logo
Join Slack
Powered by
# help-connector-development
  • j

    Josh Jeffries

    04/17/2023, 1:22 PM
    Hey everyone, Loving what you have done with the connector builder in 0.44! adding the publish/release new version button makes the whole process of using the builder a lot slicker. With the transformation section you have also added, are you able to add in dynamic values from a field on the record? For example i have a record selector which outputs a nested array, however, i want to also include one of the top level values in the response (in this instance i want to include the top level candidate Id but do a record selector on a nested skills table.) Thanks Josh
    • 1
    • 2
  • t

    Tristan Crudge

    04/18/2023, 3:34 PM
    Hey, Wanted some advice on a custom rest connector using python. Can someone tell me at what point state is updated during instantiation? Thanks
    m
    m
    • 3
    • 4
  • b

    Benjamin Edwards

    04/20/2023, 9:44 AM
    Hi Everyone, soon I will be looking to explore Airbyte's connector with Braze. I can see this is currently in it's alpha stage of development, are there any updates on when this will be further developed to Beta stage?
  • k

    Kazim Raza

    04/20/2023, 3:50 PM
    Hello, I have coded an airbyte custom destination using python. I have implemented an incremental sync dedup operation using this query,
    Copy code
    MERGE INTO {self.schema_name}.{table_name} AS target
    USING {self.schema_name}.{table_name}_temp AS source
    ON target.data_{primary_keys[0][0]}=source.data_{primary_keys[0][0]}
    WHEN MATCHED THEN
            {query_placeholder_refined}
    WHEN NOT MATCHED THEN
            INSERT *
    I am taking github as source and my custom python destination, but some streams cause sync failure for instance this error,
    pyspark.sql.utils.AnalysisException: Updates are in conflict for these columns: data_user_id
  • d

    Dieter De Beuckeleer

    04/21/2023, 9:03 AM
    I struggle sometimes with the error message:
    Internal message: Empty string keys not allowed without dpath.options.ALLOW_EMPTY_STRING_KEYS=True: ('__injected_declarative_manifest', 'streams', 1, 'retriever', 'requester', 'request_headers', '')
    I managed to fix this once but a lot of clicking around, but I did not really understand what I was doing. Could someone help me explain what I am doing wrong? (fyi: I am using the connector builder )
    a
    • 2
    • 4
  • g

    Gergely Imreh

    04/25/2023, 5:34 AM
    Hey, I’m using the low code/config based connector builder, and have an API schema that seems a bit annoying to match, I wonder if there’s any advice on how’s best to approach? The schema returns a bunch of records, but the IDs of the records are the keys of the object…
    Copy code
    {
      "status_reports": {
          "6509036": {
            "color": "green",
            "description": "All OK",
            "id": "6509036"
          },
          "6509037": {
            "color": "yellow",
            "description": "Watch out",
            "id": "6509037"
          },
    ....
    So here,
    6509036
    is an ID for a record, etc…. I can use the first record selector as
    status_reports
    but then it’s not clear if I can flip that the properties to become a list themselves? I don’t need to keep the key as the ID is repeated inside that object. (or whether I will need to go to the HTTP API based one just to do this transformation)
    ✅ 1
    • 1
    • 1
  • u

    [DEPRECATED] Marcos Marx

    04/26/2023, 2:51 PM
    has renamed the channel from "using-the-cdk" to "help-connector-development"
  • m

    Marcos Marx (Airbyte)

    04/26/2023, 2:55 PM
    has renamed the channel from "public-cdk" to "public-help-connector-development"
  • a

    Andreas Stylianou

    04/26/2023, 4:55 PM
    Hi guys, i have quick a question.i am using a pipeline to source is salesfirce target is kafka.Is it possible to restructure the message before it goes in to kafka topic? for example bring the columns inside __AIRBYTE_DATA on the same level with AIRBYTEDATA_
    k
    m
    • 3
    • 4
  • b

    Ba Thien Tran

    04/26/2023, 6:15 PM
    Hi everyone, it seems like every connector is implemented in some way differently – is there any golden standard example? Makes me wary that there isn’t anything guard railing implementation which makes connectors more prone to fail. Understand that each API is different (from just API keys, to existing libraries, etc.) but I’d thought that there is at least some consistency
    k
    m
    • 3
    • 14
  • t

    Todd Vernick

    04/26/2023, 6:59 PM
    Hi there! It seems the mongodb connector is broken trying to read the system.profile collection (errMsg:"FieldPath field names may not start with '$'.") Wondering if anyone has a workaround.
    k
    • 2
    • 6
  • e

    Ethan Veres

    04/26/2023, 8:01 PM
    Is there a way to have child streams be incremental based off of the parent? I have an Items API
    /items
    and for each
    item
    there are
    approvals
    (
    /items/<id>/approvals
    ),
    archives
    , and
    members
    . Each of those sub resources are not incremental by itself, but rather I can leverage the parent’s incremental sync and only call those sub resources. How can I do that?
    k
    l
    • 3
    • 3
  • j

    Johannes Müller

    04/27/2023, 6:26 AM
    How can I build the entire project without running any tests or checks?
    k
    • 2
    • 4
  • s

    Steven UZAN

    04/27/2023, 6:32 AM
    Hello there! Not sure it’s the right channel but Airbyte does not support the DELETE statement for the BigQuery->Postgres connection (probably related to BigQuery source). Is there any plan to add it?
    k
    s
    • 3
    • 4
  • j

    Johannes Müller

    04/27/2023, 6:42 AM
    I am running into an issue with compiling a connector I have not even changed 😄 How do I avoid that?
    m
    • 2
    • 4
  • s

    Shreepad Khandve

    04/27/2023, 7:57 AM
    Hi Team, I was working on low code connector and facing issue while stopping the loop for cursor based pagination using next link, any help would be appreciated.
    Copy code
    paginator:
            type: DefaultPaginator
            pagination_strategy:
              type: "CursorPagination"
              cursor_value: "{{ response.links.next }}"
              stop_condition: "{{ 'links' not in response }}"
    k
    j
    • 3
    • 11
  • d

    Dieter De Beuckeleer

    04/27/2023, 9:20 AM
    I would like to see what kind of requests Airbyte is sending. Usually I can see it in the testing module but sometimes I can’t. Is there an alternative way to see the requests? I am getting
    400 Client Error: Bad Request for url
    messages and I would like to investigate what is going wrong.
    k
    m
    j
    • 4
    • 38
  • s

    Slackbot

    04/27/2023, 10:37 AM
    This message was deleted.
    k
    • 2
    • 2
  • b

    Benjamin Edwards

    04/27/2023, 10:39 AM
    Hi All, I have recently set up an airbyte connector between postgres and snowflake. I have tried to set the destination namespace to 'mirror source structure' and this successfully loads data into a schema with the same name as the schema in the source ostgres database. however it also creates an empty table with the same name in the default destination schema as well as creating an _AIRBYTE_RAW_* table in the default database. This seems like an unnecessary duplication of data. Are there configuration changes that can be made so source data is copied only into the same schema name in the destination or is it recommended to change the default destination schema to the required schema for each source schema?
    k
    • 2
    • 10
  • v

    Varadharaj

    04/27/2023, 10:41 AM
    Hi Team, Currently configured Python Http API based source ->Local JSON connection with single stream. Response data is writing to single jsonl file, is there any possibility on whether we can write response data to multiple files in single stream, because my response data size is huge so difficult to process single file. Please advise whether we can able to do in custom connector.
    k
    • 2
    • 3
  • d

    Dhanji Mahto

    04/27/2023, 11:37 AM
    Hi Team, We are facing issue - Issue - When we are running integration test cases & our database container is starting, test case are getting executed before container start. As a result 9 test cases are failing. Any idea how to fix ?
    u
    • 2
    • 4
  • g

    Gergely Imreh

    04/27/2023, 12:46 PM
    Hi, I’m looking at the connection builder, and trying to set up an incremental sync. The API I’m talking to returns a time zone, but it seems somewhere that value is being lost in the process. The API in question returns times like
    2023-04-27T06:29:50-07:00
    which I parse according to the relevant
    %Y-%m-%dT%H:%M:%S%z
    format. If I put that example value as a starting time, and then look at the request sent by AirByte, the time zone data gets zeroed out (ie. parsed as the wrong time), here for example used as
    2023-04-27T06:29:50+0000
    in the request (added screenshot of the request info, the first datetime value) Am I missing something? Otherwise I would worry, that the cursor will keep/propagate the wrong times, potentially? Cheers!
    j
    m
    a
    • 4
    • 13
  • s

    Slackbot

    04/27/2023, 2:58 PM
    This message was deleted.
  • v

    Victor Babichev

    04/27/2023, 2:59 PM
    I have really strange error when I try transformation: Add field. Remove field work perfect. Maybe I’m doing something wrong?
    j
    • 2
    • 7
  • v

    Viswadeep Veguru

    04/27/2023, 3:50 PM
    Is there any API for get Source
    schema
    like a table definition in the Snowflake?
    k
    • 2
    • 2
  • a

    Anchit

    04/27/2023, 6:23 PM
    Hey, I have a use-case to download ZIP files (without extracting) from an HTTPS website. The current Files source connector does not support syncing zip files. So I guess the next step would be to create a custom connector. I've gone through the Python CDK docs & tutorial but would like to get some ideas from the community. • Can we modify the current Files connector to also ingest zip files? If not, what's the reason? • How to define a schema for this use case?
    k
    • 2
    • 2
  • s

    Slackbot

    04/27/2023, 8:28 PM
    This message was deleted.
  • d

    Disha

    04/27/2023, 8:31 PM
    Hi I do not see any option to enter the github repository url when using open source airbyte locally I only see repositories but no option to enter URL.
    k
    m
    • 3
    • 7
  • a

    Andrew Nessin

    04/28/2023, 5:57 AM
    I have a situation! I am pulling data from a source DB and storing it in destination DB. Now, I want to transform the data before storing it. I understand we can do this with DBT. However, in my case, I want to call some external URLs to fetch additional data as part of transformation. The input to these URLs depend on the data I pull from MySQL DB. How can I setup this flow in Airbyte?
    k
    • 2
    • 2
  • d

    Dhanji Mahto

    04/28/2023, 7:14 AM
    Hi Team, I am running integration test and getting below error. Optional[io.airbyte.protocol.models.AirbyteConnectionStatus@759d44a3[status=FAILED,message=Could not connect with provided configuration. Error: java.nio.file.NoSuchFileException: source_config.json,additionalProperties={}]] source_config.json from where testing module is reading it ? Please help
    k
    • 2
    • 2
1...678...21Latest