https://linen.dev logo
Join Slack
Powered by
# help-connector-development
  • j

    Jens Mostaert

    07/17/2025, 11:28 AM
    I'm getting 500 errors on Airbyte Cloud on the "Test and save" button of Sources. I already created a support ticket yesterday, but no response so far. Anyone who knows what's going on?
    Copy code
    POST /api/v1/sources/update
    
    {
        "message": "Internal Server Error: io.grpc.StatusRuntimeException: ALREADY_EXISTS: Secret [projects/332657581931/secrets/airbyte_workspace_67db3dcd-eb80-424c-9ae2-e42d654f28bc_secret_78eadfdf-dde2-45e9-b287-d3f37b8c5ee3_v7084] already exists.",
        "exceptionClassName": "com.google.api.gax.rpc.AlreadyExistsException",
        "exceptionStack": [],
        "rootCauseExceptionStack": []
    }
    u
    • 2
    • 1
  • m

    Martin Andonov

    07/17/2025, 6:54 PM
    Hi All, Can you please help me set up the Builder configuration correctly to achieve the below pull? Goal: I have a response that contains a nested array in a column called vendorItems. I want to pull only the values in the nested array, without the remaining data. Approach: I'm attempting to use the Record Selector in the Builder UI to drill into the nested array. Below is my current setup: type: RecordSelector extractor: type: DpathExtractor field_path: - '*' - result - '*' - results Outcome: I get the error "No records could be extracted from the response. Check the Response tab to see the raw response. If this looks correct, make sure the Record Selector is configured correctly." Response: Below is a sample of the API response for reference: { "status": 200, "body": [ { "vendorId": "928b864a-9656-4dea-822f-d7cc4e94430f", "contactName": "", "currencyId": "88285281-a77e-4e96-ad51-6cfa2d0bbd31", "customFields": { "custom1": "", "custom2": "", "custom3": "", "custom4": "", "custom5": "", "custom6": "", "custom7": "", "custom8": "", "custom9": "", "custom10": "" }, "defaultAddressId": null, "defaultCarrier": "", "defaultPaymentMethod": "", "defaultPaymentTermsId": null, "discount": "0.00", "email": "", "fax": "", "isActive": true, "isTaxInclusivePricing": false, "lastModifiedById": "89aebc56-db26-4ee8-8325-0b9518519e94", "lastModifiedDttm": "2024-02-12T215104.2057167+00:00", "leadTimeDays": null, "name": "1000Bulbs", "phone": "", "remarks": "", "taxingSchemeId": null, "timestamp": "0000000004CA71B7", "website": "", "vendorItems": [] }, { "vendorId": "6677fe2e-081f-485b-baf0-e0de68787cd6", "contactName": "", "currencyId": "88285281-a77e-4e96-ad51-6cfa2d0bbd31", "customFields": { "custom1": "", "custom2": "", "custom3": "", "custom4": "", "custom5": "", "custom6": "", "custom7": "", "custom8": "", "custom9": "", "custom10": "" }, "defaultAddressId": null, "defaultCarrier": "", "defaultPaymentMethod": "", "defaultPaymentTermsId": null, "discount": "0.00", "email": "", "fax": "", "isActive": true, "isTaxInclusivePricing": false, "lastModifiedById": "89aebc56-db26-4ee8-8325-0b9518519e94", "lastModifiedDttm": "2023-02-28T195126.5667782+00:00", "leadTimeDays": null, "name": "220 Electronics", "phone": "", "remarks": "", "taxingSchemeId": null, "timestamp": "0000000003EF04F6", "website": "", "vendorItems": [] }, { "vendorId": "07c9f0e3-d7f5-41cc-93c8-aee3a5913d2b", "contactName": "", "currencyId": "88285281-a77e-4e96-ad51-6cfa2d0bbd31", "customFields": { "custom1": "", "custom2": "", "custom3": "", "custom4": "", "custom5": "", "custom6": "", "custom7": "", "custom8": "", "custom9": "", "custom10": "" }, "defaultAddressId": null, "defaultCarrier": "", "defaultPaymentMethod": "", "defaultPaymentTermsId": null, "discount": "0.00", "email": "", "fax": "", "isActive": true, "isTaxInclusivePricing": false, "lastModifiedById": "d5da1d41-953c-4896-a41c-2e133e3dc6d0", "lastModifiedDttm": "2016-07-21T134201.62+00:00", "leadTimeDays": null, "name": "220togo", "phone": "", "remarks": "", "taxingSchemeId": null, "timestamp": "000000000027A931", "website": "", "vendorItems": [ { "vendorItemId": "a322e7d1-fe27-4926-95e9-5ec342f6a4a1", "cost": "0.00000", "leadTimeDays": null, "lineNum": null, "productId": "1d2d5fa8-3e05-4113-8b3e-1612880dae53", "timestamp": "0000000001520AC6", "vendorId": "07c9f0e3-d7f5-41cc-93c8-aee3a5913d2b", "vendorItemCode": "" }, { "vendorItemId": "039ea565-cd6e-4c8c-9b02-3e12e77b6d65", "cost": "69.95000", "leadTimeDays": null, "lineNum": null, "productId": "450b5f66-a3d6-4ba4-bbe5-8acd02bfda7f", "timestamp": "000000000027D277", "vendorId": "07c9f0e3-d7f5-41cc-93c8-aee3a5913d2b", "vendorItemCode": "" }, { "vendorItemId": "a4df002e-e812-4423-8a41-637d62c241d3", "cost": "0.00000", "leadTimeDays": null, "lineNum": null, "productId": "7bdd5145-e6df-42dc-8be9-e9513cad9942", "timestamp": "0000000001520613", "vendorId": "07c9f0e3-d7f5-41cc-93c8-aee3a5913d2b", "vendorItemCode": "" }, { "vendorItemId": "d78e43ec-2365-4409-be10-da34006c5ae9", "cost": "0.00000", "leadTimeDays": null, "lineNum": null, "productId": "1515cb1b-9bf8-4f91-932d-ed903402bee4", "timestamp": "0000000001520643", "vendorId": "07c9f0e3-d7f5-41cc-93c8-aee3a5913d2b", "vendorItemCode": "" }, { "vendorItemId": "40664f08-56f2-4b8a-a843-e4df9d23d637", "cost": "129.00000", "leadTimeDays": null, "lineNum": 1, "productId": "a2dd5938-9421-435f-b1ff-34711c2180aa", "timestamp": "00000000034DCB12", "vendorId": "07c9f0e3-d7f5-41cc-93c8-aee3a5913d2b", "vendorItemCode": "Whirlpool 25 Liter microwave with Grill #WMGO9SDE" } ] },
    j
    • 2
    • 2
  • l

    Lorenzo

    07/17/2025, 10:21 PM
    I'm using the SFTP Bulk connector. How do I add a new stream to an existing connector?
  • p

    Paul Houghton

    07/18/2025, 12:52 PM
    is anyone building a cloudflare R2 connector?
    u
    p
    • 3
    • 2
  • m

    Mukesh Kumar

    07/19/2025, 5:18 AM
    Hello everyone, We just installed open-source Airbyte on our machine. We're encountering an issue with the BigQuery connector: the table is created in the airbyte_internal dataset, but it is not appearing in the final dataset.table
    j
    • 2
    • 1
  • b

    Bilal Mubasher

    07/21/2025, 8:48 AM
    Hi, everyone I am trying to connect using s3 connector with parquet file as a source but it showin "please wait a bit more" progress bar. Below is my json configuration, can anyone plz help what's the issue. is it configuration issue ? { "name": "S3", "workspaceId": "21897ffe-476a-4a13-a808-587f7134ef47", "definitionId": "69589781-7828-43c5-9f63-8925b1c1ccc2", "configuration": { "format": { "encoding": "utf8", "filetype": "csv", "delimiter": ",", "block_size": 10000, "quote_char": "\"", "double_quote": true, "infer_datatypes": true, "newlines_in_values": false }, "schema": "{}", "streams": [ { "globs": [ "**/*.parquet" ], "format": { "filetype": "parquet", "decimal_as_float": false }, "schemaless": false, "validation_policy": "Emit Record", "days_to_sync_if_history_is_full": 3, "name": "s3-stream" } ], "provider": {}, "delivery_method": { "delivery_type": "use_records_transfer" }, "bucket": "costsage-ai", "aws_access_key_id": "*****", "aws_secret_access_key": "*****" } }
  • b

    Bilal Mubasher

    07/21/2025, 8:57 AM
    Internal message: Failed to find output files from connector within timeout of 9 minute(s). Is the connector still running? airbyte error? after 9 minutes getting above error?
    p
    • 2
    • 3
  • b

    Bilal Mubasher

    07/21/2025, 1:33 PM
    Hi everyone, what's the maximum file size that Airbyte can successfully process as a source?
  • r

    Rafael Capelo

    07/21/2025, 5:59 PM
    Could someone help me implement the OAuth 2.0 authentication process for the Bling API? For some reason, I'm having issues during the redirection to the Airbyte callback endpoint. I don't know if it's due to some configuration in the connector or if it has something to do with the application not being https. If someone had the YAML or could guide me, it would be very helpful. Here’s the link to the authentication flow: https://developer.bling.com.br/aplicativos#fluxo-de-autoriza%C3%A7%C3%A3o
  • l

    Lorenzo

    07/23/2025, 4:02 AM
    Has anyone had issues adding streams to the SFTP Bulk connector? I continue to run into this error
  • g

    Gergely Imreh

    07/23/2025, 7:33 AM
    Heyhey, I am developing against an API, that I need to query for each calendar date to provide the right data, but they don't provide any "cursor-like" object that would let me use the incremental sync to do this. Is there any Pagination for example. In particular I have a start date, say
    2025-06-01
    , and from there I have to run requests with the query parameters:
    from_date
    set to
    2025-06-01
    and the
    to_date
    set to
    2025-06-02
    ; after this I have to query again, but setting
    2025-06-02
    and
    2025-06-03
    , and so on up until today • is this possible with incremental sync (if there's no cursor on the API, but it would keep its own "up to this date" value? (this would be the best, probably, meaning fastest sync) • or, is it possible through the possible though pagination? Or would I need to built a more custom connector for this? Cheers!
    ✅ 1
    p
    • 2
    • 6
  • r

    Rubén Laguna

    07/24/2025, 7:34 AM
    Is there any way in the Connector Field to drop all field except X and Y? From the YAML Reference for RemoveFields it seems like I need to list explicitly all fields. My use case is that I just want to get a single field from an API and I don't want to get new fields if they update the API.
    j
    • 2
    • 1
  • a

    Ander Aranburu

    07/25/2025, 7:59 AM
    Hi everyone! I’m working on a custom Salesforce destination using airbyte-cdk==6.60.0 and Airbyte Platform v7.2.0. I posted this message in this merged GitHub PR, and I’d love to better understand the context behind the fix that helped me avoid this error:
    State message does not contain id
    If anyone is familiar with how/why this issue arises, I’d really appreciate your insight. A few specific questions I’m trying to figure out: • Was this issue introduced by a recent change in the Airbyte Protocol or CDK? • Is it related to the source connector at all, or only to how the destination handles state messages? • Why don’t other destinations using CDK 6.x (e.g. destination-sqlite) seem to be affected? Thanks in advance for any help you can share! 🙏
  • a

    Alexei Kozhushkov

    07/25/2025, 9:05 AM
    Hello, how are you? I have a local
    airbyte
    install and it works great with mysql, mssql and postgres. But I can't make
    connector builder
    to work, e.g. even simples unauthenticated
    <https://restcountries.com/v3.1/all?fields=name>
    fails with
    Draft can't be saved
    error. Please advise 🙏
    j
    • 2
    • 8
  • m

    Michael Hernandez

    07/25/2025, 5:28 PM
    Hi team, does anyone have experience customizing connectors to feed output from one stream to another? As my concrete example, the JIRA connector stream for
    issue_remote_links
    only supports full_refresh which makes sense as Atlassian didn't include a date or ID column that could be used for incrementally syncing. If I could use the keys from the
    issues
    stream sync, I could achieve something similar by fetching only for particular issues.
    d
    • 2
    • 2
  • a

    Alex Johnson

    07/26/2025, 12:55 AM
    Hi Guys, I'm trying to create a new connector in the connector builder but I'm having trouble with the authentication. The authentication appears to complete successfully but the access_token field is not being set. (Screenshot of the error below). I have attached the associated YAML file. The access token response is simple JSON object as follows: { "access_token": “eyJhbGciOiJSUzI1NiIsInR5cCI1IkpXVCJ9.ZRtjDHtsyK6c25YJEj_vxxx”, "refresh_token": “7c9234163d5b5a2e480a9263fb789f027c2a8e4306c7f9c098c8xxx”, } I have tried adding details for the output specification, but this just returns HTTP 500 error. So I have removed it. complete_oauth_output_specification: required: - access_token - refresh_token properties: access_token: type: string path_in_connector_config: - access_token refresh_token: type: string path_in_connector_config: - access_token Can anyone help guide me in the right direction to resolve this?
    test_podium.yaml
  • a

    Alexei Kozhushkov

    07/29/2025, 9:41 AM
    Hi there, how are you? I'm trying to get my head around
    DpathExtractor
    yaml config: • Given following array as input
    Copy code
    [{
        "name": {
          "common": "Comoros",
          "official": "Union of the Comoros",
          "nativeName": {
            "ara": {
              "official": "الاتحاد القمري",
              "common": "القمر‎"
            },
            "fra": {
              "official": "Union des Comores",
              "common": "Comores"
            },
            "zdj": {
              "official": "Udzima wa Komori",
              "common": "Komori"
            }
          }
        },
        "cca2": "KM",
        "cca3": "COM"
      }]
    • how would one extract
    Copy code
    [{
      "name": { "common": "Comoros" },
      "common_name": "Comoros",
      "cca2": "KM",
      "cca3": "COM"
    }]
    Please advise 🙏
  • o

    Olivia Natasha

    07/29/2025, 3:51 PM
    Hello airbyte team, I need help with my current set up. I was told to upgrade my connection for Stripe to BigQuery, I have upgraded, but now I am seeing that my tables are not populating anymore. When I check the error this is what I got: > Warning from source: Workload failed, source: unknown > Failed to find output files from connector within timeout of 9 minute(s). Is the connector still running? Can you direct me to the right documentation to troubleshoot this?
    p
    • 2
    • 1
  • m

    Mohith

    07/31/2025, 9:46 AM
    Hi Airbyte Community and Team, We're encountering an issue where columns with the
    ARRAY
    data type in our PostgreSQL source are being read as strings when synced to our Snowflake destination using the PostgreSQL connector. Is this expected behavior? If not, could anyone familiar with the PostgreSQL connector share how they’ve handled array-type columns in similar setups? Any insights or suggestions would be greatly appreciated. Thanks in advance!
  • p

    Prajjval Mishra

    07/31/2025, 6:11 PM
    @everyone I am having issue as amazon sp api is getting connectes but reports data is not visible in my schema can any one help me out regarding this? @channel
  • a

    Alexei Kozhushkov

    08/01/2025, 7:21 AM
    Hi, how are you doing? I found a problem using non-ascii characters in Custom Connector Stream name as well as Field name, e.g. given a Stream Name "καταστήματα" and fields name "όνομα" while loading data to PostgreSQL destination table is created with "_" instead of non-ascii symbols. At the same time, destination database is created with UTF8 support and allows "CREATE TABLE" and "INSERT INTO" with non-ascii table names and field names. Is there any settings which I should use on the connector to use UTF8 symbols in Stream and Field names? Thank you!
    p
    j
    • 3
    • 6
  • m

    Morgan Kerle

    08/05/2025, 12:27 AM
    Hey @Ian Alton, tagging you since you previously posted about helm chart release + Marcos seems to be gone. The fix for this issue has been merged. However, it is not included in the latest release / tag on that repo nor in the helm chart releases in https://github.com/airbytehq/helm-charts. Is it possible to a get a new patch release cut to include this fix?
    i
    • 2
    • 3
  • a

    Alexei Kozhushkov

    08/05/2025, 9:17 AM
    Hello, how are you? Please advise the way to troubleshoot connector builder (OAuth in particular case), e.g. where can I find results of Authorise (Consent) and Token endpoint logs? Currently I'm getting 500 which is not really informative. Basically, I would like to build a connector to Allegro which has following auth flow: • Authorize
    Copy code
    <https://allegro.pl/auth/oauth/authorize?response_type=code&client_id=a21...6be&redirect_uri=http://exemplary.redirect.uri>
    • Token
    Copy code
    curl -X POST \
      <https://allegro.pl/auth/oauth/token> \
      -H 'Authorization: Basic base64(clientId:secret)' \
      -H 'Content-Type: application/x-www-form-urlencoded' \
      -d 'grant_type=authorization_code&code=pOPEy9Tq94aEss540azzC7xL6nCJDWto&redirect_uri=<http://exemplary.redirect.uri>'
    Thank you!
    • 1
    • 1
  • g

    Grivine Ochieng'

    08/05/2025, 2:54 PM
    My Connector Builder is not returning any records, yet the response is 200 OK! I have properly applied the record selector to fetch results from the response payload, but this is not happening. See the attached screenshot for reference. Please let me know how to get out of this blocker. Request Payload:
    Copy code
    {
      "url": "<https://backstage.taboola.com/backstage/api/1.0/sinoinc-nal-plaudus-sc/reports/campaign-summary/dimensions/day?start_date=2025-01-01&end_date=2025-08-05>",
      "headers": {
        "User-Agent": "python-requests/2.32.4",
        "Accept-Encoding": "gzip, deflate",
        "Accept": "*/*",
        "Connection": "keep-alive",
        "Authorization": "Bearer ****"
      },
      "http_method": "GET",
      "body": ""
    }
    Response payload:
    Copy code
    {"status": 200,
      "body": {
        "last-used-rawdata-update-time": "2025-08-05 05:00:00.0",
        "last-used-rawdata-update-time-gmt-millisec": 1754395200000,
        "timezone": "PDT",
        "results": [
          {
            "date": "2025-08-05 00:00:00.0",
            "date_end_period": "2025-08-05 00:00:00.0",
            "clicks": 628,
            "impressions": 271757,
            "visible_impressions": 147123,
            "spent": 105.26,
            "conversions_value": 0,
            "roas": 0,
            "roas_clicks": 0,
            "roas_views": 0,
            "ctr": 0.2310888036002752,
            "vctr": 0.4268537210361398,
            "cpm": 0.39,
            "vcpm": 0.72,
            "cpc": 0.168,
            "campaigns_num": 20,
            "cpa": 6.192,
            "cpa_clicks": 6.579,
            "cpa_views": 105.264,
            "cpa_actions_num": 17,
            "cpa_actions_num_from_clicks": 16,
            "cpa_actions_num_from_views": 1,
            "cpa_conversion_rate": 2.7070063694267517,
            "cpa_conversion_rate_clicks": 2.5477707006369426,
            "cpa_conversion_rate_views": 0.1592356687898089,
            "currency": "USD"
          },
    p
    • 2
    • 5
  • c

    Carmela Beiro

    08/05/2025, 3:54 PM
    Hi! Is it possible to create a docker image from a connector using
    source-declarative-manifest
    as the base and copying the manifest.yaml generated with the Custom Builder UI? Can't find documentation about it
    p
    • 2
    • 2
  • m

    Mateo Colina

    08/05/2025, 6:27 PM
    Has anyone experience working with the new bulk-cdk for Databases (source)? I‘m currently trying to implement it for source-oracle by reverse engineering source-mysql since there is only limited documentation available
  • s

    Sebastian Miranda

    08/07/2025, 8:25 PM
    hey everyone, is it possible to use a composite key for cdc tracking when using the Postgres connector? e.g. a table doesn’t have a PK
  • p

    Patrick McCoy

    08/11/2025, 4:12 PM
    hi working on creating a custom connector via the guide shared elsewhere in this slack for a makeshift setup but looking at fivetran docs would their setup process work for airbyte with some modification? https://fivetran.com/docs/connectors/applications/microsoft-dynamics/business-central/setup-guide
  • o

    Ofek Eliahu

    08/12/2025, 2:46 PM
    Hi everyone, I’ve built a custom Python connector for GitLab. GitLab’s OAuth system uses a single-use refresh token, which means that after each authentication, the refresh token becomes invalid and must be replaced with a new one from the response. Here’s an example of the response:
    Copy code
    {
      "access_token": "c97d1fe52119f38c7f67f0a14db68d60caa35ddc86fd12401718b649dcfa9c68",
      "token_type": "bearer",
      "expires_in": 7200,
      "refresh_token": "803c1fd487fec35562c205dac93e9d8e08f9d3652a24079d704df3039df1158f",
      "created_at": 1628711391
    }
    To re-authenticate, I need to use the new refresh token from the response. I’m using the
    SingleUseRefreshTokenOauth2Authenticator
    to handle this, which saves the new
    refresh_token
    ,
    access_token
    , and
    expire_time
    in memory for the next authentication. The issue is that while these config values are correctly saved and used in memory, they are not being persisted in storage for future runs. When I create a new source, a validation check is performed, which passes and creates the source. However, after this check, the OAuth refresh token becomes invalid, and the new one isn’t saved to storage. As a result, I can’t create a new connection based on this source since the refresh token isn’t being updated in storage. Is anyone faced this issue before and knows how to solve it?
    ✅ 1
    • 1
    • 1
  • w

    Will Skelton

    08/12/2025, 3:30 PM
    Hi All! We have Postgreql source databases that are running in SQL_ASCII encoding and have some number of tables with non-UTF8 compatible characters. I believe I've discovered that the JDBC driver used by the primary Airbyte Postgreql doesn't support us connecting to the source using SQL_ASCII client encoding. We are interested in pursuing the posibility of creating a custom connector that will allow us to connect to Postgreql without using a JDBC Driver. Does anyone out here in slack world have any experience or knowledge in how possible this might be? I'd love to be able to learn from your mistakes before we head off on what might be a difficult path. Thanks!
    p
    • 2
    • 12