https://linen.dev logo
Join Slack
Powered by
# help-connector-development
  • g

    Göktuğ Aşcı

    09/17/2025, 12:56 PM
    I believe the JDBC connector 0.3.2 is outdated and there is 0.6.0 available. How can we upgrade it?
  • b

    Bastin Headmon

    09/17/2025, 1:51 PM
    Hello 👋 I just joined the community. Our company (Glowtify) runs Airbyte open-source, self-hosted on GKE, and we’ve been using several connectors. I recently added the LinkedIn Pages connector to our system, and I’d like to extend it with a few more streams to capture more granular post-level stats (since the current streams only provide org-level stats). This is my first time working with the Airbyte source code, and I’m having trouble setting it up locally. Are there any guides on how to set up the environment to start developing a connector?
    octavia wave 2
    j
    • 2
    • 1
  • j

    Jabbar Memon

    09/18/2025, 7:09 AM
    Hi Team, I’m using a self-hosted setup of Airbyte and have created a connection from MongoDB to BigQuery. While trying to refresh the schema, I’m consistently getting a 502 HTTP error (Airbyte temporarily unavailable). Details: • Source: MongoDB (database size ~200GB) • Destination: BigQuery • Issue: Unable to refresh schema due to 502 error • Tried solutions: Increased node pool size, but the problem still persists My main requirement is to sync this ~200GB MongoDB database into BigQuery, but I’m blocked at the schema refresh step. Attaching a screenshot for reference.
    j
    • 2
    • 2
  • l

    Lucas Rodrigues Dhein

    09/22/2025, 5:07 PM
    Hi everyone, I need help with the hubspot source connector. I need to fetch the custom objects associations with contacts, deals and tickets just as it is done in other natives objects.
    h
    • 2
    • 4
  • l

    Louis Adam

    09/23/2025, 3:25 PM
    Hey team, We are looking for a person who already connected Google Ads and Google Analytics to a self-hosted instance. We have trouble getting into the authentication (signing with google is deactivated). If you have any ideas of who can help us, happy to discuss. Thanks.
    m
    • 2
    • 1
  • g

    Göktuğ Aşcı

    09/24/2025, 12:58 PM
    Hi could you review my PR? We have a pressing project that requires to move data from Clickhouse to Postgres: https://github.com/airbytehq/airbyte/pull/66482
  • j

    Jason Anderson

    09/24/2025, 6:57 PM
    Can anyone please help me troubleshoot the Facebook Marketing connector? Things have matched the Facebook Ads manager ever since we started using it 6 months ago but on 9/11/25 we started to see our numbers be under-reported. If I call the Facebook Ads Insights API directly with the following curl I get the correct amount so I'm curious how the Airbyte Connector is fetching data. Thanks!
    Copy code
    curl -G \
      -d "access_token=MYTOKEN" \
      -d "time_range={'since':'2025-09-16','until':'2025-09-16'}" \
      -d "fields=conversions" \
      "<https://graph.facebook.com/v23.0/MYADNUMBER/insights>"
  • r

    Rishabh Bhardwaj

    09/25/2025, 5:28 AM
    hi everyone, we have built one connector and made a docker image out of it then when we try to add this image through Airbyte UI by providing image then it can't link up to that image, i believe the reason is since airbyte uses abctl(kubertesis) and docker image is hosted in my local, so they have diferent daemon and i can't load this docker image via kind load also any solution for this ?
    r
    h
    • 3
    • 4
  • a

    Anil Thapa

    09/25/2025, 3:57 PM
    Hello Team, Can we test a forked connector of google sheets in airbyte open source 1.7 version? The option to test the custom connector is not working with either a mouse click or ctrl+enter.
    j
    • 2
    • 1
  • r

    Rishabh Bhardwaj

    09/26/2025, 6:45 AM
    hey team, i had installed airbyte locally and it was working fine for me setting up source and destination connector but from 2 days, i am seeing the error - I have reinstalled multiple times, I checked online the issue - it said, Airbyte UI couldn't connect to Backend How this issue can be resolved ?
    e
    h
    • 3
    • 10
  • r

    Rishabh Bhardwaj

    09/27/2025, 3:42 PM
    Can anyone help me building a custom connector for Grapghdb Neo4j, we will be first transforming structure data to graph data, node schema and their relationship will be provided by user in json/yaml file, then we make a connection to Neo4j and write data to graph database
  • u

    מילכה כץ

    09/29/2025, 12:34 PM
    Hi, I’m interested in contributing to the Google Search Console connector issue #40522 and noticed the previous maintainer left. Could someone guide me who currently maintains connector development or who could review PRs for that issue?
  • d

    dilan silva

    10/02/2025, 3:16 PM
    I'm building low code Python connector for our company, now I'm running CI Tests and Its always faling in Python CLI smoke test using PyAirbyte. I will provide you more details, hope someone can help me.
    Copy code
    Dev Environment
    ------------------------
    OS Windows, using WSL and docker with Airbyte abctl.
    Metadata.yaml
    Copy code
    ------------------------------------------
    Metadata.yaml
    -------------------------------------------
    data:
      allowedHosts:
        hosts:
          - TODO # Please change to the hostname of the source.
      registries:
        oss:
          enabled: true
        cloud:
          enabled: false
      remoteRegistries:
        pypi:
          enabled: true
          packageName: airbyte-source-nexus-datasets
      connectorBuildOptions:
          baseImage: docker.io/airbyte/python-connector-base:3.0.2@sha256:73697fbe1c0e2ebb8ed58e2268484bb4bfb2cb56b653808e1680cbc50bafef75
      connectorSubtype: api
      connectorType: source
      definitionId: 9e1fe63c-80ad-44fe-8927-10e66c9e209b
      dockerImageTag: 0.1.0
      dockerRepository: airbyte/source-nexus-datasets
      githubIssueLabel: source-nexus-datasets
      icon: nexus-datasets.svg
      license: MIT
      name: Nexus Datasets
      releaseDate: TODO
      releaseStage: alpha
      supportLevel: community
      documentationUrl: <https://docs.airbyte.com/integrations/sources/nexus-datasets>
      tags:
        - language:python
        - cdk:low-code
      connectorTestSuitesOptions:
        - suite: liveTests
          testConnections:
            - name: nexus_datasets_config_dev_null
              id: TODO
        - suite: unitTests
        - suite: acceptanceTests
          #optional: if your acceptance tests need credentials from CI secret store
          #testSecrets:
          #  - name: SECRET_SOURCE-NEXUS-DATASETS__CREDS
          #    fileName: config.json
          #    secretStore:
          #      type: GSM
          #      alias: airbyte-connector-testing-secret-store       
    metadataSpecVersion: "1.0"
    This is the Pyproject.toml
    Copy code
    -------------------------
    Pyproject.toml
    -------------------------
    [build-system]
    requires = [ "poetry-core>=1.0.0",]
    build-backend = "poetry.core.masonry.api"
    
    [tool.poetry]
    version = "0.3.21"
    name = "source-nexus-datasets"
    description = "Source implementation for nexus-datasets."
    authors = [ "Airbyte <contact@airbyte.io>",]
    license = "MIT"
    readme = "README.md"
    documentation = "<https://docs.airbyte.com/integrations/sources/nexus-datasets>"
    homepage = "<https://airbyte.com>"
    repository = "<https://github.com/airbytehq/airbyte>"
    packages = [ { include = "source_nexus_datasets" }, {include = "main.py" } ]
    
    [tool.poetry.dependencies]
    python = ">=3.10,<3.12"
    airbyte-cdk = "6.60.0"
    pyarrow = ">=16.1,<22.0"
    pandas = "2.2.2"
    
    [tool.poetry.scripts]
    source-nexus-datasets = "source_nexus_datasets.run:run"
    
    [tool.poetry.group.dev.dependencies]
    requests-mock = "^1.9.3"
    pytest-mock = "^3.6.1"
    pytest = "^8.0.0"
    So this is the error I can see in the logs,
    Copy code
    Traceback (most recent call last):
      File "/usr/local/bin/pyab", line 5, in <module>
        from airbyte.cli import cli
      File "/usr/local/lib/python3.11/site-packages/airbyte/__init__.py", line 126, in <module>
        from airbyte import (
      File "/usr/local/lib/python3.11/site-packages/airbyte/cloud/__init__.py", line 59, in <module>
        from airbyte.cloud import connections, constants, sync_results, workspaces
      File "/usr/local/lib/python3.11/site-packages/airbyte/cloud/workspaces.py", line 27, in <module>
        from airbyte.sources.base import Source
      File "/usr/local/lib/python3.11/site-packages/airbyte/sources/__init__.py", line 6, in <module>
        from airbyte.sources import base, util
      File "/usr/local/lib/python3.11/site-packages/airbyte/sources/util.py", line 10, in <module>
        from airbyte._executors.util import get_connector_executor
      File "/usr/local/lib/python3.11/site-packages/airbyte/_executors/util.py", line 13, in <module>
        from airbyte._executors.declarative import DeclarativeExecutor
      File "/usr/local/lib/python3.11/site-packages/airbyte/_executors/declarative.py", line 14, in <module>
        from airbyte_cdk.sources.declarative.manifest_declarative_source import ManifestDeclarativeSource
    ModuleNotFoundError: No module named 'airbyte_cdk.sources.declarative.manifest_declarative_source'
    Can anyone please help me on this ?
  • m

    MTA

    10/03/2025, 7:53 AM
    Hi Everyone, Using Airbyte cloud currently, and building a new connector through the UI for an API source. via oAUTH2. The source API needs client_id, client_secret so no problem for that, but it needs a third parameter (id) in order to generate the token. In the UI, I don't see where I can add a custom extra parameter. I tought this might not be the first time this issue has been raised, so I tought to ask the community. Ideally, we would like to avoid going through a development (cdk) because the client will not be able to maintain the code in the long run. Thanks for any guidance or help on this.
    h
    • 2
    • 1
  • d

    Daniel Popowich

    10/03/2025, 4:09 PM
    I am developing a custom connector and am using
    abctl
    for testing. I have runtime features I would like to turn on/off during testing without having to rebuild my image. How do I set (and later change) an arbitrary environment variable for my connector. For example, I can run a source connector image from my command-line like this:
    Copy code
    docker run --rm $MYIMAGE spec
    But if I want to turn on a custom feature I could do this:
    Copy code
    docker run --rm -e MYCUSTOMENV=true $MYIMAGE spec
    How do I do this while running in the
    abctl
    (kind) environment?
  • m

    Mike Braden

    10/06/2025, 5:03 PM
    EDIT: I think this was just a version issue. Using latest 1.8.5 had no problem with parent stream/partitions in the polling URL, I am developing a custom connector using the Connection Builder. I have successfully configured sync streams partitioned off of a parent stream. I need to convert a couple of my streams to be async. I am having issue with the partitioning, however. I can use
    stream_partition.<id_name>
    for the Creation and Download URLs but when used in the Polling URL it inserts a null value for the stream partition current ID, causing the request to fail. Am I missing something? Is stream partitioning not supported for the polling portion of the async job?
    ✅ 1
  • s

    Sunny Matharu

    10/06/2025, 10:16 PM
    hello, I have recently been exploring Airbyte open source. I can see that there is a Salesforce source connector, (which is a bit clunky to configure but works fine). However I could not see a Salesforce destination connector. (Is that only available with Airbyte paid plans?) So, I decided to use this community YAML API spec of the Salesforce Bulk API 2.0 (which renders and works correctly in Swagger, and imports successfully on Postman). However when selecting said YAML file in the connection builder, it errors out with the following error + an enormous stack trace..
    Copy code
    Could not validate your connector as specified:
    
    Error handling request: Validation against json schema defined in declarative_component_schema.yaml schema failed
    Has anyone successfully managed to set up a custom Salesforce destination connector? Is it even allowed given it seems to be explicitly excluded from the list of connectors in the self-hosted / open source version of AirByte?
    j
    • 2
    • 7
  • f

    Fungai

    10/07/2025, 1:37 AM
    Hello everyone! I'm looking for a tutorial that shows the process of creating a destination connector. I found tutorials for source connectors but not destination connectors. I know there's open source code for destination connectors but is there any other resource available?
    h
    • 2
    • 1
  • g

    Gloria Usanase

    10/08/2025, 7:19 AM
    Hello everyone, I am facing issues with the Airbyte on-prem upgrade: After upgrading the ClickHouse destination from v1 to v2.1.4 , syncs have become much heavier. Before the upgrade, a full DB sync finished in *<30 min*; now only small tables complete after 5 hours, while most large tables either queue or fail,. Logs show the source occasionally dying mid-snapshot with
    org.postgresql.util.PSQLException: An I/O error occurred while sending to the backend (EOF)
    from the CTID/Xmin iterator. What’s already tried: Verified CH side (Atomic DB, correct namespace, no cluster, JSON disabled), limited concurrency to 1, ran Full refresh | Overwrite for big tables, and adjusted the Postgres source (JDBC `socketTimeout=0&tcpKeepAlive=true&defaultRowFetchSize=1000`; role GUCs removing statement/idle timeouts). Inserts into CH occur in 2k batches, but overall throughput is far slower than v1 and several large streams still don’t finish. Looking for any known v2.1.x performance regressions or tuning tips for PG→CH initial loads (batch/window sizes, fetch size, CTID vs Xmin, CDC for big tables, etc.).
  • z

    Zeineb Ben Nessib

    10/09/2025, 9:19 AM
    Hello everyone! I'm trying to update pennylane connector to adapt it to the V2 of the api. I'm facing an issue when I update the pagination section which should go from an incremental method to a cursor one. I updated it accordingly but I'm getting a 500 Internal Server Error. Here's my set up in the screenshot below Thanks in advance for your help on this!
  • j

    Jean Baptiste Guerraz

    10/14/2025, 11:08 PM
    Hello mates! I'm planning to write a (set of) custom java connector (first, destination and then source). I'm having troubles figuring out the right way to do so: I know java cdk is "on the way" somehow but maybe we can still do something like now ? does anyone here have some related insights ? thank you a lot !
    s
    s
    f
    • 4
    • 23
  • j

    J Bob

    10/17/2025, 9:13 AM
    I've created a Custom Connector using the Builder, but I can't find the API, for extracting and loading the yaml, any ideas?
  • k

    Kanhaiya Gupta

    10/17/2025, 12:33 PM
    Hi , Can we achieve real-time replication from a MySQL database using Change Data Capture (CDC). Currently on UI it showing minimum 1 hours to sync with destination
    h
    • 2
    • 1
  • g

    Göktuğ Aşcı

    10/17/2025, 7:26 PM
    Dear Airbyte community, I have started to develop custom connectors for my company via the cool Connector Builder feature. But I am a little bit confused with the incremental sync section. Is there a way to set the stream to use last max cursor value and generate the datetime cutoff dynamically? What is the best practice here?
    j
    • 2
    • 5
  • j

    J Bob

    10/18/2025, 1:55 PM
    I've update my version. of airbyte and when I try to reload my Connector yaml file I get a cryptic: "Your YAML manifest is not compatible with the Connector Builder UI, due to the following error: An unknown error has occurred There is no UI value to switch back to. Please resolve this error with the YAML editor in order to switch to UI mode." So not very helpful, the YAML editor doesn't seem to have any validation, is there a schema I can validate the yaml againts, any clues what so ever?
    h
    • 2
    • 2
  • i

    Isaac Harris-Holt

    10/22/2025, 11:19 AM
    Hey folks, we're integrating with an API that has slightly strange auth requirements. The flow looks like this: • Open a session • Reuse the same session for the sync on every stream • Explicitly close the session This doesn't appear to be possible with Airbyte Cloud, unless I'm missing something. Would appreciate any guidance.
    j
    • 2
    • 3
  • a

    Aymen NEGUEZ

    10/24/2025, 1:24 PM
    The current HubSpot destination in Airbyte already supports writing to both standard objects (Contacts, Companies, Deals, etc.) and custom objects. However, it does not yet provide native support for creating associations between these records. This feature request proposes extending the HubSpot destination to handle associations via the HubSpot API. For example: Linking a Contact to a Company Associating a Deal with multiple Contacts Relating Custom Objects to standard objects (e.g., a custom Subscription object linked to a Contact) Supporting associations between custom objects themselves Key benefits: Preserve the relational structure of CRM data when syncing into HubSpot. Ensure that objects written via Airbyte reflect real-world business relationships. Enable more advanced HubSpot use cases by leveraging both default and custom associations. Potential implementation details: Extend the destination configuration to define association mappings (e.g., contactId → companyId). Support both default HubSpot associations and custom associations defined in the HubSpot account. Handle upserts gracefully to prevent duplicate or broken associations. Adding this functionality would make the HubSpot destination more complete and better aligned with HubSpot’s data model.
    h
    • 2
    • 1
  • l

    Lucas Hild

    10/24/2025, 5:40 PM
    Hey, I am building a custom connector for DoiT to pull cloud costs. This is the response schema:
    Copy code
    [
      {
        "result": {
          "rows": [
            [
              "amazon-web-services",
              "2025",
              "10",
              "14",
              123.456,
              1760400000
            ],
            [
              "microsoft-azure",
              "2025",
              "10",
              "12",
              654.321,
              1760227200
            ],
            ...
          ]
        }
      }
    ]
    How can I pull access these fields properly? When I select the field path
    result, rows
    , I get this error:
    Exception while syncing stream cloud_costs: dictionary update sequence element #0 has length 12; 2 is required
    Is there any way to transform this data properly? Thanks!
  • л

    Ляшик Іван

    10/27/2025, 9:22 AM
    Hello, We are using an Incremental | Append + Deduped sync mode in our connection, where the cursor field is based on a datetime column (e.g.,
    updated_at
    ). We’ve noticed a potential issue with how Airbyte handles incremental syncs: If new records appear in the source with the same cursor value as the last recorded cursor (for example,
    2025-10-20 09:42:11
    ), these rows are not picked up in the next sync. As far as we understand, this happens because Airbyte always applies the condition
    cursor_field > last_cursor_value
    , not
    >=
    , when filtering incremental data. This creates a risk of data loss if multiple rows share the same timestamp down to the same second or microsecond — which is common in high-frequency data sources. Could you please confirm: 1. Whether there is a way to configure or modify this behavior (e.g., use
    >=
    or apply a grace period/offset for the cursor)? 2. If not, is there any recommended best practice or workaround (such as cursor windowing or state rollback) to avoid missing records that share the same cursor value? Thank you for your help and clarification. Best regards!
    j
    r
    • 3
    • 5
  • s

    Stockton Fisher

    11/02/2025, 7:38 AM
    I'm using the connector builder. It was able to fetch my streams for testing before, but now it just seems to be dead. Any tips on understanding what the error is so I can fix it? Version: Airbyte Cloud