https://linen.dev logo
Join Slack
Powered by
# help-connector-development
  • d

    dilan silva

    05/22/2025, 8:55 AM
    Hi Team, I'm seeking for help to build my first source connector. I created a connector in cloud version of airbyte and there I tried to stream the data which returns fine. The data comes from an public API and its JSONL content. This is the manifest it produced,
    Copy code
    version: 6.48.15
    
    type: DeclarativeSource
    
    check:
      type: CheckStream
      stream_names:
        - datasets
    
    definitions:
      streams:
        datasets:
          type: DeclarativeStream
          name: datasets
          retriever:
            type: SimpleRetriever
            requester:
              $ref: "#/definitions/base_requester"
              path: /api/export
              http_method: GET
            record_selector:
              type: RecordSelector
              extractor:
                type: DpathExtractor
                field_path: []
            decoder:
              type: JsonlDecoder
          schema_loader:
            type: InlineSchemaLoader
            schema:
              $ref: "#/schemas/datasets"
      base_requester:
        type: HttpRequester
        url_base: >-
          <https://03100670-8969-4472-a593-d7a8cef4488b-00-15uejhh5ki8b9.janeway.replit.dev>
    
    streams:
      - $ref: "#/definitions/streams/datasets"
    
    spec:
      type: Spec
      connection_specification:
        type: object
        $schema: <http://json-schema.org/draft-07/schema#>
        required: []
        properties: {}
        additionalProperties: true
    
    schemas:
      datasets:
        type: object
        $schema: <http://json-schema.org/draft-07/schema#>
        additionalProperties: true
        properties: {}
    Now when I try to locally do this work in my local connector, it does not load the data, but it says Sync success with 0 bytes. In the log I can see these messages, (Not pasting full log here)
    Copy code
    2025-05-22 13:54:15 source ERROR Marking stream records as STARTED
    2025-05-22 13:54:15 source ERROR Syncing stream instance: records
    2025-05-22 13:54:15 source ERROR Setting state of SourceNexusDatasets stream to {}
    2025-05-22 13:54:15 source ERROR Syncing stream: records 
    2025-05-22 13:54:15 source ERROR Making outbound API request
    2025-05-22 13:54:15 source INFO Starting syncing SourceNexusDatasets
    2025-05-22 13:54:15 source INFO Marking stream records as STARTED
    2025-05-22 13:54:15 source INFO Malformed non-Airbyte record (connectionId = 034b7256-278b-4830-bba5-953c366de232): {"type": "DEBUG", "message": "Syncing stream instance: records", "data": {"message": "Syncing stream instance: records", "cursor_field": "[]", "primary_key": "None"}}
    2025-05-22 13:54:15 source INFO Setting state of SourceNexusDatasets stream to {}
    2025-05-22 13:54:15 source INFO Syncing stream: records 
    2025-05-22 13:54:15 source INFO Malformed non-Airbyte record (connectionId = 034b7256-278b-4830-bba5-953c366de232): {"type": "DEBUG", "message": "Making outbound API request", "data": {"request_body": "None", "headers": "{'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'}", "url": "<https://03100670-8969-4472-a593-d7a8cef4488b-00-15uejhh5ki8b9.janeway.replit.dev/api/export>", "message": "Making outbound API request"}}
    2025-05-22 13:54:15 replication-orchestrator INFO Stream status TRACE received of status: STARTED for stream records
    2025-05-22 13:54:15 replication-orchestrator INFO Sending update for records - null -> RUNNING
    2025-05-22 13:54:15 replication-orchestrator INFO Stream Status Update Received: records - RUNNING
    2025-05-22 13:54:15 replication-orchestrator INFO Creating status: records - RUNNING
    2025-05-22 13:54:17 source ERROR Receiving response
    2025-05-22 13:54:17 source INFO Malformed non-Airbyte record (connectionId = 034b7256-278b-4830-bba5-953c366de232): {"type": "DEBUG", "message": "Receiving response", "data": {"headers": "{'Content-Disposition': 'attachment; filename=\"ExportData.json\"', 'Content-Type': 'application/json', 'Date': 'Thu, 22 May 2025 08:24:19 GMT', 'Replit-Cluster': 'janeway', 'X-Powered-By': 'Express', 'X-Robots-Tag': 'none, noindex, noarchive, nofollow, nositelinkssearchbox, noimageindex, none, noindex, noarchive, nofollow, nositelinkssearchbox, noimageindex', 'Transfer-Encoding': 'chunked'}", "body": "{\"id\": 1, \"name\": \"Example Item 1\", \"category\": \"Category A\", \"price\": 19.99}\n{\"id\": 2, \"name\": \"Example Item 2\", \"category\": \"Category B\", \"price\": 29.99}\n{\"id\": 3, \"name\": \"Example Item 3\", \"category\": \"Category A\", \"price\": 15.50}\n{\"id\": 4, \"name\": \"Example Item 4\", \"category\": \"Category C\", \"price\": 45.00}\n{\"id\": 5, \"name\": \"Example Item 5\", \"category\": \"Category B\", \"price\": 35.25}", "message": "Receiving response", "status": "200"}}
    2025-05-22 13:54:17 source ERROR Read 0 records from records stream
    2025-05-22 13:54:17 source ERROR Marking stream records as STOPPED
    2025-05-22 13:54:17 source ERROR Finished syncing records
    2025-05-22 13:54:17 source ERROR SourceNexusDatasets runtimes:
    2025-05-22 13:54:17 source ERROR Syncing stream records 0:00:01.715473
    2025-05-22 13:54:17 source ERROR Finished syncing SourceNexusDatasets
    I have the same manifest configuration in the local connector except the
    version
    ,
    Copy code
    version: 0.90.0
    
    type: DeclarativeSource
    
    check:
      type: CheckStream
      stream_names:
        - "records"
    
    definitions:
      streams:
        records:
          type: DeclarativeStream
          name: records
          retriever:
            type: SimpleRetriever
            requester:
              type: HttpRequester
              url_base: <https://03100670-8969-4472-a593-d7a8cef4488b-00-15uejhh5ki8b9.janeway.replit.dev>
              path: /api/export
              http_method: GET
            record_selector:
              type: RecordSelector
              extractor:
                type: DpathExtractor
                field_path: []
            decoder:
              type: JsonlDecoder
          schema_loader:
            type: InlineSchemaLoader
            schema:
              $ref: "#/schemas/datasets"
    
    
    streams:
      - "#/definitions/streams/records"
      
    schemas:
      datasets:
        type: object
        $schema: <http://json-schema.org/draft-07/schema#>
        additionalProperties: true
        properties: {}
    Can someone please help on this ? I tried to change the version but it give an error,
    Copy code
    jsonschema.exceptions.ValidationError: The manifest version 6.48.15 is greater than the airbyte-cdk package version (0.90.0). Your manifest may contain features              
                        that are not in the current CDK version..
    u
    • 2
    • 2
  • m

    Mathieu Dumoulin

    05/22/2025, 5:53 PM
    Hi everyone, I have a question about parent streams. Given the API endpoint: https://api.inscribe.ai/api/v2/customers/{customer_id}/bank_accounts/{bank_account_id}/transactions Is it posible to define two parent streams where I get customer_id from one, and then for each customer id, get the bank account_id? I'm using the Connection Builder in Airbyte Cloud and the result I get is that every combination of A and B are sent to the API, which doesn't work at all.
  • h

    Hadrien Lepousé

    05/24/2025, 7:51 PM
    Hey everyone I'm working on HubSpot source connector, on branch master I have lots of unit tests fails Anybody knows what's wrong ? I'm using python 3.11.6 For example
    poetry run pytest unit_tests/integrations/test_web_analytics_streams.py::TestCRMWebAnalyticsStream::test_given_two_pages_when_read_then_return_records
    Result:
    Copy code
    ======================================== short test summary info =========================================
    FAILED unit_tests/integrations/test_web_analytics_streams.py::TestCRMWebAnalyticsStream::test_given_two_pages_when_read_then_return_records[tickets_web_analytics-tickets-ticket-parent_stream_associations0] - ValueError: Invalid number of matches for `HttpRequestMatcher(request_to_match=ParseResult(scheme='https', netloc='<http://api.hubapi.com|api.hubapi.com>', path='/crm/v3/objects/ticket', params='', query='archived=false&associations=contacts&associations=deals&associations=companies&limit=100&properties=closed_date,createdate', fragment='') with headers {} and body None), minimum_number_of_expected_match=1, actual_number_of_matches=0)`
    FAILED unit_tests/integrations/test_web_analytics_streams.py::TestCRMWebAnalyticsStream::test_given_two_pages_when_read_then_return_records[deals_web_analytics-deals-deal-parent_stream_associations1] - ValueError: Request ParseResult(scheme='https', netloc='<http://api.hubapi.com|api.hubapi.com>', path='/crm/v3/schemas', params='', query='', fragment='') with headers {} and body None) already mocked
    FAILED unit_tests/integrations/test_web_analytics_streams.py::TestCRMWebAnalyticsStream::test_given_two_pages_when_read_then_return_records[companies_web_analytics-companies-company-parent_stream_associations2] - ValueError: Request ParseResult(scheme='https', netloc='<http://api.hubapi.com|api.hubapi.com>', path='/crm/v3/schemas', params='', query='', fragment='') with headers {} and body None) already mocked
    FAILED unit_tests/integrations/test_web_analytics_streams.py::TestCRMWebAnalyticsStream::test_given_two_pages_when_read_then_return_records[contacts_web_analytics-contacts-contact-parent_stream_associations3] - ValueError: Request ParseResult(scheme='https', netloc='<http://api.hubapi.com|api.hubapi.com>', path='/crm/v3/schemas', params='', query='', fragment='') with headers {} and body None) already mocked
    FAILED unit_tests/integrations/test_web_analytics_streams.py::TestCRMWebAnalyticsStream::test_given_two_pages_when_read_then_return_records[engagements_calls_web_analytics-engagements_calls-calls-parent_stream_associations4] - ValueError: Request ParseResult(scheme='https', netloc='<http://api.hubapi.com|api.hubapi.com>', path='/crm/v3/schemas', params='', query='', fragment='') with headers {} and body None) already mocked
    FAILED unit_tests/integrations/test_web_analytics_streams.py::TestCRMWebAnalyticsStream::test_given_two_pages_when_read_then_return_records[engagements_emails_web_analytics-engagements_emails-emails-parent_stream_associations5] - ValueError: Request ParseResult(scheme='https', netloc='<http://api.hubapi.com|api.hubapi.com>', path='/crm/v3/schemas', params='', query='', fragment='') with headers {} and body None) already mocked
    FAILED unit_tests/integrations/test_web_analytics_streams.py::TestCRMWebAnalyticsStream::test_given_two_pages_when_read_then_return_records[engagements_meetings_web_analytics-engagements_meetings-meetings-parent_stream_associations6] - ValueError: Request ParseResult(scheme='https', netloc='<http://api.hubapi.com|api.hubapi.com>', path='/crm/v3/schemas', params='', query='', fragment='') with headers {} and body None) already mocked
    FAILED unit_tests/integrations/test_web_analytics_streams.py::TestCRMWebAnalyticsStream::test_given_two_pages_when_read_then_return_records[engagements_notes_web_analytics-engagements_notes-notes-parent_stream_associations7] - ValueError: Request ParseResult(scheme='https', netloc='<http://api.hubapi.com|api.hubapi.com>', path='/crm/v3/schemas', params='', query='', fragment='') with headers {} and body None) already mocked
    FAILED unit_tests/integrations/test_web_analytics_streams.py::TestCRMWebAnalyticsStream::test_given_two_pages_when_read_then_return_records[engagements_tasks_web_analytics-engagements_tasks-tasks-parent_stream_associations8] - ValueError: Request ParseResult(scheme='https', netloc='<http://api.hubapi.com|api.hubapi.com>', path='/crm/v3/schemas', params='', query='', fragment='') with headers {} and body None) already mocked
  • g

    Giulliano Bueno

    06/02/2025, 8:41 AM
    I'm trying to collect data from my client's Databricks and save it to our Big Query dataset using Airbyte Cloud. However, I've hit a roadblock with building a JDBC connector. After copying an existing connector, I need guidance on creating the repository and configuring the
    gradlew
    build process. Can anyone point me to tutorials or resources on contributing to this project or setting up Databricks as a source without creating a new connector?
    p
    • 2
    • 2
  • e

    Erin Yener

    06/02/2025, 11:23 AM
    Hey, community! octavia wave I’m interested in forking a community connector and adding an API budget parameter. I’m using Airbyte Cloud and would love to do this in the Builder UI if possible. • Is it possible to add an API budget using the UI? It looks like it would be a ‘new user input’ in the
    Inputs
    section, but would love confirmation on how best to do this. • If it’s best to modify the YAML directly, would anyone be able to advise me on what section of the YAML to add the API budget to? The docs have some examples, but I’m not clear on how to add this to a forked connector. • Are there existing connector examples that have this parameter as an optional input so that I can see how it ‘fits’? Thanks for the help!
    p
    • 2
    • 3
  • j

    Jens Mostaert

    06/04/2025, 12:53 PM
    I could use some guidance in developing an exact online connector using the connector builder. After some effort I got the oauth call working to the point that I receive an access_token. However in the "testingValues" in the browser console I see this token is stored under the key "undefined", which causes issues when trying to call one of the streams after authenticating.
    • 1
    • 1
  • n

    Nick Zombolas

    06/04/2025, 8:06 PM
    hey, I"m working on a connector that has some yaml streams and some python streams. I want to implement a parent stream that gets a secret from aws secrets manager and uses the results in the child streams. I've implemented this in python already and it works as a parent stream for my python stream, but I'm having trouble getting this to work in yaml streams. I've tried to set the python stream as the parent stream for all yaml streams, but i can't get the schema to validate since yaml doesn't have knowledge of the python stream during the
    spec
    job. I've been looking into how to implement this steam in yaml instead of calling my python class, but not sure the best way to get this done. any advice? thanks!
    u
    • 2
    • 2
  • m

    Mert Ors

    06/05/2025, 9:05 AM
    This pr broke all my airtable connections: https://github.com/airbytehq/airbyte/commit/0a99c45298d53f6c42d02332e1882399d66fb84c
    u
    m
    • 3
    • 4
  • p

    Paul

    06/05/2025, 1:35 PM
    I am beginning to learn how to modify connectors and have tried to make a small change to the destination - blob storage. I made the code change and have attempted to run -
    airbyte-ci connectors --name destination-azure-blob-storage build
    But sadly after about 30 seconds it fails. The build output html is empty. The dagger.log file is also empty. This is what i get in the window log below - any ideas what ive missed? ive updated the secrets as per the readme, but no luck.
    Copy code
    [23:33:09] INFO     root: Setting working directory to /home/pladmin/airbyte/airbyte                                                                                                         ensure_repo_root.py:58
    [23:33:10] INFO     root: Setting working directory to /home/pladmin/airbyte/airbyte                                                                                                         ensure_repo_root.py:58
               INFO     pipelines: airbyte-ci is up to date. Installed version: 5.2.5. Latest version: 5.2.5                                                                                          auto_update.py:89
               INFO     pipelines: Called with dagger run: False                                                                                                                                      airbyte_ci.py:127
               INFO     pipelines.cli.dagger_run: Running command: ['/home/pladmin/bin/dagger', '--silent', 'run', 'airbyte-ci', 'connectors', '--name', 'destination-azure-blob-storage', 'build']   dagger_run.py:120
    [23:33:18] INFO     root: Setting working directory to /home/pladmin/airbyte/airbyte                                                                                                         ensure_repo_root.py:58
    [23:33:19] INFO     root: Setting working directory to /home/pladmin/airbyte/airbyte                                                                                                         ensure_repo_root.py:58
               INFO     pipelines: airbyte-ci is up to date. Installed version: 5.2.5. Latest version: 5.2.5                                                                                          auto_update.py:89
               INFO     pipelines: Called with dagger run: True                                                                                                                                       airbyte_ci.py:127
    [23:33:27] INFO     pipelines: Will run on the following 1 connectors: destination-azure-blob-storage.                                                                                               commands.py:32
               INFO     pipelines: Running Dagger Command build...                                                                                                                        dagger_pipeline_command.py:32
               INFO     pipelines: If you're running this command for the first time the Dagger engine image will be pulled, it can take a short minute...                                dagger_pipeline_command.py:33
               INFO     pipelines: Saving dagger logs to:                                                                                                                                 dagger_pipeline_command.py:43
                        /home/pladmin/airbyte/airbyte/airbyte-ci/connectors/pipelines/pipeline_reports/airbyte-ci/connectors/build/manual/master/1749094400/b2ffb0185be442ddf72677067d3a8
                        243fbba770f/dagger.log
               INFO     pipelines: Building connectors for ['linux/amd64'], use --architecture to change this.                                                                                           commands.py:46
               INFO     Build connector destination-azure-blob-storage: Should send status check: False                                                                                         pipeline_context.py:222
    [23:33:29] INFO     root: Using storage driver: fuse-overlayfs                                                                                                                                         docker.py:85
    [23:33:56] INFO     Build connector destination-azure-blob-storage: Caching the latest CDK version...                                                                                       pipeline_context.py:284
               INFO     Build connector destination-azure-blob-storage: Should send status check: False                                                                                         pipeline_context.py:222
               INFO     Build connector destination-azure-blob-storage - Build connector tar: 🚀 Start Build connector tar                                                                                 steps.py:303
               ERROR    Build connector destination-azure-blob-storage: An error got handled by the ConnectorContext                                                                                     context.py:253
                        ╭───────────────────────────────────────────────────────────────────── Traceback (most recent call last) ──────────────────────────────────────────────────────────────────────╮
                        │ in run_connector_build_pipeline:49                                                                                                                                           │
                        │                                                                                                                                                                              │
                        │ in run_connector_build:33                                                                                                                                                    │
                        │                                                                                                                                                                              │
                        │ in run_connector_build:60                                                                                                                                                    │
                        │                                                                                                                                                                              │
                        │ in run:307                                                                                                                                                                   │
                        │                                                                                                                                                                              │
                        │ in __aexit__:772                                                                                                                                                             │
                        ╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
                        ExceptionGroup: unhandled errors in a TaskGroup (1 sub-exception)
               ERROR    Build connector destination-azure-blob-storage: No test report was provided. This is probably due to an upstream error                                                           context.py:255
    ╭──────────────────────────────────────────────────────────────────────────────────── DESTINATION-AZURE-BLOB-STORAGE - REPORT ────────────────────────────────────────────────────────────────────────────────────╮
    │        Steps results                                                                                                                                                                                            │
    │ ┏━━━━━━┳━━━━━━━━┳━━━━━━━━━━┓                                                                                                                                                                                    │
    │ ┃ Step ┃ Result ┃ Duration ┃                                                                                                                                                                                    │
    │ ┡━━━━━━╇━━━━━━━━╇━━━━━━━━━━┩                                                                                                                                                                                    │
    │ └──────┴────────┴──────────┘                                                                                                                                                                                    │
    │ ℹ️  You can find more details with step executions logs in the saved HTML report.                                                                                                                                │
    ╰───────────────────────────────────────────────────────────────────── ⏲️  Total pipeline duration for destination-azure-blob-storage: 0.36s ────────────────────────────────────────────────────────────────────
  • a

    Aphonso Henrique do Amaral Rafael

    06/05/2025, 5:10 PM
    Hello community, I built a custom source connector in my Airbyte, that works fine when testing (it can fetch the data and display the results, etc.; so everything looks good), as shown in the first picture. However, due to requirements from my company, I cannot fetch images from remote repositories, but only the authorised remote place, where we store all our images to pull from it. So, for instance, one of my connectors is MySQL, hence I created a "custom connector" over the original MySQL image from Airbyte that I hosted internally and then it works. My struggle is, in this "custom connector", I don't have the "original image" to fetch and then host into my repository, so I am trying to identify "where can I find the image" of this custom connector, so I can host it in my repository remote as well - if it makes sense. When I try to use this custom connector in my sources, see this "source-declaritve-manifest" file (2nd picture), but I don't believe this is the "image" of the connector, which is actually what I'm looking for. Any thoughts? Thank you very much!
    p
    • 2
    • 3
  • j

    Juliette Duizabo

    06/09/2025, 4:39 PM
    Hello, I am trying to create a super simple connector to retrieve Airbyte metadata. (inception) I tested this connector in the builder and it works, however, when I want to add the new connection, in "define source", I paste the same Client Secret as the one I used in the builder, the test takes a long time (feels like ~3 minutes), and then says
    Copy code
    Configuration check failed
    'Encountered an error while checking availability of stream sources. Error: Request URL: <https://api.airbyte.com/v1/applications/token>, Response Code: 500, Response Text: {"message":"Internal Server Error","_links":{"self":{"href":"/api/public/v1/applications/token","templated":false}},"_embedded":{"errors":[{"message":"Internal Server Error: class org.jboss.resteasy.client.jaxrs.engines.ManualClosingApacheHttpClient43Engine$3 cannot be cast to class io.micronaut.jaxrs.common.JaxRsMutableResponse (org.jboss.resteasy.client.jaxrs.engines.ManualClosingApacheHttpClient43Engine$3 and io.micronaut.jaxrs.common.JaxRsMutableResponse are in unnamed module of loader \'app\')","_links":{},"_embedded":{}}]}}'
    It looks like the issue is on Airbyte's side. Has any of you managed to set up the import of Airbyte metadata to have the observability in their warehouse?
  • g

    Gergely Imreh

    06/10/2025, 12:58 PM
    Heyhey, I'm building a custom connector in a UI, and coming up short figuring out how to configure a specific query. • I have one stream that returns a bunch of ID (let's call it
    id
    and have values of
    id1
    ,
    id2
    , etc) • The child stream would need a payload to query that puts those
    id
    s into a list in the request body such as:
    Copy code
    {"input": [{"id": id1}, {"id": id2}, ....]}
    and sends of that query (it's a batch one by default) Is this possible to configure (with a parent substream like this)? Or do I have to just run a sequential list of queries with
    Copy code
    {"input": [{"id": id1}]}
    then
    Copy code
    {"input": [{"id": id2}]}
    .... This would likely work, though probably hit rate limits, and takes longer time than the one that would run things in one go. Any suggestions? 🤔
    u
    • 2
    • 3
  • m

    Mike Moyer

    06/10/2025, 9:34 PM
    Hi all, I want to build a custom connector using the Python CDK to fetch data from an API where the API has the following requirements: 1. the request must be encrypted 2. the API provides an encrypted response that must then be decrypted Could a custom connector be used here to encrypt the request before sending and then decrypt the response? Or are there limitations to how custom connectors function that make this impossible/impractical? Thanks in advance for the help.
    m
    • 2
    • 11
  • a

    Anthony Smart

    06/11/2025, 1:40 PM
    Airbyte iterable EU connector - Do we know why this PR hasn't been merged? @[DEPRECATED] Marcos Marx
    u
    • 2
    • 2
  • a

    Albert Le

    06/11/2025, 6:57 PM
    Problem: Hi all, with the new Builder Tool, I'm using it to connect to an API, which has an endpoint that returns a list of study ids, this endpoint is
    /studies
    . There is another subendpoint called
    /studies/{id}/availability
    , where {id} is a single study id. Does the new builder Tool have an automated way of allowing me to call /studies endpoint andget a list of study_ids, and use that response as the input query parameter for the subendpoint? What i tried: Searched through documentation, but couldn't find anything for my use-case.
    l
    • 2
    • 1
  • j

    Jeroen de Vries

    06/12/2025, 10:26 AM
    -- Building Python connectors with Poetry V2 -- Should it be possible to build a Python Connector using Poetry 2.1? It is stated that poetry should be >= 1.1.8 in the docs Right now my build failes
    Failed to build connector image for platform linux/arm64: resolve: process "poetry check" did not complete successfully: exit code: 1
    But I want to exclude the use of Poetry.
  • m

    Matthew Wagaman

    06/12/2025, 7:17 PM
    Attempting to use the AsyncRetriever in my low-code source connector. I keep getting Json schema validation errors and I can't track down what is wrong. Anyone see the problem? Manifest in thread
    m
    • 2
    • 3
  • r

    Rashi Bhave

    06/12/2025, 7:40 PM
    Hi all, I've been trying to add semantic chunking to my connector (drive ---> weaviate). Seems like airbyte-cdk only supports fixed-length chunking. I backtracked the code and updated the necessary part in the airbyte-cdk repo ----> https://github.com/RashiBhave-nur/airbyte-python-cdk-custom/blob/main/airbyte_cdk/destinations/vector_db_based/document_processor.py I am building the docker image (only for the destination-weaviate code from airbyte repo) with the attached dockerfile. The docker image is added successfully in settings -> destinations -> +New Connector. While creating a connector it passes all destination tests too. Although sync is failing with the below error. (logs in txt file) I need some guidance on this. 1. Is there an easier way to do this? 2. Am i missing something? 3. If this is the right way, can someone help with the error. Thanks in advance!
    Dockerfile
    job_6108_attempt_5_txt.txt
  • t

    Theo Marchal

    06/16/2025, 9:35 AM
    Hello, why isn't it possible to fork the HubSpot connector on Airbyte Cloud? May connectors are available but I would need to fork HubSpot specifically. Any workaround would be appreciated, thanks
    p
    j
    u
    • 4
    • 4
  • a

    Alejandro De La Cruz López

    06/16/2025, 10:51 AM
    Hey, I'm trying to upgrade the ClickUp source connector to get new fields that the API is returning. However, the "task" stream testing returns a 400 error every time. This happens even if I don't change a thing on the source, which is weird because the source works well when using it in a connector. Do you have any idea?
    Copy code
    Client error : 400 Bad Request {"exceptionStack":"Traceback (most recent call last):\n File \"/home/airbyte/.pyenv/versions/3.10.17/lib/python3.10/site-packages/airbyte_cdk/connector_builder/connector_builder_handler.py\", line 83, in read_stream\n stream_read = test_read_handler.run_test_read(\n File \"/home/airbyte/.pyenv/versions/3.10.17/lib/python3.10/site-packages/airbyte_cdk/connector_builder/test_reader/reader.py\", line 128, in run_test_read\n schema, log_messages = self._get_infered_schema(\n File \"/home/airbyte/.pyenv/versions/3.10.17/lib/python3.10/site-packages/airbyte_cdk/connector_builder/test_reader/reader.py\", line 328, in _get_infered_schema\n schema = schema_inferrer.get_stream_schema(configured_stream.stream.name)\n File \"/home/airbyte/.pyenv/versions/3.10.17/lib/python3.10/site-packages/airbyte_cdk/utils/schema_inferrer.py\", line 266, in get_stream_schema\n self._clean(self.stream_to_builder[stream_name].to_schema())\n File \"/home/airbyte/.pyenv/versions/3.10.17/lib/python3.10/site-packages/airbyte_cdk/utils/schema_inferrer.py\", line 154, in _clean\n self._clean_properties(node)\n File \"/home/airbyte/.pyenv/versions/3.10.17/lib/python3.10/site-packages/airbyte_cdk/utils/schema_inferrer.py\", line 131, in _clean_properties\n self._clean(value)\n File \"/home/airbyte/.pyenv/versions/3.10.17/lib/python3.10/site-packages/airbyte_cdk/utils/schema_inferrer.py\", line 157, in _clean\n self._clean(node[_ITEMS])\n File \"/home/airbyte/.pyenv/versions/3.10.17/lib/python3.10/site-packages/airbyte_cdk/utils/schema_inferrer.py\", line 154, in _clean\n self._clean_properties(node)\n File \"/home/airbyte/.pyenv/versions/3.10.17/lib/python3.10/site-packages/airbyte_cdk/utils/schema_inferrer.py\", line 131, in _clean_properties\n self._clean(value)\n File \"/home/airbyte/.pyenv/versions/3.10.17/lib/python3.10/site-packages/airbyte_cdk/utils/schema_inferrer.py\", line 160, in _clean\n self._ensure_null_type_on_top(node)\n File \"/home/airbyte/.pyenv/versions/3.10.17/lib/python3.10/site-packages/airbyte_cdk/utils/schema_inferrer.py\", line 134, in _ensure_null_type_on_top\n if isinstance(node[_TYPE], list):\nKeyError: 'type'\n","exceptionClassName":"io.airbyte.protocol.models.v0.AirbyteTraceMessage
  • c

    Cadu Magalhães

    06/16/2025, 4:48 PM
    Some connectors are broken due to a constraint to airbyte >=1.7.0. Not sure if this was on purpose, but I guess not since 1.7.0 has no public release date or anything. There are currently 6 connectors that cant be upgraded, or in my case, cant even be installed in a new instance. https://github.com/airbytehq/airbyte/issues/61525
    j
    p
    • 3
    • 9
  • a

    Arthur Dev

    06/17/2025, 2:42 PM
    Hi everyone! Has anyone here successfully built a source connector for Bitrix24? I’m trying to set up incremental sync and ran into some issues with how the Bitrix24 API handles filters in the request body.Before going too deep, I wanted to check if anyone has tackled this before and could share insights or tips. Thanks in advance!
  • k

    Kailash Bisht

    06/19/2025, 1:05 PM
    Hey does anybody uses source-connector-facebook-pages with OSS?
  • s

    Sushmita Sen

    06/20/2025, 7:16 AM
    Hi everyone, We have recently migrated to airbyte v1.4, here we did upgraded the google ads connector version to 3.8.2 whcih at the backend using api version 18. Now i see that v18 is also going to deprecate in august. I would like to understand how airbyte will handle the upgrade. Are we going to get option to upgrade the google ads connector before sunset? How will it reflect in the UI , do we need to restart the docker instance?
    p
    • 2
    • 1
  • f

    Forrest Hicks

    06/20/2025, 5:22 PM
    Hey everyone, Recently upgraded to Airbyte v1.7 and had an issue with older custom connectors I had built. I couldn't test them because an unknown error. Also can't switch from the YAML view to the UI view because Incompatible YAML I'll attach messages below. I saw in this issue raised recently on GitHub that downgrading to v1.6.3 fixes the issue temporarily, but I try that and it didn't work for me. Could anyone help me with this? Maybe I'm missing something
    p
    i
    • 3
    • 6
  • a

    Ananta Patil

    06/23/2025, 10:50 AM
    Hi all, Has anyone build a connector for Apache iceberg? Thanks
    u
    • 2
    • 1
  • j

    Jens Mostaert

    06/23/2025, 1:43 PM
    I'm trying to build a low code connector for Exact Online. Exact online uses OAuth authentication with single use refresh tokens. I got this working correctly with one stream using the OAuthAuthenticator with a refresh_token_updater. However, as soon as I add a second stream, the sync starts to fail when the access_token expires for the first time. How does Airbyte handle token refreshes with multiple streams? Will each stream try to refresh the token individually? That might explain the error. Is there a way to orchestrate token refreshes of single use refresh tokens when using multiple streams?
    j
    • 2
    • 2
  • v

    Vivien Morlet

    06/23/2025, 5:32 PM
    Hi, I am working on source-notion connector. I am trying: 1) to run "airbyte-ci connectors --name source-notion test". Everything works but the acceptance tests with no explicit logs -> here for the detailed message and logs: https://airbytehq.slack.com/archives/C021JANJ6TY/p1750699518735949 2) to add the version "dev" on Airbyte UI so that I can test it but the update doesn't work. Do you know why? -> here for the detailed message: https://airbytehq.slack.com/archives/C021JANJ6TY/p1750699638874139 Thank you for your help!
    u
    • 2
    • 3
  • k

    Kailash Bisht

    06/24/2025, 8:19 AM
    Hi can anyone please share the latest working manifest for facebook-pages connector.
  • e

    Eyþór Helgason

    06/24/2025, 9:59 AM
    Hi, Does the the Shopify connector support
    poe
    instead of
    airbyte-ci
    ? (should be depreciated according to comment above) I am working on adding metafield definition streams and when I run
    poe test-integration-tests
    in directory
    /airbyte-integrations/connectors/source-shopify
    I get the following error:
    Copy code
    Poe => set -eu # Ensure we return non-zero exit code upon failure
    
    if ls integration_tests/test_*.py >/dev/null 2>&1; then
      poetry run pytest --junitxml=build/test-results/pytest-integration-tests-junit.xml integration_tests
    else
      echo "No 'integration_tests' directory found; skipping integration tests."
    fi
    The currently activated Python version 3.12.8 is not supported by the project (^3.10,<3.12).
    Trying to find and use a compatible version. 
    Using python3.11 (3.11.11)
    No 'integration_tests' directory found; skipping integration tests.
    Even though integration_tests is defined with the following files:
    Copy code
    integration_tests
      __init__.py
      abnormal_state.json
      acceptance.py
      configured_catalog.json
      expected_records.jsonl
      expected_records_transactions_with_user_id.jsonl
      invalid_config.json
      invalid_config_old.json
      invalid_oauth_config.json
      state.json
    Any help would be greatly appreciated :)