https://linen.dev logo
Join Slack
Powered by
# help-connector-development
  • p

    Pankaj Lal

    05/05/2025, 7:28 AM
    I am building a connector to cxone.niceincontact.com. This has oauth2 authentication This is what I am providing in consent url. I have already provided client_id and client_secret in Inputs
    Copy code
    <https://cxone.niceincontact.com/auth/authorize?{{client_id_key}}={{client_id_value}}&{{redirect_uri_key}}={{{{redirect_uri_value}}%7CurlEncoder}}&{{state_key}}={{state_value}}&scope=openid&response_type=code>
    I see this error in the console when I test the connection when airbyte calls
    Copy code
    get_oauth_consent_url
    {
        "message": "Internal Server Error: Error parsing '{{redirect_uri_value': syntax error at position 129, encountered '}', expected ':'",
        "exceptionClassName": "com.hubspot.jinjava.interpret.FatalTemplateErrorsException",
        "exceptionStack": [],
        "rootCauseExceptionStack": []
    }
    • 1
    • 3
  • m

    Max Werner

    05/05/2025, 3:04 PM
    Is it possible to inject a JSON array into the request body for incremental syncs? The API spec I'm working with accepts filters but as arrays like this:
    Copy code
    {
      "filters": [
        {
          "$gt": {
            "audit.modifiedDateTime": "2025-05-05T12:34:56Z"
          }
        }
      ]
    }
    but when I have a Incremental Sync section of the YAML like
    Copy code
    type: DatetimeBasedCursor
    cursor_field: audit.modifiedDateTime
    cursor_datetime_formats:
      - '%Y-%m-%dT%H:%M:%SZ'
    datetime_format: '%Y-%m-%dT%H:%M:%SZ'
    start_datetime:
      type: MinMaxDatetime
      datetime: '{{ config["start_date"] }}'
      datetime_format: '%Y-%m-%dT%H:%M:%SZ'
    start_time_option:
      type: RequestOption
      inject_into: body_json
      field_path:
        - filters
        - '0'
        - $gt
        - audit.modifiedDateTime
    The
    filters
    part of the request does not become an array but a KV pair like
    Copy code
    "filters": {
          "0": {
            "$gt": {
              "audit.modifiedDateTime": "2005-05-01T00:00:00Z"
            }
          }
        }
    How can I tell Airbyte that
    filters
    in the request JSON body is supposed to be an array?
    j
    • 2
    • 3
  • r

    Robert Kolb

    05/06/2025, 1:54 PM
    Has anyone found any reliable workaround to connect to databricks as a source with airbyte cloud?
  • s

    Sirine Hdiji

    05/07/2025, 8:39 AM
    Hi everyone! I'm working on a declarative stream using yaml, and I'm trying to use
    DatetimeStreamSlicer
    to generate monthly date slices between a
    start_date
    and
    end_date
    defined in the config :
    202501, 202502...
    My goal is to pass the
    date
    from each slice (in the format
    YYYYMM
    ) as a required query parameter in the API request. Here's a simplified version of my YAML configuration:
    Copy code
    stream_slicer:
        type: DatetimeStreamSlicer
        cursor_field: date
        start_datetime: "{{ config['start_date'] }}"
        end_datetime: "{{ config['end_date'] }}"
        step: "1M"
        datetime_format: "%Y%m"
        cursor_granularity: "P1M"
    
    retriever:
      type: SimpleRetriever
      requester:
        request_parameters:
          date: "{{ stream_slice['date'] }}"
    However, in the actual requests, the
    date
    parameter is missing, and the API returns a 400 error:
    "must have required property 'date'"
    . is
    DatetimeStreamSlicer
    still supported or has it been deprecated in favor of another approach? also, what are the best practices to pass dynamic query parameters ? Any tips or examples would be appreciated! Thanks a lot ๐Ÿ™
    u
    • 2
    • 1
  • a

    Afif

    05/08/2025, 6:56 AM
    Hi everyone. I'm working on a custom destination connector. I'm trying to upload the docker to my local Airbyte( using abctl). Somehow the connector can't connect with
    exit 1
    . when running
    spec
    , but it works well when i
    docker run <image> spec
    u
    • 2
    • 1
  • a

    Andres Pulgarin

    05/08/2025, 11:14 AM
    Good afternoon, everyone. It's a pleasure to be part of this incredible community. I wanted to ask if anyone has worked with this connector. I was asked to connect the ERP to our data warehouse. Thank you. Is it working correctly? ๐Ÿ™‚
    u
    • 2
    • 1
  • s

    Satish Chinthanippu

    05/08/2025, 3:11 PM
    Hi Team, How to add support of encoding UTF8 to destination connectors like Teradata. Currently, Teradata destination connector not able to handle ร‚ยฎ, รƒห†, ร‚ยจ, ร‚ยฐ, รƒโ€ฐ, ร‚ยฉ, ร‚ยป, รƒโ€“, ร‚ยซ, ร‚ยผ, รƒล“ special characters and Teradata JDBC driver failing with error as "[Teradata Database] [TeraJDBC 20.00.00.43] [Error 6706] [SQLState HY000] The string contains an untranslatable character." To avoid this error I need to use encoding as UTF8.
  • c

    Carolina Buckler

    05/09/2025, 2:17 PM
    Hello, I found these PRs in Github that have been closed to allow
    query_tag
    in the JDBC parameter in the Snowflake destination connector. Is there a way to have it add dynamically the Airbyte connection name instead of just a hardcoded value? https://github.com/airbytehq/airbyte/pull/9623 https://github.com/airbytehq/airbyte/issues/9467
    u
    • 2
    • 3
  • t

    Thomas Bazin

    05/12/2025, 9:51 AM
    Hi, I am facing an issue with a source connector set up using the connector builder. The source API exposes an endpoint to search for customers with a filter on modification date. It only returns a collection of identifiers. The API also exposes an endpoint to read a single customer (with full details, including the modification date). So I setup my first stream as incremental using an added property to get a cursor field. And a "child" stream, incremental too, with the following configuration for the parent stream :
    Copy code
    - type: SubstreamPartitionRouter
      parent_stream_configs:
        - type: ParentStreamConfig
          parent_key: idCustomer
          partition_field: idCustomer
          stream:
            $ref: '#/definitions/streams/Customer'
          incremental_dependency: true
    It seems to work when I test it directly into the builder, but it does not return anything when I use it for a connection. Connection logs do not show any call to the parent stream, and the state is the following :
    {"states": [], "parent_state": {"Customers": {}}, "lookback_window": 7, "use_global_cursor": false}
    What am I doing wrong ?
    j
    • 2
    • 3
  • n

    Namay Jindal

    05/13/2025, 1:32 PM
    Hi everyone ๐Ÿ‘‹, I'm experiencing rate limiting issues with my Luma API connector in Airbyte, and would appreciate some help troubleshooting. Issue: My Airbyte connection to Luma API keeps hitting rate limits, but a custom Python script I've written with similar functionality works fine without any rate limiting errors. Context: - I'm trying to sync events and event guests from Luma API to BigQuery - The API has a rate limit of 290 requests per minute (as specified in my connector YAML) - I've included my connector configuration YAML and a working Python script below for reference What I've tried: - The Python script respects rate limits with simple sleep timers and works perfectly - The Airbyte connector seems to be making requests too quickly or not properly handling rate limiting errors Questions: 1. How can I modify my Airbyte connector configuration to better handle rate limits? 2. Is there a way to slow down the request rate or add more robust backoff handling? 3. Should I consider switching to a custom connector based on the Python code that works? Here's my connector YAML and the Python script that works without hitting rate limits: Thanks in advance for your help! Looking forward to any insights or suggestions.
    new_events.pylu_ma.yaml
    p
    j
    • 3
    • 2
  • n

    Nick Zombolas

    05/13/2025, 5:21 PM
    hey everyone! I'm working with the Airbyte Instagram connector and have some questions. Facebook has two methods for getting instagram insights: Instagram API with facebook login and Instagram API with Instagram login. The former allows us to use a facebook app to get insights on an ig account linked to the facebook page, and the latter allows us to get insights via instagram credentials alone, without a linked facebook page. The Airbyte connector currently supports the facebook login method, and from what i can tell, it doesn't currently support login via instagram. There are some differences in the two approaches like different data being accessible, different types of access tokens, and the use of the base url
    <http://graph.instagram.com|graph.instagram.com>
    or
    <http://graph.facebook.com|graph.facebook.com>
    . 1. Can anybody familiar with the ig connector confirm that it would not be possible to ingest via instagram login method? looking at the facebook_business python package we are using, it uses
    <http://graph.facebook.com|graph.facebook.com>
    and i didn't see any options to use instagram api instead. 2. If this is indeed the case, are there any future plans to include the instagram login method? From what I can tell, there would need to be some changes in the Api stream to hit the correct initial endpoints for accounts, but insights endpoints for Media Insights, etc should be the same. for further context, here's the documentation overview describing the difference between the two methods: https://developers.facebook.com/docs/instagram-platform/overview Thanks!
  • m

    Mohammad Soori

    05/14/2025, 9:34 AM
    Hi Airbyte community, I have a requirement to fetch data from the Subskribe API in Airbyte. The workflow is: 1. Fetch Plans from the Plan endpoint (https://api.app.subskribe.com/plans) 2. Extract charge-ids from all plans. 3. For each charge-id, make an API call to https://api.app.subskribe.com/charges/{{charge-id}} to extract custom fields. 4. Stream the API responses (all charge details) to a destination (e.g., Postgres). Is it possible to achieve this entirely within Airbyte using existing connectors or features? If not, can this be accomplished by building a custom source connector with the Python CDK? If so, are there examples of connectors that combine database queries (e.g., Postgres) with HTTP API calls, and any tips for handling API rate limits or optimizing performance? Thanks for any insights or pointers!
  • h

    Hรฅkon Guttulsrud

    05/15/2025, 1:08 PM
    Hello ๐Ÿ™‚ Does there currently exist a method of using either one of these two: 1. https://en.wikipedia.org/wiki/Apache_Cassandra 2. https://docs.opensearch.org/docs/latest/about/ As sources in Airbyte? Has someone tried to achieve this previously? Thank you
    p
    • 2
    • 1
  • m

    Mathieson

    05/19/2025, 12:24 PM
    silly question, but the docs don't seem to answer this - is it possible to use custom Docker container based connectors with Airbyte cloud?
    h
    m
    • 3
    • 7
  • e

    Erhan Tuna

    05/21/2025, 5:56 AM
    Hello, I am new to connector development and I want to write a code that searches for the value of a special key in a json format record throughout the json and if found, adds the value as a special field in Transformations.
    u
    • 2
    • 1
  • d

    dilan silva

    05/22/2025, 8:55 AM
    Hi Team, I'm seeking for help to build my first source connector. I created a connector in cloud version of airbyte and there I tried to stream the data which returns fine. The data comes from an public API and its JSONL content. This is the manifest it produced,
    Copy code
    version: 6.48.15
    
    type: DeclarativeSource
    
    check:
      type: CheckStream
      stream_names:
        - datasets
    
    definitions:
      streams:
        datasets:
          type: DeclarativeStream
          name: datasets
          retriever:
            type: SimpleRetriever
            requester:
              $ref: "#/definitions/base_requester"
              path: /api/export
              http_method: GET
            record_selector:
              type: RecordSelector
              extractor:
                type: DpathExtractor
                field_path: []
            decoder:
              type: JsonlDecoder
          schema_loader:
            type: InlineSchemaLoader
            schema:
              $ref: "#/schemas/datasets"
      base_requester:
        type: HttpRequester
        url_base: >-
          <https://03100670-8969-4472-a593-d7a8cef4488b-00-15uejhh5ki8b9.janeway.replit.dev>
    
    streams:
      - $ref: "#/definitions/streams/datasets"
    
    spec:
      type: Spec
      connection_specification:
        type: object
        $schema: <http://json-schema.org/draft-07/schema#>
        required: []
        properties: {}
        additionalProperties: true
    
    schemas:
      datasets:
        type: object
        $schema: <http://json-schema.org/draft-07/schema#>
        additionalProperties: true
        properties: {}
    Now when I try to locally do this work in my local connector, it does not load the data, but it says Sync success with 0 bytes. In the log I can see these messages, (Not pasting full log here)
    Copy code
    2025-05-22 13:54:15 source ERROR Marking stream records as STARTED
    2025-05-22 13:54:15 source ERROR Syncing stream instance: records
    2025-05-22 13:54:15 source ERROR Setting state of SourceNexusDatasets stream to {}
    2025-05-22 13:54:15 source ERROR Syncing stream: records 
    2025-05-22 13:54:15 source ERROR Making outbound API request
    2025-05-22 13:54:15 source INFO Starting syncing SourceNexusDatasets
    2025-05-22 13:54:15 source INFO Marking stream records as STARTED
    2025-05-22 13:54:15 source INFO Malformed non-Airbyte record (connectionId = 034b7256-278b-4830-bba5-953c366de232): {"type": "DEBUG", "message": "Syncing stream instance: records", "data": {"message": "Syncing stream instance: records", "cursor_field": "[]", "primary_key": "None"}}
    2025-05-22 13:54:15 source INFO Setting state of SourceNexusDatasets stream to {}
    2025-05-22 13:54:15 source INFO Syncing stream: records 
    2025-05-22 13:54:15 source INFO Malformed non-Airbyte record (connectionId = 034b7256-278b-4830-bba5-953c366de232): {"type": "DEBUG", "message": "Making outbound API request", "data": {"request_body": "None", "headers": "{'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'}", "url": "<https://03100670-8969-4472-a593-d7a8cef4488b-00-15uejhh5ki8b9.janeway.replit.dev/api/export>", "message": "Making outbound API request"}}
    2025-05-22 13:54:15 replication-orchestrator INFO Stream status TRACE received of status: STARTED for stream records
    2025-05-22 13:54:15 replication-orchestrator INFO Sending update for records - null -> RUNNING
    2025-05-22 13:54:15 replication-orchestrator INFO Stream Status Update Received: records - RUNNING
    2025-05-22 13:54:15 replication-orchestrator INFO Creating status: records - RUNNING
    2025-05-22 13:54:17 source ERROR Receiving response
    2025-05-22 13:54:17 source INFO Malformed non-Airbyte record (connectionId = 034b7256-278b-4830-bba5-953c366de232): {"type": "DEBUG", "message": "Receiving response", "data": {"headers": "{'Content-Disposition': 'attachment; filename=\"ExportData.json\"', 'Content-Type': 'application/json', 'Date': 'Thu, 22 May 2025 08:24:19 GMT', 'Replit-Cluster': 'janeway', 'X-Powered-By': 'Express', 'X-Robots-Tag': 'none, noindex, noarchive, nofollow, nositelinkssearchbox, noimageindex, none, noindex, noarchive, nofollow, nositelinkssearchbox, noimageindex', 'Transfer-Encoding': 'chunked'}", "body": "{\"id\": 1, \"name\": \"Example Item 1\", \"category\": \"Category A\", \"price\": 19.99}\n{\"id\": 2, \"name\": \"Example Item 2\", \"category\": \"Category B\", \"price\": 29.99}\n{\"id\": 3, \"name\": \"Example Item 3\", \"category\": \"Category A\", \"price\": 15.50}\n{\"id\": 4, \"name\": \"Example Item 4\", \"category\": \"Category C\", \"price\": 45.00}\n{\"id\": 5, \"name\": \"Example Item 5\", \"category\": \"Category B\", \"price\": 35.25}", "message": "Receiving response", "status": "200"}}
    2025-05-22 13:54:17 source ERROR Read 0 records from records stream
    2025-05-22 13:54:17 source ERROR Marking stream records as STOPPED
    2025-05-22 13:54:17 source ERROR Finished syncing records
    2025-05-22 13:54:17 source ERROR SourceNexusDatasets runtimes:
    2025-05-22 13:54:17 source ERROR Syncing stream records 0:00:01.715473
    2025-05-22 13:54:17 source ERROR Finished syncing SourceNexusDatasets
    I have the same manifest configuration in the local connector except the
    version
    ,
    Copy code
    version: 0.90.0
    
    type: DeclarativeSource
    
    check:
      type: CheckStream
      stream_names:
        - "records"
    
    definitions:
      streams:
        records:
          type: DeclarativeStream
          name: records
          retriever:
            type: SimpleRetriever
            requester:
              type: HttpRequester
              url_base: <https://03100670-8969-4472-a593-d7a8cef4488b-00-15uejhh5ki8b9.janeway.replit.dev>
              path: /api/export
              http_method: GET
            record_selector:
              type: RecordSelector
              extractor:
                type: DpathExtractor
                field_path: []
            decoder:
              type: JsonlDecoder
          schema_loader:
            type: InlineSchemaLoader
            schema:
              $ref: "#/schemas/datasets"
    
    
    streams:
      - "#/definitions/streams/records"
      
    schemas:
      datasets:
        type: object
        $schema: <http://json-schema.org/draft-07/schema#>
        additionalProperties: true
        properties: {}
    Can someone please help on this ? I tried to change the version but it give an error,
    Copy code
    jsonschema.exceptions.ValidationError: The manifest version 6.48.15 is greater than the airbyte-cdk package version (0.90.0). Your manifest may contain features              
                        that are not in the current CDK version..
    u
    • 2
    • 2
  • m

    Mathieu Dumoulin

    05/22/2025, 5:53 PM
    Hi everyone, I have a question about parent streams. Given the API endpoint: https://api.inscribe.ai/api/v2/customers/{customer_id}/bank_accounts/{bank_account_id}/transactions Is it posible to define two parent streams where I get customer_id from one, and then for each customer id, get the bank account_id? I'm using the Connection Builder in Airbyte Cloud and the result I get is that every combination of A and B are sent to the API, which doesn't work at all.
  • h

    Hadrien Lepousรฉ

    05/24/2025, 7:51 PM
    Hey everyone I'm working on HubSpot source connector, on branch master I have lots of unit tests fails Anybody knows what's wrong ? I'm using python 3.11.6 For example
    poetry run pytest unit_tests/integrations/test_web_analytics_streams.py::TestCRMWebAnalyticsStream::test_given_two_pages_when_read_then_return_records
    Result:
    Copy code
    ======================================== short test summary info =========================================
    FAILED unit_tests/integrations/test_web_analytics_streams.py::TestCRMWebAnalyticsStream::test_given_two_pages_when_read_then_return_records[tickets_web_analytics-tickets-ticket-parent_stream_associations0] - ValueError: Invalid number of matches for `HttpRequestMatcher(request_to_match=ParseResult(scheme='https', netloc='<http://api.hubapi.com|api.hubapi.com>', path='/crm/v3/objects/ticket', params='', query='archived=false&associations=contacts&associations=deals&associations=companies&limit=100&properties=closed_date,createdate', fragment='') with headers {} and body None), minimum_number_of_expected_match=1, actual_number_of_matches=0)`
    FAILED unit_tests/integrations/test_web_analytics_streams.py::TestCRMWebAnalyticsStream::test_given_two_pages_when_read_then_return_records[deals_web_analytics-deals-deal-parent_stream_associations1] - ValueError: Request ParseResult(scheme='https', netloc='<http://api.hubapi.com|api.hubapi.com>', path='/crm/v3/schemas', params='', query='', fragment='') with headers {} and body None) already mocked
    FAILED unit_tests/integrations/test_web_analytics_streams.py::TestCRMWebAnalyticsStream::test_given_two_pages_when_read_then_return_records[companies_web_analytics-companies-company-parent_stream_associations2] - ValueError: Request ParseResult(scheme='https', netloc='<http://api.hubapi.com|api.hubapi.com>', path='/crm/v3/schemas', params='', query='', fragment='') with headers {} and body None) already mocked
    FAILED unit_tests/integrations/test_web_analytics_streams.py::TestCRMWebAnalyticsStream::test_given_two_pages_when_read_then_return_records[contacts_web_analytics-contacts-contact-parent_stream_associations3] - ValueError: Request ParseResult(scheme='https', netloc='<http://api.hubapi.com|api.hubapi.com>', path='/crm/v3/schemas', params='', query='', fragment='') with headers {} and body None) already mocked
    FAILED unit_tests/integrations/test_web_analytics_streams.py::TestCRMWebAnalyticsStream::test_given_two_pages_when_read_then_return_records[engagements_calls_web_analytics-engagements_calls-calls-parent_stream_associations4] - ValueError: Request ParseResult(scheme='https', netloc='<http://api.hubapi.com|api.hubapi.com>', path='/crm/v3/schemas', params='', query='', fragment='') with headers {} and body None) already mocked
    FAILED unit_tests/integrations/test_web_analytics_streams.py::TestCRMWebAnalyticsStream::test_given_two_pages_when_read_then_return_records[engagements_emails_web_analytics-engagements_emails-emails-parent_stream_associations5] - ValueError: Request ParseResult(scheme='https', netloc='<http://api.hubapi.com|api.hubapi.com>', path='/crm/v3/schemas', params='', query='', fragment='') with headers {} and body None) already mocked
    FAILED unit_tests/integrations/test_web_analytics_streams.py::TestCRMWebAnalyticsStream::test_given_two_pages_when_read_then_return_records[engagements_meetings_web_analytics-engagements_meetings-meetings-parent_stream_associations6] - ValueError: Request ParseResult(scheme='https', netloc='<http://api.hubapi.com|api.hubapi.com>', path='/crm/v3/schemas', params='', query='', fragment='') with headers {} and body None) already mocked
    FAILED unit_tests/integrations/test_web_analytics_streams.py::TestCRMWebAnalyticsStream::test_given_two_pages_when_read_then_return_records[engagements_notes_web_analytics-engagements_notes-notes-parent_stream_associations7] - ValueError: Request ParseResult(scheme='https', netloc='<http://api.hubapi.com|api.hubapi.com>', path='/crm/v3/schemas', params='', query='', fragment='') with headers {} and body None) already mocked
    FAILED unit_tests/integrations/test_web_analytics_streams.py::TestCRMWebAnalyticsStream::test_given_two_pages_when_read_then_return_records[engagements_tasks_web_analytics-engagements_tasks-tasks-parent_stream_associations8] - ValueError: Request ParseResult(scheme='https', netloc='<http://api.hubapi.com|api.hubapi.com>', path='/crm/v3/schemas', params='', query='', fragment='') with headers {} and body None) already mocked
  • g

    Giulliano Bueno

    06/02/2025, 8:41 AM
    I'm trying to collect data from my client's Databricks and save it to our Big Query dataset using Airbyte Cloud. However, I've hit a roadblock with building a JDBC connector. After copying an existing connector, I need guidance on creating the repository and configuring the
    gradlew
    build process. Can anyone point me to tutorials or resources on contributing to this project or setting up Databricks as a source without creating a new connector?
    p
    • 2
    • 2
  • e

    Erin Yener

    06/02/2025, 11:23 AM
    Hey, community! octavia wave Iโ€™m interested in forking a community connector and adding an API budget parameter. Iโ€™m using Airbyte Cloud and would love to do this in the Builder UI if possible. โ€ข Is it possible to add an API budget using the UI? It looks like it would be a โ€˜new user inputโ€™ in the
    Inputs
    section, but would love confirmation on how best to do this. โ€ข If itโ€™s best to modify the YAML directly, would anyone be able to advise me on what section of the YAML to add the API budget to? The docs have some examples, but Iโ€™m not clear on how to add this to a forked connector. โ€ข Are there existing connector examples that have this parameter as an optional input so that I can see how it โ€˜fitsโ€™? Thanks for the help!
    p
    • 2
    • 3
  • j

    Jens Mostaert

    06/04/2025, 12:53 PM
    I could use some guidance in developing an exact online connector using the connector builder. After some effort I got the oauth call working to the point that I receive an access_token. However in the "testingValues" in the browser console I see this token is stored under the key "undefined", which causes issues when trying to call one of the streams after authenticating.
    • 1
    • 1
  • n

    Nick Zombolas

    06/04/2025, 8:06 PM
    hey, I"m working on a connector that has some yaml streams and some python streams. I want to implement a parent stream that gets a secret from aws secrets manager and uses the results in the child streams. I've implemented this in python already and it works as a parent stream for my python stream, but I'm having trouble getting this to work in yaml streams. I've tried to set the python stream as the parent stream for all yaml streams, but i can't get the schema to validate since yaml doesn't have knowledge of the python stream during the
    spec
    job. I've been looking into how to implement this steam in yaml instead of calling my python class, but not sure the best way to get this done. any advice? thanks!
    u
    • 2
    • 2
  • m

    Mert Ors

    06/05/2025, 9:05 AM
    This pr broke all my airtable connections: https://github.com/airbytehq/airbyte/commit/0a99c45298d53f6c42d02332e1882399d66fb84c
    u
    m
    • 3
    • 4
  • p

    Paul

    06/05/2025, 1:35 PM
    I am beginning to learn how to modify connectors and have tried to make a small change to the destination - blob storage. I made the code change and have attempted to run -
    airbyte-ci connectors --name destination-azure-blob-storage build
    But sadly after about 30 seconds it fails. The build output html is empty. The dagger.log file is also empty. This is what i get in the window log below - any ideas what ive missed? ive updated the secrets as per the readme, but no luck.
    Copy code
    [23:33:09] INFO     root: Setting working directory to /home/pladmin/airbyte/airbyte                                                                                                         ensure_repo_root.py:58
    [23:33:10] INFO     root: Setting working directory to /home/pladmin/airbyte/airbyte                                                                                                         ensure_repo_root.py:58
               INFO     pipelines: airbyte-ci is up to date. Installed version: 5.2.5. Latest version: 5.2.5                                                                                          auto_update.py:89
               INFO     pipelines: Called with dagger run: False                                                                                                                                      airbyte_ci.py:127
               INFO     pipelines.cli.dagger_run: Running command: ['/home/pladmin/bin/dagger', '--silent', 'run', 'airbyte-ci', 'connectors', '--name', 'destination-azure-blob-storage', 'build']   dagger_run.py:120
    [23:33:18] INFO     root: Setting working directory to /home/pladmin/airbyte/airbyte                                                                                                         ensure_repo_root.py:58
    [23:33:19] INFO     root: Setting working directory to /home/pladmin/airbyte/airbyte                                                                                                         ensure_repo_root.py:58
               INFO     pipelines: airbyte-ci is up to date. Installed version: 5.2.5. Latest version: 5.2.5                                                                                          auto_update.py:89
               INFO     pipelines: Called with dagger run: True                                                                                                                                       airbyte_ci.py:127
    [23:33:27] INFO     pipelines: Will run on the following 1 connectors: destination-azure-blob-storage.                                                                                               commands.py:32
               INFO     pipelines: Running Dagger Command build...                                                                                                                        dagger_pipeline_command.py:32
               INFO     pipelines: If you're running this command for the first time the Dagger engine image will be pulled, it can take a short minute...                                dagger_pipeline_command.py:33
               INFO     pipelines: Saving dagger logs to:                                                                                                                                 dagger_pipeline_command.py:43
                        /home/pladmin/airbyte/airbyte/airbyte-ci/connectors/pipelines/pipeline_reports/airbyte-ci/connectors/build/manual/master/1749094400/b2ffb0185be442ddf72677067d3a8
                        243fbba770f/dagger.log
               INFO     pipelines: Building connectors for ['linux/amd64'], use --architecture to change this.                                                                                           commands.py:46
               INFO     Build connector destination-azure-blob-storage: Should send status check: False                                                                                         pipeline_context.py:222
    [23:33:29] INFO     root: Using storage driver: fuse-overlayfs                                                                                                                                         docker.py:85
    [23:33:56] INFO     Build connector destination-azure-blob-storage: Caching the latest CDK version...                                                                                       pipeline_context.py:284
               INFO     Build connector destination-azure-blob-storage: Should send status check: False                                                                                         pipeline_context.py:222
               INFO     Build connector destination-azure-blob-storage - Build connector tar: ๐Ÿš€ Start Build connector tar                                                                                 steps.py:303
               ERROR    Build connector destination-azure-blob-storage: An error got handled by the ConnectorContext                                                                                     context.py:253
                        โ•ญโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€ Traceback (most recent call last) โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ•ฎ
                        โ”‚ in run_connector_build_pipeline:49                                                                                                                                           โ”‚
                        โ”‚                                                                                                                                                                              โ”‚
                        โ”‚ in run_connector_build:33                                                                                                                                                    โ”‚
                        โ”‚                                                                                                                                                                              โ”‚
                        โ”‚ in run_connector_build:60                                                                                                                                                    โ”‚
                        โ”‚                                                                                                                                                                              โ”‚
                        โ”‚ in run:307                                                                                                                                                                   โ”‚
                        โ”‚                                                                                                                                                                              โ”‚
                        โ”‚ in __aexit__:772                                                                                                                                                             โ”‚
                        โ•ฐโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ•ฏ
                        ExceptionGroup: unhandled errors in a TaskGroup (1 sub-exception)
               ERROR    Build connector destination-azure-blob-storage: No test report was provided. This is probably due to an upstream error                                                           context.py:255
    โ•ญโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€ DESTINATION-AZURE-BLOB-STORAGE - REPORT โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ•ฎ
    โ”‚        Steps results                                                                                                                                                                                            โ”‚
    โ”‚ โ”โ”โ”โ”โ”โ”โ”โ”ณโ”โ”โ”โ”โ”โ”โ”โ”โ”ณโ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”“                                                                                                                                                                                    โ”‚
    โ”‚ โ”ƒ Step โ”ƒ Result โ”ƒ Duration โ”ƒ                                                                                                                                                                                    โ”‚
    โ”‚ โ”กโ”โ”โ”โ”โ”โ”โ•‡โ”โ”โ”โ”โ”โ”โ”โ”โ•‡โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”ฉ                                                                                                                                                                                    โ”‚
    โ”‚ โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜                                                                                                                                                                                    โ”‚
    โ”‚ โ„น๏ธ  You can find more details with step executions logs in the saved HTML report.                                                                                                                                โ”‚
    โ•ฐโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€ โฒ๏ธ  Total pipeline duration for destination-azure-blob-storage: 0.36s โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€
  • a

    Aphonso Henrique do Amaral Rafael

    06/05/2025, 5:10 PM
    Hello community, I built a custom source connector in my Airbyte, that works fine when testing (it can fetch the data and display the results, etc.; so everything looks good), as shown in the first picture. However, due to requirements from my company, I cannot fetch images from remote repositories, but only the authorised remote place, where we store all our images to pull from it. So, for instance, one of my connectors is MySQL, hence I created a "custom connector" over the original MySQL image from Airbyte that I hosted internally and then it works. My struggle is, in this "custom connector", I don't have the "original image" to fetch and then host into my repository, so I am trying to identify "where can I find the image" of this custom connector, so I can host it in my repository remote as well - if it makes sense. When I try to use this custom connector in my sources, see this "source-declaritve-manifest" file (2nd picture), but I don't believe this is the "image" of the connector, which is actually what I'm looking for. Any thoughts? Thank you very much!
    p
    • 2
    • 3
  • j

    Juliette Duizabo

    06/09/2025, 4:39 PM
    Hello, I am trying to create a super simple connector to retrieve Airbyte metadata. (inception) I tested this connector in the builder and it works, however, when I want to add the new connection, in "define source", I paste the same Client Secret as the one I used in the builder, the test takes a long time (feels like ~3 minutes), and then says
    Copy code
    Configuration check failed
    'Encountered an error while checking availability of stream sources. Error: Request URL: <https://api.airbyte.com/v1/applications/token>, Response Code: 500, Response Text: {"message":"Internal Server Error","_links":{"self":{"href":"/api/public/v1/applications/token","templated":false}},"_embedded":{"errors":[{"message":"Internal Server Error: class org.jboss.resteasy.client.jaxrs.engines.ManualClosingApacheHttpClient43Engine$3 cannot be cast to class io.micronaut.jaxrs.common.JaxRsMutableResponse (org.jboss.resteasy.client.jaxrs.engines.ManualClosingApacheHttpClient43Engine$3 and io.micronaut.jaxrs.common.JaxRsMutableResponse are in unnamed module of loader \'app\')","_links":{},"_embedded":{}}]}}'
    It looks like the issue is on Airbyte's side. Has any of you managed to set up the import of Airbyte metadata to have the observability in their warehouse?
  • g

    Gergely Imreh

    06/10/2025, 12:58 PM
    Heyhey, I'm building a custom connector in a UI, and coming up short figuring out how to configure a specific query. โ€ข I have one stream that returns a bunch of ID (let's call it
    id
    and have values of
    id1
    ,
    id2
    , etc) โ€ข The child stream would need a payload to query that puts those
    id
    s into a list in the request body such as:
    Copy code
    {"input": [{"id": id1}, {"id": id2}, ....]}
    and sends of that query (it's a batch one by default) Is this possible to configure (with a parent substream like this)? Or do I have to just run a sequential list of queries with
    Copy code
    {"input": [{"id": id1}]}
    then
    Copy code
    {"input": [{"id": id2}]}
    .... This would likely work, though probably hit rate limits, and takes longer time than the one that would run things in one go. Any suggestions? ๐Ÿค”
    u
    • 2
    • 3
  • m

    Mike Moyer

    06/10/2025, 9:34 PM
    Hi all, I want to build a custom connector using the Python CDK to fetch data from an API where the API has the following requirements: 1. the request must be encrypted 2. the API provides an encrypted response that must then be decrypted Could a custom connector be used here to encrypt the request before sending and then decrypt the response? Or are there limitations to how custom connectors function that make this impossible/impractical? Thanks in advance for the help.
    m
    • 2
    • 11
  • a

    Anthony Smart

    06/11/2025, 1:40 PM
    Airbyte iterable EU connector - Do we know why this PR hasn't been merged? @[DEPRECATED] Marcos Marx
    u
    • 2
    • 2
  • a

    Albert Le

    06/11/2025, 6:57 PM
    Problem: Hi all, with the new Builder Tool, I'm using it to connect to an API, which has an endpoint that returns a list of study ids, this endpoint is
    /studies
    . There is another subendpoint called
    /studies/{id}/availability
    , where {id} is a single study id. Does the new builder Tool have an automated way of allowing me to call /studies endpoint andget a list of study_ids, and use that response as the input query parameter for the subendpoint? What i tried: Searched through documentation, but couldn't find anything for my use-case.
    l
    • 2
    • 1
1...1718192021Latest