https://linen.dev logo
Join Slack
Powered by
# help-connector-development
  • j

    Jens Mostaert

    07/10/2025, 10:29 AM
    I'm running into an issue when configuring the api_budget section of my low code Exact Online connector. Exact returns the ratelimit_reset_header as milliseconds instead of seconds. Is there a way to configure airbyte to divide the value by 1000?
  • d

    Dhiraj Bhalerao

    07/10/2025, 10:50 AM
    getting following error when saving schema changes in airbyte cloud. An unknown error occurred. (HTTP 409)
    j
    u
    • 3
    • 3
  • s

    Sunil Jimenez

    07/10/2025, 7:25 PM
    @Ian Alton @[DEPRECATED] Marcos Marx Hello guys! the site is down. Cant even open a ticket
    u
    • 2
    • 8
  • d

    Damon Gudaitis

    07/10/2025, 10:35 PM
    I'm trying to use the Declarative Oauth2.0. All I want to do is to use the client_credentials grant_type but add an Authorization header that base64 encodes the client ID and secret separated by a colon. There's an example that seems to do this in the documentation.
    Example with the header value encoded into
    base64-string
    ```--- secret_header_manifest.yml
    +++ secret_header_manifest.yml
    spec:
    https://yourconnectorservice.com/oauth/consent?client_id={{client_id_value}}&redirect_uri={{
    redirect_uri_value }}&state={{ state }}
    access_token_url: >-
    - https://yourconnectorservice.com/oauth/token?client_id={{client_id_value}}&client_secret={{client_secret_value}}&code={{auth_code_value}}
    + https://yourconnectorservice.com/oauth/token?client_id={{client_id_value}}&code={{auth_code_value}}
    + access_token_headers:
    - SECRETHEADER: "{{ client_secret_value }}"
    + SECRETHEADER: "{{ (client_id_value ~ ':' ~ client_secret_value) | b64encode }}"
    complete_oauth_output_specification:
    required:```
    I've tried that code and various variations on it including just hard-coding an authorization header,
    Authorization : testval
    , no matter what I do, I can't get any custom header to appear.
    Copy code
    base_requester:
        type: HttpRequester
        url_base: <snip>
        authenticator:
          type: OAuthAuthenticator
          client_id: "{{ config[\"client_id\"] }}"
          grant_type: client_credentials
          client_secret: "{{ config[\"client_secret\"] }}"
          access_token_value: "{{ config[\"client_access_token\"] }}"
          refresh_request_body: {}
          token_refresh_endpoint: <https://api-platform.cvent.com/ea/oauth2/token>
          access_token_headers:
            Authorization: >-
              Basic {{ (config[\"client_id\"] ~ ':' ~ config[\"client_secret\"] ) |
              b64encode }}
    Regardless of what I put in
    access_token_headers
    , nothing below changes.
    Copy code
    {
      "url": "<snip although I don't think anyone cares>",
      "body": "grant_type=client_credentials&client_id=****&client_secret=****",
      "headers": {
        "User-Agent": "python-requests/2.32.4",
        "Accept-Encoding": "gzip, deflate",
        "Accept": "*/*",
        "Connection": "keep-alive",
        "Content-Length": "139",
        "Content-Type": "application/x-www-form-urlencoded"
      },
      "http_method": "POST"
    }
    The values were first populated by the OAuth Legacy connector since that one is the closest to what I want to do. I must be misunderstanding something with the Declarative OAuth2.0.
  • m

    Morgan Kerle

    07/10/2025, 11:41 PM
    Hey Folks, does anyone have a good idea of how Airbyte's schema evolution works? The docs state that non-breaking changes only should be actioned, but we found that in some circumstances Airbyte's schema evolution will remove columns which seems to contradict this.
  • b

    Bohdan Stadnyk

    07/11/2025, 3:39 PM
    Hey everyone, Im running into an issue with the api_budget and rate limiting. It either a bug where
    api_budget
    doesn't work or my confusion around its functionality. Considering the bitbucket rate limits which are 1k requests per 1h rolling window, you would expect this config never hit rate limits
    Copy code
    requester:
        type: HttpRequester
        url_base: "<https://api.bitbucket.org/2.0/>"
        http_method: GET
        authenticator:
          type: BasicHttpAuthenticator
          username: "{{ config['email'] }}"
          password: "{{ config['api_token'] }}"
        error_handler:
          type: DefaultErrorHandler
          max_retries: 100
          backoff_strategies:
            - type: ExponentialBackoffStrategy
              factor: 5
          response_filters:
            - type: HttpResponseFilter
              action: RETRY
              http_codes: [429, 500, 502, 503, 504]
    ....
    api_budget:
      type: HTTPAPIBudget
      status_codes_for_ratelimit_hit: [429]
      policies:
        - type: MovingWindowCallRatePolicy
          rates:
            - limit: 15
              interval: "PT1M"
            - limit: 900
              interval: "PT1H"
          matchers:
            - method: "GET"
              url_base: "<https://api.bitbucket.org/2.0/>"
    
    concurrency_level:
      type: ConcurrencyLevel
      default_concurrency: 1
      max_concurrency: 1
    However in output logs you can clearly see that airbyte hits rate limits
    Copy code
    {"type":"LOG","log":{"level":"INFO","message":"Retrying. Sleeping for 10.0 seconds"}}
    {"type":"LOG","log":{"level":"INFO","message":"Backing off _send(...) for 0.0s (airbyte_cdk.sources.streams.http.exceptions.UserDefinedBackoffException: Too many requests.)"}}
    {"type":"LOG","log":{"level":"INFO","message":"Retrying. Sleeping for 10.0 seconds"}}
    {"type":"LOG","log":{"level":"INFO","message":"Backing off _send(...) for 0.0s (airbyte_cdk.sources.streams.http.exceptions.UserDefinedBackoffException: Too many requests.)"}}
    {"type":"LOG","log":{"level":"INFO","message":"Retrying. Sleeping for 20.0 seconds"}}
    {"type":"LOG","log":{"level":"INFO","message":"Backing off _send(...) for 0.0s (airbyte_cdk.sources.streams.http.exceptions.UserDefinedBackoffException: Too many requests.)"}}
    {"type":"LOG","log":{"level":"INFO","message":"Retrying. Sleeping for 40.0 seconds"}}
    {"type":"LOG","log":{"level":"INFO","message":"Backing off _send(...) for 0.0s (airbyte_cdk.sources.streams.http.exceptions.UserDefinedBackoffException: Too many requests.)"}}
    {"type":"LOG","log":{"level":"INFO","message":"Retrying. Sleeping for 20.0 seconds"}}
    {"type":"LOG","log":{"level":"INFO","message":"Backing off _send(...) for 0.0s (airbyte_cdk.sources.streams.http.exceptions.UserDefinedBackoffException: Too many requests.)"}}
    {"type":"LOG","log":{"level":"INFO","message":"Retrying. Sleeping for 10.0 seconds"}}
    {"type":"LOG","log":{"level":"INFO","message":"Backing off _send(...) for 0.0s (airbyte_cdk.sources.streams.http.exceptions.UserDefinedBackoffException: Too many requests.)"}}
    {"type":"LOG","log":{"level":"INFO","message":"Retrying. Sleeping for 160.0 seconds"}}
    Does anyone knows what can be wrong? or this is expected behaviour? I found a few relevant threads on the relevant topic, but all them left unanswered, any help would appreciated. Yes, I read the docs, and I searched the repo for examples but couldn't find what is the reason behind this.
  • j

    James Carter

    07/11/2025, 11:49 PM
    Greetings! 😄 I am evaluating Airbyte for use at my place of work, and I decided to build custom source & destination connectors using Python. I created my own git repo from scratch and got the code written, with passing unit tests and integration tests. My next step is to package and deploy to a locally running instance of Airbyte (installed via abctl). I see conflicting information in the documentation about whether airbyte-ci is deprecated or not. But, I forged ahead and created a little docker container from the Airbyte Repo to build airbyte-ci and I can now run it over my own repo via a directory mount. The tool is telling me that my connector is not one of the bundled connectors and errors out. How can I package & deploy my own (python) connectors, from my own repo, to my local instance of Airbyte?
    • 1
    • 1
  • a

    Anudhyan Datta LT-23

    07/14/2025, 4:30 AM
    Hi community I need your help! I recently graduated with BTech in leather technology from Government college of engineering and leather technology. I recently got a placement in a company(role software engineer trainee) through campus but after joining they told that there will be 2 years bond and if I leave before 2 years I will have to pay an amount of 2 lakh. I was asked to leave as I did not agree to sign the bond and I am seeking an active employment. I have already got skills . I have been a part of Airbyte community since a long time. Please feel free to DM the opportunities/jobs you might be having! Regards, Anudhyan
  • f

    Forrest Hicks

    07/14/2025, 5:33 PM
    👋 Hi everyone, I’m working on internal documentation for rollback and recovery procedures on Airbyte Cloud, and I’d love some clarity on a few things: 1. Is there any way to manually roll back a connection to a previous working state (e.g. after a breaking schema change or version update)? 2. Does Airbyte Cloud offer any form of automated backups or version history for connection configs or sync states? 3. What are the current best practices around version control for connections on Airbyte Cloud? I’ve already reviewed the official docs on connector updates and state management, but I’d appreciate any additional insights or tips from the team or community — especially for production-critical use cases 🙏 Thanks in advance!
    j
    • 2
    • 1
  • v

    Victor K.

    07/14/2025, 7:22 PM
    Hi Team 👏, I have a connection that pulls data from an API to a third-party platform. However, the response does not include any date fields. In this case, is there a way to set up incremental sync? Please advise. Thanks in advance! 🙏
    j
    • 2
    • 6
  • b

    Blake

    07/14/2025, 9:06 PM
    @Natik Gadzhi @Maxime Carbonneau-Leclerc @Alexandre Girard (Airbyte) Hey folks - I'm looking into putting together a PR for: log http calls to api; I need https and the logging work for a low code connector so I can debug some parent calls (source-track-pms). The contents would of course be sensitive info, but would be cool to make it accessible for others who also need to debug their stuff. Totally understand if this isn't a high priority, but I'm open to doing some grunt work if we know what constraints are needed. Open to taking the conversation offline too if that's worthwhile. Thanks!
  • a

    Alex Johnson

    07/15/2025, 3:39 AM
    Hi Guys, I’m trying to create my first custom connector and can’t figure out the Consent and Access token URLs. API guide is here, https://docs.podium.com/docs/oauth Also, how do I see the consent URL that is being generated and sent? I can’t seem to locate this so it’s very hard to trouble shoot the issue. I just get a HTTP 500 error. Any help is greatly appreciated! Thanks! Alex
  • e

    Eric Hendrickson

    07/15/2025, 7:18 PM
    @kapa.ai just updated to 1.7.1 and I'm getting Forbidden in the Connector Builder UI (basically it defaults me to the YAML and won't let me switch back to UI). The only error that I can find is Forbidden. Any ideas? (including here in case others have seen this)
    j
    • 2
    • 3
  • j

    juhani connolly

    07/16/2025, 6:09 AM
    Heya, I understand that airbyte-ci is kind of deprecated now, but trying to verify our custom python cdk connector will work on 1.7 before doing the upgrade. Working in wsl with podman with kind for the test cluster and for some reason airbyte-ci is trying to access docker-credential-desktop.exe(where I've removed docker entirely and everything is aliased/symlinked to podman):
    Copy code
    The above exception was the direct cause of the
                        following exception:
    
                        ╭──────── Traceback (most recent call last) ─────────╮
                        │ in invoke:48                                       │
                        │                                                    │
                        │ in invoke:1485                                     │
                        │                                                    │
                        │ in invoke:824                                      │
                        │                                                    │
                        │ in build:74                                        │
                        │                                                    │
                        │ in run_connectors_pipelines:107                    │
                        │ in start:12                                        │
                        │                                                    │
                        │ in start:7334                                      │
                        │                                                    │
                        │ in execute:162                                     │
                        ╰────────────────────────────────────────────────────╯
                        QueryError: resolve: failed to resolve image
                        <http://docker.io/library/docker:24-dind|docker.io/library/docker:24-dind>: failed to resolve
                        source metadata for <http://docker.io/library/docker:24-dind|docker.io/library/docker:24-dind>:
                        failed to get credentials: error getting credentials -
                        err: exec: "docker-credential-desktop.exe": executable
                        file not found in $PATH, out: ``
               INFO     pipelines: Dagger logs saved to                        dagger_pipeline_command.py:56
                        /mnt/c/Users/A12527/Workspaces/airbyte/airbyte-ci/conn
                        ectors/pipelines/pipeline_reports/airbyte-ci/connector
                        s/build/manual/main/1752612580/105533dd3155234b7294c50
                        deb62060decf051bc/dagger.log
    Any ideas as to the cause(kind of long shot as I guess wsl + podman really weirds things out
  • k

    Kamil Maruta

    07/16/2025, 1:44 PM
    Hi everyone, I'm building a connector using the Connector Builder (Airbyte version 6.33.4) to fetch a list of workspace IDs from the Power BI Admin API endpoint
    /admin/workspaces/modified
    . The API response is a simple JSON object: { "status": 200, "body": [ { "id": "abc-123" }, { "id": "def-456" } ], "headers" : { } } My goal is to extract the array of objects from the
    body
    field in the API response and collect all
    id
    values into a single list. I don’t want each
    id
    to be treated as a separate partition. Instead, I want to group the `id`s into batches (e.g., 80 per request) and pass each batch as an array in the request body to downstream endpoints like
    getInfo
    and
    scanResult
    . I attempted to use the following
    record_selector
    :
    Copy code
    record_selector:
      type: RecordSelector
      field_path: ["body"]
    or body.* However, this results an empty list. And I got a warning "No records could be extracted from the response. Check the Response tab to see the raw response. If this looks correct, make sure the Record Selector is configured correctly." What is the correct and schema-compliant way to extract a top-level array (like
    body
    ) from the JSON response? Any working example would be greatly appreciated. Thanks
  • j

    Jens Mostaert

    07/17/2025, 11:28 AM
    I'm getting 500 errors on Airbyte Cloud on the "Test and save" button of Sources. I already created a support ticket yesterday, but no response so far. Anyone who knows what's going on?
    Copy code
    POST /api/v1/sources/update
    
    {
        "message": "Internal Server Error: io.grpc.StatusRuntimeException: ALREADY_EXISTS: Secret [projects/332657581931/secrets/airbyte_workspace_67db3dcd-eb80-424c-9ae2-e42d654f28bc_secret_78eadfdf-dde2-45e9-b287-d3f37b8c5ee3_v7084] already exists.",
        "exceptionClassName": "com.google.api.gax.rpc.AlreadyExistsException",
        "exceptionStack": [],
        "rootCauseExceptionStack": []
    }
    u
    • 2
    • 1
  • m

    Martin Andonov

    07/17/2025, 6:54 PM
    Hi All, Can you please help me set up the Builder configuration correctly to achieve the below pull? Goal: I have a response that contains a nested array in a column called vendorItems. I want to pull only the values in the nested array, without the remaining data. Approach: I'm attempting to use the Record Selector in the Builder UI to drill into the nested array. Below is my current setup: type: RecordSelector extractor: type: DpathExtractor field_path: - '*' - result - '*' - results Outcome: I get the error "No records could be extracted from the response. Check the Response tab to see the raw response. If this looks correct, make sure the Record Selector is configured correctly." Response: Below is a sample of the API response for reference: { "status": 200, "body": [ { "vendorId": "928b864a-9656-4dea-822f-d7cc4e94430f", "contactName": "", "currencyId": "88285281-a77e-4e96-ad51-6cfa2d0bbd31", "customFields": { "custom1": "", "custom2": "", "custom3": "", "custom4": "", "custom5": "", "custom6": "", "custom7": "", "custom8": "", "custom9": "", "custom10": "" }, "defaultAddressId": null, "defaultCarrier": "", "defaultPaymentMethod": "", "defaultPaymentTermsId": null, "discount": "0.00", "email": "", "fax": "", "isActive": true, "isTaxInclusivePricing": false, "lastModifiedById": "89aebc56-db26-4ee8-8325-0b9518519e94", "lastModifiedDttm": "2024-02-12T215104.2057167+00:00", "leadTimeDays": null, "name": "1000Bulbs", "phone": "", "remarks": "", "taxingSchemeId": null, "timestamp": "0000000004CA71B7", "website": "", "vendorItems": [] }, { "vendorId": "6677fe2e-081f-485b-baf0-e0de68787cd6", "contactName": "", "currencyId": "88285281-a77e-4e96-ad51-6cfa2d0bbd31", "customFields": { "custom1": "", "custom2": "", "custom3": "", "custom4": "", "custom5": "", "custom6": "", "custom7": "", "custom8": "", "custom9": "", "custom10": "" }, "defaultAddressId": null, "defaultCarrier": "", "defaultPaymentMethod": "", "defaultPaymentTermsId": null, "discount": "0.00", "email": "", "fax": "", "isActive": true, "isTaxInclusivePricing": false, "lastModifiedById": "89aebc56-db26-4ee8-8325-0b9518519e94", "lastModifiedDttm": "2023-02-28T195126.5667782+00:00", "leadTimeDays": null, "name": "220 Electronics", "phone": "", "remarks": "", "taxingSchemeId": null, "timestamp": "0000000003EF04F6", "website": "", "vendorItems": [] }, { "vendorId": "07c9f0e3-d7f5-41cc-93c8-aee3a5913d2b", "contactName": "", "currencyId": "88285281-a77e-4e96-ad51-6cfa2d0bbd31", "customFields": { "custom1": "", "custom2": "", "custom3": "", "custom4": "", "custom5": "", "custom6": "", "custom7": "", "custom8": "", "custom9": "", "custom10": "" }, "defaultAddressId": null, "defaultCarrier": "", "defaultPaymentMethod": "", "defaultPaymentTermsId": null, "discount": "0.00", "email": "", "fax": "", "isActive": true, "isTaxInclusivePricing": false, "lastModifiedById": "d5da1d41-953c-4896-a41c-2e133e3dc6d0", "lastModifiedDttm": "2016-07-21T134201.62+00:00", "leadTimeDays": null, "name": "220togo", "phone": "", "remarks": "", "taxingSchemeId": null, "timestamp": "000000000027A931", "website": "", "vendorItems": [ { "vendorItemId": "a322e7d1-fe27-4926-95e9-5ec342f6a4a1", "cost": "0.00000", "leadTimeDays": null, "lineNum": null, "productId": "1d2d5fa8-3e05-4113-8b3e-1612880dae53", "timestamp": "0000000001520AC6", "vendorId": "07c9f0e3-d7f5-41cc-93c8-aee3a5913d2b", "vendorItemCode": "" }, { "vendorItemId": "039ea565-cd6e-4c8c-9b02-3e12e77b6d65", "cost": "69.95000", "leadTimeDays": null, "lineNum": null, "productId": "450b5f66-a3d6-4ba4-bbe5-8acd02bfda7f", "timestamp": "000000000027D277", "vendorId": "07c9f0e3-d7f5-41cc-93c8-aee3a5913d2b", "vendorItemCode": "" }, { "vendorItemId": "a4df002e-e812-4423-8a41-637d62c241d3", "cost": "0.00000", "leadTimeDays": null, "lineNum": null, "productId": "7bdd5145-e6df-42dc-8be9-e9513cad9942", "timestamp": "0000000001520613", "vendorId": "07c9f0e3-d7f5-41cc-93c8-aee3a5913d2b", "vendorItemCode": "" }, { "vendorItemId": "d78e43ec-2365-4409-be10-da34006c5ae9", "cost": "0.00000", "leadTimeDays": null, "lineNum": null, "productId": "1515cb1b-9bf8-4f91-932d-ed903402bee4", "timestamp": "0000000001520643", "vendorId": "07c9f0e3-d7f5-41cc-93c8-aee3a5913d2b", "vendorItemCode": "" }, { "vendorItemId": "40664f08-56f2-4b8a-a843-e4df9d23d637", "cost": "129.00000", "leadTimeDays": null, "lineNum": 1, "productId": "a2dd5938-9421-435f-b1ff-34711c2180aa", "timestamp": "00000000034DCB12", "vendorId": "07c9f0e3-d7f5-41cc-93c8-aee3a5913d2b", "vendorItemCode": "Whirlpool 25 Liter microwave with Grill #WMGO9SDE" } ] },
    j
    • 2
    • 2
  • l

    Lorenzo

    07/17/2025, 10:21 PM
    I'm using the SFTP Bulk connector. How do I add a new stream to an existing connector?
  • p

    Paul Houghton

    07/18/2025, 12:52 PM
    is anyone building a cloudflare R2 connector?
    u
    p
    • 3
    • 2
  • m

    Mukesh Kumar

    07/19/2025, 5:18 AM
    Hello everyone, We just installed open-source Airbyte on our machine. We're encountering an issue with the BigQuery connector: the table is created in the airbyte_internal dataset, but it is not appearing in the final dataset.table
    j
    • 2
    • 1
  • b

    Bilal Mubasher

    07/21/2025, 8:48 AM
    Hi, everyone I am trying to connect using s3 connector with parquet file as a source but it showin "please wait a bit more" progress bar. Below is my json configuration, can anyone plz help what's the issue. is it configuration issue ? { "name": "S3", "workspaceId": "21897ffe-476a-4a13-a808-587f7134ef47", "definitionId": "69589781-7828-43c5-9f63-8925b1c1ccc2", "configuration": { "format": { "encoding": "utf8", "filetype": "csv", "delimiter": ",", "block_size": 10000, "quote_char": "\"", "double_quote": true, "infer_datatypes": true, "newlines_in_values": false }, "schema": "{}", "streams": [ { "globs": [ "**/*.parquet" ], "format": { "filetype": "parquet", "decimal_as_float": false }, "schemaless": false, "validation_policy": "Emit Record", "days_to_sync_if_history_is_full": 3, "name": "s3-stream" } ], "provider": {}, "delivery_method": { "delivery_type": "use_records_transfer" }, "bucket": "costsage-ai", "aws_access_key_id": "*****", "aws_secret_access_key": "*****" } }
  • b

    Bilal Mubasher

    07/21/2025, 8:57 AM
    Internal message: Failed to find output files from connector within timeout of 9 minute(s). Is the connector still running? airbyte error? after 9 minutes getting above error?
    p
    • 2
    • 3
  • b

    Bilal Mubasher

    07/21/2025, 1:33 PM
    Hi everyone, what's the maximum file size that Airbyte can successfully process as a source?
  • r

    Rafael Capelo

    07/21/2025, 5:59 PM
    Could someone help me implement the OAuth 2.0 authentication process for the Bling API? For some reason, I'm having issues during the redirection to the Airbyte callback endpoint. I don't know if it's due to some configuration in the connector or if it has something to do with the application not being https. If someone had the YAML or could guide me, it would be very helpful. Here’s the link to the authentication flow: https://developer.bling.com.br/aplicativos#fluxo-de-autoriza%C3%A7%C3%A3o
  • l

    Lorenzo

    07/23/2025, 4:02 AM
    Has anyone had issues adding streams to the SFTP Bulk connector? I continue to run into this error
  • g

    Gergely Imreh

    07/23/2025, 7:33 AM
    Heyhey, I am developing against an API, that I need to query for each calendar date to provide the right data, but they don't provide any "cursor-like" object that would let me use the incremental sync to do this. Is there any Pagination for example. In particular I have a start date, say
    2025-06-01
    , and from there I have to run requests with the query parameters:
    from_date
    set to
    2025-06-01
    and the
    to_date
    set to
    2025-06-02
    ; after this I have to query again, but setting
    2025-06-02
    and
    2025-06-03
    , and so on up until today • is this possible with incremental sync (if there's no cursor on the API, but it would keep its own "up to this date" value? (this would be the best, probably, meaning fastest sync) • or, is it possible through the possible though pagination? Or would I need to built a more custom connector for this? Cheers!
    ✅ 1
    p
    • 2
    • 6
  • r

    Rubén Laguna

    07/24/2025, 7:34 AM
    Is there any way in the Connector Field to drop all field except X and Y? From the YAML Reference for RemoveFields it seems like I need to list explicitly all fields. My use case is that I just want to get a single field from an API and I don't want to get new fields if they update the API.
    j
    • 2
    • 1
  • a

    Ander Aranburu

    07/25/2025, 7:59 AM
    Hi everyone! I’m working on a custom Salesforce destination using airbyte-cdk==6.60.0 and Airbyte Platform v7.2.0. I posted this message in this merged GitHub PR, and I’d love to better understand the context behind the fix that helped me avoid this error:
    State message does not contain id
    If anyone is familiar with how/why this issue arises, I’d really appreciate your insight. A few specific questions I’m trying to figure out: • Was this issue introduced by a recent change in the Airbyte Protocol or CDK? • Is it related to the source connector at all, or only to how the destination handles state messages? • Why don’t other destinations using CDK 6.x (e.g. destination-sqlite) seem to be affected? Thanks in advance for any help you can share! 🙏
  • a

    Alexei Kozhushkov

    07/25/2025, 9:05 AM
    Hello, how are you? I have a local
    airbyte
    install and it works great with mysql, mssql and postgres. But I can't make
    connector builder
    to work, e.g. even simples unauthenticated
    <https://restcountries.com/v3.1/all?fields=name>
    fails with
    Draft can't be saved
    error. Please advise 🙏
    j
    • 2
    • 8
  • m

    Michael Hernandez

    07/25/2025, 5:28 PM
    Hi team, does anyone have experience customizing connectors to feed output from one stream to another? As my concrete example, the JIRA connector stream for
    issue_remote_links
    only supports full_refresh which makes sense as Atlassian didn't include a date or ID column that could be used for incrementally syncing. If I could use the keys from the
    issues
    stream sync, I could achieve something similar by fetching only for particular issues.
    d
    • 2
    • 2
1...1718192021Latest