https://linen.dev logo
Join Slack
Powered by
# ask-ai
  • j

    Júlia Lemes

    08/22/2025, 2:22 PM
    @16 Deepak Kamble How to downgrade airbyte version?
    k
    • 2
    • 1
  • l

    Lisha Zhang

    08/22/2025, 2:53 PM
    @kapa.ai I want to use the Account ID filter in Pinterest. However in Airbyte UI when I enter a valid account id that I can access I get this error message: "Configuration check failed "Forbidden. You don't have permission to access this resource.""
    k
    • 2
    • 1
  • k

    Kiet Luu (Ken)

    08/22/2025, 3:13 PM
    @kapa.ai what should i put in pyairbyte config dict given the following yaml ? wait for my next prompt pls i'll send code
    k
    • 2
    • 24
  • r

    Renu Fulmali

    08/22/2025, 4:18 PM
    Hi @kapa.ai I need to increase the env: MAX_SYNC_WORKERS: 50 # Example: Allows 50 concurrent syncs per worker MAX_CHECK_WORKERS: 10 MAX_SPEC_WORKERS: 10 MAX_DISCOVER_WORKERS: 10 TEMPORAL_WORKER_PORTS: 80 # Sum of MAX_*_WORKERS (50+10+10+10) I am currently using the helm chart of aIRBYTE
    k
    • 2
    • 12
  • y

    yingting

    08/22/2025, 4:20 PM
    @kapa.ai point me to the source code that throws this error on postgres source connector:
    io.airbyte.cdk.integrations.source.relationaldb.state.FailedRecordIteratorException: java.lang.RuntimeException: java.lang.RuntimeException: org.postgresql.util.PSQLException: ERROR: feature not supported on beam relations
    k
    • 2
    • 7
  • l

    Lukas Heinz

    08/22/2025, 4:27 PM
    @kapa.ai Clockify connector returns
    An unknown error occurred. (HTTP 504)
    k
    • 2
    • 4
  • t

    Todd Matthews

    08/22/2025, 4:39 PM
    I need help troubleshooting the bootloader getting stuck in version 1.6.5
    k
    • 2
    • 7
  • j

    Júlia Lemes

    08/22/2025, 7:18 PM
    @kapa.ai I'm receiving this error when trying to install airbyte: unable to install airbyte chart: unable to install helm: failed pre-install: 1 error occurred: * pod airbyte-abctl-bootloader failed
    k
    • 2
    • 1
  • j

    Júlia Lemes

    08/22/2025, 7:33 PM
    @kapa.ai How to reset local credentials to enter airbyte?
    k
    • 2
    • 7
  • c

    Cody Redmond

    08/22/2025, 7:46 PM
    Hi there -- does Airbyte provide a Data Privacy Agreement that I can source for my own GDPR compliance?
    k
    • 2
    • 2
  • j

    Júlia Lemes

    08/22/2025, 8:20 PM
    @kapa.ai Im trying to set sources and destinations and receiving this error: Airbyte is temporarily unavailable. Please try again. (HTTP 502)
    k
    • 2
    • 10
  • s

    Steve Caldwell

    08/22/2025, 8:41 PM
    @kapa.ai - I have a connection using the Hubspot source that has a current sync in progress. This sync has been processing for the past 100 days and still hasn't finished, producing over 4 billion rows so far from 2 years of data, and it still has over 6 months of data to sync. This doesn't seem feasible. Could there be a problem with my sync settings? Is there a way to backup the raw__stream table in the airbyte_internal schema for this stream? If I stop the sync, will it damage the raw__stream table? Does airbyte gradually load records directly to the final table in my destination so that if I stopped the sync, I'll still keep the records that have already been loaded?
    k
    • 2
    • 10
  • j

    Júlia Lemes

    08/22/2025, 9:11 PM
    @kapa.ai I'm trying to create sources and destinations but receiving an error. I installed this way: abctl local install --chart-version 1.6.1 --insecure-cookies. Erro: ERROR i.a.c.s.e.h.IdNotFoundExceptionHandler(handle):33 - Not found exception class NotFoundKnownExceptionInfo { id: null message: Id not found: Could not find configuration for STANDARD_WORKSPACE: 257cfeba-801f-4bfc-a046-c7e1eb63854f. exceptionClassName: io.airbyte.commons.server.errors.IdNotFoundKnownException exceptionStack: [io.airbyte.commons.server.errors.IdNotFoundKnownException: Id not found: Could not find configuration for STANDARD_WORKSPACE: 257cfeba-801f-4bfc-a046-c7e1eb63854f
    k
    • 2
    • 1
  • j

    Júlia Lemes

    08/22/2025, 9:22 PM
    @kapa.ai I've tried installing Airbyte version 1.8 but couldn't set sources or destinations and kept receiving Airbyte is temporarily unavailable. Please try again. (HTTP 502). I had installed previously the 1.6 version but had to uninstall using the --persisted flag and using rm -rf ~/.airbyte/abctl and then reinstall using the 1.8 version. I noticed that all pods were running and didn't find any errors in the log. I noticed the check and write pod weren't appearing there. I then uninstalled again using the --persisted flag and rm -rf ~/.airbyte/abctl and reinstalled airbyte using this command: abctl local install --chart-version=1.6.1 --insecure-cookies. But it's still returning the same error when trying to connect to the source and found this in the server log: ERROR i.a.c.s.e.h.IdNotFoundExceptionHandler(handle):33 - Not found exception class NotFoundKnownExceptionInfo { id: null message: Id not found: Could not find configuration for STANDARD_WORKSPACE: 257cfeba-801f-4bfc-a046-c7e1eb63854f. exceptionClassName: io.airbyte.commons.server.errors.IdNotFoundKnownException exceptionStack: [io.airbyte.commons.server.errors.IdNotFoundKnownException: Id not found: Could not find configuration for STANDARD_WORKSPACE: 257cfeba-801f-4bfc-a046-c7e1eb63854f.
    k
    • 2
    • 4
  • j

    Júlia Lemes

    08/22/2025, 10:00 PM
    @kapa.ai My sync has been running for 20 minutes, is this correct?
    k
    • 2
    • 1
  • h

    Hari Haran R

    08/23/2025, 5:55 AM
    @kapa.ai i want to buid a CDK connector for acuamatica , here the api are working with cookies, how to build a cdk for this
    k
    • 2
    • 1
  • i

    INISH KASHYAP

    08/23/2025, 10:37 AM
    I'm experiencing persistent installation failures with
    abctl
    on AWS EC2 and would appreciate any guidance or insights. Environment Setup: • Instance: AWS EC2 t3.large (2 vCPUs, 8GB RAM, 45GB storage) • OS: Amazon Linux 2023 • Docker: 25.0.8 • abctl: 0.30.1 • Region: ap-south-1 (India) The Problem: Every
    abctl local install
    attempt fails at the exact same point - during nginx/ingress-nginx Helm chart installation. The process runs for 75+ minutes before timing out. Command used: bash
    Copy code
    abctl local install --host <http://myairbytezin.duckdns.org|myairbytezin.duckdns.org> --insecure-cookies --port 8000
    Error Pattern: 1. ✅ Cluster creation succeeds 2. ✅ Initial setup completes 3. ❌ Gets stuck at:
    Installing 'nginx/ingress-nginx' (version: 4.13.1) Helm Chart
    4. ❌ Repeated timeout errors:
    Copy code
    W0823 04:39:38.002418 13320 reflector.go:561] failed to list *unstructured.Unstructured: 
    Get "<https://127.0.0.1:34281/apis/batch/v1/namespaces/ingress-nginx/jobs>": dial tcp 127.0.0.1:34281: i/o timeout
    Resources Confirmed Sufficient: • Storage: 36GB free (21% usage) • Memory: 7.0GB available (only 387MB used) • Docker: Healthy with 6.7GB reclaimable space
    k
    • 2
    • 1
  • t

    Thomas Niederberger

    08/23/2025, 11:32 PM
    I try to build a connector for Walmart using the Cloud version and run into the issue that I can not have a Header like “WM_SVC.NAME” as of the use of a period.
    k
    • 2
    • 1
  • p

    Poorna Premachandra

    08/24/2025, 3:28 AM
    @kapa.ai I'm getting the following error for second sync of a connection that is set mysql as source and snowflake as destination.
    Copy code
    [config_error] MySQL Connector Error: The sync encountered an unexpected error in the change event producer and has stopped. Please check the logs for details and troubleshoot accordingly.
    <https://docs.oracle.com/javase/9/docs/api/java/lang/RuntimeException.html>
    the connection set to sync from CDC. initial sync completes in 12 hours with 375 GB loaded. Binlogs are available for 16 hours. After initial sync completed second sync ran and in 1 hour it gets failed with 1 hour. In source connection, I have set
    initial loadout time
    to 16 hours and
    concurrency
    to 2 as well. What could be the issue here?
    k
    • 2
    • 1
  • s

    Stav Hans

    08/24/2025, 9:27 AM
    when incremental sync fails, the next sync will start from the start time defined in the connector? or from the incremental point?
    k
    • 2
    • 1
  • o

    Ofek Eliahu

    08/24/2025, 11:42 AM
    @kapa.ai I am creating a python connector i want to print my secret refresh token for testing. why when i print it the out is ** Token refresh token: ** <class 'str'> but there is a value which working and running. how can i retrive the value itself and not the * characters
    k
    • 2
    • 1
  • n

    Noam Moskowitz

    08/24/2025, 11:44 AM
    @kapa.ai How can i avoid getting this error?
    io.grpc.StatusRuntimeException: RESOURCE_EXHAUSTED: grpc: received message larger than max (4194365 vs. 4194304)
    k
    • 2
    • 6
  • o

    Ofek Eliahu

    08/24/2025, 1:12 PM
    @kapa.ai i am using the single use refresh token for oauth in my python connector. when i first initial my source it works and the oauth check is success but it doesn't save the values to the config for future use. it does set it in memory but not saving it for future runs this is my configuration. is there something wrong? i used the same configuration for other source that i created and everything worked
    Copy code
    basic_auth = BasicHttpAuthenticator(
            username=credentials["client_id"],
            password=credentials["client_secret"],
            config={},
            parameters={},
        )
    
        return SingleUseRefreshTokenOauth2Authenticator(
            connector_config=config,
            token_refresh_endpoint="<https://app.gong.io/oauth2/generate-customer-token>",
            client_id=credentials["client_id"],
            client_secret=credentials["client_secret"],
            access_token_config_path=("access_token",),
            refresh_token_config_path=("refresh_token",),
            token_expiry_date_config_path=("token_expiry_date",),
            refresh_request_body={
                "access_type": "offline",
            },
            refresh_request_headers=basic_auth.get_auth_header(),
        )
    k
    • 2
    • 4
  • t

    Temitope Sam

    08/24/2025, 2:32 PM
    @kapa.ai whenever I create connection from any source to MongoDB as database, it always append "airbyte_raw_"... how do I stop it?
    k
    • 2
    • 1
  • k

    kapa.ai

    08/24/2025, 7:16 PM
    The error message you’re seeing: > AirbyteConnectorInstallationError: The provided manifest requires additional code files (
    class_name
    key detected). This feature is not compatible with the declarative YAML executor. To use this executor, please try again with the Python executor. means that your connector’s manifest references a Python class (via the
    class_name
    key), which requires actual Python code to be executed. The declarative YAML executor in Airbyte is designed to run connectors defined purely in YAML, without any custom Python code. When a manifest references a class, it expects the Python executor to be used, which can interpret and run Python code. What does this mean for you? - If your connector (like
    source-mailchimp
    ) uses the
    class_name
    key or otherwise requires Python code, you cannot use it with the declarative YAML executor. - You need to use the Python executor, which is capable of running connectors that include both YAML and Python code. What should you do? - If you want to use the declarative YAML executor, ensure your connector is fully declarative (YAML only, no
    class_name
    or custom Python code). - If your connector requires Python code, run it with the Python executor instead. This limitation is by design and is not a bug. The knowledge sources do not provide a workaround for using Python-based connectors with the declarative YAML executor, so you must use the appropriate executor for your connector type. If you need to use your connector in environments like PyAirbyte or Jupyter, and it requires Python code, you must ensure the environment supports the Python executor, not just the declarative YAML executor. For purely declarative connectors, you can use the YAML executor as described in the [PyAirbyte tutorial](https://airbyte.com/tutorials/how-to-add-custom-source-to-pyairbyte-using-the-no-code-builder). If you need more details on how to structure your connector or switch executors, please provide more context or refer to the official Airbyte documentation.
  • c

    Colin

    08/24/2025, 7:16 PM
    @kapa.ai AirbyteConnectorInstallationError: The provided manifest requires additional code files (
    class_name
    key detected). This feature is not compatible with the declarative YAML executor. To use this executor, please try again with the Python executor. Connector Name: ‘source-mailchimp’
    k
    • 2
    • 1
  • c

    Colin

    08/24/2025, 7:33 PM
    @kapa.ai Why does Pyairbyte, when using source-mailchimp, say “ModuleNotFoundError: No module named ‘source_declarative_manifest’”
    k
    • 2
    • 5
  • h

    Hari Haran R

    08/25/2025, 6:14 AM
    @kapa.ai [ { "id": "6d9698ce-09cb07381", "rowNumber": 1, "note": { "value": "" }, "AccountRef": {}, "AutoApplyPayments": { "value": false }, "BAccountID": { "value": 787 }, "BillingAddressOverride": { "value": false }, "BillingContactOverride": { "value": false }, "CreatedDateTime": { "value": "2024-02-22T001226.91+00:00" }, "CreditLimit": { "value": 0 }, "CurrencyID": { "value": "USD" }, "CurrencyRateType": {}, "CustomerCategory": { "value": "I" }, "CustomerClass": { "value": "INTERCOMP" }, "CustomerID": { "value": " }, "CustomerName": { "value": "" }, "Email": {}, "EnableCurrencyOverride": { "value": false }, "EnableRateOverride": { "value": false }, "EnableWriteOffs": { "value": false }, "FOBPoint": {}, "LastModifiedDateTime": { "value": "2024-02-22T001226.91+00:00" }, this is an acuamatica API Response for customer stream i'm since these fields are nested all fields are getting stored as each table in snowflake, i'm using the airbyte 0.44.5 and i have built using connector builder i know this is happening because of the normalization issue, normalization is happening due to data type variant , i want the data to be stored in single customer table instead of getting normalized
    k
    • 2
    • 1
  • f

    Fabrizio Spini

    08/25/2025, 7:35 AM
    @kapa.ai I'm running Airbyte OSS 1.2.0 (self-hosted) with the MySQL Source Connector v3.11.1 (CDC mode with Debezium embedded). I'm encountering a problem where Airbyte replicates UPDATE events to BigQuery with all columns as NULL values, even when the actual updated record in MySQL has non-null fields. I analyzed the binlog and noticed that Debezium is not including unchanged fields in the payload (as expected by default). I’ve learned that setting
    emit.unchanged.fields = true
    in Debezium would solve this, but this parameter is not exposed in the Airbyte UI for MySQL Source v3.11.1. My Questions: 1. Is there a supported way in OSS to enable
    emit.unchanged.fields
    in v3.11.1?
    2. Is there any plan to expose
    emit.unchanged.fields
    in the Airbyte UI (or env vars) for OSS users?
    k
    • 2
    • 4
  • m

    Matheus Dantas

    08/25/2025, 8:29 AM
    After upgrading to version 1.8. I have some connections finishing with success but showing this error in the logs:
    Copy code
    2025-08-25 08:03:43 platform ERROR Stage Pipeline Exception: io.airbyte.workload.launcher.pipeline.stages.model.StageError: java.lang.RuntimeException: Init container for Pod: pods did not complete successfully. Actual termination reason: Error.
    message: java.lang.RuntimeException: Init container for Pod: pods did not complete successfully. Actual termination reason: Error.
    stackTrace: [Ljava.lang.StackTraceElement;@5e394e0a
    2025-08-25 08:03:43 platform INFO Attempting to update workload: d749198f-04d4-49e7-8560-2ccaec19c5c2_25675_0_sync to FAILED.
    2025-08-25 08:03:43 platform INFO Pipeline aborted after error for workload: d749198f-04d4-49e7-8560-2ccaec19c5c2_25675_0_sync.
    2025-08-25 08:03:43 platform INFO 
    ----- START POST REPLICATION OPERATIONS -----
    
    2025-08-25 08:03:43 platform INFO No post-replication operation(s) to perform.
    2025-08-25 08:03:43 platform INFO 
    ----- END POST REPLICATION OPERATIONS -----
    k
    • 2
    • 7