Michal Krawczyk
11/21/2025, 10:30 AMMahmoud Mostafa
11/21/2025, 2:31 PMDiego Dias
11/21/2025, 5:12 PMworker:
enabled: true
replicaCount: 2
maxNotifyWorkers: "20"
maxCheckWorkers: "20"
maxSyncWorkers: "20"
resources:
requests:
cpu: "500m"
memory: "1Gi"
limits:
cpu: "1000m"
memory: "2Gi"
extraEnv:
- name: MAX_SYNC_WORKERS
value: "20"
- name: MAX_CHECK_WORKERS
value: "20"
- name: MAX_DISCOVERY_WORKERS
value: "10"
- name: MAX_SPEC_WORKERS
value: "10"
workloadLauncher:
enabled: true
replicaCount: 2
resources:
requests:
cpu: "500m"
memory: "1Gi"
limits:
cpu: "1000m"
memory: "2Gi"
extraEnv:
- name: WORKLOAD_LAUNCHER_PARALLELISM
value: "25"Tigran Zalyan
11/21/2025, 6:36 PMglobal.database.user, the airbyte-bootloader fails with:
Error: couldn't find key DATABASE_USER in Secret airbyte-airbyte-secrets
The workaround was to move the value into the secret under that key.
I'm also seeing another error in airbyte-server when using GCS as the storage provider:
io.micronaut.context.exceptions.BeanInstantiationException: Bean definition [io.airbyte.server.apis.controllers.SourceDefinitionApiController] could not be loaded:
Error instantiating bean of type [io.airbyte.commons.storage.GcsStorageClient]: Is a directory
The last error doesn't allow Airbyte to successfully start. Any ideas why this might be happening?Dan Cook
11/21/2025, 6:37 PMJonathan Clemons
11/21/2025, 9:56 PMGaurav Jain
11/22/2025, 9:26 AMBogdan
11/24/2025, 9:53 AMSamy-Alexandre LICOUR
11/24/2025, 11:50 AMsudo abctl local install --secret secrets.yaml --values values.yaml
Here are my values and secrets files:
values.yaml
global:
storage:
type: "GCS"
secretName: airbyte-config-secrets
bucket: # GCS bucket names that you've created. We recommend storing the following all in one bucket.
log: dev-airbyte-abctl-logs-
state: dev-airbyte-abctl-logs
workloadOutput: dev-airbyte-abctl-logs
gcs:
projectId: source-plm-dev
credentialsJsonPath: /secrets/gcs-log-creds/gcp.json
jobs:
resources:
## Example:
requests:
memory: 8Gi
cpu: 2
# -- Job resource requests
#requests: {}
## Example:
limits:
cpu: 4
memory: 16Gi
# -- Job resource limits
#limits: {}
auth:
enabled: false
secrets.yaml :
apiVersion: v1
kind: Secret
metadata:
name: airbyte-config-secrets
type: Opaque
stringData:
gcp.json: "..."
I am getting this error:
Caused by: java.io.IOException: Is a directory
There seems to be a problem with credentialsJsonPath: /secrets/gcs-log-creds/gcp.json even though I followed the documentation.
Thanks for your help!Akshata Shanbhag
11/25/2025, 7:36 AMkapa.ai
11/25/2025, 2:55 PMMihĂĄly Dombi
11/25/2025, 2:55 PMreport_task_id) in the response header that can be polled to check its status. The problem is that this Location is in the response header but the response doesn't have a payload. Because of this the _get_creation_response_interpolation_context is always failing. Also in the connector builder the HTTP Response Format section doesn't have an option that tells that the response doesn't have a body. Is there a workaround for this, or the CDK should be extended with this option?
The report creation endpoint in question: https://developer.ebay.com/api-docs/sell/marketing/resources/ad_report_task/methods/createReportTask#request.dateFromLucas Segers
11/25/2025, 2:56 PMDyllan Pascoe
11/25/2025, 9:04 PMRob Kwark
11/26/2025, 1:39 AMKevin O'Keefe
11/26/2025, 4:00 AMJadperNL
11/26/2025, 9:03 AMIsaac Steele
11/26/2025, 3:45 PMrequests.get() and the python airbyte-api on sources for my source configuration information.
For example, when I get run a GET on ...v1/sources , my sources have connection information like configuration":{"auth_type":"Client","client_id":"myclient123456789","is_sandbox":true,"client_secret":"**********","refresh_token":"**********","streams_criteria":[],"stream_slice_step":"P30D","force_use_bulk_api":false}.
But when I use the python API get_source() or list_sources() the configuration information returned is just: configuration=SourceAirtable(credentials=None, SOURCE_TYPE=<SourceAirtableAirtable.AIRTABLE: 'airtable'>) .
This is an example from my salesforce connector, but the same is true for my other connectors as well, just giving the Airtable for configuration on all of them. Is this a python-api bug? How can I get/view my masked source configuration data from the python api?Nikita
11/26/2025, 4:36 PMFrancis Carr
11/26/2025, 5:00 PMairbyte/source-snowflake on version 1.0.8. The source table within Snowflake is very large and it seems I am hitting a credentials timeout of around 6 hours.
What seems to be happening;
1. Airbyte makes a query for all data from the Snowflake table and this is staged internally within an S3 bucket
2. Airbyte then starts transferring this data out of the bucket piece-by-piece sending it from the Source to the Destination, the destination being BigQuery
3. After 6 hours or so, sometimes longer, we run into a 403 Forbidden on reading the data and the job within Airbyte runs another attempt.
a. After 5 attemps of around 6 hours it gives up and reports it as a failure.
b. When the job starts again we seem to start from the very beginning; even though the job has written a lot of records to the BigQuery destination, it doesn't seem to save where it has got to in the Source. The sync mode is Incremental | Append with a Date as the Cursor
Has anyone had any similar experiences with this type of Snowflake timeout? It sounds like it could be something that can be configured on the Snowflake side, but we aren't sure what configuration we could use to control this. There is documentation around the 6 hour timeout in the Using Persisted Query Results from Snowflake but it doesn't really say much beyond A new token can be retrieved to access results while they are still in cache.
We are also not sure what we can do on our side as this seems to be internal to the JDBC driver that reads from Snowflake. Maybe there is a JDBC query param we could add? We know we can break up this data and there are lots of workarounds like that, but it still avoids the core issue that can arise again. Any recommendations would be greatly appreciated! đAlex Tasioulis
11/26/2025, 5:27 PMBaudilio GarcĂa HernĂĄndez
11/26/2025, 5:54 PMChanakya Pendem
11/27/2025, 8:46 AMLinas LeĹĄÄinskas
11/27/2025, 12:51 PMsudo abctl local uninstall and sudo abctl local install . In the process Airbyte was upgraded from 0.22x to 0.3x.
However, I still cannot run any syncs. When I run a sync, it just hangs indefinitely, generating zero logs. When I try to replicate the connection, I encounter Airbyte is temporarily unavailable. Please try again. (HTTP 502) in the Select streams stage.
I tried kicking workload launchers and restarting cluster, but this yields no tangible results.
P. S. I have made a machine image so I can start anew on a fresh VM (with expired certs).
Any guidance appreciated!Fernand Ramat
11/27/2025, 7:29 PMio.airbyte.cdk.TransientErrorException: Input was fully read, but some streams did not receive a terminal stream status message. If the destination did not encounter other errors, this likely indicates an error in the source or platform. Streams without a status message: [import_mongodb_hh_production_product2_v2 ......]I am not sure how I could solve this pipeline
Alex Johnson
11/28/2025, 3:32 AMAlex Johnson
11/28/2025, 5:28 AMPrabhu Agarwal
11/28/2025, 8:53 AMsource_declarative_manifest.components.
I have checked for source_declarative_manifest traces in the other streams but its not the case. Not sure why this error is coming.
How can I fix this issue? any leads would be highly appreciated.
ThanksLeon Kozlowski
11/29/2025, 4:09 PMSam Riggleman
11/29/2025, 5:37 PM