Giridhar Gopal Vemula
11/19/2025, 3:39 PMTanuja
11/19/2025, 4:16 PMGoogle Ads - Airbyte connector? Our pipelines are not coming up from past few hours.Dan Schlosser
11/19/2025, 10:02 PMcontacts field is all of a sudden null for all of the deals that we sync into BigQuery from Airbyte – no config changes (definitely selected in the Airbyte mapping UI), just coming in as null.Mochamad Eka Pramudita
11/20/2025, 4:34 AMYuvaraj Prem Kumar
11/20/2025, 9:49 AMBarun Pattanaik
11/20/2025, 10:12 AMKevin O'Keefe
11/20/2025, 6:42 PM{
"dataplane-client-id": "Redacted",
"dataplane-client-secret": "Redacted==",
"instance-admin-client-id": "Redacted",
"instance-admin-client-secret": "Redacted=",
"jwt-signature-secret": "Redacted=",
"instance-admin-email": "Redacted=",
"instance-admin-password": "Redacted"
}
This is getting properly imported into the server container. It seems like if I create any secret with the email and password it breaks all of the other generated secrets. I have tried splitting these two into another secret and pointing the auth to that, but then it does not autogenerate the airbyte-auth-secrets secret like it does if I have auth disabled. Has anyone else had to deal with this? Sorry if this has been answered before, I searched and there was not much to go on. Here is the values file I am using
global:
serviceAccountName: &service-account-name airbyte-sa
edition: community
airbyteUrl: "airbyte.k8s.haus"
annotations:
<http://argocd.argoproj.io/sync-wave|argocd.argoproj.io/sync-wave>: "2"
database:
type: external
secretName: "airbyte-config-secrets"
host: "10.23.1.3"
port: 5432
database: "postgres"
userSecretKey: "database-user"
passwordSecretKey: "database-password"
auth:
enabled: true
secretName: "airbyte-auth2-secrets"
instanceAdmin:
firstName: "security"
lastName: "admin"
emailSecretKey: "instance-admin-email"
passwordSecretKey: "instance-admin-password"
minio:
enabled: false
postgresql:
enabled: false
serviceAccount:
create: true
name: *service-account-name
annotations:
<http://iam.gke.io/gcp-service-account|iam.gke.io/gcp-service-account>: "Redacted"Aviran Moshe
11/20/2025, 10:51 PMMichal Krawczyk
11/21/2025, 10:30 AMMahmoud Mostafa
11/21/2025, 2:31 PMDiego Dias
11/21/2025, 5:12 PMworker:
enabled: true
replicaCount: 2
maxNotifyWorkers: "20"
maxCheckWorkers: "20"
maxSyncWorkers: "20"
resources:
requests:
cpu: "500m"
memory: "1Gi"
limits:
cpu: "1000m"
memory: "2Gi"
extraEnv:
- name: MAX_SYNC_WORKERS
value: "20"
- name: MAX_CHECK_WORKERS
value: "20"
- name: MAX_DISCOVERY_WORKERS
value: "10"
- name: MAX_SPEC_WORKERS
value: "10"
workloadLauncher:
enabled: true
replicaCount: 2
resources:
requests:
cpu: "500m"
memory: "1Gi"
limits:
cpu: "1000m"
memory: "2Gi"
extraEnv:
- name: WORKLOAD_LAUNCHER_PARALLELISM
value: "25"Tigran Zalyan
11/21/2025, 6:36 PMglobal.database.user, the airbyte-bootloader fails with:
Error: couldn't find key DATABASE_USER in Secret airbyte-airbyte-secrets
The workaround was to move the value into the secret under that key.
I'm also seeing another error in airbyte-server when using GCS as the storage provider:
io.micronaut.context.exceptions.BeanInstantiationException: Bean definition [io.airbyte.server.apis.controllers.SourceDefinitionApiController] could not be loaded:
Error instantiating bean of type [io.airbyte.commons.storage.GcsStorageClient]: Is a directory
The last error doesn't allow Airbyte to successfully start. Any ideas why this might be happening?Dan Cook
11/21/2025, 6:37 PMJonathan Clemons
11/21/2025, 9:56 PMGaurav Jain
11/22/2025, 9:26 AMBogdan
11/24/2025, 9:53 AMSamy-Alexandre LICOUR
11/24/2025, 11:50 AMsudo abctl local install --secret secrets.yaml --values values.yaml
Here are my values and secrets files:
values.yaml
global:
storage:
type: "GCS"
secretName: airbyte-config-secrets
bucket: # GCS bucket names that you've created. We recommend storing the following all in one bucket.
log: dev-airbyte-abctl-logs-
state: dev-airbyte-abctl-logs
workloadOutput: dev-airbyte-abctl-logs
gcs:
projectId: source-plm-dev
credentialsJsonPath: /secrets/gcs-log-creds/gcp.json
jobs:
resources:
## Example:
requests:
memory: 8Gi
cpu: 2
# -- Job resource requests
#requests: {}
## Example:
limits:
cpu: 4
memory: 16Gi
# -- Job resource limits
#limits: {}
auth:
enabled: false
secrets.yaml :
apiVersion: v1
kind: Secret
metadata:
name: airbyte-config-secrets
type: Opaque
stringData:
gcp.json: "..."
I am getting this error:
Caused by: java.io.IOException: Is a directory
There seems to be a problem with credentialsJsonPath: /secrets/gcs-log-creds/gcp.json even though I followed the documentation.
Thanks for your help!Akshata Shanbhag
11/25/2025, 7:36 AMkapa.ai
11/25/2025, 2:55 PMMihály Dombi
11/25/2025, 2:55 PMreport_task_id) in the response header that can be polled to check its status. The problem is that this Location is in the response header but the response doesn't have a payload. Because of this the _get_creation_response_interpolation_context is always failing. Also in the connector builder the HTTP Response Format section doesn't have an option that tells that the response doesn't have a body. Is there a workaround for this, or the CDK should be extended with this option?
The report creation endpoint in question: https://developer.ebay.com/api-docs/sell/marketing/resources/ad_report_task/methods/createReportTask#request.dateFromLucas Segers
11/25/2025, 2:56 PMDyllan Pascoe
11/25/2025, 9:04 PMRob Kwark
11/26/2025, 1:39 AMKevin O'Keefe
11/26/2025, 4:00 AMJadperNL
11/26/2025, 9:03 AMIsaac Steele
11/26/2025, 3:45 PMrequests.get() and the python airbyte-api on sources for my source configuration information.
For example, when I get run a GET on ...v1/sources , my sources have connection information like configuration":{"auth_type":"Client","client_id":"myclient123456789","is_sandbox":true,"client_secret":"**********","refresh_token":"**********","streams_criteria":[],"stream_slice_step":"P30D","force_use_bulk_api":false}.
But when I use the python API get_source() or list_sources() the configuration information returned is just: configuration=SourceAirtable(credentials=None, SOURCE_TYPE=<SourceAirtableAirtable.AIRTABLE: 'airtable'>) .
This is an example from my salesforce connector, but the same is true for my other connectors as well, just giving the Airtable for configuration on all of them. Is this a python-api bug? How can I get/view my masked source configuration data from the python api?Nikita
11/26/2025, 4:36 PMFrancis Carr
11/26/2025, 5:00 PMairbyte/source-snowflake on version 1.0.8. The source table within Snowflake is very large and it seems I am hitting a credentials timeout of around 6 hours.
What seems to be happening;
1. Airbyte makes a query for all data from the Snowflake table and this is staged internally within an S3 bucket
2. Airbyte then starts transferring this data out of the bucket piece-by-piece sending it from the Source to the Destination, the destination being BigQuery
3. After 6 hours or so, sometimes longer, we run into a 403 Forbidden on reading the data and the job within Airbyte runs another attempt.
a. After 5 attemps of around 6 hours it gives up and reports it as a failure.
b. When the job starts again we seem to start from the very beginning; even though the job has written a lot of records to the BigQuery destination, it doesn't seem to save where it has got to in the Source. The sync mode is Incremental | Append with a Date as the Cursor
Has anyone had any similar experiences with this type of Snowflake timeout? It sounds like it could be something that can be configured on the Snowflake side, but we aren't sure what configuration we could use to control this. There is documentation around the 6 hour timeout in the Using Persisted Query Results from Snowflake but it doesn't really say much beyond A new token can be retrieved to access results while they are still in cache.
We are also not sure what we can do on our side as this seems to be internal to the JDBC driver that reads from Snowflake. Maybe there is a JDBC query param we could add? We know we can break up this data and there are lots of workarounds like that, but it still avoids the core issue that can arise again. Any recommendations would be greatly appreciated! 🙏Alex Tasioulis
11/26/2025, 5:27 PMBaudilio García Hernández
11/26/2025, 5:54 PM