Slackbot
09/11/2025, 6:24 AMKarthik
09/11/2025, 6:27 AMcurl --location '<http://localhost:8080/api/v1/applications/token>' \
--header 'Content-Type: application/json' \
--data '{"client_id":"<placeholder_client_id>","client_secret":"<placeholder_client_secret>"}
Johannes Müller
09/11/2025, 11:06 AM/api/v1/workspaces/list
endpoint and were surprised by suddenly getting a 404 after the upgrade.Lisha Zhang
09/11/2025, 2:25 PMkapa.ai
09/11/2025, 4:38 PMMfundo Radebe
09/11/2025, 8:26 PMStephen Kim
09/11/2025, 8:56 PMAsha Ravilla
09/11/2025, 10:54 PMRommel
09/12/2025, 7:26 AMGET /v1/sources
, but when I try those on self-hosted (e.g. <https://my-airbyte/api/v1/sources>
), I get:
{
"message": "Forbidden",
"_links": {
"self": {
"href": "/api/v1/sources"
}
},
"_embedded": {
"errors": [
{
"message": "Forbidden"
}
]
}
}
If I switch to the RPC-style call (POST /api/v1/sources/list
with a workspaceId), it works.
This raises a few questions:
• Are the docs at reference.airbyte.com intended only for Cloud?
• For self-hosted OSS, should we always be using the RPC-style endpoints like /api/v1/sources/list
, /api/v1/connections/create
, etc.?
• Is there a plan to unify the Cloud and OSS APIs in the future?
Also, when I tried to create a Google Search Console source via API, I hit this error:
curl --location '<https://dev.airbyte.jepto.com/api/v1/sources/create>' \
--header 'Content-Type: application/json' \
--header 'Authorization: ••••••' \
--data '{
"configuration": {
"sourceType": "google-search-console",
"authorization": {
"auth_type": "Client",
"client_id": "...redacted...",
"client_secret": "...redacted...",
"refresh_token": "...redacted..."
},
"site_urls": ["<https://www.mysite.com/>"],
"start_date": "2021-05-01"
},
"name": "Sample GSC Connection",
"sourceDefinitionId": "eb4c9e00-db83-4d63-a386-39cfa91012a8",
"workspaceId": "48d59dac-ba6e-466f-8cff-9f2a291cfa1c"
}'
The response I got was:
{
"message": "Internal Server Error: null",
"exceptionClassName": "java.lang.NullPointerException",
"exceptionStack": [],
"rootCauseExceptionStack": []
}
Is this related to a mismatch between the Cloud docs vs the OSS API, or am I formatting the sources/create
payload incorrectly for OSS?Kamil M
09/12/2025, 11:40 AMLillian Jiang
09/12/2025, 6:59 PMAddFields
transformations on a substream. The substream returns a json object as expected, but it doesn't add any fields - not even a static one.
Setup:
• DeclarativeStream with SubstreamPartitionRouter
• Parent stream provides composite_key
via partition_field
• Simple AddFields transformation to add static test_field
What's not working:
yaml
transformations:
- type: AddFields
fields:
- path: ["test_field"]
value: "hello world?"
The test_field
never appears in the output records, even though:
• Schema includes the field with correct type
• additionalProperties: true
is set
• Field is not in required
array
• Partition router is working (API calls succeed)
• I am able to use AddFields for another substreamAsha Ravilla
09/12/2025, 7:58 PMTonja Rand
09/14/2025, 11:21 AMAllan Delmare
09/15/2025, 12:28 AMTanuj Shriyan
09/15/2025, 8:08 AMali chadordouzan
09/15/2025, 12:26 PM1. Magento creates temp table in my database → MariaDB writesevent to my binlogTABLE_MAP
2. Magento writes data to temp table → MariaDB writesevents to my binlogWRITE_ROWS
3. Magento drops temp table → Table no longer exists in my database
4. My Airbyte reads binlog → Findsevents for table ID (e.g 223)WRITE_ROWS
5. Airbyte looks for table metadata → Can't findevent in its reading windowTABLE_MAP
6. I get this error: "No TableMapEventData has been found for table id:223"Is there a way for me to tell Airbyte to ignore those temp tables? I can't find anything in the UI. Thank you in advance.
Oliver Alluard
09/15/2025, 12:57 PMALBAGNAC Damien
09/15/2025, 1:02 PM2025-09-15 15:01:37 info
2025-09-15 15:01:37 info Connector exited, processing output
2025-09-15 15:01:37 info ----- START CHECK -----
2025-09-15 15:01:37 info
2025-09-15 15:01:37 info Output file jobOutput.json found
2025-09-15 15:01:37 info Connector exited with exit code 0
2025-09-15 15:01:37 info Reading messages from protocol version 0.2.0
2025-09-15 15:01:37 info INFO main i.m.c.e.DefaultEnvironment(<init>):168 Established active environments: [k8s, cloud, cli, destination, connector]
2025-09-15 15:01:37 info INFO main i.a.c.AirbyteConnectorRunnable(run):33 Executing class io.airbyte.cdk.load.check.CheckOperation operation.
2025-09-15 15:01:37 warn WARN main i.a.c.l.c.CheckOperation(execute):58 Caught throwable during CHECK java.lang.IllegalArgumentException: Failed to insert expected rows into check table. Actual written: 0
at io.airbyte.integrations.destination.clickhouse.check.ClickhouseChecker.check(ClickhouseChecker.kt:48) ~[io.airbyte.airbyte-integrations.connectors-destination-clickhouse.jar:?]
at io.airbyte.integrations.destination.clickhouse.check.ClickhouseChecker.check(ClickhouseChecker.kt:20) ~[io.airbyte.airbyte-integrations.connectors-destination-clickhouse.jar:?]
at io.airbyte.cdk.load.check.CheckOperation.execute(CheckOperation.kt:48) [bulk-cdk-core-load-0.1.20.jar:?]
at io.airbyte.cdk.AirbyteConnectorRunnable.run(AirbyteConnectorRunnable.kt:34) [bulk-cdk-core-base-0.1.20.jar:?]
at picocli.CommandLine.executeUserObject(CommandLine.java:2030) [picocli-4.7.6.jar:4.7.6]
at picocli.CommandLine.access$1500(CommandLine.java:148) [picocli-4.7.6.jar:4.7.6]
at picocli.CommandLine$RunLast.executeUserObjectOfLastSubcommandWithSameParent(CommandLine.java:2465) [picocli-4.7.6.jar:4.7.6]
at picocli.CommandLine$RunLast.handle(CommandLine.java:2457) [picocli-4.7.6.jar:4.7.6]
at picocli.CommandLine$RunLast.handle(CommandLine.java:2419) [picocli-4.7.6.jar:4.7.6]
at picocli.CommandLine$AbstractParseResultHandler.execute(CommandLine.java:2277) [picocli-4.7.6.jar:4.7.6]
at picocli.CommandLine$RunLast.execute(CommandLine.java:2421) [picocli-4.7.6.jar:4.7.6]
at picocli.CommandLine.execute(CommandLine.java:2174) [picocli-4.7.6.jar:4.7.6]
at io.airbyte.cdk.AirbyteDestinationRunner$Companion.run(AirbyteConnectorRunner.kt:289) [bulk-cdk-core-base-0.1.20.jar:?]
at io.airbyte.cdk.AirbyteDestinationRunner$Companion.run$default(AirbyteConnectorRunner.kt:75) [bulk-cdk-core-base-0.1.20.jar:?]
at io.airbyte.integrations.destination.clickhouse.ClickhouseDestinationKt.main(ClickhouseDestination.kt:10) [io.airbyte.airbyte-integrations.connectors-destination-clickhouse.jar:?]
Stack Trace: java.lang.IllegalArgumentException: Failed to insert expected rows into check table. Actual written: 0
at io.airbyte.integrations.destination.clickhouse.check.ClickhouseChecker.check(ClickhouseChecker.kt:48)
at io.airbyte.integrations.destination.clickhouse.check.ClickhouseChecker.check(ClickhouseChecker.kt:20)
at io.airbyte.cdk.load.check.CheckOperation.execute(CheckOperation.kt:48)
at io.airbyte.cdk.AirbyteConnectorRunnable.run(AirbyteConnectorRunnable.kt:34)
at picocli.CommandLine.executeUserObject(CommandLine.java:2030)
at picocli.CommandLine.access$1500(CommandLine.java:148)
at picocli.CommandLine$RunLast.executeUserObjectOfLastSubcommandWithSameParent(CommandLine.java:2465)
at picocli.CommandLine$RunLast.handle(CommandLine.java:2457)
at picocli.CommandLine$RunLast.handle(CommandLine.java:2419)
at picocli.CommandLine$AbstractParseResultHandler.execute(CommandLine.java:2277)
at picocli.CommandLine$RunLast.execute(CommandLine.java:2421)
at picocli.CommandLine.execute(CommandLine.java:2174)
at io.airbyte.cdk.AirbyteDestinationRunner$Companion.run(AirbyteConnectorRunner.kt:289)
at io.airbyte.cdk.AirbyteDestinationRunner$Companion.run$default(AirbyteConnectorRunner.kt:75)
at io.airbyte.integrations.destination.clickhouse.ClickhouseDestinationKt.main(ClickhouseDestination.kt:10)
2025-09-15 15:01:37 info INFO main i.a.c.AirbyteConnectorRunnable(run):46 Flushing output consumer prior to shutdown.
2025-09-15 15:01:37 info INFO main i.a.c.AirbyteConnectorRunnable(run):48 Completed integration: airbyte/destination-clickhouse.
2025-09-15 15:01:37 info Checking for optional control message...
2025-09-15 15:01:37 info Optional control message not found. Skipping...
2025-09-15 15:01:37 info Writing output of ce0d828e-1dc4-496c-b122-2da42e637e48_96361554-8782-4233-b8e4-10373f363714_0_check to the doc store
2025-09-15 15:01:37 info Marking workload ce0d828e-1dc4-496c-b122-2da42e637e48_96361554-8782-4233-b8e4-10373f363714_0_check as successful
2025-09-15 15:01:37 info
2025-09-15 15:01:37 info Deliberately exiting process with code 0.
2025-09-15 15:01:37 info ----- END CHECK -----
2025-09-15 15:01:37 info
Do you have any ideas?Kamil M
09/15/2025, 1:37 PMKamil M
09/15/2025, 1:53 PMMounika Naga
09/15/2025, 3:40 PMKarthik
09/15/2025, 4:06 PMLeonardo Amorim
09/15/2025, 6:03 PMMissing Facebook Ads Purchase Data Since July 29th
I'm following up on the issue discussed in this thread: https://airbytehq.slack.com/archives/C021JANJ6TY/p1754590543220719
We are approaching almost two months since this problem was first reported, and it continues to be a critical issue for us.
We're in the same position as many other users and are strongly considering a move to Fivetran if a fix isn't implemented for this problem.Allan Delmare
09/15/2025, 7:11 PMRobin Smith-Gilbert
09/15/2025, 7:23 PMCameron Foy
09/15/2025, 10:00 PMInternal message: java.lang.RuntimeException: io.airbyte.integrations.source.elasticsearch.UnsupportedDatatypeException: Cannot map unsupported data type to Airbyte data type: completion
Failure type: system_error
java.lang.RuntimeException: io.airbyte.integrations.source.elasticsearch.UnsupportedDatatypeException: Cannot map unsupported data type to Airbyte data type: completion
at io.airbyte.integrations.source.elasticsearch.typemapper.ElasticsearchTypeMapper.lambda$formatJSONSchema$0(ElasticsearchTypeMapper.java:133)
at java.base/java.util.Iterator.forEachRemaining(Iterator.java:133)
at io.airbyte.integrations.source.elasticsearch.typemapper.ElasticsearchTypeMapper.formatJSONSchema(ElasticsearchTypeMapper.java:129)
at io.airbyte.integrations.source.elasticsearch.typemapper.ElasticsearchTypeMapper.lambda$formatJSONSchema$0(ElasticsearchTypeMapper.java:131)
at java.base/java.util.Iterator.forEachRemaining(Iterator.java:133)
at io.airbyte.integrations.source.elasticsearch.typemapper.ElasticsearchTypeMapper.formatJSONSchema(ElasticsearchTypeMapper.java:129)
at io.airbyte.integrations.source.elasticsearch.ElasticsearchSource.discover(ElasticsearchSource.java:78)
at io.airbyte.cdk.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:159)
at io.airbyte.cdk.integrations.base.IntegrationRunner.run(IntegrationRunner.java:125)
at io.airbyte.integrations.source.elasticsearch.ElasticsearchSource.main(ElasticsearchSource.java:37)
Caused by: io.airbyte.integrations.source.elasticsearch.UnsupportedDatatypeException: Cannot map unsupported data type to Airbyte data type: completion
at io.airbyte.integrations.source.elasticsearch.typemapper.ElasticsearchTypeMapper.formatJSONSchema(ElasticsearchTypeMapper.java:127)
at io.airbyte.integrations.source.elasticsearch.typemapper.ElasticsearchTypeMapper.lambda$formatJSONSchema$0(ElasticsearchTypeMapper.java:131)
... 9 more
I understand that there's an issue with the "completion" data type. It looks like it's not being handled by the Airbyte connector. But I cannot fork the connector in the connector builder nor can I edit any configs in the source. Does anyone have any advice on what I should do? Do I need to create my own connector from scratch?Seb J
09/16/2025, 11:55 AMvalues.yaml
under `global.storage`:
global:
storage:
type: "S3"
secretName: "airbyte-config-secrets"
bucket:
log: airbyte-s3
state: airbyte-s3
workloadOutput: airbyte-s3
s3:
region: fr-par
endpoint: <https://s3.fr-par.scw.cloud>
authenticationType: credentials
• In the Kubernetes secret airbyte-config-secrets
, I added my s3-access-key-id
and s3-secret-access-key
.
• I tried forcing S3_PATH_STYLE_ACCESS=true
to workaround DNS resolution issues, but the worker still tries to reach <http://airbyte-s3.s3.fr-par.amazonaws.com|airbyte-s3.s3.fr-par.amazonaws.com>
and fails.
My understanding: the Helm chart V1 seems designed for AWS S3, and the custom endpoint is not picked up correctly.
My question:
• Has anyone successfully configured Airbyte 1.8.x with Helm chart V1 to use an external S3-compatible storage directly (without an intermediate MinIO)?
• If yes, could you share a working values.yaml
and secret example or your process ?
Thanks a lot for your feedback 🙏Christopher Vreugdenhil
09/16/2025, 12:59 PMNeeraj N
09/16/2025, 1:40 PMGarrett Thornburg
09/16/2025, 3:21 PM[ASYNC QUEUE INFO] Global: max: 593.92 MB, allocated: 10 MB (9.9977445602417 MB), % used: 0.016833345242386704 | Queue .................... | State Manager memory usage: Allocated: 9 MB, Used: -2365 bytes, percentage Used -0.000226
It just runs forever and never finishes. It used to take (consistently) ~5 minutes.
I tried deleting the connection and recreating it but that did not work. I also tested each endpoint in the connection builder to see if it had any changes, but it everything works as it should.
There is one table in the connection that has ~140k records so I disabled that and re-ran. It finished but took ages. I saw lots of comments like:
<gs://my_bucket/airbyte/my_table/2025/09/16/15/c29bb93c-80dc-47e7-9115-92f43ea8a485/0.csv.gz>
<gs://my_bucket/airbyte/my_table/2025/09/16/15/c29bb93c-80dc-47e7-9115-92f43ea8a485/1.csv.gz>
...
Where there is one file with 42KB of data and then there are like 900 files with 0 bytes.
So, something weird is happening here but I'm not sure what it is.