Göktuğ Aşcı
09/17/2025, 12:56 PMBastin Headmon
09/17/2025, 1:51 PMJabbar Memon
09/18/2025, 7:09 AMLucas Rodrigues Dhein
09/22/2025, 5:07 PMLouis Adam
09/23/2025, 3:25 PMGöktuğ Aşcı
09/24/2025, 12:58 PMJason Anderson
09/24/2025, 6:57 PMcurl -G \
-d "access_token=MYTOKEN" \
-d "time_range={'since':'2025-09-16','until':'2025-09-16'}" \
-d "fields=conversions" \
"<https://graph.facebook.com/v23.0/MYADNUMBER/insights>"Rishabh Bhardwaj
09/25/2025, 5:28 AMAnil Thapa
09/25/2025, 3:57 PMRishabh Bhardwaj
09/26/2025, 6:45 AMRishabh Bhardwaj
09/27/2025, 3:42 PMמילכה כץ
09/29/2025, 12:34 PMdilan silva
10/02/2025, 3:16 PMDev Environment
------------------------
OS Windows, using WSL and docker with Airbyte abctl.
Metadata.yaml
------------------------------------------
Metadata.yaml
-------------------------------------------
data:
allowedHosts:
hosts:
- TODO # Please change to the hostname of the source.
registries:
oss:
enabled: true
cloud:
enabled: false
remoteRegistries:
pypi:
enabled: true
packageName: airbyte-source-nexus-datasets
connectorBuildOptions:
baseImage: docker.io/airbyte/python-connector-base:3.0.2@sha256:73697fbe1c0e2ebb8ed58e2268484bb4bfb2cb56b653808e1680cbc50bafef75
connectorSubtype: api
connectorType: source
definitionId: 9e1fe63c-80ad-44fe-8927-10e66c9e209b
dockerImageTag: 0.1.0
dockerRepository: airbyte/source-nexus-datasets
githubIssueLabel: source-nexus-datasets
icon: nexus-datasets.svg
license: MIT
name: Nexus Datasets
releaseDate: TODO
releaseStage: alpha
supportLevel: community
documentationUrl: <https://docs.airbyte.com/integrations/sources/nexus-datasets>
tags:
- language:python
- cdk:low-code
connectorTestSuitesOptions:
- suite: liveTests
testConnections:
- name: nexus_datasets_config_dev_null
id: TODO
- suite: unitTests
- suite: acceptanceTests
#optional: if your acceptance tests need credentials from CI secret store
#testSecrets:
# - name: SECRET_SOURCE-NEXUS-DATASETS__CREDS
# fileName: config.json
# secretStore:
# type: GSM
# alias: airbyte-connector-testing-secret-store
metadataSpecVersion: "1.0"
This is the Pyproject.toml
-------------------------
Pyproject.toml
-------------------------
[build-system]
requires = [ "poetry-core>=1.0.0",]
build-backend = "poetry.core.masonry.api"
[tool.poetry]
version = "0.3.21"
name = "source-nexus-datasets"
description = "Source implementation for nexus-datasets."
authors = [ "Airbyte <contact@airbyte.io>",]
license = "MIT"
readme = "README.md"
documentation = "<https://docs.airbyte.com/integrations/sources/nexus-datasets>"
homepage = "<https://airbyte.com>"
repository = "<https://github.com/airbytehq/airbyte>"
packages = [ { include = "source_nexus_datasets" }, {include = "main.py" } ]
[tool.poetry.dependencies]
python = ">=3.10,<3.12"
airbyte-cdk = "6.60.0"
pyarrow = ">=16.1,<22.0"
pandas = "2.2.2"
[tool.poetry.scripts]
source-nexus-datasets = "source_nexus_datasets.run:run"
[tool.poetry.group.dev.dependencies]
requests-mock = "^1.9.3"
pytest-mock = "^3.6.1"
pytest = "^8.0.0"
So this is the error I can see in the logs,
Traceback (most recent call last):
File "/usr/local/bin/pyab", line 5, in <module>
from airbyte.cli import cli
File "/usr/local/lib/python3.11/site-packages/airbyte/__init__.py", line 126, in <module>
from airbyte import (
File "/usr/local/lib/python3.11/site-packages/airbyte/cloud/__init__.py", line 59, in <module>
from airbyte.cloud import connections, constants, sync_results, workspaces
File "/usr/local/lib/python3.11/site-packages/airbyte/cloud/workspaces.py", line 27, in <module>
from airbyte.sources.base import Source
File "/usr/local/lib/python3.11/site-packages/airbyte/sources/__init__.py", line 6, in <module>
from airbyte.sources import base, util
File "/usr/local/lib/python3.11/site-packages/airbyte/sources/util.py", line 10, in <module>
from airbyte._executors.util import get_connector_executor
File "/usr/local/lib/python3.11/site-packages/airbyte/_executors/util.py", line 13, in <module>
from airbyte._executors.declarative import DeclarativeExecutor
File "/usr/local/lib/python3.11/site-packages/airbyte/_executors/declarative.py", line 14, in <module>
from airbyte_cdk.sources.declarative.manifest_declarative_source import ManifestDeclarativeSource
ModuleNotFoundError: No module named 'airbyte_cdk.sources.declarative.manifest_declarative_source'
Can anyone please help me on this ?MTA
10/03/2025, 7:53 AMDaniel Popowich
10/03/2025, 4:09 PMabctl for testing. I have runtime features I would like to turn on/off during testing without having to rebuild my image. How do I set (and later change) an arbitrary environment variable for my connector. For example, I can run a source connector image from my command-line like this:
docker run --rm $MYIMAGE spec
But if I want to turn on a custom feature I could do this:
docker run --rm -e MYCUSTOMENV=true $MYIMAGE spec
How do I do this while running in the abctl (kind) environment?Mike Braden
10/06/2025, 5:03 PMstream_partition.<id_name> for the Creation and Download URLs but when used in the Polling URL it inserts a null value for the stream partition current ID, causing the request to fail. Am I missing something? Is stream partitioning not supported for the polling portion of the async job?Sunny Matharu
10/06/2025, 10:16 PMCould not validate your connector as specified:
Error handling request: Validation against json schema defined in declarative_component_schema.yaml schema failed
Has anyone successfully managed to set up a custom Salesforce destination connector? Is it even allowed given it seems to be explicitly excluded from the list of connectors in the self-hosted / open source version of AirByte?Fungai
10/07/2025, 1:37 AMGloria Usanase
10/08/2025, 7:19 AMorg.postgresql.util.PSQLException: An I/O error occurred while sending to the backend (EOF) from the CTID/Xmin iterator.
What’s already tried: Verified CH side (Atomic DB, correct namespace, no cluster, JSON disabled), limited concurrency to 1, ran Full refresh | Overwrite for big tables, and adjusted the Postgres source (JDBC `socketTimeout=0&tcpKeepAlive=true&defaultRowFetchSize=1000`; role GUCs removing statement/idle timeouts). Inserts into CH occur in 2k batches, but overall throughput is far slower than v1 and several large streams still don’t finish. Looking for any known v2.1.x performance regressions or tuning tips for PG→CH initial loads (batch/window sizes, fetch size, CTID vs Xmin, CDC for big tables, etc.).Zeineb Ben Nessib
10/09/2025, 9:19 AMJean Baptiste Guerraz
10/14/2025, 11:08 PMJ Bob
10/17/2025, 9:13 AMKanhaiya Gupta
10/17/2025, 12:33 PMGöktuğ Aşcı
10/17/2025, 7:26 PMJ Bob
10/18/2025, 1:55 PMIsaac Harris-Holt
10/22/2025, 11:19 AMAymen NEGUEZ
10/24/2025, 1:24 PMLucas Hild
10/24/2025, 5:40 PM[
{
"result": {
"rows": [
[
"amazon-web-services",
"2025",
"10",
"14",
123.456,
1760400000
],
[
"microsoft-azure",
"2025",
"10",
"12",
654.321,
1760227200
],
...
]
}
}
]
How can I pull access these fields properly?
When I select the field path result, rows, I get this error: Exception while syncing stream cloud_costs: dictionary update sequence element #0 has length 12; 2 is required
Is there any way to transform this data properly?
Thanks!Ляшик Іван
10/27/2025, 9:22 AMupdated_at).
We’ve noticed a potential issue with how Airbyte handles incremental syncs:
If new records appear in the source with the same cursor value as the last recorded cursor (for example, 2025-10-20 09:42:11), these rows are not picked up in the next sync.
As far as we understand, this happens because Airbyte always applies the condition cursor_field > last_cursor_value, not >=, when filtering incremental data.
This creates a risk of data loss if multiple rows share the same timestamp down to the same second or microsecond — which is common in high-frequency data sources.
Could you please confirm:
1. Whether there is a way to configure or modify this behavior (e.g., use >= or apply a grace period/offset for the cursor)?
2. If not, is there any recommended best practice or workaround (such as cursor windowing or state rollback) to avoid missing records that share the same cursor value?
Thank you for your help and clarification.
Best regards!Stockton Fisher
11/02/2025, 7:38 AM