Rishabh Bhardwaj
09/27/2025, 3:42 PMמילכה כץ
09/29/2025, 12:34 PMdilan silva
10/02/2025, 3:16 PMDev Environment
------------------------
OS Windows, using WSL and docker with Airbyte abctl.
Metadata.yaml
------------------------------------------
Metadata.yaml
-------------------------------------------
data:
allowedHosts:
hosts:
- TODO # Please change to the hostname of the source.
registries:
oss:
enabled: true
cloud:
enabled: false
remoteRegistries:
pypi:
enabled: true
packageName: airbyte-source-nexus-datasets
connectorBuildOptions:
baseImage: docker.io/airbyte/python-connector-base:3.0.2@sha256:73697fbe1c0e2ebb8ed58e2268484bb4bfb2cb56b653808e1680cbc50bafef75
connectorSubtype: api
connectorType: source
definitionId: 9e1fe63c-80ad-44fe-8927-10e66c9e209b
dockerImageTag: 0.1.0
dockerRepository: airbyte/source-nexus-datasets
githubIssueLabel: source-nexus-datasets
icon: nexus-datasets.svg
license: MIT
name: Nexus Datasets
releaseDate: TODO
releaseStage: alpha
supportLevel: community
documentationUrl: <https://docs.airbyte.com/integrations/sources/nexus-datasets>
tags:
- language:python
- cdk:low-code
connectorTestSuitesOptions:
- suite: liveTests
testConnections:
- name: nexus_datasets_config_dev_null
id: TODO
- suite: unitTests
- suite: acceptanceTests
#optional: if your acceptance tests need credentials from CI secret store
#testSecrets:
# - name: SECRET_SOURCE-NEXUS-DATASETS__CREDS
# fileName: config.json
# secretStore:
# type: GSM
# alias: airbyte-connector-testing-secret-store
metadataSpecVersion: "1.0"
This is the Pyproject.toml
-------------------------
Pyproject.toml
-------------------------
[build-system]
requires = [ "poetry-core>=1.0.0",]
build-backend = "poetry.core.masonry.api"
[tool.poetry]
version = "0.3.21"
name = "source-nexus-datasets"
description = "Source implementation for nexus-datasets."
authors = [ "Airbyte <contact@airbyte.io>",]
license = "MIT"
readme = "README.md"
documentation = "<https://docs.airbyte.com/integrations/sources/nexus-datasets>"
homepage = "<https://airbyte.com>"
repository = "<https://github.com/airbytehq/airbyte>"
packages = [ { include = "source_nexus_datasets" }, {include = "main.py" } ]
[tool.poetry.dependencies]
python = ">=3.10,<3.12"
airbyte-cdk = "6.60.0"
pyarrow = ">=16.1,<22.0"
pandas = "2.2.2"
[tool.poetry.scripts]
source-nexus-datasets = "source_nexus_datasets.run:run"
[tool.poetry.group.dev.dependencies]
requests-mock = "^1.9.3"
pytest-mock = "^3.6.1"
pytest = "^8.0.0"
So this is the error I can see in the logs,
Traceback (most recent call last):
File "/usr/local/bin/pyab", line 5, in <module>
from airbyte.cli import cli
File "/usr/local/lib/python3.11/site-packages/airbyte/__init__.py", line 126, in <module>
from airbyte import (
File "/usr/local/lib/python3.11/site-packages/airbyte/cloud/__init__.py", line 59, in <module>
from airbyte.cloud import connections, constants, sync_results, workspaces
File "/usr/local/lib/python3.11/site-packages/airbyte/cloud/workspaces.py", line 27, in <module>
from airbyte.sources.base import Source
File "/usr/local/lib/python3.11/site-packages/airbyte/sources/__init__.py", line 6, in <module>
from airbyte.sources import base, util
File "/usr/local/lib/python3.11/site-packages/airbyte/sources/util.py", line 10, in <module>
from airbyte._executors.util import get_connector_executor
File "/usr/local/lib/python3.11/site-packages/airbyte/_executors/util.py", line 13, in <module>
from airbyte._executors.declarative import DeclarativeExecutor
File "/usr/local/lib/python3.11/site-packages/airbyte/_executors/declarative.py", line 14, in <module>
from airbyte_cdk.sources.declarative.manifest_declarative_source import ManifestDeclarativeSource
ModuleNotFoundError: No module named 'airbyte_cdk.sources.declarative.manifest_declarative_source'
Can anyone please help me on this ?MTA
10/03/2025, 7:53 AMDaniel Popowich
10/03/2025, 4:09 PMabctl for testing. I have runtime features I would like to turn on/off during testing without having to rebuild my image. How do I set (and later change) an arbitrary environment variable for my connector. For example, I can run a source connector image from my command-line like this:
docker run --rm $MYIMAGE spec
But if I want to turn on a custom feature I could do this:
docker run --rm -e MYCUSTOMENV=true $MYIMAGE spec
How do I do this while running in the abctl (kind) environment?Mike Braden
10/06/2025, 5:03 PMstream_partition.<id_name> for the Creation and Download URLs but when used in the Polling URL it inserts a null value for the stream partition current ID, causing the request to fail. Am I missing something? Is stream partitioning not supported for the polling portion of the async job?Sunny Matharu
10/06/2025, 10:16 PMCould not validate your connector as specified:
Error handling request: Validation against json schema defined in declarative_component_schema.yaml schema failed
Has anyone successfully managed to set up a custom Salesforce destination connector? Is it even allowed given it seems to be explicitly excluded from the list of connectors in the self-hosted / open source version of AirByte?Fungai
10/07/2025, 1:37 AMGloria Usanase
10/08/2025, 7:19 AMorg.postgresql.util.PSQLException: An I/O error occurred while sending to the backend (EOF) from the CTID/Xmin iterator.
What’s already tried: Verified CH side (Atomic DB, correct namespace, no cluster, JSON disabled), limited concurrency to 1, ran Full refresh | Overwrite for big tables, and adjusted the Postgres source (JDBC `socketTimeout=0&tcpKeepAlive=true&defaultRowFetchSize=1000`; role GUCs removing statement/idle timeouts). Inserts into CH occur in 2k batches, but overall throughput is far slower than v1 and several large streams still don’t finish. Looking for any known v2.1.x performance regressions or tuning tips for PG→CH initial loads (batch/window sizes, fetch size, CTID vs Xmin, CDC for big tables, etc.).Zeineb Ben Nessib
10/09/2025, 9:19 AMJean Baptiste Guerraz
10/14/2025, 11:08 PMJ Bob
10/17/2025, 9:13 AMKanhaiya Gupta
10/17/2025, 12:33 PMGöktuğ Aşcı
10/17/2025, 7:26 PMJ Bob
10/18/2025, 1:55 PMIsaac Harris-Holt
10/22/2025, 11:19 AMAymen NEGUEZ
10/24/2025, 1:24 PMLucas Hild
10/24/2025, 5:40 PM[
{
"result": {
"rows": [
[
"amazon-web-services",
"2025",
"10",
"14",
123.456,
1760400000
],
[
"microsoft-azure",
"2025",
"10",
"12",
654.321,
1760227200
],
...
]
}
}
]
How can I pull access these fields properly?
When I select the field path result, rows, I get this error: Exception while syncing stream cloud_costs: dictionary update sequence element #0 has length 12; 2 is required
Is there any way to transform this data properly?
Thanks!Ляшик Іван
10/27/2025, 9:22 AMupdated_at).
We’ve noticed a potential issue with how Airbyte handles incremental syncs:
If new records appear in the source with the same cursor value as the last recorded cursor (for example, 2025-10-20 09:42:11), these rows are not picked up in the next sync.
As far as we understand, this happens because Airbyte always applies the condition cursor_field > last_cursor_value, not >=, when filtering incremental data.
This creates a risk of data loss if multiple rows share the same timestamp down to the same second or microsecond — which is common in high-frequency data sources.
Could you please confirm:
1. Whether there is a way to configure or modify this behavior (e.g., use >= or apply a grace period/offset for the cursor)?
2. If not, is there any recommended best practice or workaround (such as cursor windowing or state rollback) to avoid missing records that share the same cursor value?
Thank you for your help and clarification.
Best regards!Stockton Fisher
11/02/2025, 7:38 AMPranay S
11/03/2025, 12:39 PMFlorian Brüser
11/03/2025, 4:20 PMSimon Duvergier
11/03/2025, 4:48 PMrun the `bump-version` Airbyte-CI command locallyShould I try to add a commit on my existing PR ?
Connector CI Tests. Some failures here may be expected if your tests require credentials. Please review these results to ensure (1) unit tests are passing, if applicable, and (2) integration tests pass to the degree possible and expected.I have some tests failing because no credentials are provided, but I am not sure how I can edit/update tests when using the Connector Builder, nor if I should update these tests ? I would be more than happy to be redirect to a documentation/better channel 🙂
Levon Galstyan
11/04/2025, 2:34 PMDavid Aichelin
11/06/2025, 10:28 AMSai Manaswini Reddy I
11/07/2025, 3:55 PMWilliam Kaper
11/12/2025, 1:57 PMVinayak R
11/13/2025, 3:48 AMJonathan Ben-Harosh
11/13/2025, 7:33 AMAustin Fay
11/14/2025, 6:00 PMbase_requester:
type: HttpRequester
url_base: <https://services.leadconnectorhq.com/>
authenticator:
type: OAuthAuthenticator
grant_type: refresh_token
client_id: "{{ config['client_id'] }}"
client_secret: "{{ config['client_secret'] }}"
refresh_token: "{{ config['refresh_token'] }}"
access_token_name: access_token
refresh_request_body:
grant_type: refresh_token
client_id: "{{ config['client_id'] }}"
client_secret: "{{ config['client_secret'] }}"
refresh_token: "{{ config['refresh_token'] }}"
refresh_token_updater:
refresh_token_name: refresh_token
refresh_token_config_path:
- refresh_token
...
spec:
advanced_auth:
oauth_config_specification:
complete_oauth_server_input_specification:
required:
- client_id
complete_oauth_output_specification:
required:
- access_token
- refresh_token
- token_expiry_date
properties:
access_token:
type: string
path_in_connector_config:
- access_token
path_in_oauth_response:
- access_token
refresh_token:
type: string
path_in_connector_config:
- refresh_token
path_in_oauth_response:
- refresh_token
token_expiry_date:
type: number
path_in_connector_config:
- token_expiry_date
path_in_oauth_response:
- expires_in
When I test it in the connector builder, it works fine. However, when I create a real source/connection with it, it manages to test the source successfully, then upon actually syncing the connection, it fails to authenticate with the refresh token. The other thing to note is that this worked until about a month ago, and then it simply stopped working. I'm not sure what happened, but we tried everything including a full re-install of airbyte. Am I missing something here?