Pankaj Lal
05/05/2025, 7:28 AM<https://cxone.niceincontact.com/auth/authorize?{{client_id_key}}={{client_id_value}}&{{redirect_uri_key}}={{{{redirect_uri_value}}%7CurlEncoder}}&{{state_key}}={{state_value}}&scope=openid&response_type=code>
I see this error in the console when I test the connection when airbyte calls
get_oauth_consent_url
{
"message": "Internal Server Error: Error parsing '{{redirect_uri_value': syntax error at position 129, encountered '}', expected ':'",
"exceptionClassName": "com.hubspot.jinjava.interpret.FatalTemplateErrorsException",
"exceptionStack": [],
"rootCauseExceptionStack": []
}
Max Werner
05/05/2025, 3:04 PM{
"filters": [
{
"$gt": {
"audit.modifiedDateTime": "2025-05-05T12:34:56Z"
}
}
]
}
but when I have a Incremental Sync section of the YAML like
type: DatetimeBasedCursor
cursor_field: audit.modifiedDateTime
cursor_datetime_formats:
- '%Y-%m-%dT%H:%M:%SZ'
datetime_format: '%Y-%m-%dT%H:%M:%SZ'
start_datetime:
type: MinMaxDatetime
datetime: '{{ config["start_date"] }}'
datetime_format: '%Y-%m-%dT%H:%M:%SZ'
start_time_option:
type: RequestOption
inject_into: body_json
field_path:
- filters
- '0'
- $gt
- audit.modifiedDateTime
The filters
part of the request does not become an array but a KV pair like
"filters": {
"0": {
"$gt": {
"audit.modifiedDateTime": "2005-05-01T00:00:00Z"
}
}
}
How can I tell Airbyte that filters
in the request JSON body is supposed to be an array?Robert Kolb
05/06/2025, 1:54 PMSirine Hdiji
05/07/2025, 8:39 AMDatetimeStreamSlicer
to generate monthly date slices between a start_date
and end_date
defined in the config : 202501, 202502...
My goal is to pass the date
from each slice (in the format YYYYMM
) as a required query parameter in the API request. Here's a simplified version of my YAML configuration:
stream_slicer:
type: DatetimeStreamSlicer
cursor_field: date
start_datetime: "{{ config['start_date'] }}"
end_datetime: "{{ config['end_date'] }}"
step: "1M"
datetime_format: "%Y%m"
cursor_granularity: "P1M"
retriever:
type: SimpleRetriever
requester:
request_parameters:
date: "{{ stream_slice['date'] }}"
However, in the actual requests, the date
parameter is missing, and the API returns a 400 error:
"must have required property 'date'"
.
is DatetimeStreamSlicer
still supported or has it been deprecated in favor of another approach?
also, what are the best practices to pass dynamic query parameters ? Any tips or examples would be appreciated!
Thanks a lot ๐Afif
05/08/2025, 6:56 AMexit 1
. when running spec
, but it works well when i docker run <image> spec
Andres Pulgarin
05/08/2025, 11:14 AMSatish Chinthanippu
05/08/2025, 3:11 PMCarolina Buckler
05/09/2025, 2:17 PMquery_tag
in the JDBC parameter in the Snowflake destination connector. Is there a way to have it add dynamically the Airbyte connection name instead of just a hardcoded value? https://github.com/airbytehq/airbyte/pull/9623
https://github.com/airbytehq/airbyte/issues/9467Thomas Bazin
05/12/2025, 9:51 AM- type: SubstreamPartitionRouter
parent_stream_configs:
- type: ParentStreamConfig
parent_key: idCustomer
partition_field: idCustomer
stream:
$ref: '#/definitions/streams/Customer'
incremental_dependency: true
It seems to work when I test it directly into the builder, but it does not return anything when I use it for a connection.
Connection logs do not show any call to the parent stream, and the state is the following :
{"states": [], "parent_state": {"Customers": {}}, "lookback_window": 7, "use_global_cursor": false}
What am I doing wrong ?Namay Jindal
05/13/2025, 1:32 PMNick Zombolas
05/13/2025, 5:21 PM<http://graph.instagram.com|graph.instagram.com>
or <http://graph.facebook.com|graph.facebook.com>
.
1. Can anybody familiar with the ig connector confirm that it would not be possible to ingest via instagram login method? looking at the facebook_business python package we are using, it uses <http://graph.facebook.com|graph.facebook.com>
and i didn't see any options to use instagram api instead.
2. If this is indeed the case, are there any future plans to include the instagram login method? From what I can tell, there would need to be some changes in the Api stream to hit the correct initial endpoints for accounts, but insights endpoints for Media Insights, etc should be the same.
for further context, here's the documentation overview describing the difference between the two methods: https://developers.facebook.com/docs/instagram-platform/overview
Thanks!Mohammad Soori
05/14/2025, 9:34 AMHรฅkon Guttulsrud
05/15/2025, 1:08 PMMathieson
05/19/2025, 12:24 PMErhan Tuna
05/21/2025, 5:56 AMdilan silva
05/22/2025, 8:55 AMversion: 6.48.15
type: DeclarativeSource
check:
type: CheckStream
stream_names:
- datasets
definitions:
streams:
datasets:
type: DeclarativeStream
name: datasets
retriever:
type: SimpleRetriever
requester:
$ref: "#/definitions/base_requester"
path: /api/export
http_method: GET
record_selector:
type: RecordSelector
extractor:
type: DpathExtractor
field_path: []
decoder:
type: JsonlDecoder
schema_loader:
type: InlineSchemaLoader
schema:
$ref: "#/schemas/datasets"
base_requester:
type: HttpRequester
url_base: >-
<https://03100670-8969-4472-a593-d7a8cef4488b-00-15uejhh5ki8b9.janeway.replit.dev>
streams:
- $ref: "#/definitions/streams/datasets"
spec:
type: Spec
connection_specification:
type: object
$schema: <http://json-schema.org/draft-07/schema#>
required: []
properties: {}
additionalProperties: true
schemas:
datasets:
type: object
$schema: <http://json-schema.org/draft-07/schema#>
additionalProperties: true
properties: {}
Now when I try to locally do this work in my local connector, it does not load the data, but it says Sync success with 0 bytes. In the log I can see these messages, (Not pasting full log here)
2025-05-22 13:54:15 source ERROR Marking stream records as STARTED
2025-05-22 13:54:15 source ERROR Syncing stream instance: records
2025-05-22 13:54:15 source ERROR Setting state of SourceNexusDatasets stream to {}
2025-05-22 13:54:15 source ERROR Syncing stream: records
2025-05-22 13:54:15 source ERROR Making outbound API request
2025-05-22 13:54:15 source INFO Starting syncing SourceNexusDatasets
2025-05-22 13:54:15 source INFO Marking stream records as STARTED
2025-05-22 13:54:15 source INFO Malformed non-Airbyte record (connectionId = 034b7256-278b-4830-bba5-953c366de232): {"type": "DEBUG", "message": "Syncing stream instance: records", "data": {"message": "Syncing stream instance: records", "cursor_field": "[]", "primary_key": "None"}}
2025-05-22 13:54:15 source INFO Setting state of SourceNexusDatasets stream to {}
2025-05-22 13:54:15 source INFO Syncing stream: records
2025-05-22 13:54:15 source INFO Malformed non-Airbyte record (connectionId = 034b7256-278b-4830-bba5-953c366de232): {"type": "DEBUG", "message": "Making outbound API request", "data": {"request_body": "None", "headers": "{'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'}", "url": "<https://03100670-8969-4472-a593-d7a8cef4488b-00-15uejhh5ki8b9.janeway.replit.dev/api/export>", "message": "Making outbound API request"}}
2025-05-22 13:54:15 replication-orchestrator INFO Stream status TRACE received of status: STARTED for stream records
2025-05-22 13:54:15 replication-orchestrator INFO Sending update for records - null -> RUNNING
2025-05-22 13:54:15 replication-orchestrator INFO Stream Status Update Received: records - RUNNING
2025-05-22 13:54:15 replication-orchestrator INFO Creating status: records - RUNNING
2025-05-22 13:54:17 source ERROR Receiving response
2025-05-22 13:54:17 source INFO Malformed non-Airbyte record (connectionId = 034b7256-278b-4830-bba5-953c366de232): {"type": "DEBUG", "message": "Receiving response", "data": {"headers": "{'Content-Disposition': 'attachment; filename=\"ExportData.json\"', 'Content-Type': 'application/json', 'Date': 'Thu, 22 May 2025 08:24:19 GMT', 'Replit-Cluster': 'janeway', 'X-Powered-By': 'Express', 'X-Robots-Tag': 'none, noindex, noarchive, nofollow, nositelinkssearchbox, noimageindex, none, noindex, noarchive, nofollow, nositelinkssearchbox, noimageindex', 'Transfer-Encoding': 'chunked'}", "body": "{\"id\": 1, \"name\": \"Example Item 1\", \"category\": \"Category A\", \"price\": 19.99}\n{\"id\": 2, \"name\": \"Example Item 2\", \"category\": \"Category B\", \"price\": 29.99}\n{\"id\": 3, \"name\": \"Example Item 3\", \"category\": \"Category A\", \"price\": 15.50}\n{\"id\": 4, \"name\": \"Example Item 4\", \"category\": \"Category C\", \"price\": 45.00}\n{\"id\": 5, \"name\": \"Example Item 5\", \"category\": \"Category B\", \"price\": 35.25}", "message": "Receiving response", "status": "200"}}
2025-05-22 13:54:17 source ERROR Read 0 records from records stream
2025-05-22 13:54:17 source ERROR Marking stream records as STOPPED
2025-05-22 13:54:17 source ERROR Finished syncing records
2025-05-22 13:54:17 source ERROR SourceNexusDatasets runtimes:
2025-05-22 13:54:17 source ERROR Syncing stream records 0:00:01.715473
2025-05-22 13:54:17 source ERROR Finished syncing SourceNexusDatasets
I have the same manifest configuration in the local connector except the version
,
version: 0.90.0
type: DeclarativeSource
check:
type: CheckStream
stream_names:
- "records"
definitions:
streams:
records:
type: DeclarativeStream
name: records
retriever:
type: SimpleRetriever
requester:
type: HttpRequester
url_base: <https://03100670-8969-4472-a593-d7a8cef4488b-00-15uejhh5ki8b9.janeway.replit.dev>
path: /api/export
http_method: GET
record_selector:
type: RecordSelector
extractor:
type: DpathExtractor
field_path: []
decoder:
type: JsonlDecoder
schema_loader:
type: InlineSchemaLoader
schema:
$ref: "#/schemas/datasets"
streams:
- "#/definitions/streams/records"
schemas:
datasets:
type: object
$schema: <http://json-schema.org/draft-07/schema#>
additionalProperties: true
properties: {}
Can someone please help on this ? I tried to change the version but it give an error,
jsonschema.exceptions.ValidationError: The manifest version 6.48.15 is greater than the airbyte-cdk package version (0.90.0). Your manifest may contain features
that are not in the current CDK version..
Mathieu Dumoulin
05/22/2025, 5:53 PMHadrien Lepousรฉ
05/24/2025, 7:51 PMpoetry run pytest unit_tests/integrations/test_web_analytics_streams.py::TestCRMWebAnalyticsStream::test_given_two_pages_when_read_then_return_records
Result:
======================================== short test summary info =========================================
FAILED unit_tests/integrations/test_web_analytics_streams.py::TestCRMWebAnalyticsStream::test_given_two_pages_when_read_then_return_records[tickets_web_analytics-tickets-ticket-parent_stream_associations0] - ValueError: Invalid number of matches for `HttpRequestMatcher(request_to_match=ParseResult(scheme='https', netloc='<http://api.hubapi.com|api.hubapi.com>', path='/crm/v3/objects/ticket', params='', query='archived=false&associations=contacts&associations=deals&associations=companies&limit=100&properties=closed_date,createdate', fragment='') with headers {} and body None), minimum_number_of_expected_match=1, actual_number_of_matches=0)`
FAILED unit_tests/integrations/test_web_analytics_streams.py::TestCRMWebAnalyticsStream::test_given_two_pages_when_read_then_return_records[deals_web_analytics-deals-deal-parent_stream_associations1] - ValueError: Request ParseResult(scheme='https', netloc='<http://api.hubapi.com|api.hubapi.com>', path='/crm/v3/schemas', params='', query='', fragment='') with headers {} and body None) already mocked
FAILED unit_tests/integrations/test_web_analytics_streams.py::TestCRMWebAnalyticsStream::test_given_two_pages_when_read_then_return_records[companies_web_analytics-companies-company-parent_stream_associations2] - ValueError: Request ParseResult(scheme='https', netloc='<http://api.hubapi.com|api.hubapi.com>', path='/crm/v3/schemas', params='', query='', fragment='') with headers {} and body None) already mocked
FAILED unit_tests/integrations/test_web_analytics_streams.py::TestCRMWebAnalyticsStream::test_given_two_pages_when_read_then_return_records[contacts_web_analytics-contacts-contact-parent_stream_associations3] - ValueError: Request ParseResult(scheme='https', netloc='<http://api.hubapi.com|api.hubapi.com>', path='/crm/v3/schemas', params='', query='', fragment='') with headers {} and body None) already mocked
FAILED unit_tests/integrations/test_web_analytics_streams.py::TestCRMWebAnalyticsStream::test_given_two_pages_when_read_then_return_records[engagements_calls_web_analytics-engagements_calls-calls-parent_stream_associations4] - ValueError: Request ParseResult(scheme='https', netloc='<http://api.hubapi.com|api.hubapi.com>', path='/crm/v3/schemas', params='', query='', fragment='') with headers {} and body None) already mocked
FAILED unit_tests/integrations/test_web_analytics_streams.py::TestCRMWebAnalyticsStream::test_given_two_pages_when_read_then_return_records[engagements_emails_web_analytics-engagements_emails-emails-parent_stream_associations5] - ValueError: Request ParseResult(scheme='https', netloc='<http://api.hubapi.com|api.hubapi.com>', path='/crm/v3/schemas', params='', query='', fragment='') with headers {} and body None) already mocked
FAILED unit_tests/integrations/test_web_analytics_streams.py::TestCRMWebAnalyticsStream::test_given_two_pages_when_read_then_return_records[engagements_meetings_web_analytics-engagements_meetings-meetings-parent_stream_associations6] - ValueError: Request ParseResult(scheme='https', netloc='<http://api.hubapi.com|api.hubapi.com>', path='/crm/v3/schemas', params='', query='', fragment='') with headers {} and body None) already mocked
FAILED unit_tests/integrations/test_web_analytics_streams.py::TestCRMWebAnalyticsStream::test_given_two_pages_when_read_then_return_records[engagements_notes_web_analytics-engagements_notes-notes-parent_stream_associations7] - ValueError: Request ParseResult(scheme='https', netloc='<http://api.hubapi.com|api.hubapi.com>', path='/crm/v3/schemas', params='', query='', fragment='') with headers {} and body None) already mocked
FAILED unit_tests/integrations/test_web_analytics_streams.py::TestCRMWebAnalyticsStream::test_given_two_pages_when_read_then_return_records[engagements_tasks_web_analytics-engagements_tasks-tasks-parent_stream_associations8] - ValueError: Request ParseResult(scheme='https', netloc='<http://api.hubapi.com|api.hubapi.com>', path='/crm/v3/schemas', params='', query='', fragment='') with headers {} and body None) already mocked
Giulliano Bueno
06/02/2025, 8:41 AMgradlew
build process. Can anyone point me to tutorials or resources on contributing to this project or setting up Databricks as a source without creating a new connector?Erin Yener
06/02/2025, 11:23 AMInputs
section, but would love confirmation on how best to do this.
โข If itโs best to modify the YAML directly, would anyone be able to advise me on what section of the YAML to add the API budget to? The docs have some examples, but Iโm not clear on how to add this to a forked connector.
โข Are there existing connector examples that have this parameter as an optional input so that I can see how it โfitsโ?
Thanks for the help!Jens Mostaert
06/04/2025, 12:53 PMNick Zombolas
06/04/2025, 8:06 PMspec
job. I've been looking into how to implement this steam in yaml instead of calling my python class, but not sure the best way to get this done. any advice? thanks!Mert Ors
06/05/2025, 9:05 AMPaul
06/05/2025, 1:35 PMairbyte-ci connectors --name destination-azure-blob-storage build
But sadly after about 30 seconds it fails. The build output html is empty. The dagger.log file is also empty. This is what i get in the window log below - any ideas what ive missed? ive updated the secrets as per the readme, but no luck.
[23:33:09] INFO root: Setting working directory to /home/pladmin/airbyte/airbyte ensure_repo_root.py:58
[23:33:10] INFO root: Setting working directory to /home/pladmin/airbyte/airbyte ensure_repo_root.py:58
INFO pipelines: airbyte-ci is up to date. Installed version: 5.2.5. Latest version: 5.2.5 auto_update.py:89
INFO pipelines: Called with dagger run: False airbyte_ci.py:127
INFO pipelines.cli.dagger_run: Running command: ['/home/pladmin/bin/dagger', '--silent', 'run', 'airbyte-ci', 'connectors', '--name', 'destination-azure-blob-storage', 'build'] dagger_run.py:120
[23:33:18] INFO root: Setting working directory to /home/pladmin/airbyte/airbyte ensure_repo_root.py:58
[23:33:19] INFO root: Setting working directory to /home/pladmin/airbyte/airbyte ensure_repo_root.py:58
INFO pipelines: airbyte-ci is up to date. Installed version: 5.2.5. Latest version: 5.2.5 auto_update.py:89
INFO pipelines: Called with dagger run: True airbyte_ci.py:127
[23:33:27] INFO pipelines: Will run on the following 1 connectors: destination-azure-blob-storage. commands.py:32
INFO pipelines: Running Dagger Command build... dagger_pipeline_command.py:32
INFO pipelines: If you're running this command for the first time the Dagger engine image will be pulled, it can take a short minute... dagger_pipeline_command.py:33
INFO pipelines: Saving dagger logs to: dagger_pipeline_command.py:43
/home/pladmin/airbyte/airbyte/airbyte-ci/connectors/pipelines/pipeline_reports/airbyte-ci/connectors/build/manual/master/1749094400/b2ffb0185be442ddf72677067d3a8
243fbba770f/dagger.log
INFO pipelines: Building connectors for ['linux/amd64'], use --architecture to change this. commands.py:46
INFO Build connector destination-azure-blob-storage: Should send status check: False pipeline_context.py:222
[23:33:29] INFO root: Using storage driver: fuse-overlayfs docker.py:85
[23:33:56] INFO Build connector destination-azure-blob-storage: Caching the latest CDK version... pipeline_context.py:284
INFO Build connector destination-azure-blob-storage: Should send status check: False pipeline_context.py:222
INFO Build connector destination-azure-blob-storage - Build connector tar: ๐ Start Build connector tar steps.py:303
ERROR Build connector destination-azure-blob-storage: An error got handled by the ConnectorContext context.py:253
โญโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ Traceback (most recent call last) โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโฎ
โ in run_connector_build_pipeline:49 โ
โ โ
โ in run_connector_build:33 โ
โ โ
โ in run_connector_build:60 โ
โ โ
โ in run:307 โ
โ โ
โ in __aexit__:772 โ
โฐโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโฏ
ExceptionGroup: unhandled errors in a TaskGroup (1 sub-exception)
ERROR Build connector destination-azure-blob-storage: No test report was provided. This is probably due to an upstream error context.py:255
โญโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ DESTINATION-AZURE-BLOB-STORAGE - REPORT โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโฎ
โ Steps results โ
โ โโโโโโโโณโโโโโโโโโณโโโโโโโโโโโ โ
โ โ Step โ Result โ Duration โ โ
โ โกโโโโโโโโโโโโโโโโโโโโโโโโโโโฉ โ
โ โโโโโโโโดโโโโโโโโโดโโโโโโโโโโโ โ
โ โน๏ธ You can find more details with step executions logs in the saved HTML report. โ
โฐโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โฒ๏ธ Total pipeline duration for destination-azure-blob-storage: 0.36s โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
Aphonso Henrique do Amaral Rafael
06/05/2025, 5:10 PMJuliette Duizabo
06/09/2025, 4:39 PMConfiguration check failed
'Encountered an error while checking availability of stream sources. Error: Request URL: <https://api.airbyte.com/v1/applications/token>, Response Code: 500, Response Text: {"message":"Internal Server Error","_links":{"self":{"href":"/api/public/v1/applications/token","templated":false}},"_embedded":{"errors":[{"message":"Internal Server Error: class org.jboss.resteasy.client.jaxrs.engines.ManualClosingApacheHttpClient43Engine$3 cannot be cast to class io.micronaut.jaxrs.common.JaxRsMutableResponse (org.jboss.resteasy.client.jaxrs.engines.ManualClosingApacheHttpClient43Engine$3 and io.micronaut.jaxrs.common.JaxRsMutableResponse are in unnamed module of loader \'app\')","_links":{},"_embedded":{}}]}}'
It looks like the issue is on Airbyte's side. Has any of you managed to set up the import of Airbyte metadata to have the observability in their warehouse?Gergely Imreh
06/10/2025, 12:58 PMid
and have values of id1
, id2
, etc)
โข The child stream would need a payload to query that puts those id
s into a list in the request body such as:
{"input": [{"id": id1}, {"id": id2}, ....]}
and sends of that query (it's a batch one by default)
Is this possible to configure (with a parent substream like this)?
Or do I have to just run a sequential list of queries with
{"input": [{"id": id1}]}
then
{"input": [{"id": id2}]}
....
This would likely work, though probably hit rate limits, and takes longer time than the one that would run things in one go.
Any suggestions? ๐คMike Moyer
06/10/2025, 9:34 PMAnthony Smart
06/11/2025, 1:40 PMAlbert Le
06/11/2025, 6:57 PM/studies
.
There is another subendpoint called /studies/{id}/availability
, where {id} is a single study id.
Does the new builder Tool have an automated way of allowing me to call /studies endpoint andget a list of study_ids, and use that response as the input query parameter for the subendpoint?
What i tried: Searched through documentation, but couldn't find anything for my use-case.