Imon
04/07/2022, 11:48 AMrequest_params
such that instead of returning one dictionary, I return a list of dictionaries where each entry corresponds to the same request but for a different city.
However, using a list of dictionaries instead of one dictionaries only results in a Too many values to unpack
-error. Does anyone have an idea on how I could solve this problem?Mahdi Dibaiee
04/07/2022, 2:26 PMsource-facebook-marketing
where it assumes user_tos_accepted
to be a number but it’s actually an object, and so read operations fail. I’ve opened an issue and a PR where I’m reusing the schema definition of tos_accepted
for user_tos_accepted
which makes both nullable objects. Can I get a review from one of the maintainers please? It’s a small PR but blocks our use of this connector. thanku
Issue: 11800
PR: 11801Jove Zhong
04/07/2022, 5:27 PMdiscover
hangs? Hello, I am building the source and destination connectors for our own product. I had this worked before, but recently the schema discover failed. Not sure it's due to aribyte change or our product change. When I ran the UI test (read from our product and send to csv), the schema discover progress never end. Then I ran docker run --rm -i -v ~/Dev:/secrets timeplus/airbyte-source-timeplus:dev discover --config /secrets/source_config.json
It shows tables/columns but also hang again, with many INFO messages like
2022-04-07 171926 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):137 - Table bookings column cid (type string[0]) -> Json type io.airbyte.protocol.models.JsonSchemaType@492fc69eIt's a Java-based connector. Can I run this via IntelliJ and see which method it is stuck? thanku
David Anderson
04/07/2022, 8:32 PMGujjalapati Raju
04/08/2022, 9:28 AMJohan Strand
04/08/2022, 1:47 PMAlpana Shukla
04/08/2022, 1:59 PMSean Zinsmeister
04/08/2022, 9:18 PMAchmad Syarif Hidayatullah
04/09/2022, 6:16 AMJSON schema validation failed.
errors: $: null found, object expected
the whole zendesk pipeline failed because of this, and the log aren't clear enough to tell which stream and column that causing this error
i want to make a fix but this leave me clueless.
also is there any way to suppress this error? for example make certain column with object type as optional?
second, is json schema validation run in source or run in destination part?
this problem pretty much annoys me (make airbyte keeps retrying until now) and this happens after using zendesk-support connector version 2AndrewD
04/09/2022, 5:09 PMNatan Yellin
04/10/2022, 2:38 PMEnrico Tuvera Jr.
04/11/2022, 6:32 AMrequests
package's session authenticators aren't they? the requests_native
package, at that and the fact that they're passed into self._session.auth
both kinda hint at that. i just want to make sure that there's nothing special about them before trying to roll my ownOliver Franz
04/11/2022, 9:34 AMEhmad Zubair
04/11/2022, 11:57 AMLouis-Marius Gendreau
04/11/2022, 2:56 PMTien Nguyen
04/11/2022, 8:40 PMgunu
04/11/2022, 10:55 PMcursor_field
so that on the next sync, the API will pull records after this date. however some connectors i’ve looked at all appear to pull records for on or after this date. meaning that it will always pull the last record synced again, causing duplicate records to appear.
this is the case for zendesk-support
, zendesk-talk
, survey-monkey
and i imagine others too. given no other users are complaining about this, i’m wondering if i’m misunderstanding some fundamental aspect of the incremental stream.
posted additional details in the discussion forum if you’re interestedShubham Pinjwani
04/11/2022, 10:57 PMTask airbyte integrationsconnectorssource custom postgrescompileJavaFAILED /Users/shubham.pinjwani/airbyte/airbyte-integrations/connectors/source-custom-postgres/src/main/java/io/airbyte/integrations/source/custom_postgres/CustomPostgresSource.java25 error: cannot find symbol public class CustomPostgresSource extends PostgresSource { symbol: class PostgresSource 1 error
gunu
04/11/2022, 11:12 PMcursor_field
so that on the next sync, the API will pull records after this date. however some connectors i’ve looked at all appear to pull records for on or after this date. meaning that it will always pull the last record synced again, causing duplicate records to appear.
this is the case for zendesk-support
, zendesk-talk
, survey-monkey
and i imagine others too. given no other users are complaining about this, i’m wondering if i’m misunderstanding some fundamental aspect of the incremental stream.
posted additional details in the discussion forum if you’re interested.
also attached an image to show how the last record keeps getting synced again.Shubham Pinjwani
04/12/2022, 12:31 AMInternal Server Error: Get Spec job failed.
Ryan Werth
04/12/2022, 1:51 AMpython main.py read --config sample_files/config.json --catalog sample_files/configured_catalog.json
returns the desired result. I need some help adding the connector to the airbyte UIAakash Kumar
04/12/2022, 7:13 AM'_csv.Error: field larger than field limit (131072)'
We are using Ubuntu OS where airbyte is deployed. We are also attaching logs for you reference.Augustin Lafanechere (Airbyte)
04/12/2022, 5:31 PMJonathan Alvarado
04/12/2022, 5:59 PM./gradlew :airbyte-integrations:connectors:source-postgres:integrationTest
Tien Nguyen
04/12/2022, 8:53 PMDaniel Vengoechea
04/13/2022, 1:29 PMDaniel Vengoechea
04/13/2022, 1:30 PMAugustin Lafanechere (Airbyte)
04/13/2022, 1:35 PMuser
04/13/2022, 1:42 PM[DEPRECATED] Augustin Lafanechere