Zawar Khan
03/30/2022, 5:33 PMPrakash
03/30/2022, 7:16 PMERROR i.a.s.RequestLogger(filter):110 - REQ 172.18.0.2 POST 500 /api/v1/source_definitions/create - {"name":"okta","documentationUrl":"<https://github.com/faros-ai/airbyte-connectors/tree/main/sources/okta-source>","dockerImageTag":"latest","dockerRepository":"connectprakash/okta-source"}
Marcos Marx (Airbyte)
03/30/2022, 8:30 PMMert Karabulut
03/31/2022, 9:42 AMMarcos Marx (Airbyte)
03/31/2022, 8:30 PMAvijit Mandal
04/01/2022, 8:19 AMcursor_field
@property
def cursor_field(self) -> str:
"""
TODO
Override to return the cursor field used by this stream e.g: an API entity might always use created_at as the cursor field. This is
usually id or date based. This field's presence tells the framework this in an incremental stream. Required for incremental.
:return str: The name of the cursor field.
"""
return [""]
I mean if my column had 2000 rows and a new row get added then will this much will be suffient to append 1 new row or it will fetch 2001 columns from scratch .
Thanks in Advanced!Marcos Marx (Airbyte)
04/01/2022, 8:30 PMKirsten Hipolito
04/03/2022, 11:21 PM400 Client Error: Bad Request
, because the scores are saved for only a certain amount of time, and scores from beyond that interval return that error. But, we can't pinpoint exactly which customer IDs were scored within that interval, so it's only through the 400 error code that we can tell. How can I modify the Python source connector template so that I can handle the 400 error myself, and it doesn't reflect as an error in a sync (causing the sync to be marked as 'failed')? Or what might be a better approach to handling these errors?Daniel Lundkvist
04/04/2022, 8:31 AMMarcos Marx (Airbyte)
04/04/2022, 7:59 PMTimo Klock
04/04/2022, 8:12 PMLuis Gomez
04/05/2022, 1:32 PMPranit
04/05/2022, 2:36 PMShubham Pinjwani
04/05/2022, 7:26 PMYanni Iyeze - Toucan Toco
04/06/2022, 8:13 AMMichael Cooper
04/06/2022, 9:36 PMInternal Server Error: java.lang.IllegalArgumentException: no JSON input found
for an error code. I’ve checked the credentials, so it’s not invalid credentials.Christopher Wu
04/06/2022, 10:22 PMThe destination connector should only output state messages if they were previously received as input on stdin. Outputting a state message indicates that all records which came before it have been successfully written to the destination.
Does Airbyte do anything with the state messages outputted by the destination, or is it purely for surfacing info to the user?Imon
04/07/2022, 11:48 AMrequest_params
such that instead of returning one dictionary, I return a list of dictionaries where each entry corresponds to the same request but for a different city.
However, using a list of dictionaries instead of one dictionaries only results in a Too many values to unpack
-error. Does anyone have an idea on how I could solve this problem?Mahdi Dibaiee
04/07/2022, 2:26 PMsource-facebook-marketing
where it assumes user_tos_accepted
to be a number but it’s actually an object, and so read operations fail. I’ve opened an issue and a PR where I’m reusing the schema definition of tos_accepted
for user_tos_accepted
which makes both nullable objects. Can I get a review from one of the maintainers please? It’s a small PR but blocks our use of this connector. thanku
Issue: 11800
PR: 11801Jove Zhong
04/07/2022, 5:27 PMdiscover
hangs? Hello, I am building the source and destination connectors for our own product. I had this worked before, but recently the schema discover failed. Not sure it's due to aribyte change or our product change. When I ran the UI test (read from our product and send to csv), the schema discover progress never end. Then I ran docker run --rm -i -v ~/Dev:/secrets timeplus/airbyte-source-timeplus:dev discover --config /secrets/source_config.json
It shows tables/columns but also hang again, with many INFO messages like
2022-04-07 171926 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):137 - Table bookings column cid (type string[0]) -> Json type io.airbyte.protocol.models.JsonSchemaType@492fc69eIt's a Java-based connector. Can I run this via IntelliJ and see which method it is stuck? thanku
David Anderson
04/07/2022, 8:32 PMGujjalapati Raju
04/08/2022, 9:28 AMJohan Strand
04/08/2022, 1:47 PMAlpana Shukla
04/08/2022, 1:59 PMSean Zinsmeister
04/08/2022, 9:18 PMAchmad Syarif Hidayatullah
04/09/2022, 6:16 AMJSON schema validation failed.
errors: $: null found, object expected
the whole zendesk pipeline failed because of this, and the log aren't clear enough to tell which stream and column that causing this error
i want to make a fix but this leave me clueless.
also is there any way to suppress this error? for example make certain column with object type as optional?
second, is json schema validation run in source or run in destination part?
this problem pretty much annoys me (make airbyte keeps retrying until now) and this happens after using zendesk-support connector version 2AndrewD
04/09/2022, 5:09 PMNatan Yellin
04/10/2022, 2:38 PMEnrico Tuvera Jr.
04/11/2022, 6:32 AMrequests
package's session authenticators aren't they? the requests_native
package, at that and the fact that they're passed into self._session.auth
both kinda hint at that. i just want to make sure that there's nothing special about them before trying to roll my ownOliver Franz
04/11/2022, 9:34 AM