Yokesh RS
06/15/2023, 10:06 AMPatrick Elsen
06/15/2023, 12:23 PMPatrick Elsen
06/15/2023, 12:49 PMAndy Smith
06/15/2023, 1:00 PMFull Refresh - Append
would seem appropriate. Sometimes, however, data for a given date (e.g. 7 days ago) is updated by the remote server, and we need to re-ingest for that date. which would yield records with duplicate primary keys in the destination table, for that date. The only other sync method that seems close is Incremental - Dedupe with History
, However, this requires a cursor, and because the re-ingested date could be well before the value of the current cursor, it is not clear to me whether the dedupe would work, as it looks like the dedupe just works for records with the last cursor value and above. How can we achieve this behaviour?
What we kinda need is the dedupe to work only for the records with the given date (i.e. the newly ingested date).Octavia Squidington III
06/15/2023, 1:45 PMTom Anderson
06/15/2023, 2:51 PM[
{
"headers": [
"trackingId",
"timeStamp",
"userId",
"email",
"userRole",
"emailDomain",
"userGroups",
"dashboardTitle",
"dashboardId",
"dashboardPath",
"action",
"loadTime",
"category",
"str1",
"str2",
"int1",
"int2"
],
"values": [
[
"8abeffd5-af24-4d6f-b929-ac56d7319220",
"2022-06-28T20:40:17",
"obfuscated",
"obfuscated",
"Sys. Admin",
"obfuscated",
"Admins;",
"N\\A",
"N\\A",
"N\\A",
"page.navigate.analytics",
"N\\A",
"General",
"N\\A",
"N\\A",
"N\\A",
"N\\A"
],
[
"8abeffd5-af24-4d6f-b929-ac56d7319220",
"2022-06-28T20:40:17",
"obfuscated",
"obfuscated",
"Sys. Admin",
"obfuscated",
"Admins;",
"N\\A",
"N\\A",
"N\\A",
"page.navigate.analytics",
"N\\A",
"General",
"N\\A",
"N\\A",
"N\\A",
"N\\A"
]
]
}
]
CDK Version 0.44.5
David Anderson
06/15/2023, 2:55 PM{{ stream_partition.id }}
notation logic? i want to end up with something like: /path/stream_partition1.id/path/stream_partition2.id
Aazam Thakur
06/15/2023, 10:21 PMListMembers
which gets the list_id
from the Lists class to use it in the url lists/{list_id}/members
class Lists(IncrementalMailChimpStream):
cursor_field = "date_created"
data_field = "lists"
def path(self, **kwargs) -> str:
return "lists"
Hoang Ho
06/16/2023, 8:42 AMJanis Karimovs
06/16/2023, 12:14 PM2023-06-16 11:35:10 [42mnormalization[0m > [31mUnhandled error while executing model.airbyte_utils.App_Name_2_0[0m
Pickling client objects is explicitly not supported.
Clients have non-trivial state that is local and unpickleable.
From some of the research that I've done it seems the issue lies within BigQuery connector, but I'm confused about why the other streams (podio apps) which seem more or less the same as the one that's failing are working just fine.
Anyone else encounter something similar? Any info on this would be highly appreciated... thanks 🙏Anthony Smart
06/16/2023, 12:51 PMOctavia Squidington III
06/16/2023, 7:45 PMAazam Thakur
06/16/2023, 11:32 PMGetMemberInfo(authenticator=authenticator)\nTypeError: Can't instantiate abstract class GetMemberInfo with abstract method data_field\n", "failure_type": "system_error"}}}
Cody Scott
06/17/2023, 12:19 AMBiondi Septian S
06/17/2023, 1:46 PMBiondi Septian S
06/17/2023, 1:46 PMBiondi Septian S
06/17/2023, 1:46 PMChính Bùi Quang
06/19/2023, 6:41 AMChidambara Ganapathy
06/19/2023, 9:05 AMLuke Whittaker
06/19/2023, 10:37 AM%Y-%m-%d....
laila ribke
06/19/2023, 11:32 AMAnthony Smart
06/19/2023, 12:06 PMSlackbot
06/19/2023, 5:47 PMOctavia Squidington III
06/19/2023, 7:45 PMThomas van Latum
06/19/2023, 7:53 PMMahesh Thirunavukarasu
06/19/2023, 8:08 PMMahesh Thirunavukarasu
06/19/2023, 9:00 PMMicky
06/19/2023, 9:22 PMLuis Peña
06/20/2023, 12:28 AMQuang Dang Vu Nhat
06/20/2023, 3:00 AMmerge_request_commits
schema
"approvals_before_merge": {
"type": ["null", "boolean", "string", "object"]
},
or in Pipedrive product_fields
schema
"options": {
"type": ["null", "array"],
"items": {
"type": "object",
"properties": {
"id": {
"type": ["null", "boolean", "integer"]
},
"label": {
"type": ["null", "string"]
}
}
}
}
I don’t see how they handle multiple datatype in their code, apart from these declaration,
Can anyone support me with this 🙇