Kfir
11/19/2022, 6:00 PMairbyte/source-google-workspace-admin-reports
?
It wasn’t build for arm64
as the other sources
https://hub.docker.com/r/airbyte/source-google-workspace-admin-reports/tagsRytis Zolubas
11/19/2022, 6:01 PMVenkat Dasari
11/20/2022, 5:50 AMRytis Zolubas
11/20/2022, 8:51 AMMurat Cetink
11/20/2022, 7:40 PMWorkerException: Could not find image: airbyte/destination-snowflake:0.40.40
error. I wonder if anyone else is getting the same error or it’s just me.Hrvoje Piasevoli
11/20/2022, 9:16 PMNipuna Prashan
11/21/2022, 3:08 AMNipuna Prashan
11/21/2022, 3:10 AMRishabh D
11/21/2022, 8:42 AMMichael Sonnleitner
11/21/2022, 9:13 AM....
2022-11-20 10:21:55 [44msource[0m > Nov 20, 2022 10:21:55 AM com.github.shyiko.mysql.binlog.BinaryLogClient$5 run
2022-11-20 10:21:55 [44msource[0m > INFO: Keepalive: Trying to restore lost connection to <http://database-server.com:3306|database-server.com:3306>
2022-11-20 10:25:16 [44msource[0m > Stopping the task and engine
2022-11-20 10:25:16 [44msource[0m > Stopping down connector
2022-11-20 10:26:46 [44msource[0m > Coordinator didn't stop in the expected time, shutting down executor now
2022-11-20 10:28:16 [44msource[0m > Connection gracefully closed
2022-11-20 10:28:16 [44msource[0m > Stopped FileOffsetBackingStore
2022-11-20 10:28:16 [44msource[0m > Debezium engine shutdown.
2022-11-20 10:28:16 [44msource[0m > The main thread is exiting while children non-daemon threads from a connector are still active.
Ideally, this situation should not happen...
Please check with maintainers if the connector or library code should safely clean up its threads before quitting instead.
The main thread is: main (RUNNABLE)
Thread stacktrace: java.base/java.lang.Thread.getStackTrace(Thread.java:1610)
at io.airbyte.integrations.base.IntegrationRunner.dumpThread(IntegrationRunner.java:334)
at io.airbyte.integrations.base.IntegrationRunner.watchForOrphanThreads(IntegrationRunner.java:282)
at io.airbyte.integrations.base.IntegrationRunner.produceMessages(IntegrationRunner.java:219)
at io.airbyte.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:141)
at io.airbyte.integrations.base.IntegrationRunner.run(IntegrationRunner.java:100)
at io.airbyte.integrations.source.mysql.MySqlSource.main(MySqlSource.java:309)
2022-11-20 10:28:16 [44msource[0m > Active non-daemon thread: debezium-mysqlconnector-database-change-event-source-coordinator (TIMED_WAITING)
Thread stacktrace: java.base@17.0.4.1/jdk.internal.misc.Unsafe.park(Native Method)
....
We can already rule out that it is an out of memory problem on our airbyte server. At least according to the server's logs, there seems to be no problem here.
Do any of you have an idea how we could solve this problem?Ivan Pilipchuk
11/21/2022, 10:24 AMhttps://user-images.githubusercontent.com/51704301/203024551-59110a9c-75d2-47a0-a427-0c851d56dd83.png▾
https://user-images.githubusercontent.com/51704301/203024974-225dd6da-bce3-4b1c-ab1e-e583fe736fb0.png▾
Rahul Borse
11/21/2022, 11:13 AMMads Christensen
11/21/2022, 12:22 PMnon-json response
error.
I use Google Analytics 4 as source and PostgreSQL as destination.
I shows up after 60 seconds, which indicates the default timeout. I tried to set it to 0 (infinity) in the jdbc string, but doesnt work.
I see that many have had this problem, but has anyone ever found the solution?Agung Pratama
11/21/2022, 12:58 PMoctavia import all
and my goal is to checkin all the manifest in the git repo, so my work colleague can just octavia apply
to configure their local airbyte.
However when I tried, I got this exception:
🐙 - Octavia is targetting your Airbyte instance running at <http://localhost:8000> on workspace 419064f1-2e08-480c-be95-69c932d2a463.
🐙 - TimescaleDB (tracking_db) exists on your Airbyte instance according to your state file, let's check if we need to update it!
😴 - Did not update because no change detected.
🐙 - MySQL (space_database) exists on your Airbyte instance according to your state file, let's check if we need to update it!
😴 - Did not update because no change detected.
🐙 - MySQL (user_info_database) exists on your Airbyte instance according to your state file, let's check if we need to update it!
😴 - Did not update because no change detected.
🐙 - MySQL (space_database) <> TimescaleDB (tracking_db) does not exists on your Airbyte instance, let's create it!
Traceback (most recent call last):
File "/usr/local/bin/octavia", line 8, in <module>
sys.exit(octavia())
File "/usr/local/lib/python3.9/site-packages/click/core.py", line 1128, in __call__
return self.main(*args, **kwargs)
File "/usr/local/lib/python3.9/site-packages/click/core.py", line 1053, in main
rv = self.invoke(ctx)
File "/usr/local/lib/python3.9/site-packages/click/core.py", line 1659, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "/usr/local/lib/python3.9/site-packages/octavia_cli/base_commands.py", line 54, in invoke
raise e
File "/usr/local/lib/python3.9/site-packages/octavia_cli/base_commands.py", line 51, in invoke
result = super().invoke(ctx)
File "/usr/local/lib/python3.9/site-packages/click/core.py", line 1395, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/usr/local/lib/python3.9/site-packages/click/core.py", line 754, in invoke
return __callback(*args, **kwargs)
File "/usr/local/lib/python3.9/site-packages/click/decorators.py", line 26, in new_func
return f(get_current_context(), *args, **kwargs)
File "/usr/local/lib/python3.9/site-packages/octavia_cli/check_context.py", line 91, in wrapper
f(ctx, **kwargs)
File "/usr/local/lib/python3.9/site-packages/octavia_cli/apply/commands.py", line 29, in apply
apply_single_resource(resource, force)
File "/usr/local/lib/python3.9/site-packages/octavia_cli/apply/commands.py", line 66, in apply_single_resource
messages = create_resource(resource)
File "/usr/local/lib/python3.9/site-packages/octavia_cli/apply/commands.py", line 127, in create_resource
created_resource, state = resource.create()
File "/usr/local/lib/python3.9/site-packages/octavia_cli/apply/resources.py", line 696, in create
return self._create_or_update(self._create_fn, self.create_payload)
File "/usr/local/lib/python3.9/site-packages/octavia_cli/apply/resources.py", line 670, in create_payload
return WebBackendConnectionCreate(
File "/usr/local/lib/python3.9/site-packages/airbyte_api_client/model_utils.py", line 46, in wrapped_init
return fn(_self, *args, **kwargs)
File "/usr/local/lib/python3.9/site-packages/airbyte_api_client/model/web_backend_connection_create.py", line 345, in __init__
setattr(self, var_name, var_value)
File "/usr/local/lib/python3.9/site-packages/airbyte_api_client/model_utils.py", line 185, in __setattr__
self[attr] = value
File "/usr/local/lib/python3.9/site-packages/airbyte_api_client/model_utils.py", line 510, in __setitem__
self.set_attribute(name, value)
File "/usr/local/lib/python3.9/site-packages/airbyte_api_client/model_utils.py", line 157, in set_attribute
value = validate_and_convert_types(
File "/usr/local/lib/python3.9/site-packages/airbyte_api_client/model_utils.py", line 1582, in validate_and_convert_types
raise get_type_error(input_value, path_to_item, valid_classes,
airbyte_api_client.exceptions.ApiTypeError: Invalid type for variable 'geography'. Required value type is Geography and passed type was str at ['geography']
Krzysztof
11/21/2022, 2:12 PMKrzysztof
11/21/2022, 2:12 PMKrzysztof
11/21/2022, 2:14 PMRegitze Sdun
11/21/2022, 2:36 PMThe form is invalid. Please make sure that all fields are correct.Any idea what I'm doing wrong?
JP
11/21/2022, 2:36 PMException attempting to access the Gcs bucket
Stack Trace: com.amazonaws.services.s3.model.AmazonS3Exception: Access denied.
Service account currently has
"storage.multipartUploads.abort",
"storage.multipartUploads.create",
"storage.multipartUploads.list",
"storage.multipartUploads.listParts",
"storage.buckets.get",
"storage.buckets.create",
"storage.buckets.getIamPolicy",
"storage.buckets.list",
"storage.objects.create",
"storage.objects.get",
"storage.objects.getIamPolicy",
"storage.objects.list",
"bigquery.config.get",
Hiep Minh Pham
11/21/2022, 2:44 PMFull refresh | Overwrite
and I changed it to Incremental | Append
in the UI (I did not choose to reset all streams as I need to keep historical data). However, the connector still get full data other than run an incremental model. Is this a bug?Joviano Cicero Costa Junior
11/21/2022, 3:32 PMCesar Santos
11/21/2022, 6:20 PMSapin Dahal
11/21/2022, 7:47 PM{"type": "LOG", "log": {"level": "ERROR", "message": "Check failed"}}
{"type": "CONNECTION_STATUS", "connectionStatus": {"status": "FAILED", "message": "'Unable to connect to stream action - '"}}
But when I run -- python main.py check --debug --config secrets/config.json
I get a response and connection success message
{"type": "LOG", "log": {"level": "INFO", "message": "Check succeeded"}}
{"type": "CONNECTION_STATUS", "connectionStatus": {"status": "SUCCEEDED"}}
Nelson Rafael Perez
11/21/2022, 10:40 PMThe datastore operation timed out, or the data was temporarily unavailable."
The Bigquery table has 2.240.000 records and each record has 30 fields, It is around 550MB, I know that is a lot of data but is there something that I can do from Airbyte side to avoid this situation?Adrian Bakula
11/21/2022, 11:15 PM0.40.18
. Noticing that /v1/connections/search
doesn't return results anymore, though it was working fine in 0.40.14
. Anyone else experiencing this? Seems like no search parameters work.Mohammad Abu Aboud
11/22/2022, 1:27 AMHao Kuang
11/22/2022, 2:05 AMMAX_ITERATION_VALUE
configurable? We do sometimes see init:Error
when source pod get created due to 1-min hard-coded timeout?PALLAPOTHU MANOJ SAI KUMAR
11/22/2022, 6:04 AMcost_per_conversion
column(given full permission of data to sync). could anyone please let us know how to get it.
image version of source Facebook marketing connector - 0.2.72Faris
11/22/2022, 7:03 AMSync Failed
Last attempt:67.92 MB513,973 emitted recordsno records2m 44s
Failure Origin: source, Message: Something went wrong in the connector. See the logs for more details.
2022-11-21 14:26:40 - Additional Failure Information: java.lang.RuntimeException: org.postgresql.util.PSQLException: FATAL: terminating connection due to conflict with recovery Detail: User query might have needed to see row versions that must be removed. Hint: In a moment you should be able to reconnect to the database and repeat your command.
source connector version 1.0.22
destination connector version 0.3.26
@Nataly Merezhuk (Airbyte) any direction to why this error happens? (posted this question long ago but couldn't get any cues to solve it)Rahul Borse
11/22/2022, 7:19 AM