Developer Crunchy
10/10/2022, 7:26 AMGeorg Heiler
10/10/2022, 8:00 AM403 Client Error: Forbidden for url: <https://analyticsdata.googleapis.com/v1beta/properties/><<m<_property>>/metadata'
. Is this a bug in the GA4 connector (alpha) or permission/networking issue on my side?Nicola Corda
10/10/2022, 8:52 AMAsmaa Althakafi
10/10/2022, 12:23 PMQamarudeen Muhammad
10/10/2022, 12:35 PMSheshan
10/10/2022, 1:22 PMRocky Appiah
10/10/2022, 1:35 PMLuca Moity
10/10/2022, 1:44 PMCould not find a version that satisfies the requirement airbyte-cdk~=0.1.56
My setup:
-ubuntu 20.04
-python3.9.1
I was looking for similar posts and did found a few, but none of the suggestion did help me.Tony Lewis
10/10/2022, 1:45 PMNicola Corda
10/10/2022, 2:02 PMSamantha Duggan
10/10/2022, 6:39 PMIncremental | deduped + history
is ‘succeeding’ but finding “no records” on an initial sync from an Aurora Postgres database into Snowflake.
I was able to do an initial sync using the same source/destination and same replication settings for Full refresh
and Incremental | append
. My cursor is an updated_at timestamp column and the table I’m testing has a primary key.
Any initial ideas on why the ‘deduped + history’ mode might not be emitting any records?Grant Pendrey
10/10/2022, 7:28 PMPradyumna Thakur Deshmukh
10/10/2022, 1:35 PMMarcelo Pio de Castro
10/11/2022, 12:10 AMNathan Chan
10/11/2022, 12:46 AM.env
file and point to the dir where I am going to place application_default_credentials.json
, configured the volume mapping in docker-compose.yaml
and in worker + webserver contain the env variable GOOGLE_APPLICATION_CREDENTIALS
is accessible, but still on ui I get this error: java.io.IOException: The Application Default Credentials are not available. They are available if running in Google Compute Engine. Otherwise, the environment variable GOOGLE_APPLICATION_CREDENTIALS must be defined pointing to a file defining the credentials. See <https://developers.google.com/accounts/docs/application-default-credentials> for more information.
Did I miss something?Matt Webster
10/11/2022, 1:53 AMState code: 08S01; Message: Communications link failure The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
There’s an airbyte worker message
airbyte-worker | 2022-10-11 01:47:42 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):109 - HikariPool-1 - Start completed.
And 60 seconds later
airbyte-worker | 2022-10-11 01:48:42 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):109 - HikariPool-1 - Shutdown initiated...
I’m using the latest server 0.40.14 and latest connector (airbyte/source-mysql:1.0.3). I’m connecting to MySql 5.6 in AWS Aurora.
I tried the enabledTLSProtocols=TLSv1.2
JDBC parameter even though I’m not seeing the poolable connection error.
I’ve tested my credentials connecting directly to MySql both from inside my VPC and externally through a SSH tunnel and it works fine everywhere other than from an Airbyte Server.
Any other ideas for me to try? Any other ways to troubleshoot?
Thanks in advance for the help!Nathan Chan
10/11/2022, 3:28 AMResponse Code: 429, Response Text: Pagination limit reached for offset model
. Is this something we need to configure on zendesk side or we can somehow set the pagination limit on airbyte? Thanks!Keshav Agarwal
10/11/2022, 5:03 AMZachary Damcevski
10/11/2022, 5:11 AMAditya Shelke
10/11/2022, 6:38 AMMark Elayan
10/11/2022, 7:10 AMairbyte-worker | 2022-10-11 07:02:14 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/1/0/normalize --log-driver none --name normalization-normalize-1-0-jmoyq --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e DEPLOYMENT_MODE=OSS -e AIRBYTE_ROLE= -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_VERSION=0.40.14 airbyte/normalization:0.2.22 run --integration-type postgres --config destination_config.json --catalog destination_catalog.json
airbyte-worker | 2022-10-11 07:02:16 normalization > [FATAL tini (7)] exec /airbyte/entrypoint.sh failed: Exec format error
airbyte-worker | 2022-10-11 07:02:16 INFO i.a.w.g.DefaultNormalizationWorker(run):82 - Normalization executed in 36 seconds.
airbyte-worker | 2022-10-11 07:02:16 INFO i.a.w.t.TemporalAttemptExecution(lambda$getWorkerThread$2):164 - Completing future exceptionally...
airbyte-worker | io.airbyte.workers.exception.WorkerException: Normalization Failed.
airbyte-worker | at io.airbyte.workers.general.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:92) ~[io.airbyte-airbyte-workers-0.40.14.jar:?]
airbyte-worker | at io.airbyte.workers.general.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:27) ~[io.airbyte-airbyte-workers-0.40.14.jar:?]
airbyte-worker | at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:161) ~[io.airbyte-airbyte-workers-0.40.14.jar:?]
airbyte-worker | at java.lang.Thread.run(Thread.java:833) ~[?:?]
I ran the same installation on my localhost (Apple M1) and all was working fine and got good results.
Used the below for deployments:
- git clone https://github.com/airbytehq/airbyte.git
- cd airbyte
- docker-compose up
Appreciate any tips if possible.Rachel RIZK
10/11/2022, 7:22 AM高松拳人
10/11/2022, 8:11 AMInternal Server Error: The specified bucket does not exist.
However, in fact, the specified bucket does not exist in
job-logging/workspace
is actually created in the specified bucket.
What is wrong?Ahmad Zamrik
10/11/2022, 9:02 AMSelman Ay
10/11/2022, 9:14 AMRahul Borse
10/11/2022, 10:45 AMKazi Khayruzzaman
10/11/2022, 10:19 AMstream_slicer
Just before moving to this https://docs.airbyte.com/connector-development/config-based/tutorial/incremental-reads#supporting-incremental-syncs we check if the stream_slicer is working or not.
i run this command
python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.json
But getting this error: [ATTACHED THE LOG IN A FILE]
Can anyone help, please?Donk
10/11/2022, 11:35 AMNikita Kogut
10/11/2022, 12:09 PMsourceDefinitionIds
differ from machine to machine for built-in connectors like S3 / Postgres? I need to hardcode that value in automated script, and not sure if it will be different on another machine and that will require changes to be made.Florian Melki
10/11/2022, 1:27 PM