RajeevKumar Garikapati
08/23/2024, 5:05 AMConfiguration check failed
State code: 08S01; Message: Communications link failure The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
The Support from 'rds-ca-2019' certificate ended Today, so we are seeing this error from today Morning 10AM.
Certificate authority Info: rds-ca-rsa2048-g1
Certificate authority date: May 25, 2061, 04:29 (UTC+05:30)
Old CA info: On August 22, 2024 - 'rds-ca-2019' will expire. You will need to take action before August 22, 2024
Can Someone help us on this.k monish
08/23/2024, 5:59 AMuser
08/23/2024, 8:38 AMYoungsam Roh
08/23/2024, 9:42 AMLeo Giroldo
08/23/2024, 10:42 AMChristopher Daniel
08/23/2024, 12:02 PM1.0.0
to 0.2.5
to support normalization.
Without changing proxy If there is a way to achieve it also - I am okay with it.
Please assist.Hassan Razzaq
08/23/2024, 12:02 PMvalues.yaml
file work fine with port forwarding, they do not work when using the LoadBalancer.
I would greatly appreciate any assistance or insights you can provide to help resolve this issue. If there are specific configurations or steps that I might have overlooked, or if there are best practices for ensuring that credentials work properly with a LoadBalancer, your guidance would be invaluable.
Thank you very much for your support!
Best regards,
Hassanuser
08/23/2024, 12:05 PMJulie Choong
08/23/2024, 2:18 PMhelm install
again but this time with the values.yaml
file containing the Postgres credentials. However, this time, my local port 8080 returns error 502 with http error bad gateway.
Need some assistance on debugging why is this happening, and how can I gracefully migrate all my data from Docker Compose to Kubernetes.Himanshu Dube
08/23/2024, 4:28 PMuser
08/24/2024, 2:22 PMINFO Using Kubernetes provider:
Provider: kind
Kubeconfig: /home/airbyteVM/.airbyte/abctl/abctl.kubeconfig
Context: kind-airbyte-abctl
ERROR Unable to determine organization email
ERROR unable to determine organization email: failed to get organization: unable to fetch token: unable to decode token request: invalid character '<' looking for beginning of value
I can access Airbyte in the browser, and I have checked the email in the database, but I still can't fetch the password. When I use localhost
and try to access it through the browser, I get a 404 Nginx error.
airbytehq/airbyteHassan Razzaq
08/24/2024, 2:28 PMINFO Using Kubernetes provider:
Provider: kind
Kubeconfig: /home/airbyteVM/.airbyte/abctl/abctl.kubeconfig
Context: kind-airbyte-abctl
ERROR Unable to determine organization email
ERROR unable to determine organization email: failed to get organization: unable to fetch token: unable to decode token request: invalid character '<' looking for beginning of value
I can access Airbyte in the browser, and I have checked the email in the database, but I still can't fetch the password. When I use localhost
and try to access it through the browser, I get a 404 Nginx error.Ishan Anilbhai Koradiya
08/24/2024, 4:22 PMPatrick Blank Cassol
08/25/2024, 4:01 AMRabea Yousef
08/25/2024, 9:20 AMHassan Razzaq
08/25/2024, 2:53 PMi am getting this error {"status":422,"type":"<https://reference.airbyte.com/reference/errors#unprocessable-entity>","title":"unprocessable-entity","detail":"The body of the request was not understood","documentationUrl":null,"data":{"message":"json schema validation failed when comparing the data to the json schema. \nErrors: $: required property 'api_key' not found, $: required property 'url' not found "}}
This is the url I am using url = "http:localhost:8000/api/public/v1/sources"
Aldo Orozco
08/25/2024, 11:24 PMSet up destination
Could not connect with provided SSH configuration. Error: getSQLState(...) must not be null
I'm using no SSH tunnel (screenshot). However, when I look at the logs, I see this error
WARN main c.z.h.u.DriverDataSource(<init>):68 Registered driver with driverClassName=com.amazon.redshift.jdbc.Driver was not found, trying direct instantiation.
Wonder if I need to configure anything in particular in the values.yaml so the driver is installed
cc - @Bryce Groff 🙏Jayant Kumar
08/26/2024, 6:41 AMYannik Voß
08/26/2024, 8:40 AML Theisen
08/26/2024, 9:34 AMAdam COHEN
08/26/2024, 10:29 AMab_cdc_deleted_at
to be active.
In the target bigquery database all records has the field ab_cdc_deleted_at
set to null.
Thank you for your time and help !Alkis
08/26/2024, 10:50 AMairbyte_internal
schema. It doesn’t seem to be unpacking into the expected tables in the actual schema.
I’m using the default setup without any custom transformations or dbt, so I’m not sure what might be causing this. Has anyone else experienced this or know how to ensure the data is correctly unpacked into the final destination schema?
Thanks for any help you can provide!Stockton Fisher
08/26/2024, 11:09 AM> message='io.airbyte.workers.exception.WorkloadMonitorException: Airbyte could not track the sync progress. No heartbeat within the time limit indicates the process might have died.', type='java.lang.RuntimeException', nonRetryable=falseCausing a sync to repeatedly fail.
Sourav Sikka
08/26/2024, 11:50 AM24-08-26 15:34:23 airbyte-db | 2024-08-26 10:04:23.022 UTC [746] FATAL: role "airflow" does not exist
2024-08-26 15:34:28 airflow-webserver-1 | 127.0.0.1 - - [26/Aug/2024:10:04:28 +0000] "GET /health HTTP/1.1" 200 318 "-" "curl/7.88.1"
2024-08-26 15:34:33 airbyte-db | 2024-08-26 10:04:33.078 UTC [760] FATAL: role "airflow" does not exist
2024-08-26 15:34:43 airbyte-db | 2024-08-26 10:04:43.154 UTC [771] FATAL: role "airflow" does not exist
2024-08-26 15:34:53 airbyte-db | 2024-08-26 10:04:53.206 UTC [780] FATAL: role "airflow" does not exist
2024-08-26 15:34:58 airflow-webserver-1 | 127.0.0.1 - - [26/Aug/2024:10:04:58 +0000] "GET /health HTTP/1.1" 200 318 "-" "curl/7.88.1"
2024-08-26 15:35:03 airbyte-db | 2024-08-26 10:05:03.250 UTC [793] FATAL: role "airflow" does not exist
2024-08-26 15:35:07 airflow-triggerer-1 | [2024-08-26T10:05:07.694+0000] {triggerer_job_runner.py:481} INFO - 0 triggers currently running
2024-08-26 15:35:13 airbyte-db | 2024-08-26 10:05:13.320 UTC [806] FATAL: role "airflow" does not exist
2024-08-26 15:35:16 airflow-scheduler-1 | [2024-08-26T10:05:16.338+0000] {scheduler_job_runner.py:1608} INFO - Adopting or resetting orphaned tasks for active dag runs
user
08/26/2024, 11:52 AMLuis Echegaray
08/26/2024, 12:59 PMThomas Vannier
08/26/2024, 1:56 PM_airbyte_data
column in my databricks destination.
From what I see in the doc https://docs.airbyte.com/integrations/destinations/databricks , I should get the tabular data also in databricks sql warehouse. No?
I also see that the "Normalization and Transformation operations are not supported for this connection" from the connection setup.
Thank for your time and help 🙂GUANGYU QU
08/26/2024, 3:36 PMuser
08/26/2024, 4:58 PM