https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • r

    RajeevKumar Garikapati

    08/23/2024, 5:05 AM
    Hey Team, All of Sudden today , Our Airbyte server is giving the below Error:
    Copy code
    Configuration check failed
    State code: 08S01; Message: Communications link failure The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
    The Support from 'rds-ca-2019' certificate ended Today, so we are seeing this error from today Morning 10AM.
    Copy code
    Certificate authority Info: rds-ca-rsa2048-g1
    Certificate authority date: May 25, 2061, 04:29 (UTC+05:30)
    
    
    Old CA info: On  August 22, 2024 - 'rds-ca-2019' will expire. You will need to take action before August 22, 2024
    Can Someone help us on this.
    n
    m
    +2
    • 5
    • 6
  • k

    k monish

    08/23/2024, 5:59 AM
    Hi Airbyte team, just wanted to clarify my understanding with you guys. I am trying install airbyte using abctl on my ec2 instance that does not have a FQDN: 1. the host parameter expects FQDN so that airbyte can be accessed from outside. Previously I have tried installing using ip but it said that it doesn't allow ip addresses as host. So for me to install airbyte and access it from outside I will need a FQDN. Is this correct? This is my understanding from the doc (https://docs.airbyte.com/using-airbyte/getting-started/oss-quickstart#using-an-ec2-instance-with-abctl) 2. I get this error when I try installing: a. ERROR Cluster 'airbyte-abctl' could not be created b. ERROR unable to create kind cluster: failed to generate kubeadm config content: failed to get kubernetes version from node: failed to get file: command "docker exec --privileged airbyte-abctl-control-plane cat /kind/version" failed with error: exit status 126 3. Can I do a native installation of airbyte without docker - if I have a dedicated server just for airbyte? Would really be helpful if the above are answered
  • u

    user

    08/23/2024, 8:38 AM
    #44582 Error in Source Connector "YouTube Analytics: "The YouTube account is not valid." New discussion created by smdraeger Hi, we have a problem with the airbyte open source connector ‘youtube analytics’ (https://docs.airbyte.com/integrations/sources/youtube-analytics). We followed the instructions in the documentation exactly and receive the error message ‘The YouTube account is not valid. Please make sure you're trying to use the active YouTube Account connected to your Google Account.’ see screenshot below image We made sure that within YouTube Channel the Google Account has ‘Manager’ rights - the same google account which have generated the refresh_token. The YouTube account is active. Scope within Google Api configuration has been double-checked. Do you have any ideas how to fix this? Thanks airbytehq/airbyte
  • y

    Youngsam Roh

    08/23/2024, 9:42 AM
    Hello. I am just starting to explore Airbyte now. Does Airbyte only support replication through stream logs( for CDC) ? I am looking to perform full replication between various heterogeneous DBMS, such as MySQL or Oracle, on a scheduled basis or to carry out one-time data transfers on demand. Would Airbyte be an appropriate solution for this use case, or is it not suitable?
    j
    • 2
    • 1
  • l

    Leo Giroldo

    08/23/2024, 10:42 AM
    Hello, I am trying to create a SharePoint source using service key authentication, but I’m unsure what to enter in the ‘User Principal Name’ field. Could someone please guide me on what needs to be provided here?
  • c

    Christopher Daniel

    08/23/2024, 12:02 PM
    Hi Team, I would like to add proxy to Airbyte Open Source inside a Linux VM. I tried to add proxy to • Docker-compose.yaml file [not working] • /etc/environment [not working] • /etc/systemd/system/docker.service.d/http-proxy.conf [not working] • JAVA_OPTS variable [not working] • JAVA_TOOL_OPTIONS variable [not working] Can someone please advise me where I should add proxy for Airbyte to accept it? Why do I need proxy? I wanted to change Clickhouse version from
    1.0.0
    to
    0.2.5
    to support normalization. Without changing proxy If there is a way to achieve it also - I am okay with it. Please assist.
  • h

    Hassan Razzaq

    08/23/2024, 12:02 PM
    Hello everyone, I hope this message finds you well. I am currently working on deploying Airbyte on Azure Kubernetes Service (AKS). I have successfully deployed Airbyte, and I can access it using port forwarding. However, I am facing an issue with authentication when accessing Airbyte through the LoadBalancer. Specifically, while the credentials configured in my
    values.yaml
    file work fine with port forwarding, they do not work when using the LoadBalancer. I would greatly appreciate any assistance or insights you can provide to help resolve this issue. If there are specific configurations or steps that I might have overlooked, or if there are best practices for ensuring that credentials work properly with a LoadBalancer, your guidance would be invaluable. Thank you very much for your support! Best regards, Hassan
  • u

    user

    08/23/2024, 12:05 PM
    #44586 Auth issue New discussion created by muhammadhrazzaq Hello everyone, I hope this message finds you well. I am currently working on deploying Airbyte on Azure Kubernetes Service (AKS). I have successfully deployed Airbyte, and I can access it using port forwarding. However, I am facing an issue with authentication when accessing Airbyte through the LoadBalancer. Specifically, while the credentials configured in my values.yaml file work fine with port forwarding, they do not work when using the LoadBalancer. I would greatly appreciate any assistance or insights you can provide to help resolve this issue. If there are specific configurations or steps that I might have overlooked, or if there are best practices for ensuring that credentials work properly with a LoadBalancer, your guidance would be invaluable. Thank you very much for your support! Best regards, airbytehq/airbyte
  • j

    Julie Choong

    08/23/2024, 2:18 PM
    Hi everyone. I previously hosted Airbyte via Docker Compose. Now I'm trying to set up via Kubernetes with Helm Charts. I can get it to run locally on port 8080, but I'm trying to load my data from Docker Compose into the Kubernetes one. I managed to do a pg dump in the VM where I have the Docker Compose version and ingested this pg dump file into DigitalOcean's Postgres database. I connected to this database from Tableplus - no issues viewing all the data. Then I tried to run
    helm install
    again but this time with the
    values.yaml
    file containing the Postgres credentials. However, this time, my local port 8080 returns error 502 with http error bad gateway. Need some assistance on debugging why is this happening, and how can I gracefully migrate all my data from Docker Compose to Kubernetes.
    j
    • 2
    • 3
  • h

    Himanshu Dube

    08/23/2024, 4:28 PM
    Hi everyone. I have a doubt, not sure if its fully airbyte related. • So I was trying to use the Instagram connector and i needed access_token for it. • So I connected fb page to my insta business account . • But when i used the meta graph api tool explorer, the token that i get there was returning me the user but not other details as the page, posts, etc, DESPITE giving all the permissions and scopes. • So later on I created a _*user from the meta business suite, and generated an access_token , and when i entered that in the AIRBYTE, it worked.*_ So can you tell me , what is this all about, why I could'nt generate a valid token from the graph api tool explorer BUT did from the Meta business suite.
  • u

    user

    08/24/2024, 2:22 PM
    #44716 abctl/ unable to fetch creds New discussion created by muhammadhrazzaq Hi everyone :) I have deployed Airbyte on an Azure VM using abctl. I am able to fetch the credentials when I use localhost in the ingress configuration. However, when I update it to my domain name, I receive the following error:
    Copy code
    INFO    Using Kubernetes provider:
               Provider: kind
               Kubeconfig: /home/airbyteVM/.airbyte/abctl/abctl.kubeconfig
               Context: kind-airbyte-abctl
    ERROR   Unable to determine organization email
    ERROR   unable to determine organization email: failed to get organization: unable to fetch token: unable to decode token request: invalid character '<' looking for beginning of value
    I can access Airbyte in the browser, and I have checked the email in the database, but I still can't fetch the password. When I use
    localhost
    and try to access it through the browser, I get a 404 Nginx error. airbytehq/airbyte
  • h

    Hassan Razzaq

    08/24/2024, 2:28 PM
    Hi everyone :) I have deployed Airbyte on an Azure VM using abctl. I am able to fetch the credentials when I use localhost in the ingress configuration. However, when I update it to my domain name, I receive the following error:
    Copy code
    INFO    Using Kubernetes provider:
               Provider: kind
               Kubeconfig: /home/airbyteVM/.airbyte/abctl/abctl.kubeconfig
               Context: kind-airbyte-abctl
    ERROR   Unable to determine organization email
    ERROR   unable to determine organization email: failed to get organization: unable to fetch token: unable to decode token request: invalid character '<' looking for beginning of value
    I can access Airbyte in the browser, and I have checked the email in the database, but I still can't fetch the password. When I use
    localhost
    and try to access it through the browser, I get a 404 Nginx error.
    u
    • 2
    • 2
  • i

    Ishan Anilbhai Koradiya

    08/24/2024, 4:22 PM
    Hello community, I am looking for some guidance & recommendation on how to scale airbyte if we aren't planning to go down the Kubernetes route. Currently we have a docker deployment. We might go there (Kubernetes) in future, but for now I am interested to hear what all ways are there to improve/scale the docker setup. CPU is the main issue for us, which spikes when our syncs are running. RAM doesn't seem to be that much of a problem. Currently we have 4 cores & 16GB RAM. Thank you for your help in advance.
  • p

    Patrick Blank Cassol

    08/25/2024, 4:01 AM
    Hello community, iam new on Airbyte, just install a local airbyte. Iam copying mysql to sqlserver and the destination table save as raw json all the colluns, i dont find the basic normalization option on version 0.63.13, any tips or this local version dont have this option? Thank you for your help in advance.
  • r

    Rabea Yousef

    08/25/2024, 9:20 AM
    Hello community, i upgraded my Airbyte to VERSION=0.63.13 i'm using airbyte UI I configured a connection with hubspot as source and Redshift as target with incremental append+Dedupe for deals, tickets and contacts object but the sync keep running all time and not stopping although the full load completed in target tables also i tried full refresh, but the the sync started automatically once full load completed in the schema airbyte_internal, while data didn't store in the target tables? while, i'm using Intercom and Jira as source with incremental load and everything is going well Can you help please? Thanks
  • h

    Hassan Razzaq

    08/25/2024, 2:53 PM
    i am trying to create the source using developer apis
    i am getting this error {"status":422,"type":"<https://reference.airbyte.com/reference/errors#unprocessable-entity>","title":"unprocessable-entity","detail":"The body of the request was not understood","documentationUrl":null,"data":{"message":"json schema validation failed when comparing the data to the json schema. \nErrors: $: required property 'api_key' not found, $: required property 'url' not found "}}
    This is the url I am using url = "http:localhost:8000/api/public/v1/sources"
    u
    • 2
    • 4
  • a

    Aldo Orozco

    08/25/2024, 11:24 PM
    Hi community! I'm having a problem setting up Redshift as the destination in airbyte OSS deployed from the official helm chart. I want to transfer data from S3 to Redshift. When I manually create the Redshift destination, I see this error in the UI when clicking
    Set up destination
    Copy code
    Could not connect with provided SSH configuration. Error: getSQLState(...) must not be null
    I'm using no SSH tunnel (screenshot). However, when I look at the logs, I see this error
    Copy code
    WARN main c.z.h.u.DriverDataSource(<init>):68 Registered driver with driverClassName=com.amazon.redshift.jdbc.Driver was not found, trying direct instantiation.
    Wonder if I need to configure anything in particular in the values.yaml so the driver is installed cc - @Bryce Groff 🙏
    • 1
    • 2
  • j

    Jayant Kumar

    08/26/2024, 6:41 AM
    Hi Team, I am using GA4 Airbyte source to ingest data into the BQ Warehouse. GA4 supports a new models for events and properties. On the Airbyte source setup page, I could only see list of reports and their sync modes. Is it possible to ingest GA4 events using Airbyte GA4 source?
  • y

    Yannik Voß

    08/26/2024, 8:40 AM
    Hey folks, is it possible that the airbyte-worker is caching a lot of data in the memory because the source is reading more data than the destination can write? We are running airbyte on a VM with only 2 cores and I noticed that behaviour for snycs with a lot of data.
    👀 1
    p
    • 2
    • 2
  • l

    L Theisen

    08/26/2024, 9:34 AM
    Hi all, i am trying to run Airbyte OSS in a airgaped environment. It runs in Kubernetes via helmchart. Via the Startup i got the following issue: 2024-08-26 090813 INFO i.a.c.s.RemoteDefinitionsProvider(<init>):75 - Creating remote definitions provider for URL 'https://connectors.airbyte.com/files/' and registry 'OSS'... Is it possible to provide the connectors via an artifactory in the airgaped environment and if yes, where can it be grapped?
  • a

    Adam COHEN

    08/26/2024, 10:29 AM
    Hello community ! Im using Airbyte v0.63.3 with UI. I configured a connection with mongoDB as source and bigquery as target with incremental append + deduped. The issu is when I delete a record in the source database, the record is deleted in the target bigquery. I would like the record not to disappear but the field
    ab_cdc_deleted_at
    to be active. In the target bigquery database all records has the field
    ab_cdc_deleted_at
    set to null. Thank you for your time and help !
  • a

    Alkis

    08/26/2024, 10:50 AM
    Hi everyone, I’m running into an issue with Airbyte where the data syncs from BigQuery to SQL Server, but the data seems to be stuck in the
    airbyte_internal
    schema. It doesn’t seem to be unpacking into the expected tables in the actual schema. I’m using the default setup without any custom transformations or dbt, so I’m not sure what might be causing this. Has anyone else experienced this or know how to ensure the data is correctly unpacked into the final destination schema? Thanks for any help you can provide!
  • s

    Stockton Fisher

    08/26/2024, 11:09 AM
    Hi there, Any idea on how to solve this issue on Airbyte cloud?
    > message='io.airbyte.workers.exception.WorkloadMonitorException: Airbyte could not track the sync progress. No heartbeat within the time limit indicates the process might have died.', type='java.lang.RuntimeException', nonRetryable=false
    Causing a sync to repeatedly fail.
  • s

    Sourav Sikka

    08/26/2024, 11:50 AM
    Hello, I am getting below error . Could anyone please help here
    Copy code
    24-08-26 15:34:23 airbyte-db | 2024-08-26 10:04:23.022 UTC [746] FATAL: role "airflow" does not exist
    2024-08-26 15:34:28 airflow-webserver-1 | 127.0.0.1 - - [26/Aug/2024:10:04:28 +0000] "GET /health HTTP/1.1" 200 318 "-" "curl/7.88.1"
    2024-08-26 15:34:33 airbyte-db | 2024-08-26 10:04:33.078 UTC [760] FATAL: role "airflow" does not exist
    2024-08-26 15:34:43 airbyte-db | 2024-08-26 10:04:43.154 UTC [771] FATAL: role "airflow" does not exist
    2024-08-26 15:34:53 airbyte-db | 2024-08-26 10:04:53.206 UTC [780] FATAL: role "airflow" does not exist
    2024-08-26 15:34:58 airflow-webserver-1 | 127.0.0.1 - - [26/Aug/2024:10:04:58 +0000] "GET /health HTTP/1.1" 200 318 "-" "curl/7.88.1"
    2024-08-26 15:35:03 airbyte-db | 2024-08-26 10:05:03.250 UTC [793] FATAL: role "airflow" does not exist
    2024-08-26 15:35:07 airflow-triggerer-1 | [2024-08-26T10:05:07.694+0000] {triggerer_job_runner.py:481} INFO - 0 triggers currently running
    2024-08-26 15:35:13 airbyte-db | 2024-08-26 10:05:13.320 UTC [806] FATAL: role "airflow" does not exist
    2024-08-26 15:35:16 airflow-scheduler-1 | [2024-08-26T10:05:16.338+0000] {scheduler_job_runner.py:1608} INFO - Adopting or resetting orphaned tasks for active dag runs
    m
    • 2
    • 1
  • u

    user

    08/26/2024, 11:52 AM
    Comment on #44716 abctl/ unable to fetch creds Discussion answered by muhammadhrazzaq I figured out the issue when we try to deploy without --insecure-cookies we have to add the tls . If you deploy airbyte using --insecure-cookies flag it'll work airbytehq/airbyte
  • l

    Luis Echegaray

    08/26/2024, 12:59 PM
    hello everyone! I finally got my little ec2 installation over to an EKS rollout, and looking for advice on how to sync every 15-30 minutes. our data sets are not huge and mainly using cdc postgres -> snowflake. the perf looks much better now that I am on k8s thankfully
    j
    c
    l
    • 4
    • 7
  • t

    Thomas Vannier

    08/26/2024, 1:56 PM
    Hello Airbyte community, I setup a connection from MySQL to Databricks Lakehouse with full refresh sync in Airbyte 0.50.47 I dont understand why all the MySQL tabular data are inserted into one single JSON in a
    _airbyte_data
    column in my databricks destination. From what I see in the doc https://docs.airbyte.com/integrations/destinations/databricks , I should get the tabular data also in databricks sql warehouse. No? I also see that the "Normalization and Transformation operations are not supported for this connection" from the connection setup. Thank for your time and help 🙂
  • g

    GUANGYU QU

    08/26/2024, 3:36 PM
    hi there. i have a question about private registry. is it possible to point airbyte to private registry for all public connectors, not for custom connector? I did not found any documents about it.
  • u

    user

    08/26/2024, 4:58 PM
    #44785 [source-dynamodb] enable choose role arn New discussion created by arodrber0 Hello, in this connector https://github.com/airbytehq/airbyte/tree/master/airbyte-integrations/connectors/source-dynamodb I cannot write the arn of the role that I want the connector to assume, would it be possible to enable this instead Does it get it by system default? This would enable cross account consumption in AWS in this case. Connector in use: image airbytehq/airbyte
1...214215216...245Latest