https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • p

    Paul Houghton

    09/04/2024, 2:46 PM
    I have started a new airbyte instance in docker, have logged in successfully but im having networking issues. Ideally I would prefer to add airbyte to an existing network, rather than airbyte creating a new one, is that possible?
  • d

    Dhinesh Balakrishnan

    09/04/2024, 3:46 PM
    Hi, iam facing some issue i have mentioned here i have loading data from redis to mssql, all columns are correctly mapped but my output table '_airbyte_data_' is not populated
  • r

    Renat Zubayrov

    09/04/2024, 3:58 PM
    Hello #C021JANJ6TY hope you well, I'm facing an issue with NetSuite connector (REST, Python connector) with
    java.text.ParseException: Unparseable date: "2024-03-04".
    (detailed stacktrace in thread) however what is strange this date is a start-date that I configured for connector, not the date I receive from Netsuite object. Can it be that it's an issue in Python CDK?
    • 1
    • 4
  • b

    Berwin Rayen

    09/04/2024, 4:05 PM
    hi everyone i have created Redis source connector and i can able to pull the data from redis which is in json format using custom connector I have integrated into the airbyte UI my objective is to push the data from redis to ms sql server Discover is working fine but airbyte_data is coming empty log are coming fine there is no error If anybody able to resolve this issue please reply
  • u

    user

    09/04/2024, 5:28 PM
    #45130 MongoDB - Support for user-defined schemas New discussion created by evantahler Airbyte is a schema-aware application, as are most of the destinations we sync to (e.g data warehouses with a fixed set of columns). For Airbyte's MongoDB source, we have 2 methods of discovering the schema: 1. schema enforced Sample a number of objects (configurable) and inspect the properties of those objects to build the schema 2. schema not enforced Disable schema enforcement, and move the objects as blobs, with only the primary key
    _id
    and the data in a
    data
    blob. Option one (schema enforced syncs) work well for MongoDB collections with similar-ish properties on all objects. If you have a collection with differently shaped objects, sampling will likely miss some properties. A third option was presented in #42862, which would be to allow users with varying objects in the collection to provide a schema they want to use for the sync (e.g. via a JSONSchema document). Are you interested in Airbyte's MongoDB providing this new "describe your schema explicitly" option for schema-aware syncs? airbytehq/airbyte
  • u

    user

    09/04/2024, 7:49 PM
    #45138 Deployment with preconfigured access to API New discussion created by djpirra I am currently planning to build a SaaS application that can ingest data from multiple sources as one of the capabilities. I saw that Airbyte Self-Hosted seems to be quite promising to help deliver that. However, I am not sure if Airbyte can be installed already with a preconfigured application that enables access to the API out of the box. Does anyone know if this is possible? Thank you airbytehq/airbyte
  • s

    Suresh Antony

    09/04/2024, 7:59 PM
    We are using Airbyte cloud to import
    hubspot
    contacts
    to big query. We are using
    incremenal| Append Dedup
    First day we got almost all records. After that we are seeing missing records and duplicate records. Any one face this issue ?
  • d

    Danton Bertuol

    09/04/2024, 8:12 PM
    Hello, I am facing a critical problem in the Discovery of a PostgreSQL data source, always returning the error An unknown error occurred. (HTTP 504). Has anyone gone through this? In the discover POD that is created, this warning is displayed at the time of Timeout: WARN c.a.l.CommonsLog(warn):113 - JAXB is unavailable. Will fallback to SDK implementation which may be less performant. If you are using Java 9+, you will need to include javax.xml.bind:jaxb-api as a dependency. The Pod continues running, but within 30 seconds the error message appears in the web app.
  • p

    Pranay Mule

    09/05/2024, 6:38 AM
    Hello #C021JANJ6TY, I’m using the Xero Airbyte connector from the Airbyte Marketplace to sync data to BigQuery with the 'Bearer Access Token' authentication method. Given that the Xero access token expires every 30 minutes, I need to understand if Airbyte automatically handles the generation of a new token upon expiration or if the sync process will be interrupted and the connection will fail after 30 minutes if the token is not refreshed. Any insights on how token management is handled when using the 'Bearer Access Token' authentication type would be greatly appreciated.
    p
    • 2
    • 2
  • f

    Faraz Ashraf

    09/05/2024, 7:13 AM
    2024-09-05 071122 destination > Retrying langchain_community.embeddings.openai.embed_with_retry.<locals>._embed_with_retry in 16.0 seconds as it raised RateLimitError: Rate limit reached for text-embedding-ada-002 in organization org-hhuVUVP2FFjsS32XGgfT3lGn on requests per day (RPD): Limit 200, Used 200, Requested 1. Please try again in 7m12s. Visit https://platform.openai.com/account/rate-limits to learn more. You can increase your rate limit by adding a payment method to your account at https://platform.openai.com/account/billing.. 392 2024-09-05 071138 destination > Retrying langchain_community.embeddings.openai.embed_with_retry.<locals>._embed_with_retry in 20.0 seconds as it raised RateLimitError: Rate limit reached for text-embedding-ada-002 in organization org-hhuVUVP2FFjsS32XGgfT3lGn on requests per min (RPM): Limit 3, Used 3, Requested 1. Please try again in 20s. Visit https://platform.openai.com/account/rate-limits to learn more. You can increase your rate limit by adding a payment method to your account at https://platform.openai.com/account/billing.. Why I get Rate Limit Error? I have paid openai api key then why I get this error while syncing to pinecone from postgres
  • j

    Jon Seymour

    09/05/2024, 8:03 AM
    re: https://airbytehq.slack.com/archives/C01AHCD885S/p1725522809127229 I am experiencing issues with a new airbyte install (chart 0.524.0) , running (abctl v0.14.1) and trying to verify a destination • I have verified postgres connectivity from the instance using psql running on the same EC2 instance as the k8s pods and there is also zero evidence of a postgres destination connection failure in the sum total of all the k8s logs • I have enough resources - airbyte has been freshly installed on an otherwise bone-idle EC2 instance with 16GB of RAM (AWS t3.xlarge) instance type • postgres sources are verifying just fine Per the linked message there are odd messages in the pod that appears to relate to the destination check: “2024-09-05T073835.642324378Z pool-3-thread-1 ERROR Recursive call to appender SecretMaskRewrite” but I have absolutely no clue whatsoever what has caused this issue or whether it is related to the 504 error experienced in the UI although it does seem possible based on the circumstantial evidence in the logs
    a
    b
    • 3
    • 11
  • p

    Pranay Mule

    09/05/2024, 8:52 AM
    Hello #C021JANJ6TY, I’m looking to integrate Xero with Airbyte and noticed that Custom Connections are only available for organizations in Australia, New Zealand, and the UK as mentioned in the Documentation. Since my Xero subscription is in the U.S., can I not use Custom Connections? If so, are there any alternative methods for U.S.-based accounts besides using a Bearer Access Token (which expires every 30 minutes)? Any assistance would be greatly appreciated.
  • k

    Krishna Channa

    09/05/2024, 9:55 AM
    can you help me with installing airbyte on dockers?
    p
    • 2
    • 3
  • j

    James N

    09/05/2024, 10:00 AM
    Hi - first time Airbyte user here. I've followed the instructions on quickstart page for installing on Ec2 (https://docs.airbyte.com/using-airbyte/getting-started/oss-quickstart?_gl=1*zn15cq*_gcl_au*MTQ5MTA1NjQ3MC4xNzIxMjM1NDcy) but my first Source connection is failing to connect to the postgres DB: Internal message: io.airbyte.workload.launcher.pipeline.stages.model.StageError: io.airbyte.workload.launcher.pods.KubeClientException: Failed to create pod rce-postgres-check-16697a14-10f0-4bb4-a6d0-67c183837eab-0-ydybe Further down the stack I see: Caused by: io.fabric8.kubernetes.client.KubernetesClientException: Failure executing: PATCH at: https://10.96.0.1:443/api/v1/namespaces/airbyte-abctl/pods/rce-postgres-check-16697a14-10f0-4bb4-a6d0-67c183837eab-0-ydybe?fieldManager=fabric8. Message: Unauthorized. Received status: Status(apiVersion=v1, code=401, details=null, kind=Status, message=Unauthorized, metadata=ListMeta(_continue=null, remainingItemCount=null, resourceVersion=null, selfLink=null, additionalProperties={}), reason=Unauthorized, status=Failure, additionalProperties={}) Anyone any ideas or help?
  • d

    Du Trần

    09/05/2024, 10:01 AM
    Hello #C021JANJ6TY I'm using source mysql. But I see the default fetch size for the configuration is 5000 records in logger I want to change the default fetch size to 10,000 records. How can I change this config?
  • g

    Gabriele Cacchioni

    09/05/2024, 11:58 AM
    Self hosted, k8s Does anyone know how to configure the image pull secret for temporal?
  • m

    Maximilian Hein

    09/05/2024, 12:35 PM
    Hey all, I've been setting up Airbyte and Airflow, and I have both running smoothly in Docker. I've also successfully created a connection in Airbyte to sync data and set up a DAG in Airflow to trigger the sync - sadly the DAG is not working. However, I'm now facing an issue I can't quite figure out. My Airflow DAG, is throwing a network error when trying to trigger the Airbyte sync job (first task in my DAG). The error mentions that the connection to Airbyte (
    host.docker.internal:8000
    ) is unreachable (
    Max retries exceeded
    ), but I’m not sure why this is happening as everything is set up similiar to how it is explained here: Airflow &amp; Airbyte: Better Together . Any ideas on what might be going wrong or things I could check? Appreciate any pointers!
    p
    • 2
    • 6
  • j

    Josh Lee

    09/05/2024, 12:37 PM
    Hi all, i have been trying to test airbyte with a snowflake to mysql sync. i get about 9 hours into the sync and mysql just stops responding. im on v8.4. are there any special settings i should be using?
  • a

    Andrei Delcea

    09/05/2024, 2:06 PM
    Hello, can someone help me with the NGINX config? I got it running but I can't get past the setup page, it returns 403 Forbidden when submitting the setup form, on route:
    api/v1/instance_configuration/setup
    it also throws 403 on route
    /api/v1/users/get
    my config is:
    Copy code
    location / {
        proxy_pass <http://localhost:8000>;
        proxy_http_version 1.1;
        proxy_set_header Upgrade $http_upgrade;
        proxy_set_header Connection "Upgrade";
        proxy_set_header Cookie $http_cookie;
        proxy_pass_request_headers on;
        proxy_pass_header Accept;
        proxy_pass_header Server;
        proxy_set_header Authorization $http_authorization;
        proxy_pass_header Authorization;
        proxy_set_header ns_server-ui yes;
        proxy_read_timeout 3600;
        proxy_send_timeout 3600;
      }
  • s

    Shay Dahan

    09/05/2024, 2:49 PM
    Hi #C021JANJ6TY I'm running Airbyte with Helm chart version 0.524.0. When I try to set up a source or destination, I get a 504 error(see screenshot). Has anyone encountered this issue and can help? Here is the POST request that returns the 504 error: curl -X POST {host}/api/v1/scheduler/sources/check_connection \ -H "Content-Type: application/json" \ -d '{ "connectionConfiguration": { "ssl": true, "port": 3306, "ssl_mode": { "mode": "preferred" }, "tunnel_method": { "tunnel_method": "NO_TUNNEL" }, "replication_method": { "method": "STANDARD" }, "username": “xxxxx”, "password": “xxxxx”, "database": "xxxxx", "host": “xxxxx” }, "workspaceId": "d4e3d7e3-a8d7-4190-b12a-4dcd777b8945", "sourceDefinitionId": "435bb9a5-7887-4809-aa58-28c27df0d7ad" }'
  • n

    Narayan Zeermire

    09/05/2024, 5:18 PM
    Hi team, Saved offset is not valid. Please reset the connection, and then increase oplog retention and/or increase sync frequency to prevent his from happening in the future. how to resolve this error permanently because it is getting every day in all our pipeline
  • u

    user

    09/05/2024, 6:10 PM
    #45177 Pod-airbyte-bootloader-error New discussion created by nikhilnicky1 Here is my values.yaml file which connects to RDS and S3 in storage, by default its creating airbyte-db-0 pod and iam getting pod-airbyte-bootloader error, when i checked the logs its trying to connect localhost db. can anyone please help am i missing something in the configuration?? iam new to k8s. New Text Document.txt image airbytehq/airbyte
  • a

    Angela Ashe

    09/05/2024, 8:12 PM
    We upgraded our GCP hosted instance today w/out the migrate flag set. All of our connections are now gone. Is there anyway that we can recover the connections? Can we recover our data?
    p
    j
    • 3
    • 7
  • u

    user

    09/05/2024, 10:36 PM
    #45182 Unable to list workspace or create connexion using self hosted airbyte python sdk New discussion created by Abdi-Arslene Hello airbyte team, I’m working with the Airbyte Python SDK and have a self-hosted Airbyte setup running locally on my machine using abctl. Does the Python SDK support this setup? If not, can you guide me on how to use the Airbyte API with a local installation? Also, where can I find relevant documentation? this is what i did but i always get errors: image Thanks in advance! airbytehq/airbyte
  • a

    Abdullah Uddin

    09/06/2024, 2:51 AM
    Hello, is anyone able to help with connecting synology (server) to postgreSQL?
  • m

    Mukund Yadav

    09/06/2024, 6:42 AM
    Hi team, we have installed open source airbyte on a VM in GCP, and trying to get the data from Mixpanel to BigQuery, out of 7 tables, 3 tables data is not getting pulled to BQ. any one has faced this issue and what is the fix. I have also tried to upgrade the Mixpanel driver. no effect.
    • 1
    • 1
  • p

    Patricio Villanueva

    09/06/2024, 7:09 AM
    Hi all, I have a self hosted airbyte with kubernetes, and multiple connections that work oka, however I have two different connector only syncing one table each (this is because they are very being tables and initial syncs can take some time). My problem is that these two single table connectors are not using the full resources of the pods, but all the other connectors do, so I dont understand whats different about these ones. does anyone knows what might be the problem?
  • p

    Paul Houghton

    09/06/2024, 8:19 AM
    I have created a local airbyte install using abctl in an unraid home lab for testing. I have created the instance and am able to connect, and now trying to connect to the example exchante rate api (https://api.apilayer.com) but am getting resolution errors when testing. I can run nslookup and curl on the url from inside the container but not from the ui. How can i go about diagnosing this networks issue?
    p
    • 2
    • 6
1...219220221...245Latest