https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • a

    Abubakar Alaro

    01/04/2023, 9:05 PM
    https://discuss.airbyte.io/t/how-to-log-when-building-custom-destination-connector/3592?u=abuton
    plus1 1
    u
    • 2
    • 1
  • r

    Robert Put

    01/04/2023, 10:42 PM
    What is best practice to exclude columns currently?
    a
    n
    • 3
    • 5
  • n

    Nahid Oulmi

    01/04/2023, 3:02 PM
    • Is this your first time deploying Airbyte?: No • OS Version / Instance: Debian • Memory / Disk: 32GB memory / 50GB disk • Deployment: docker-compose • Airbyte Version: 0.40.26 • Source name/version: elastic-search (custom connector) • Destination name/version: BigQuery Destination Version 1.2.9 • Step: Load raw data in destination • Description: I get the following errror :
    com.google.cloud.bigquery.BigQueryException: Destination deleted/expired during operation: mycompany-gcp-2020:airbyte_elasticsearch._airbyte_tmp_tzj_stat_accounts
    at the end of my synchronization process. I am using the Incremental/Append load method. My BigQuery destination is using GCS Staging with Interactive Run Type and Default Chunk Size of 15MB. Also, all data in the dataset is deleted when I get this error. I have no Error logs on BigQuery side. Attached you can find the full logs of the sync.
    failed_sync.txt
    s
    • 2
    • 2
  • s

    shivam Kumar

    01/05/2023, 10:39 AM
    How can i protect airbyte selfhosted server, In docker-compose I install airbyte, as there is setup of HTTP security, But I see the webapp can be accessed through another port without any security verification.
    ✅ 1
  • r

    Ross Douglas

    01/05/2023, 11:51 AM
    👋 Is there an airbyte connector for... airbyte (cloud)? 🙂 I'm interested in metadata about when things were last synced etc.
    n
    • 2
    • 2
  • o

    Omprakash Kumar

    01/05/2023, 1:01 PM
    Hii, Can anyone help me how can i run airbyte open source api using postman ?
    u
    • 2
    • 1
  • i

    Ignacio Alasia

    01/05/2023, 1:45 PM
    Hi guys! I have this trouble when I try to send data from PG to SF. • Is this your first time deploying Airbyte?: No • OS Version / Instance: ec2 m6a.2xlarge. • Deployment: Docker • Airbyte Version: 0.40.26 • Source name/version: Postgres 1.0.35 • Destination name/version: Snowflake 0.4.40 • Step: My sync is failing for full refresh | append. • Description: The error message I’m getting is Cannot drop column ‘_AIRBYTE_UNIQUE_KEY’ which belongs to a clustering key. • Table size: 325GB • Note: Completed the syncs but when the normalization begin, fail and give the description message. "Normalization failed for job 35." Any ideas? I saw the community already and didn't find some useful. Thanks!
    n
    • 2
    • 1
  • a

    Annika Maybin

    01/05/2023, 5:01 PM
    Hey everyone! I was wondering if anybody has an update on when multiple cursors could potentially be implemented. We have a table that technically has three cursors that need to be considered. Or any good workaround ideas?
    s
    • 2
    • 1
  • r

    Resford Rouzer

    01/05/2023, 7:10 PM
    Hi everyone. I was curious how you create multiple Custom Reports using the Google Analytics Data API? I can't find an example and keep getting failures.
    s
    • 2
    • 1
  • o

    Ohad

    01/05/2023, 8:47 PM
    Hi all! I had a working MSSQL connector, which suddenly is no longer working, the error below is generated when I test the connector.
    Copy code
    State code: 08S01; Message: The TCP/IP connection to the host <server_name>, port 1433 has failed. Error: "No route to host. Verify the connection properties. Make sure that an instance of SQL Server is running on the host and accepting TCP/IP connections at the port. Make sure that TCP connections to the port are not blocked by a firewall.".
    The connector was turned off during the holiday time, and during that time, I've upgraded AirByte to the latest version. I've noted the MSSQL connector got updated. And I tried all the MSSQL connector versions from 4.18 up to 4.26. The SQL connection does work locally on the machine, as I'm able to use SSMS. Any ideas?
    n
    • 2
    • 13
  • l

    Lucas Gonthier

    01/06/2023, 1:00 AM
    Hi all, Does Airbyte could run without problem on Google Cloud Run ?
  • t

    Taylor Facen

    01/06/2023, 3:26 AM
    Hi. The docket template is producing errors. The main one is that I am unable to add a new connector. I get the error Internal Server Error: Get Spec job failed.
    n
    • 2
    • 1
  • s

    Sheshan

    01/06/2023, 9:45 AM
    Hi , Getting this error while testing our custom destination build with python,
    Copy code
    (.venv) sheshan@sheshan:~/Documents/Airbyte/airbyte/airbyte-integrations/connectors/destination-weav-destination$ cat messages.jsonl | python main.py write --config secrets/config.json --catalog sample_files/configured_catalog.json
    {"type": "LOG", "log": {"level": "INFO", "message": "Begin writing to the destination..."}}
    {"type": "LOG", "log": {"level": "FATAL", "message": "write() missing 1 required positional argument: 'logger'\nTraceback (most recent call last):\n  File \"main.py\", line 11, in <module>\n    DestinationWeavDestination().run(sys.argv[1:])\n  File \"/home/sheshan/Documents/Airbyte/airbyte/airbyte-integrations/connectors/destination-weav-destination/.venv/lib/python3.8/site-packages/airbyte_cdk/destinations/destination.py\", line 109, in run\n    for message in output_messages:\n  File \"/home/sheshan/Documents/Airbyte/airbyte/airbyte-integrations/connectors/destination-weav-destination/.venv/lib/python3.8/site-packages/airbyte_cdk/destinations/destination.py\", line 104, in run_cmd\n    yield from self._run_write(config=config, configured_catalog_path=parsed_args.catalog, input_stream=wrapped_stdin)\n  File \"/home/sheshan/Documents/Airbyte/airbyte/airbyte-integrations/connectors/destination-weav-destination/.venv/lib/python3.8/site-packages/airbyte_cdk/destinations/destination.py\", line 48, in _run_write\n    yield from self.write(config=config, configured_catalog=catalog, input_messages=input_messages)\nTypeError: write() missing 1 required positional argument: 'logger'"}}
  • s

    Sheshan

    01/06/2023, 10:02 AM
    Hi All, getting below error while testing with docker container of my python custom destination , can someone help.
    Copy code
    (.venv) sheshan@sheshan:~/Documents/Airbyte/airbyte/airbyte-integrations/connectors/destination-weav-destination$ cat messages.jsonl | docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/sample_files:/sample_files peeyushweav/weav_destination-custom-python:c4prod write --config /secrets/config.json --catalog /sample_files/configured_catalog.json
    {"type": "LOG", "log": {"level": "INFO", "message": "Begin writing to the destination..."}}
    {"type": "LOG", "log": {"level": "FATAL", "message": "write() missing 1 required positional argument: 'logger'\nTraceback (most recent call last):\n  File \"/airbyte/integration_code/main.py\", line 11, in <module>\n    DestinationWeavDestination().run(sys.argv[1:])\n  File \"/usr/local/lib/python3.9/site-packages/airbyte_cdk/destinations/destination.py\", line 108, in run\n    for message in output_messages:\n  File \"/usr/local/lib/python3.9/site-packages/airbyte_cdk/destinations/destination.py\", line 103, in run_cmd\n    yield from self._run_write(config=config, configured_catalog_path=parsed_args.catalog, input_stream=wrapped_stdin)\n  File \"/usr/local/lib/python3.9/site-packages/airbyte_cdk/destinations/destination.py\", line 47, in _run_write\n    yield from self.write(config=config, configured_catalog=catalog, input_messages=input_messages)\nTypeError: write() missing 1 required positional argument: 'logger'"}}
  • i

    Ishan Anilbhai Koradiya

    01/06/2023, 10:04 AM
    Hi all, I have a come across a weird scenario while the using the airbyte apis. I have installed airbyte using docker and I am trying to manage workspaces using the apis. However , the api POST http://localhost:8000/api/v1/workspaces/get (fetching a workspace by ID) always fetches one default (that was created originally while setting up airbyte) workspace even after passing a different workspace ID. The same thing happens with the delete workspace api also POST http://localhost:8000/api/v1/workspaces/delete . Any idea why these apis are ignoring the workspace ID passed in the body and always considering the default workspace ? Note that this default workspace is created when I first opened up the airbyte UI after installation.
    n
    u
    • 3
    • 4
  • r

    Renato Todorov

    01/06/2023, 10:24 AM
    Hello everyone. I'm having a strange failure with Airbyte deployed to Kubernetes. When trying to setup any connection or source, the
    check_connection
    calls are retuning a "upstream request timeout" error and I'm currently not able to understand where is it coming from because there are absolutely no errors in the logs, even with DEBUG enabled. I've logged an issue here, any help is appreciated: https://github.com/airbytehq/airbyte/issues/20963. Can anyone tell me how can I further debug this issue? I've exec'd into the server container and it can communicate and connect to all the other hosts (temporal and webapp, for example). I'm stuck and have no idea how to debug it further.
    n
    • 2
    • 7
  • t

    Talha Asif

    01/06/2023, 2:30 PM
    Hi, I am new to airbyte. Need to ask few questions. Can we hit REST/SOAP API's using destination in open source airbyte implementation? I was unable to find any destination connector? Also for the source there is only Public API (cant we specify API of our own to get data from). Also I am unable to find these answers in documentation. If anyone can help that would be appreciated. Or provide me documentation reference. TIA
    s
    s
    t
    • 4
    • 7
  • h

    Hasan Khan

    01/06/2023, 3:32 PM
    Hello admin can you help me?
    s
    • 2
    • 1
  • m

    Mora forti

    01/06/2023, 4:41 PM
    Hi all, I'm trying to use the Airbyte API to trigger a sync, I have no problem when using a local Airbyte running in localhost, but what I really need is to trigger a connection in an Airbyte that's running with Plural. I'm having trouble finding how to make the Authentication work, I get the error
    405 Not Allowed
    . I found in the docs (https://docs.airbyte.com/deploying-airbyte/on-plural/) that I can create a user in the context.yaml file, but I'm not sure how's the correct syntax to do that, or if it is the right way to make the api call work, can anyone guide me with this? Thanks!
    n
    • 2
    • 1
  • b

    Ben Greene

    01/06/2023, 6:01 PM
    Hi Everyone, I'm running Airbyte on an EKS cluster. I am currently testing size, scope, and scale of the cluster, so I'm purposefully trying to blast it and identify how many concurrent sync jobs I can start before overloading the server. However this has lead our product team to pose an interesting question. When I overload the cluster, even before failure, we start to see performance of the web app be impacted. Is there any way we can separate the blast radius of our sync jobs, and the performance of the web app? ie. Can I push our daily sync jobs to the limit, but still be confident that our data science team can interact with, and use the interface for adhoc/one-off work without being impacted from a performance standpoint? Thanks for all your help in advance! Really love the service, Airbyte is great! P.S. At a higher level, do you have any suggestions/templates/guidance for EKS Cluster Node Size, Amount, Configuration, etc.?
    s
    • 2
    • 2
  • s

    Sean Glynn

    01/06/2023, 6:05 PM
    Happy Friday Airbyters! We're currently facing an issue deploying the Airbyte via helm chart within our internal AKS cluster. Our infrastructure team blocks us from pulling any image references which do not have the explicit registry prefix. (Basically, we need to prepend all of our image references with docker.io/ - for example we would need to reference docker.io/busybox:latest to pull the latest busybox image from within our K8s infrastructure due to a security policy which we cannot avoid). As a result of this policy, we are unable to proceed with our Airbyte deployment 😞 I've opened an issue here: https://github.com/airbytehq/airbyte/issues/21123 I'm willing to help out on this in any way I can as I would love to get Airbyte deployed on our k8s infrastructure asap. Any help is greatly appreciated thanku CC: @Tim Frazer
    🙏 1
    u
    • 2
    • 2
  • l

    Lucas Migliorini De Freitas

    01/06/2023, 9:07 PM
    Hello! I'm testing the connector for mongodb and writing in BQ. Only one of the columns of both tables(_id) is coming with an unexpected value. When we extract via pymongo we have _id = {"uuid":"5e5f084c76ce47f3b1f9835c309f2471"} and via Airbyte something like vWL0kuKdQqqNaH4VcWFR/A==. Can you support us on how to solve it in the best way?
    s
    • 2
    • 2
  • a

    Arthur

    01/06/2023, 9:34 PM
    Tried 2 ways to connect to vendor’s AWS hosted RDS postgres using airbyte OS and cloud. Both failed in different ways: FAILED TO FETCH SCHEMA. ERROR: non-json response OS: 0.40.17 Whitelisted local IP. First attempt to connect worked and displayed public schema. Went back and added new schema explicitly. Received error above when setting up connection. Total table size in schema may be large. Got the server logs and docker logs for worker (see attached at 17:20+). Are there any other troubleshooting tips or logs? MESSAGE: HikariPool-1 - Connection is not available, request timed out after 10001ms. Cloud: 0.40.23 Whitelisted 3 IPs from docs (later realized there are 8 so need to re-test). Enabled SSL require. Received error above when testing source. I don’t see any available logs in this case. Appreciate any help!
    n
    • 2
    • 5
  • i

    Ignacio Contreras

    01/06/2023, 10:01 PM
    Hello ! I am having troubles with Jira Connector. I know that it is still in Alpha, but maybe some find a workaround. In the documentation it said (point 9) that it is possible to get the changelog but I turn on that option but the column is empty. Is there any constraint for that field? Thanks in advance !
    s
    • 2
    • 1
  • p

    Philip Johnson

    01/06/2023, 11:51 PM
    Hey There - I'm developing a connector locally and running airbyte via docker compose on my laptop. How does one pull the new docker image down locally to make it available to the airbyte ui running in docker?
  • p

    Phani Sura

    01/07/2023, 3:07 AM
    Hello!
  • p

    Phani Sura

    01/07/2023, 3:08 AM
    I am trying to stream facebook ads data of all my customers who authenticate through my developer app.
  • p

    Phani Sura

    01/07/2023, 3:09 AM
    I am able to get access token which is only valid of 60 days. Is there any mechanism where I can get access token that never expires. Also, does airbyte has any support for notifying when the token is about to expire?
    u
    • 2
    • 1
  • p

    Peer Richelsen

    01/07/2023, 5:08 PM
    has someone been able to use the orbit.love connector? I am getting a generic error (using default settings) i know it says “Alpha connectors are in development and support is not provided. See our documentation for more details.” but I was wondering where i could find help. the link to the docs were not helpful
    n
    a
    u
    • 4
    • 11
  • r

    Rocky Appiah

    01/07/2023, 6:16 PM
    Using a serverless rds instance, if the instance is idle, sometimes the sync will fail since it’s waiting for it to spin up. How do I increase the timeout?
    u
    • 2
    • 2
1...118119120...245Latest