https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • a

    Ashish Gupta

    07/04/2021, 12:56 PM
    any suggestions
  • s

    sajan P

    07/08/2021, 4:45 PM
    Hi, I am unable to bring airbyte up. Here is the issue I am receiving.
  • r

    Ranjith Kumar

    07/12/2021, 2:52 AM
    image.png
  • r

    Riya Tyagi

    07/19/2021, 10:23 AM
    Hi everyone, i just started with airbyte ,i have use source as sftp,local file system and my csv contain 700 columns but after syncing the data i got 0byte,0 data in logs .
  • r

    Riya Tyagi

    07/19/2021, 10:24 AM
    same thing i used for 400 columns it working fine
  • d

    Dean Lau

    09/06/2024, 8:37 AM
    Is refreshes + remove meant to remove the table index (manually created)?
  • d

    Dani Toro

    09/06/2024, 8:49 AM
    Hi all! I have a connection between Zendesk Support and Mysql DB to export all tickets and comments. Comments is a json with an array of attachments of this comment. In previous versions of Zendesk Support source connector, this attachments array creates a new table in mysql with the data. Is there any way to nest this array into a separate table?
  • l

    Laurence Trefor Williams

    09/06/2024, 9:22 AM
    Hi all, I'm wondering is it only possible to generate an access token with self-managed enterprise? What about with self-hosted (free tier)?
    p
    • 2
    • 5
  • d

    Daniel Holleran

    09/06/2024, 11:14 AM
    hi! In this documentation here , it says the hearbeat configuration for sources and destination can be edited in the
    flags.yml
    file. how can these configurations be configured in the helm chart? should I mount an edited version of this file to a particular deployment? if so, which one?
    l
    p
    j
    • 4
    • 6
  • t

    Tigran Zalyan

    09/06/2024, 2:49 PM
    Hi everyone! I just deployed Airbyte to GKE using helm. When I’m trying to create a source the request times out with 502 error code every time. I’m using an external database. Everything works fine locally. Cluster configuration: 3 nodes Each node has 8gb ram and 2vcpus Has anyone faced this issue before?
    p
    • 2
    • 5
  • f

    Fredrik

    09/06/2024, 2:58 PM
    Hi, I'm currently trying to configure an external MinIO instance with the latest Airbyte Helm chart but am running into issues with directing all S3 requests to MinIO properly. Here’s the key configuration I’m using in the Helm chart:
    Copy code
    - name: global.storage.type
      value: "minio"
    - name: global.storage.minio.endpoint
      value: "https://<masked-minio-endpoint>:443"
    - name: global.storage.storageSecretName
      value: "airbyte-custom-secret"
    - name: global.storage.minio.accessKeyIdSecretKey
      value: "MINIO_ACCESS_KEY"
    - name: global.storage.minio.secretAccessKeySecretKey
      value: "MINIO_SECRET_KEY"
    - name: global.storage.bucket.log
      value: "airbyte"
    - name: global.storage.bucket.state
      value: "airbyte"
    - name: global.storage.bucket.workloadOutput
      value: "airbyte"
    It seems that logs are being correctly stored in the external MinIO, but the state storage is not working as expected. Is there any additional configuration I should consider to fully support an external MinIO instance, or am I missing anything specific in the Helm chart? I would greatly appreciate any guidance or suggestions to troubleshoot this issue. Thanks in advance!
    p
    • 2
    • 2
  • u

    user

    09/06/2024, 3:41 PM
    Comment on #45177 Pod-airbyte-bootloader-error Discussion answered by NAjustin @nikhilnicky1 The localhost errors you're seeing are a red herring . . . that's the stats collector, not Postgres. The line that actually diagnoses the issue is this one:
    Copy code
    Caused by: java.net.UnknownHostException: <http://postgres.cluster-czcu8isw0xzm.us-east-1.rds.amazonaws.com|postgres.cluster-czcu8isw0xzm.us-east-1.rds.amazonaws.com>
    It would seem that the host machine can't resolve that name, likely because it's an internal DNS record for AWS and the host machine doesn't have that zone available (it's definitely not a public hostname). I would start by making sure that you have the right connection details, and that the hose machine for Airbyte can resolve the hostname (you can always test this using `nslookup`/`dig`/`ping`). Some hostnames in AWS are regional, meaning they'll only resolve for instances in the same region and not outside. In other cases, you could have to do additional configuration of the VPC settings or access rules. If you're ever not sure about connectivity, you can always install the postgres client and try to connect directly—sometimes you'll get a more helpful error message. (Or if nothing else you can rule out Airbyte as the issue and know where to better focus your efforts.) airbytehq/airbyte
  • u

    user

    09/06/2024, 7:19 PM
    Comment on #45182 Unable to list workspace or create connexion using self hosted airbyte python sdk Discussion answered by szemek https://reference.airbyte.com/reference/standalone-server-deprecation-and-migration-to-airbyte-server Instead of
    <http://localhost:8000/api/v1>
    use
    <http://localhost:8000/api/public/v1>
    airbytehq/airbyte
  • u

    user

    09/06/2024, 7:53 PM
    #45201 Ozone Connector. New discussion created by jonatasamorim Why not a Cloudera Ozone destination connector? airbytehq/airbyte
  • j

    Jon Seymour

    09/07/2024, 3:28 AM
    Re: https://airbytehq.slack.com/archives/C01AHCD885S/p1725679219551029 I am having trouble setting up data synchronization on a new abctl instance. To the best of my knowledge: • source connectivity is working (evidence: I have synced a single stream with this source connector) • destination connectivity is working (evidence: I have synced a single stream with thus destination connector) • the node is not under provisioned (evidence: 4 CPU, 16GB, no pending pods due to resource constraints) • there are no errors in the connection specification (evidence: no error messages, of any type, in connection logs) Yes, when I extend the set to a wider set of 8 tables, the sync does not work with these symptoms: • no indication of errors in the connection logs or any k8s log • no pending pods • destination tables created • orchestration, source and destination logs show idle all to be “running”, but idle even after 30 minutes of apparently doing nothing • node CPU is idle • there is no evidence of data being transferred nor indications of why data cannot be transferred
    • 1
    • 2
  • p

    Parth Agrawal

    09/08/2024, 1:38 PM
    Please help problem status :- Data is fetching completely from Amazon Ad But I can see that is it is not ingested all the row in BIG Query. Some are missing. Can you help me for this ?
  • b

    Brian Henson

    09/08/2024, 3:58 PM
    I am working on updating the community connector for QuickBooks to include two streams,
    CreditCardPayments
    and
    GeneralLedger
    that are not included in the current connector's definitions. Steps I have taken so far: • Cloned repo and modified the
    manifest.yaml
    file to include the missing streams • Created the
    secrets/config.json
    which is required for integration testing • Built a local development container with
    airbyte-ci connectors --name source-quickbooks build
    • Built an image with my changes by creating a Dockerfile in the
    aribyte-integrations/connectors/source-quickbooks
    directory and running
    sudo docker build -t airbyte/source-quickbooks:dev .
    • Ran integration tests using the above
    secrets/config.json
    to validate my
    mainfest.yaml
    configuration by running
    docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-quickbooks:dev check --config /secrets/config.json
    When the integration test completes I receive the following errors:
    Copy code
    {"type": "LOG", "log": {"level": "ERROR", "message": "Encountered an error trying to connect to stream accounts. Error:
    Traceback (most recent call last):
    File \"/usr/local/lib/python3.10/site-packages/airbyte_cdk/sources/declarative/checks/check_stream.py\", line 42, in check_connection
    stream_is_available, reason = availability_strategy.check_availability(stream, logger, source)
    File \"/usr/local/lib/python3.10/site-packages/airbyte_cdk/sources/streams/http/availability_strategy.py\", line 50, in check_availability
    get_first_record_for_slice(stream, stream_slice)
    File \"/usr/local/lib/python3.10/site-packages/airbyte_cdk/sources/streams/utils/stream_helper.py\", line 40, in get_first_record_for_slice
    return next(records_for_slice)
    File \"/usr/local/lib/python3.10/site-packages/airbyte_cdk/sources/declarative/declarative_stream.py\", line 126, in read_records
    yield from self.retriever.read_records(self.get_json_schema(), stream_slice)
    File \"/usr/local/lib/python3.10/site-packages/airbyte_cdk/sources/declarative/retrievers/simple_retriever.py\", line 339, in read_records
    self.cursor.close_slice(_slice, most_recent_record_from_slice)
    File \"/airbyte/integration_code/source_quickbooks/components.py\", line 72, in close_slice
    most_recent_record=LastRecordDictProxy(most_recent_record, {self.cursor_field.eval(self.config): \"MetaData/LastUpdatedTime\"}),
    AttributeError: 'str' object has no attribute 'eval'
    "
    }}
    Does anybody have any experience working with this build pipeline that might be willing to help me through these issues?
    j
    • 2
    • 4
  • u

    user

    09/08/2024, 6:26 PM
    #45333 Webhook URL in powered by Airbyte New discussion created by sudip-mondal-2002 Topic Powered by Airbyte Relevant information We can add webhook urls from the airbyte cloud. Can we also add that through powered by airbyte? Please update powered by airbyte with this endpoint as well. image airbytehq/airbyte
  • s

    Scheduled message

    09/09/2024, 4:00 AM
    Please post your weekly update in thread🧵. Thanks, team!
  • d

    Daniel Shaw

    09/09/2024, 5:55 AM
    Hey all, I'm running under docker-compose run-ab-platform.sh. I understand this is being deprecated but right now i have to update the version to enable deduplication on full refresh however upgrading VERSION in the .env fail on trying pull airbyte-proxy from docker hub.. how is the script meant to work if not all versions of services are deployed to docker hub?
    p
    • 2
    • 1
  • b

    Balakumar Krishnasamy

    09/10/2024, 4:47 PM
    Hi All, I have configured Oracle to Redshift connector (both with latest version). For the Redshift destination connector I have enabled the option "Purge Staging Files and Tables". However I do not see the files purged after sync. Has anyone faced this issue? Any inputs?
  • h

    Hari Gudla

    09/10/2024, 4:48 PM
    Hey team, I seem to have hit a roadblock. I’ve set up an Airbyte connection to capture CDC data from an MSSQL source database into Snowflake. During a schema drift (where I added a new column and dropped an old one), Airbyte detected the stream changes, as shown in the attached image. However, I’m unable to select either option to properly refresh the target warehouse. The issue I’m facing is that if I allow both options, Airbyte creates the new column but also deletes the old column from the target SCD table, resulting in data loss in the warehouse. Any ideas on what I might be missing here or how to resolve this?
  • s

    Shivani Bothra

    09/10/2024, 5:13 PM
    Hi Team , I have a scenario where I am using Date parameters to filter my data in API request . I see we have option to pass the parameter using the ValueList but when I put now_utc in the valuelist , It pass the now_utc as is without converting it into actual date, and use this in Request Body JSON as {{ stream_partition.date_from }}, it not converting the date and passing it as is . Is there anything I am doing wrong ? Can we pass the date parameters like this using the Valuelist in {parameterized request Thanks
    👀 2
  • b

    Blake Miller

    09/10/2024, 6:44 PM
    Hi team, after weeks of running smoothly, my login screen now gives the error
    Cannot read properties of undefined (reading 'edition')
    I tried reinstalling via abtcl (abctl local uninstall / abctl local install) and rebooting the machine. I’m scared to test uninstall that purges data because I have a lot of info saved to those connections. Anyone know how I can start debugging this? OS: Ubuntu 22.04 abctl: Latest Airbyte: Latest
    t
    u
    • 3
    • 4
  • m

    Madison Mae

    09/10/2024, 7:07 PM
    Hi all, I'm trying to ingest csv files from S3 into my warehouse. However, the csv files don't have the csv extension due to the way AWS creates part files. Do these need to have the csv extension in order to be ingested as the CSV file format?
    p
    • 2
    • 1
  • m

    Mikanoch

    09/10/2024, 9:03 PM
    Hi, is there a way to install airbyte with
    abctl
    and email populated without having to open a browser and go through the registration process? • I tried using
    abctl local install --secret secrets.yaml
    And the
    secrets.yaml
    looks like below. It did not work.
    Copy code
    apiVersion: v1
    kind: Secret
    metadata:
      name: airbyte-auth-secrets
    type: Opaque
    stringData:
      instance-admin-email: <mailto:airbyte@myemail.com|airbyte@myemail.com>
    • I tried
    abctl local credentials --email <mailto:airbyte@myemail.com|airbyte@myemail.com>
    but error, unexpected status code: 401. It only worked after I open a browser and register a random email. How do I skip the browser registration process after the installation?
    p
    • 2
    • 2
  • o

    Oluwapelumi Adeosun

    09/10/2024, 9:12 PM
    Hello everyone, I have some streams set to full refresh overwrite. It doesn't work as expected. Only a few rows are ingested during every sync unless I reset the stream and do a sync after. The source is MySql and the destination is Snowflake. I'd appreciate any opinion about how to fix this.
  • n

    Nikhil Chittimalla

    08/30/2024, 5:17 AM
    Hello Everyone, Iam getting Airbyte bootloader error when i try to deploy airbyte with helm in EKS. Can you please help with this?
    airbyte-bootloader-logs.rtf
    p
    r
    • 3
    • 19
  • m

    Madison Mae

    09/10/2024, 9:35 PM
    Was this issue on empty CSVs to S3 ever resolved? What is the solution? https://discuss.airbyte.io/t/source-s3-issue-reading-csv-when-only-header-row-present/1155
    p
    • 2
    • 1
  • a

    Adam Marcus

    09/11/2024, 2:43 AM
    Hello! After having no issues for >1 month with a self-hosted docker compose-based installation, we upgraded to
    abctl
    today. We had to double our machine size due to CPU limitations, but are now getting
    io.airbyte.commons.exceptions.ConnectionErrorException: java.sql.SQLTransientConnectionException: HikariPool-1 - Connection is not available, request timed out after 10196ms (total=0, active=0, idle=0, waiting=0)
    (for a Postgres source). Has anyone seen / resolved this?
    u
    • 2
    • 13
1...220221222...245Latest