https://linen.dev logo
Join Slack
Powered by
# troubleshooting
  • r

    Raj

    03/25/2022, 6:42 AM
    Hello!!! Our scheduled jobs are not getting triggered where as manual trigger of the same jobs works fine. Scheduled jobs used to work fine. Restarting the server seems to solve the problem temporarily. After few hours this issue starts again. Any pointers on how to solve this problem would really helpful. OS: Ubuntu Memory : 32GB/ 200SSD Airbyte version: 35.15
    h
    m
    • 3
    • 5
  • h

    Huib

    03/25/2022, 7:26 AM
    For some reason I can’t create a new topic in the troubleshooting discourse forum, it tells me to post in the
    sources
    or
    destinations
    categories, but these don’t seem to exist…
    h
    • 2
    • 1
  • a

    Alexandre Chouraki

    03/25/2022, 9:09 AM
    Is this your first time deploying Airbyte: No OS Version / Instance: Ubuntu 20.04 on AWS EC2 Deployment: Docker Airbyte Version: 0.35.53-alpha Source name: PostgreSQL (0.4.9) Destination: Snowflake (0.4.22) Step: Creating connection Hello! When creating a new connection with a PostgreSQL source, I get no tables displayed at all - Airbyte seems not to discover them. I have to set the source connector back to version 0.4.4 in order to see data...
    o
    a
    p
    • 4
    • 5
  • f

    Fabian Benik

    03/25/2022, 10:19 AM
    Hello everyone 👋 I'm currently facing
    No database selected
    issues during
    normalization
    when syncing from SQL Server to MySQL. Can anyone help me to fix this issue? Is this your first time deploying Airbyte: Yes OS Version / Instance: GCP GKE 1.20.15-gke.1000 Deployment: Helm Airbyte version: 0.35.35-alpha Source name: Microsoft SQL Server (MSSQL) 0.3.17 Destination: MySQL 0.1.18 Step: Sync - Normalization I have attached the stripped failed log.
    o
    o
    a
    • 4
    • 5
  • a

    Ali Hussain Mir

    03/25/2022, 12:09 PM
    @Eric how do you get the client id and client secret for lever?
    e
    • 2
    • 1
  • n

    Nahid Oulmi

    03/25/2022, 3:25 PM
    Is this your first time deploying Airbyte: No OS Version / Instance: Debian 10 (buster) on AWS EC2, 8 cores, 16GB RAM Deployment: Docker Airbyte Version: 0.35.45-alpha *Source name: My*SQL (0.4.13) - Incremental Destination: BigQuery (0.5.0) - Deduped + history Step: Writing data to BigQuery Hello ! My connection is running well on the source-side getting 8 million rows from 10 tables in MySQL with the latest log being :
    2022-03-25 14:36:32 source > 2022-03-25 14:36:32 INFO i.a.i.s.m.MySqlSource(main):200 - {} - completed source: class io.airbyte.integrations.source.mysql.MySqlSource
    . However, after succeeding with the source there is no info on the Destination worker, with the latest log line being about the Source completed. It would stay in running state for 20 hours (so far), with no further log to investigate, whether it be in the UI, the Docker containers or in the BigQuery console.
    a
    m
    • 3
    • 5
  • y

    Yanni Iyeze - Toucan Toco

    03/25/2022, 4:15 PM
    Hello , I got some issues when testing new streams on source-salesloft I’m not sure to get the error ...
    m
    • 2
    • 4
  • j

    Johan Strand

    03/25/2022, 7:51 PM
    Is this your first time deploying Airbyte: Yes, tried removing everything and re-do OS Version / Instance: macOS, GCP ,e2-medium Memory / Disk: 30gb storage Deployment: Docker Airbyte Version: 0.35.60-alpha Step: Connect to airbyte Description: Following the GCP-guide https://docs.airbyte.com/deploying-airbyte/on-gcp-compute-engine#troubleshooting I connect via SSH sucesfully but when i try the port forward i get the error
    gcloud beta compute ssh airbyte -- -L 8000:localhost:8000 -N -f
    Copy code
    ERROR: (gcloud.beta.compute.ssh) PERMISSION_DENIED: Request had insufficient authentication scopes.
    - '@type': <http://type.googleapis.com/google.rpc.ErrorInfo|type.googleapis.com/google.rpc.ErrorInfo>
     domain: <http://googleapis.com|googleapis.com>
     metadata:
      method: compute.beta.InstancesService.Get
      service: <http://compute.googleapis.com|compute.googleapis.com>
     reason: ACCESS_TOKEN_SCOPE_INSUFFICIENT
    m
    • 2
    • 2
  • a

    Andras N.

    03/25/2022, 8:44 PM
    Hey All 👋, I am new to Airbyte, just started using the cloud version (still in free trial) - after the first day when a Google Ads sync took around 7-9 minutes for a small account (17-18k records) after a few days it started taking forever, like hours and frequently terminating in errors even for the same Replication sync mode (full refresh & overwrite). Is there some simple explication for this? If there some additional info is needed I'll be glad to provide! Thanks!
    o
    m
    • 3
    • 2
  • m

    Marcos Marx (Airbyte)

    03/25/2022, 8:47 PM
    @Igor Moura why you’re building the project from the scratch? Are you planning to work in a feature or solve a bug? If not use the latest version and execute only
    docker-compose up -d
    this will get the version from the
    .env
    file
    i
    • 2
    • 6
  • t

    Tarun Anand

    03/26/2022, 12:49 AM
    Hi I am Tarun. Interestingly, trying to use the same use case as @James Sullivan I tried the open source package but for some reason the connector doesnt work for Shopify. The password based API is no. longer supported. Then I headed over to the cloud instance and saw that there is no Shopify as a source in the connector list. Please help!
    o
    a
    • 3
    • 3
  • t

    Tarun Anand

    03/26/2022, 12:56 AM
    Is this your first time deploying Airbyte: Yes OS Version / Instance: Mac OS Memory / Disk: 16Gb / 512 GB SSD Deployment: Docker Airbyte Version: 0.35.59-alpha Source name/version: Shopify 001 Destination name/version: Local CSV Step: Setting new connection, source / On sync Description: I'm trying to sync for the first time and the process ends in an error. requests.exceptions.ConnectionError: HTTPSConnectionPool(host='mystore00123.myshopify.com', port=443): Max retries exceeded with url: /admin/api/2021-07/checkouts.json?limit=250&order=updated_at+asc&updated_at_min=2022-03-01&status=any (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f8582012850>: Failed to establish a new connection: [Errno -2] Name does not resolve')) 2022-03-26 003043 INFO i.a.w.DefaultReplicationWorker(lambda$getReplicationRunnable$5):305 - Total records read: 0 (0 bytes)
    a
    • 2
    • 5
  • a

    Alex Bondar

    03/27/2022, 7:34 AM
    Is this your first time deploying Airbyte: No OS Version / Instance: Azure Standard_DS3_v2 Deployment: K8s Airbyte Version: 0.35.51-alpha Source name/version: All Destination name/version: BingQuery Step: Scheduler Description: Periodic sync doesn't works in all connections, e.g: have sync frequency of once in 12hr but in doesn't triggers
    o
    h
    • 3
    • 5
  • s

    Sigmundt Kappel

    03/27/2022, 4:45 PM
    tmp - I'm assuming these are temp tables. Will these tables convert into the shorter names after the full sync is done?
    c
    t
    a
    • 4
    • 7
  • r

    Rytis Zolubas

    03/28/2022, 7:48 AM
    Hello, which Python file is used to create a DBT nomalisation models?
    o
    a
    • 3
    • 3
  • a

    Anton Escalante

    03/28/2022, 7:53 AM
    Does anyone have any idea why normalized_at != current_timestamp in my snowflake destination I ran this a few mins ago
    r
    c
    • 3
    • 5
  • s

    Samir

    03/28/2022, 8:52 AM
    Hey @[DEPRECATED] Augustin Lafanechere. Was wondering where we are with this? Every time we want to add new tables or update existing table schemas we have to create entirely new connections or delete and rebuild old connections. We have a lot of tables and data.
  • d

    Dimitris Bougas

    03/28/2022, 12:17 PM
    Is this your first time deploying Airbyte: Yes. OS Version / Instance: Debian / n1-standard-2 Memory / Disk: 40GB Deployment: GCP Airbyte Version: 0.35.59-alpha Source name/version: Oracle Destination name/version: MySQL Step: Connection is working. Description: I have configured my pipeline and the sync is working. When i put on the Destination namespace: Mirror source structure, i dont see any tables on my MySQL. When i put destination default, i saww the below:
    a
    • 2
    • 1
  • a

    Alpana Shukla

    03/28/2022, 12:57 PM
    I am creating a Zendesk support Source,the source has been created succesfully, but when connect that to MSSQL destination it gives me this `*ERROR* i.a.w.DefaultReplicationWorker(run):168 - Sync worker failed.`error in log while syncing data. Attaching log file for further reference. Thanks in anticipation. 😀
    o
    a
    • 3
    • 4
  • m

    Madara Ranawake

    03/28/2022, 1:26 PM
    Hello, are there any IP addresses associated with the platform that we can use to whitelist? (e.g. In an AWS RDS security group). Thanks.
    o
    a
    • 3
    • 3
  • m

    M B

    03/28/2022, 1:58 PM
    Hello everyone. I have a question regarding Google Sheet connector. I created a connector between google sheet -> bigquery and airbyte connects properly to both source and destinations. I have one tab in spreadsheet, its schema is read correctly (header names and types 'string' are detected well) Job is marked as succedeed, there are no errors in job log. But there is no result table in data warehouse, only _airbyte_raw table In logs I can see that there is a message 1 row synced (0 bytes) The table has many more rows What might be the issue? Header names do not have illegal characters
    a
    • 2
    • 1
  • a

    Ali Hussain Mir

    03/28/2022, 3:33 PM
    Hello Team, I am working on bringing the data from Lever, And i have seen that there is a connector available on Airbyte. What Airbyte needs is Client ID, Client Secret, Refresh Token What lever provides us with is just API KEY, How can we get the above information?
    o
    a
    • 3
    • 3
  • e

    Enrico Tuvera Jr.

    03/28/2022, 4:46 PM
    Hi guys, I had trouble with the hubspot connector a week or so back and even after the changes I'm still getting failed syncs. new log file is attached
    o
    m
    • 3
    • 5
  • l

    Leonardo de Almeida

    03/28/2022, 6:22 PM
    Is this your first time deploying Airbyte: No. Memory / Disk: 50GB Deployment: K8s Airbyte Version: 0.35.60-alpha Description: I'm upgrading my Airbyte version from 0.35.30-alpha to 0.35.60-alpha, but when has deployed webapp, worker, server, shceduler, temporal I can't access webapp. The logs showing are: Seems to be an error with columns in database. I'm using an Postgres RDS on amazon. I have to execute something manual when upgrading airbyte version?
  • l

    Leonardo de Almeida

    03/28/2022, 6:23 PM
    Is this your first time deploying Airbyte: No. Memory / Disk: 50GB Deployment: K8s Airbyte Version: 0.35.60-alpha Description: I'm upgrading my Airbyte version from 0.35.30-alpha to 0.35.60-alpha, but when has deployed webapp, worker, server, shceduler, temporal I can't access webapp. The logs showing are: Seems to be an error with columns in database. I'm using an Postgres RDS on amazon. I have to execute something manual when upgrading airbyte version?
    m
    • 2
    • 2
  • s

    Sunrise

    03/28/2022, 8:12 PM
    I’m trying to build Airbyte locally to evaluate it against our product needs. I’m on m1 so i’m using
    VERSION=dev docker-compose up
    which returns the following error. Seems like others are running into this issue, but I couldnt quite find a resolution
    Copy code
    Pulling init (airbyte/init:dev)...
    ERROR: manifest for airbyte/init:dev not found: manifest unknown: manifest unknown
    m
    • 2
    • 3
  • m

    Marcos Marx (Airbyte)

    03/28/2022, 8:30 PM
    If you encounter any issues using Airbyte, check out our [Troubleshooting](https://discuss.airbyte.io/c/issues/11) forum. You’ll see how others have got their issues resolved, and our team will be there to assist if your issue hasn’t been encountered yet.
  • a

    Adam Schmidt

    03/29/2022, 12:13 AM
    Hey team, Wondering what fine-tuning options are available for improving the throughput of jobs? I'm running a Gitlab -> Snowflake connection, and it's been running for 23 hours (weird given that the start date was only set to the beginning of the year). Based on the logs, the ingest seems pretty slow @ 1000 records every 6 minutes or so, but I'd have thought it should be able to crunch through data a lot faster.
    m
    • 2
    • 1
  • f

    Faisal Anees

    03/29/2022, 2:34 AM
    @s @Marcos Marx (Airbyte) Can you please respond to @Christian Persson’s question ? We have the same question as writing to Salesforce is a requirement for us and curious to know if its "feasible" at all with Airbyte ? We're quite interested in building a Salesforce destination but just want to know if this flow is feasible in Airbyte
    m
    • 2
    • 1
  • h

    Hamza Liaqat

    03/29/2022, 5:56 AM
    Hi Everyone, I'm using Airbyte API (with dagster). My source connector is Github. When I use this endpoint
    /sources/discover_schema
    (link) and my GitHub source is more than 10 repos I get a timeout error on this particular endpoint only.
    Copy code
    Request to Airbyte API failed: HTTPConnectionPool(host='localhost', port=8000): Read timed out. (read timeout=15)
    When the number of repos is less than 10, it works fine and I get source catalog. I don't face this issue when I use airbyte UI.
    m
    • 2
    • 2
1...8910...14Latest