https://linen.dev logo
Join Slack
Powered by
# troubleshooting
  • r

    Raj

    04/04/2022, 12:01 PM
    Hello!!! Our scheduled jobs are not getting triggered where as manual trigger of the same jobs works fine. Scheduled jobs used to work fine. Restarting the server seems to solve the problem temporarily. After few hours this issue starts again. We found the issue, connections to that database are not being closed properly resulting in Idle connections. This is resulting in max number of connections we can make to postgres db. Is there any solution for this issue ? Currently we are running script to close all the idle connections. Is this something that can be fixed by Airbyte ?
    n
    • 2
    • 1
  • a

    Alban van Rijsewijk

    04/04/2022, 4:29 PM
    Is this your first time deploying Airbyte: Yes Airbyte Version: 0.22.2-alpha Source name: Intercom Destination name: Snowflake Step: Setting new connection, source / On sync Description: sync is successful when putting “raw data normalization” but I get some normalization issues with the “basic normalization” option. Do you have any idea why it’s happening ?
    m
    • 2
    • 3
  • c

    Connor Lough

    04/04/2022, 4:58 PM
    Is this your first time deploying Airbyte: Yes Airbyte Version: 0.35.61-alpha Source name: MSSQL Destination name: S3 Step: Setting new destination, on sync with output format AVRO Description: sync is successful for PARQUET and other file types. AVRO fails with Internal Server Error: null
    m
    • 2
    • 1
  • k

    Kris Hagel

    04/04/2022, 7:01 PM
    Is this your first time deploying Airbyte: No OS Version / Instance: EC2 t3a.xlarge Memory / Disk: 16Gb Deployment: Docker Airbyte Version: 0.35.64-alpha Source name/version: File 0.2.9 Destination name/version: Redshift 0.3.28 Step: Setting new connection, source / On sync Description: Multiple sources from SFTP over to Redshift are failing on first sync. These do work just fine to a MySQL destination, but trying to move them over to Redshift now. Does not effect every SFTP connection, but a large number of them are now failing.
    m
    • 2
    • 1
  • m

    Marcos Marx (Airbyte)

    04/04/2022, 8:00 PM
    If you encounter any issues using Airbyte, check out our Troubleshooting forum. You’ll see how others have got their issues resolved, and our team will be there to assist if your issue hasn’t been encountered yet.
  • j

    Jerri Comeau (Airbyte)

    04/04/2022, 8:28 PM
    Jerri Comeau [1:15 PM] Hi there! Thank you for your time posting in the Troubleshooting Slack Channel. This week we’re moving our support discussions over to discuss.airbyte.io; if you would be so kind as to post your questions there, you’ll be able ask for help from fellow users as well as the Airbyte team. It is our intention that as of Friday afternoon (Pacific Time) we will lock this channel to everything except administrator posts. We’ll leave everything in place for the rest of April if you need to get information or troubleshooting from existing threads, with the intention of archiving the channel in early May. We really appreciate everyone helping us to build our community, and we’re looking forward to the next phase. Sincerely, Jerri and the Airbyte team.
    k
    m
    • 3
    • 3
  • g

    gunu

    04/04/2022, 10:56 PM
    Is this your first time deploying Airbyte: no OS Version / Instance: Linux EC2 m5.4xlarge Deployment: Docker Airbyte Version: 0.35.64-alpha Source: Survey Monkey 0.1.7 Destination: Snowflake 0.4.24 Description:
    Please check with maintainers if the connector or library code should safely clean up its threads before quitting instead.
    Is this considered an issue? Our EC2 Docker instance crashes infrequently after running without problems and this often follows attempts to use/troubleshoot the survey monkey connector.
    Copy code
    2022-04-04 22:50:33 [43mdestination[0m > 2022-04-04 22:50:33 [33mWARN[m i.a.i.b.IntegrationRunner(watchForOrphanThreads):229 - The main thread is exiting while children non-daemon threads from a connector are still active.
    2022-04-04 22:50:33 [43mdestination[0m > Ideally, this situation should not happen...
    2022-04-04 22:50:33 [43mdestination[0m > Please check with maintainers if the connector or library code should safely clean up its threads before quitting instead.
    2022-04-04 22:50:33 [43mdestination[0m > The main thread is: main (RUNNABLE)
    2022-04-04 22:50:33 [43mdestination[0m >  Thread stacktrace: java.base/java.lang.Thread.getStackTrace(Thread.java:1610)
    2022-04-04 22:50:33 [43mdestination[0m >         at io.airbyte.integrations.base.IntegrationRunner.dumpThread(IntegrationRunner.java:264)
    2022-04-04 22:50:33 [43mdestination[0m >         at io.airbyte.integrations.base.IntegrationRunner.watchForOrphanThreads(IntegrationRunner.java:233)
    2022-04-04 22:50:33 [43mdestination[0m >         at io.airbyte.integrations.base.IntegrationRunner.runConsumer(IntegrationRunner.java:190)
    2022-04-04 22:50:33 [43mdestination[0m >         at io.airbyte.integrations.base.IntegrationRunner.lambda$runInternal$1(IntegrationRunner.java:163)
    2022-04-04 22:50:33 [43mdestination[0m >         at io.airbyte.integrations.base.sentry.AirbyteSentry.executeWithTracing(AirbyteSentry.java:54)
    2022-04-04 22:50:33 [43mdestination[0m >         at io.airbyte.integrations.base.sentry.AirbyteSentry.executeWithTracing(AirbyteSentry.java:38)
    2022-04-04 22:50:33 [43mdestination[0m >         at io.airbyte.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:163)
    2022-04-04 22:50:33 [43mdestination[0m >         at io.airbyte.integrations.base.IntegrationRunner.run(IntegrationRunner.java:105)
    2022-04-04 22:50:33 [43mdestination[0m >         at io.airbyte.integrations.destination.snowflake.SnowflakeDestination.main(SnowflakeDestination.java:30)
    2022-04-04 22:50:33 [43mdestination[0m > 2022-04-04 22:50:33 [33mWARN[m i.a.i.b.IntegrationRunner(watchForOrphanThreads):243 - Active non-daemon thread: pool-4-thread-1 (TIMED_WAITING)
    2022-04-04 22:50:33 [43mdestination[0m >  Thread stacktrace: java.base@17.0.1/jdk.internal.misc.Unsafe.park(Native Method)
    2022-04-04 22:50:33 [43mdestination[0m >         at java.base@17.0.1/java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:252)
    2022-04-04 22:50:33 [43mdestination[0m >         at java.base@17.0.1/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:1672)
    2022-04-04 22:50:33 [43mdestination[0m >         at java.base@17.0.1/java.util.concurrent.LinkedBlockingQueue.poll(LinkedBlockingQueue.java:460)
    2022-04-04 22:50:33 [43mdestination[0m >         at java.base@17.0.1/java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1061)
    2022-04-04 22:50:33 [43mdestination[0m >         at java.base@17.0.1/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1122)
    2022-04-04 22:50:33 [43mdestination[0m >         at java.base@17.0.1/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
    2022-04-04 22:50:33 [43mdestination[0m >         at java.base@17.0.1/java.lang.Thread.run(Thread.java:833)
    2022-04-04 22:50:33 [43mdestination[0m > 2022-04-04 22:50:33 [32mINFO[m i.a.i.b.FailureTrackingAirbyteMessageConsumer(close):65 - Airbyte message consumer: succeeded.
  • s

    Shubham Kumar

    04/05/2022, 5:56 AM
    Hey guys, can someone please help with this
    o
    m
    • 3
    • 3
  • r

    Rytis Zolubas

    04/05/2022, 7:22 AM
    S3 v0.3 destinaction deleted all my data in the bucket... in documentation I read why but still there should be never option to delete something after an upgrade. Pretty furious about this one
    c
    r
    • 3
    • 20
  • h

    Ha Pham

    04/05/2022, 8:42 AM
    hi, we have just upgraded Airbyte from v0.26.1 to v0.35.64-alpha and a sync from Hubspot to BigQuery (denormalized destination) failed
    m
    • 2
    • 3
  • h

    Haithem (WOOP)

    04/05/2022, 8:42 AM
    Hey, Is Airbyte effected with CVE-2022-22965. A remote code execution ?
    m
    • 2
    • 1
  • k

    Kyle Vorster

    04/05/2022, 8:49 AM
    Hey all, Hope I am in the correct channel, I've been having issues pulling data from Google Analytics. Version: 0.35.65-alpha airbyte/source-google-analytics-v4: 0.1.17 airbyte/destination-s3: 0.3.0 I'm using S3 to store the data from Google Analytics. I keep on getting these errors.
    Copy code
    2022-04-05 08:41:12 INFO i.a.w.t.TemporalUtils(withBackgroundHeartbeat):235 - Stopping temporal heartbeating...
    2022-04-05 08:41:12 INFO i.a.v.j.JsonSchemaValidator(test):56 - JSON schema validation failed. 
    errors: $.format_type: does not have a value in the enumeration [Avro], $.compression_codec: string found, object expected, $.compression_codec: should be valid to one and only one of the schemas 
    2022-04-05 08:41:12 INFO i.a.v.j.JsonSchemaValidator(test):56 - JSON schema validation failed. 
    errors: $.flattening: is missing but it is required, $.format_type: does not have a value in the enumeration [CSV]
    2022-04-05 08:41:12 INFO i.a.v.j.JsonSchemaValidator(test):56 - JSON schema validation failed. 
    errors: $.format_type: does not have a value in the enumeration [JSONL]
    m
    c
    • 3
    • 3
  • k

    Kiran Desi Reddy Venkata

    04/05/2022, 1:01 PM
    Hi Team, we are using 0.35.60-alpha version running in kubernetes, we have a question from our POC on Airbyte for data ingestion from mssql server to snowflake datawareshouse In Airbyte mssql server source connection, For replication method, we have the following description: "Replication Method - The replication method used for extracting data from the database. STANDARD replication requires no setup on the DB side but will not be able to represent deletions incrementally. CDC uses {TBC} to detect inserts, updates, and deletes. This needs to be configured on the source database itself." When we tried CDC option from above for replication method, airbyte has thrown error "MSSQL requires snapshot isolation enabled for CDC" error. However we have gone through in detail the usage of snapshot isolation configuration for mssql server, and we have below question: 1. Why do we need to enable the "ALLOW_SNAPSHOT_ISOLATION" db option on the mssql server databases? This is not a requirement of CDC in mssql server. Can anyone clarify the above use of "ALLOW_SNAPSHOT_ISOLATION" db option on the mssql server database? (since we dont want to enable in mssql server unless we understand the intended use of "ALLOW_SNAPSHOT_ISOLATION" db option)
    o
    m
    • 3
    • 3
  • h

    Hamza Liaqat

    04/05/2022, 2:10 PM
    With Github source and Postgres destination, each sync misses some data. Specifically, for a set of repos, size of
    _airbyte_raw_issues
    table is 1041 after sync 1, for example. Yet, if I rerun this experiment from scratch, the size of
    _airbyte_raw_issues
    would be 1030 (for example) such it's not reproducible. I ran many experiments today but airbyte seems to miss at least a few records in raw issues table in each sync.
    m
    • 2
    • 3
  • p

    Pranit

    04/05/2022, 2:35 PM
    By Source and target both are setup properly, but it still gives me below error. Source Klaviyo- Target: Big Query
    Copy code
    errors: $.credential: is not defined in the schema and the schema does not allow additional properties, $.part_size_mb: is not defined in the schema and the schema does not allow additional properties, $.gcs_bucket_name: is not defined in the schema and the schema does not allow additional properties, $.gcs_bucket_path: is not defined in the schema and the schema does not allow additional properties, $.keep_files_in_gcs-bucket: is not defined in the schema and the schema does not allow additional properties, $.method: must be a constant value Standard
    m
    • 2
    • 1
  • p

    Pablo L.

    04/05/2022, 2:45 PM
    Is this your first time deploying Airbyte: Yes OS Version / Instance: Ubuntu 20.04 Memory / Disk: 32Gb / 1Tb SSD Deployment: Docker Airbyte Version: 0.35.?-alpha Source name/version: Microsoft SQL Server 2014 - 12.0.5687.1 Destination name/version: Microsoft SQL Server 2019 - 15.0.2080.9 Step: Performed some 300 GB worth of sync, but lost all connection data in airbyte after running docker-compose down -v. Would like to restore the connection data without having to rerun the 30-hour long sync process again.
    m
    • 2
    • 1
  • j

    Jing Zhang

    04/05/2022, 4:18 PM
    Hi team, does Airbyte have a port number that we can use for healthcheck? Trying to set it up with cloud load balancer on GCP. Thanks!
    m
    • 2
    • 1
  • k

    Kevin Chan

    04/05/2022, 4:32 PM
    Hi. I just started using Airbyte and set it up on digital ocean. I'm trying to sync Harvest to MySQL. However it keep failing at normalization.
  • k

    Kevin Chan

    04/05/2022, 4:53 PM
    Never mind, fixed by turn off normalization. As mySQL is not version 8
    m
    • 2
    • 3
  • k

    komal azram

    04/05/2022, 7:18 PM
    Hi everyone Can we use local directory as souce?
    m
    • 2
    • 9
  • a

    Andrés Bravo

    04/05/2022, 9:17 PM
    I’m having this same issue when upgrading from
    v0.35.3-beta
    to
    v0.35.46-beta
    upgrading is fine but after that, export and import the same file fails with this message. Going
    v0.35.30-beta
    first does not help.
    l
    m
    • 3
    • 12
  • s

    Steve Reeling

    04/05/2022, 10:36 PM
    Here is a log of a sync.
    m
    • 2
    • 1
  • k

    komal azram

    04/05/2022, 10:43 PM
    what should be the url here when I am trying to use local directory as source?
    m
    • 2
    • 1
  • p

    Piyush Bajaj

    04/06/2022, 8:01 AM
    Hey Team , I have a concern .I am moving the data from MongoDB to BIgquery and all the nested columns are there in the raw table but the final table the values are null .I tried this same in the month of December at that time the json objects were coming in the string format but now it is coming blank .Can anyone help me to resolve this .
    m
    • 2
    • 3
  • a

    Abdulrahman Abu Tayli

    04/06/2022, 8:23 AM
    Hi everyone! I’m having the same issue described on these tickets where I can’t run custom transformation on kubernetes: 1. https://github.com/airbytehq/airbyte/issues/9899 2. https://github.com/airbytehq/airbyte/issues/5091 One of the purposed solutions is to ship a custom docker image for dbt. Is there a detailed guide for how to do that and what changes are required?
    o
    m
    • 3
    • 4
  • i

    Isaac Harris-Holt

    04/06/2022, 9:36 AM
    Is this your first time deploying Airbyte Yes OS Version / Instance Windows 10 Memory / Disk 32 gig/terabyte Deployment Docker Airbyte Version 0.35.42 alpha Source name/version Postgres 4.4 Destination name/version MS SQL Server 0.1.15 Step Incremental sync deduped and history Description I am using the incremental sync with dedupe and history for a postgres to ms sql server connection. The tables all have primary keys defined in the source and the cursor field is the UpdatedDate field. After the initial sync, the existing records that get changed in the source are getting updated in the destination. As expected. However, new records entered in the source after initial sync are not being added to the destination table or the destination_ table _scd. Any ideas why this would happen?
  • i

    Isaac Harris-Holt

    04/06/2022, 9:43 AM
    Is this your first time deploying Airbyte No OS Version / Instance Alpine Linux Memory / Disk 8GB / 32GB Deployment Docker Airbyte Version 0.35.47-alpha Source name/version MySQL / 0.5.6 Destination name/version BigQuery / 0.6.11 Step Incremental | Append Description I'm using the incremental sync on an auto-incremental integer ID for a few tables, and this is usually fine. However, there was an instance where Airbyte failed to run a job on the first attempt, then succeeded on the second. Unfortunately, this has created some duplicate entries in the BigQuery dataset, and I'm not sure why this has happened. What's the normal best practice for making sure this doesn't occur again? I'm thinking of switching to using the Deduped write mode, but it seems like this issue shouldn't happen with an incremental run in the first place.
    o
    m
    • 3
    • 2
  • m

    Matheus Guinezi

    04/06/2022, 1:26 PM
    Hi guys! I am trying to make airbyte use facebook_business python library v13 instead of v12 in source-facebook-marketing setup.py installation, but it keeps giving the same error, even when I start from scratch and rebuild containers and images. Is the way I am doing to get the lib from git correct? Is there any cache that could prevent me from getting this facebook_business source code?
    o
    m
    • 3
    • 3
  • p

    Pranit

    04/06/2022, 5:13 PM
    team need help in resolving below error, by input is klaviyo and output is big query. Both connections are setup properly but still gives below error
    Copy code
    2022-04-06 17:12:06 INFO i.a.v.j.JsonSchemaValidator(test):56 - JSON schema validation failed. 
    errors: $.credential: is not defined in the schema and the schema does not allow additional properties, $.part_size_mb: is not defined in the schema and the schema does not allow additional properties, $.gcs_bucket_name: is not defined in the schema and the schema does not allow additional properties, $.gcs_bucket_path: is not defined in the schema and the schema does not allow additional properties, $.keep_files_in_gcs-bucket: is not defined in the schema and the schema does not allow additional properties, $.method: must be a constant value Standard
    o
    m
    • 3
    • 2
  • s

    Schuyler Duveen

    04/06/2022, 7:57 PM
    Is this your first time deploying Airbyte: Yes Airbyte Version: 0.35.65-alpha Source name: BigQuery Destination name: Postgres Step: On sync Description: sync is successful and schema is mapped correctly, but there are no records. Logs have
    i.a.w.DefaultReplicationWorker(lambda$getReplicationRunnable$5):305 - Total records read: 1 (0 bytes)
    m
    • 2
    • 2
1...1011121314Latest