https://linen.dev logo
Join Slack
Powered by
# troubleshooting
  • r

    Rodrigo Menezes

    07/12/2021, 8:42 PM
    I’m replicating Postgres -> Postgres. Airbyte reported a successful sync, but zero rows were copied over for a few tables. Is this your first time deploying Airbyte: No OS Version / Instance: Ubuntu 20 Memory / Disk: t3.medium (8GB memory, not sure disk) Deployment: Docker Airbyte Version: 0.27.0-alpha Source name/version: Postgres 0.3.6 Destination name/version: Postgres 0.3.6 Step: Manual sync Description: I see a “Succeeded”, but I have zero rows for many tables. Using CDC. The only errors I see are:
    Copy code
    2021-07-12 20:10:45 INFO () JsonSchemaValidator(test):76 - JSON schema validation failed. 
    errors: $: unknown found, object expected
    2021-07-12 20:10:45 ERROR () DefaultAirbyteStreamFactory(lambda$create$1):83 - Validation failed: null
    • 1
    • 2
  • x

    Xing Fang

    10/04/2021, 5:44 PM
    Hi, I want to follow up with the retry checkpoint issue when transferring a large amount of data between src and destination. Is there a timeline for fixing it so we can try it out? Thank you very much!
    u
    m
    • 3
    • 5
  • t

    Tom Gordon

    12/16/2021, 6:57 PM
    Is this your first time deploying Airbyte: No  OS Version / Instance: macOS 11.6.1 Memory / Disk: 16GB / 500GB SSD  Deployment: Docker Airbyte Version: 0.34.0-alpha Source name/version: MySQL 0.4.13 (0.5.0 and 0.5.1 fail the connection test) Destination name/version: S3 0.1.16 Step: On sync  Description: Extracting MySQL tables to S3 as parquet files with snappy compression is working for tables as large as 6GB so far, but trying on a 14GB table fails without reading any rows. It seems to time out after 5 minutes while waiting for query results
    Copy code
    [34msource[0m - 2021-12-16 18:38:36 INFO () DefaultAirbyteStreamFactory(lambda$create$0):61 - 2021-12-16 18:38:36 [32mINFO[m i.a.i.s.m.MySqlSource(getIncrementalIterators):181 - {} - using CDC: false
    [34msource[0m - 2021-12-16 18:38:36 INFO () DefaultAirbyteStreamFactory(lambda$create$0):61 - 2021-12-16 18:38:36 [32mINFO[m i.a.i.s.r.AbstractRelationalDbSource(queryTableFullRefresh):35 - {} - Queueing query for table: establishments
    [34msource[0m - 2021-12-16 18:43:48 INFO () DefaultAirbyteStreamFactory(lambda$create$0):61 - 2021-12-16 18:43:48 [32mINFO[m i.a.i.s.r.AbstractDbSource(lambda$read$2):123 - {} - Closing database connection pool.
    [34msource[0m - 2021-12-16 18:43:48 INFO () DefaultAirbyteStreamFactory(lambda$create$0):61 - 2021-12-16 18:43:48 [32mINFO[m i.a.i.s.r.AbstractDbSource(lambda$read$2):125 - {} - Closed database connection pool.
    [34msource[0m - 2021-12-16 18:43:48 ERROR () LineGobbler(voidCall):82 - Exception in thread "main" java.lang.RuntimeException: java.lang.RuntimeException: com.mysql.cj.jdbc.exceptions.CommunicationsException: Communications link failure
    [34msource[0m - 2021-12-16 18:43:48 ERROR () LineGobbler(voidCall):82 - 
    [34msource[0m - 2021-12-16 18:43:48 ERROR () LineGobbler(voidCall):82 - The last packet successfully received from the server was 310,727 milliseconds ago. The last packet sent successfully to the server was 310,727 milliseconds ago.
    u
    l
    +2
    • 5
    • 16
  • e

    Emah Bright

    02/01/2022, 3:42 AM
    Is this your first time deploying Airbyte: No OS Version: Ubuntu 20.04 Memory / Disk: 4 Gb / 80 Gb Deployment: Digital Ocean Droplet Airbyte Version: 0.30.25 Source name/version: Amazon Seller Partner/0.2.14 Destination name/version: Bigquery Description: Unable to set up the source. Getting error.
    Copy code
    The connection tests failed.
    Exception('Error while refreshing access token: 401 Client Error: Unauthorized for url: <https://api.amazon.com/auth/o2/token>')
    Followed the amazon seller set up, doubled checked the permissions and still seems to get this error. Is there any way to get more details from the logs? Would be great to see more details on ERROR check failed line. Logs below:
    Copy code
    2022-02-01 03:32:35 INFO () TemporalAttemptExecution(get):94 - Executing worker wrapper. Airbyte version: 0.30.25-alpha
    2022-02-01 03:32:35 INFO () LineGobbler(voidCall):82 - Checking if airbyte/source-amazon-seller-partner:0.2.14 exists...
    2022-02-01 03:32:35 INFO () DockerProcessFactory(create):127 - Preparing command: docker run --rm --init -i -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -w /data/06df2588-f803-473f-8ba3-3a443e8fea50/0 --network host --log-driver none airbyte/source-amazon-seller-partner:0.2.14 check --config source_config.json
    2022-02-01 03:32:36 ERROR () DefaultAirbyteStreamFactory(internalLog):96 - Check failed
    2022-02-01 03:32:36 INFO () TemporalAttemptExecution(get):115 - Stopping cancellation check scheduling...
    m
    a
    +3
    • 6
    • 11
  • r

    Roy Peter

    02/08/2022, 4:05 AM
    Hi Team, few records present in source are missing in destination Is this your first time deploying Airbyte: No OS Version / Instance: EC2, Ubuntu 20.04 Memory / Disk: 120Gb / 100GB SSD Deployment: EC2 Airbyte Version: 0.35.12-alpha Source name/version: Salesfroce 0.1.21 Destination name/version: Redshift 0.3.23 (s3 is used for staging data) Description: Records present in source is missing in destination
    h
    m
    • 3
    • 18
  • s

    Serhii Сheredko

    02/08/2022, 11:32 AM
    Hi! Is this your first time deploying Airbyte: Yes OS Version / Instance: Local Airbyte in a Docker container, my machine is on Ubuntu 20.04 Deployment: Docker Airbyte Version: 0.35.23-alpha Source name/version: MongoDB v4.2.13 Destination name/version: - Step: Setting new source Description: I'm trying to configure new source, which is a MongoDB replica set on MongoDB Atlas Cloud. I chose "MongoDB Instance type" option to be "Replica Set" and pointed out my replicas along with the name. But I'm getting the "SSL peer shut down incorrectly" error. I thought it might be an issue with IP whitelisting, but my IP is whitelisted on the Cloud and I can connect to it locally. If I change it, I'll get another error("Connect timed out"). Hence, it seems it's something to do with SSL, not whitelisting.
    n
    a
    m
    • 4
    • 12
  • e

    Elliot Trabac

    02/13/2022, 8:44 AM
    Is this your first time deploying Airbyte: No OS Version / Instance: AWS EC2 t2.medium Memory / Disk: 4GB RAM / 30 Gib Deployment: Docker Airbyte Version: 0.35.28-alpha Source name/version: airbyte/source-mysql 0.54 Destination name/version:  airbyte/destination-clickhouse 0.1.2 Description: I want to load my MySQL data to Clickhouse and I need a basic normalization before migration (not a JSON format). But I got
    nomalization falied
    error when syncing data CC: @Arash Layeghi @Parham
    a
    j
    • 3
    • 3
  • r

    Rytis Zolubas

    02/14/2022, 10:24 AM
    Is this your first time deploying Airbyte: Yes OS Version / Instance: MacOS Catalina Memory / Disk: 16Gb / 500 GB Deployment: Docker, locally Airbyte Version: 0.35.28-alpha Source name/version: MySQL / 5.7 Destination name/version: non yet Description: i am trying to setup a MySQL, but I'm getting the following error message. The MySQL database is reachable from my computer via Talend or via MySQL workbench
    Copy code
    Could not connect with provided configuration. Error: Cannot create PoolableConnectionFactory (Communications link failure The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.)
    a
    f
    +3
    • 6
    • 5
  • m

    Marc Heymann

    02/14/2022, 11:23 AM
    Is this your first time deploying Airbyte: Yes OS Version / Instance: Linux/UNIX Memory / Disk: 30GiB Deployment: Docker Airbyte Version: 0.35.28-alpha Source name/version: Not yet Destination name/version: non yet Description: I have installed airbyte over ec2 instance. But unable to connect to the UI via browser.
    a
    s
    +2
    • 5
    • 7
  • a

    Aditya Chatterjee

    02/14/2022, 1:20 PM
    Hello, I have some issues when I install Airbyte Is this your first time deploying Airbyte: Yes OS Version / Instance: Windows 10 Memory / Disk: 16Gb / 1Tb SSD Deployment: Docker Airbyte Version: 0.35.27-alpha (also with latest version) Step: Setting the first connexion to localhost:8000 Description: I just followed up the getting started from github, ( docker compose up working, I got my container) but when I go to localhost:8000 I have the message " Cannot reach server. The server may still be starting up." My airbyte-temporal is looping continously and restart again and again, here is the log from airbyte-temporal file :
    f
    m
    +5
    • 8
    • 8
  • a

    Antonio Grass

    02/14/2022, 1:39 PM
    Is this your first time deploying Airbyte: no OS Version / Instance: Linux EC2 m5.4xlarge Deployment: Docker Airbyte Version: 0.35.6-alpha Destination:  MySQL 0.1.15 Description:  When MySQL database it's full will infinite retry and due the infinite loop of retries the service stop working. I needed to delete the destination to recover it. It would be great that after some number of retries pause the connection or just retry with longer timeframes.
    j
    d
    c
    • 4
    • 4
  • a

    Antonio Grass

    02/14/2022, 2:25 PM
    Is this your first time deploying Airbyte: No  OS Version / Instance: ubuntu 20.04  Memory / Disk: 32Gb / 120GB  Deployment: Docker compose Version : 0.35.15-alpha source : MSSQL 2012 destination: MSSQL 2019 Need help with configuring to make moving tables from one SQL to another faster. Moving a SQL table has 15M rows, and takes up about 2GB of space. Tested using SSIS and the runtime was 1.5 minutes. Tested using Airbyte after and it took roughly 1.5 hours. I looked at the scaling airbyte page and saw it mostly had to do with memory and storage. I updated to 64Gb of memory, and moved the machine to having 16 cores. After another few tests I kept seeing pretty low CPU and memory usage so I then updated the
    JOB_MAIN_CONTAINER_MEMORY_REQUEST
    and
    JOB_MAIN_CONTAINER_MEMORY_LIMIT
    say that I had double the memory in my system in hopes that it would use more, but I never saw it use more than 10gb. The source and destination DB servers have 32gb memory each and plenty of storage. I also updated the number of workers from 5-15 and set new CPU request limits at near the maximum. I also restarted the server after making these changes. from watching the log file as the job run it seems like the bottleneck is when it flushes the buffer. It reports that it only takes seconds but in reality it appears to take 2-3 minutes between this happening and running through the next group of record reads. I have attached a log from my recent run that I stopped at 6 minutes as all the adjustments I made did not seem to yield and changes. Really my issue is how can I configure airbyte to better move this type of table between sql servers, when SSIS completed this in a minute and a half I didnt expect it to match but taking over an hour was far from the expected result.
    k
    • 2
    • 1
  • p

    Peter Leiwakabessy

    02/14/2022, 3:09 PM
    Hello everyone! Is this your first time deploying Airbyte: No OS Version / Instance: Ubuntu 20.04 on AWS EC2 Deployment: Docker Airbyte Version: 0.35.27-alpha Source name: MySQL (0.5.4) Destination: Snowflake (0.4.8) Hello! I’m having issues with the 
    source-mysql
      Airbyte connector, with a CDC configuration. Incremental sync fails with this type of error log:
    Copy code
    2022-02-14 14:15:11 source > 2022-02-14 14:15:11 ERROR i.d.r.TableSchemaBuilder(lambda$createValueGenerator$5):269 - Failed to properly convert data value for 'prestashopprod.ps_product.available_date' of type DATE for row [1, 0, 0, 13, 1, 97, 0, 0, [], [54, 54, 48, 48, 52, 50, 55, 49, 51, 54, 55, 52], 0.000000, 0, 1, 499.000000, 0.000000, [], 0.000000, 0.00, [67, 82, 88, 48, 48], [], [], 0.000000, 0.000000, 0.000000, 1.900000, 2, 0, 0, 0, 0, 0, 404, 0, 1, null, new, 1, 0, both, 0, 0, 0, 0, 2017-05-30T17:06:36.000+0000, 2020-04-23T10:32:53.000+0000, 0, 3]:
    2022-02-14 14:15:11 source > org.apache.kafka.connect.errors.DataException: Invalid Java object for schema type STRING: class java.lang.Integer for field: "available_date"
    2022-02-14 14:15:11 source > 	at org.apache.kafka.connect.data.ConnectSchema.validateValue(ConnectSchema.java:245) ~[connect-api-2.6.1.jar:?]
    2022-02-14 14:15:11 source > 	at org.apache.kafka.connect.data.Struct.put(Struct.java:216) ~[connect-api-2.6.1.jar:?]
    2022-02-14 14:15:11 source > 	at io.debezium.relational.TableSchemaBuilder.lambda$createValueGenerator$5(TableSchemaBuilder.java:265) ~[debezium-core-1.4.2.Final.jar:1.4.2.Final]
    even though, when looking at the table, the row actually seems to have okay values... To sum up : https://github.com/airbytehq/airbyte/issues/6884 this issue that I was encountering previously before seems to be fixed, but now I’m having the cousin of this one https://github.com/airbytehq/airbyte/issues/9118... Any plans to upgrade Debezium at some point?
    k
    m
    • 3
    • 2
  • a

    Alexandre Chouraki

    02/14/2022, 3:37 PM
    And another one: Is this your first time deploying Airbyte: No OS Version / Instance: Ubuntu 20.04 on AWS EC2 Deployment: Docker Airbyte Version: 0.35.27-alpha Source name: Jira (0.2.18) Destination: Snowflake (0.4.8) Somehow, some streams seem to have no data.
    Copy code
    2022-02-14 14:53:10 source > Syncing stream: issues 
    2022-02-14 14:53:10 source > Read 0 records from issues stream
    2022-02-14 14:53:10 source > Finished syncing SourceJira
    2022-02-14 14:53:10 source > SourceJira runtimes:
    This is strange, because : • I know for a fact this data exists • The token and user associated to my source are basically superadmins • Other streams do return records • going to https://XXX.atlassian.net/rest/api/3/search?jql= does yield the issues I want...
    s
    v
    +3
    • 6
    • 7
  • a

    Alexandre Chouraki

    02/14/2022, 3:59 PM
    Hi, Did anyone manage to spin up Airbyte on Kubernetes? I tried to follow the relevant docs and as soon as I ran the kustomization
    kubectl apply -k kube/overlays/stable
    I get
    error: json: unknown field "envs"
    error. I haven’t really dived into the code just read the docs to do a quick start.. should I change something on one of the manifests?
    j
    d
    r
    • 4
    • 5
  • p

    Peter

    02/14/2022, 5:13 PM
    Hi, Just a question which I haven't been able to answer by myself, for the Postgres source connector, in the Incremental + Deduped History mode, the deduplication is performed at the normalization step, right? So if I choose not to have the basic normalization step, I would have to implement the deduplication logic on my side. If so, is there a recommended way to do this? For example using ROW_NUMBER or DISTINCT ON ?
    c
    r
    • 3
    • 2
  • h

    Houman Farokhzad

    02/14/2022, 6:59 PM
    hello, how do i merge tables when running increment dedup? my stream creates two tables because of nested structure. main table has _airbyte_unique_key but i don’t see any sensible value to map this to in the second table:
    Copy code
    _airbyte_hubspot_companies_hashid STRING 	NULLABLE 	
    _airbyte_ab_id STRING 	NULLABLE 	
    _airbyte_emitted_at TIMESTAMP 	NULLABLE 	
    _airbyte_normalized_at TIMESTAMP 	NULLABLE 	
    _airbyte_properties_hashid STRING 	NULLABLE
  • t

    Titas Skrebė

    02/14/2022, 7:06 PM
    Hi. I have a postgres->bigquery connector both of whose pods are in error stage as you see below.
    Copy code
    k get pods | grep 126044
    airbyte-bigquery-sync-126044-0-bghap       0/5     Error       0          20h
    source-postgres-sync-126044-0-zgnvm        0/4     Error       0          20h
    but on the UI it shows running and doesn’t restart / retry it even after 20h. any idea what’s going on. I am on 0.35.10-alpha
    m
    j
    +4
    • 7
    • 10
  • c

    Cayden Brasher

    02/15/2022, 8:08 PM
    Is this your first time deploying Airbyte:  Yes OS Version / Instance: Mac OS 12.2 Beta Memory / Disk: 32Gb / 1Tb SS*Deployment*: Local Airbyte Version: most recent Source name/version: CDK Python API Destination name/version: Postgres 0.3.14 Step: On connection Description: I am attempting to connect Airbyte to my companies Quickbooks to retrieve CSV every 30 minutes of so. I have been able to find RealmID, Client ID, Client Secret. I do not know what to put for User Agent, Start Date, and Refresh Token
    g
    s
    • 3
    • 2
  • g

    gunu

    02/16/2022, 2:54 AM
    Anyone experiencing google ads connector issues
    Copy code
    2022-02-16 02:04:10 WARN i.a.w.p.a.DefaultAirbyteStreamFactory(internalLog):96 - Request made: ClientCustomerId: ####, Host: <http://googleads.googleapis.com|googleads.googleapis.com>, Method: /google.ads.googleads.v9.services.GoogleAdsService/Search, RequestId: XXX, IsFault: True, FaultMessage: User doesn't have permission to access customer. Note: If you're accessing a client customer, the manager's customer id must be set in the 'login-customer-id' header. See <https://developers.google.com/google-ads/api/docs/concepts/call-structure#cid>
    it appears the
    ClientCustomerId
    is not properly being set as the manager’s customer id
    d
    j
    d
    • 4
    • 18
  • g

    gunu

    02/16/2022, 4:14 AM
    Is this your first time deploying Airbyte:  Yes OS Version / Instance: AWS Airbyte Version: 0.35.27 Source name/version: Notion Destination name/version: s3 Bucket Step: Post Sync Description: Post data sync, I see blocks only from main pages not sub pages(structure attached). Do I have to configure anything explicitly to get those blocks?
    n
    k
    +2
    • 5
    • 5
  • d

    Divya (Proximity)

    02/16/2022, 9:17 AM
    Is this your first time deploying Airbyte: No OS Version / Instance: Ubuntu 20.04.3 LTS Memory / Disk: 16Gb Deployment: Docker Airbyte Version: 0.35.10-alpha Source name/version: Custom connector Destination name/version: CSV Description: After upgrading Airbyte from 0.19.0-alpha to 0.35.10-alpha version i can't add custom connector to platform. All test from that connector successfully passing, i can add it to Airbyte platform, but when i add it to yml file for platform - i have trouble like
    Internal Server Error: Get Spec job failed.
    I checked airbyte-scheduler logs and they say something like this:
    2022-02-16 08:52:51 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword example - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
    2022-02-16 08:52:51 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword existingJavaType - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
    So maybe someone knows how i can resolve this issue or any advices?
    k
    m
    +13
    • 16
    • 34
  • u

    user

    02/16/2022, 12:56 PM
    source : facebook marketing destination : big query • Only one table is choosen for syncing • location is US-west2 • logs shows creation of table -> [CREATE TABLE (0.0 rows, 0.0 Bytes processed) in 2.56s] • Syncing stops at ◦ INFO i.a.w.t.TemporalUtils(withBackgroundHeartbeat):234 - Stopping temporal heartbeating...
    k
    • 1
    • 1
  • s

    SyedHamza

    02/16/2022, 3:32 PM
    Hello Team, I am MLOPS and Backend Lead from Cerebra.ai. We are a well funded AI startup in Bay Area helping fashion retail industry to maximize their profits and reduce their risks using AI. I cam across Airbyte couple of days ago and find it very useful for our use case as we integrate different data source every week. I am facing a problem wondering anyone here can help me. Source: Postgres(DB1) Destination: Postgres(DB2) Problem: 1. Source does not show any tables to sync data to destination 2. When I run the connection nothing gets transfered but show success. Ask: What to do solve or debug it? Thanks.
    p
    k
    • 3
    • 4
  • p

    Preet Singh

    02/16/2022, 7:27 PM
    Is this your first time deploying Airbyte: No OS Version / Instance: Mac OS 11.5.2 Memory / Disk: 32 GB Deployment: Docker Airbyte Version: 0.35.10-alpha Source name/version: Twilio 0.1.2 Destination name/version: S3 0.2.7 Description: Tables (
    messages
    ,
    calls
    ,
    message_media
    ) using the "Incremental | Append" sync mode cannot have its date columns (
    date_sent
    ,
    date_created
    ,
    date_updated
    ) queried. When we try to query
    date_created
    directly, we receive a
    Spectrum Scan Error
    . When we try
    date_created.member0
    we receive
    relations "date_created" does not exist
    . Moving to a "Full Refresh | Overwrite" sync mode (which does work as evidenced by some of our other tables) isn't possible because we have 4+ million records in the
    messages
    table
    a
    j
    +2
    • 5
    • 7
  • n

    Nicolas Smith

    02/16/2022, 9:01 PM
    Is this your first time deploying Airbyte: No OS Version / Instance: Mac OS 11.5.2 Memory / Disk: 32 GB Deployment: Docker Airbyte Version: 0.35.10-alpha Source name/version: Twilio 0.1.2 Destination name/version: Redshift 0.3.25 Description: Messages table triggers a 400 Client Error
    k
    a
    j
    • 4
    • 3
  • n

    Nicolas Smith

    02/16/2022, 9:18 PM
    Ask for help submission from @Krzysztof Karski *Is this your first time deploying Airbyte * Yes *OS Version / Instance * AWS *Memory / Disk * 5GB *Deployment * Docker *Airbyte Version * 0.35.9-alpha *Source name/version * salesforce *Destination name/version * redshift *Step * Normaliation *Description * We are trying to sync all tables from salesforce into redshift. We can see all the tmp tables created and rows being downloaded but when the normalization phase runs, we get errors for what looks like all models. We first see ERRORs in incremental model: 1 of 561 ERROR creating incremental model airbyte.sforce_acceptedeventrelation............................... [ERROR in 40.04s] Then errors on relationships: relation "airbyte._airbyte_raw_sforce_acceptedeventrelation" does not exist 2022-02-16 210935 normalization > compiled SQL at ../build/run/airbyte_utils/models/generated/airbyte_incremental/airbyte/sforce_acceptedeventrelation.sql Finally: Done. PASS=0 WARN=0 ERROR=545 SKIP=16 TOTAL=561 2022-02-16 210941 normalization > 2022-02-16 210936.056207 (MainThread): Flushing usage events
    k
    • 2
    • 1
  • m

    Maish Maseeh

    02/17/2022, 9:49 AM
    Hello there 🙂 First I want say that using Airbyte has bean a pretty great experience so far. The UI and available resources are awesome! Not sure if this is the right place but I am struggling with the Hubspot source output: I cannot find a way to join companies and line items to the deals data. Looking at the source code I noticed that
    associations
    are specified for certain data types. Are these associations simply missing? But I feel more like I am doing something wrong since I imagine that anyone using the source would have a need for these links. So I am probably simply not seeing something in the existing data. Thanks, Jorin
    j
    p
    +2
    • 5
    • 18
  • j

    Jorin

    02/17/2022, 11:59 AM
    Hello, Does the migration to the mandatory intermediate version
    v0.32.0-alpha-patch-1
    takes longer (upgrading from
    v0.29.4
    ? Doing it in GKE and its been ~30mins. CPU/Mem being utilized but not seeing anything in logs so wondering. Data is not that high . Counts from tables (configs: 230, jobs: ~8000; attempts: ~100000)
    p
    a
    +2
    • 5
    • 16
  • p

    Pras

    02/17/2022, 2:55 PM
    Bug/Issue found during configuration migration. If user export the configuration and import it other server then cursor among all the connections are lost. (on and above v0.34.3-alpha) Description: I am trying to shift my configuration from EC2 instance to EKS. Hence performing configuration migration. If I try to export configuration and try to import on other server (assuming on both airbyte is running same version and are above and equal to v0.34.3-alpha ) then after import all the connections restore, but all the existing log and execution are lost, along with cursor). Log are not important but cursor is needed. Otherwise, all the old data will be repeated again. I tried same process with v0.34.2-alpha but airbyte is able restore all the configuration perfectly. is this because of this issue ? I experimented this issue with following version:
    Copy code
    0.32.0-alpha-patch-1	working fine
    0.32.11-alpha	        working fine
    0.34.2-alpha	        working fine
    0.34.3-alpha	        cursor lost
    0.34.4-alpha	        cursor lost
    0.35.1-alpha	        cursor lost
    0.35.18-alpha	        cursor lost
    a
    j
    +10
    • 13
    • 20
12345...14Latest