https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • j

    Jake Vernon

    02/03/2023, 5:36 PM
    Hello - I have airbyte running locally and for some reason I dont have the Wordpress source installed. How do I install this? I cant find the docker repo for it either
    u
    u
    • 3
    • 3
  • r

    Rafael Paraíso Rossim

    02/03/2023, 6:30 PM
    I'm using airbyte OSS and I'm using source facebook pages with destination bigquery and the error
    Copy code
    Failure Origin: normalization, Message: Something went wrong during normalization
    full log attached could you help me to solve this error please?
    4d1c80c9_7338_4796_8826_e17981c1f9ef_logs_48_txt.txt
    u
    m
    +2
    • 5
    • 11
  • g

    German Bettarel

    02/03/2023, 6:38 PM
    Hi everyone, we are trying to develop a source connector for crowd.dev API, we don't have problems to make it run in an older version of airbyte (0.40.14), but when we try to do it in the latest version (0.40.32) we receive the logs that I attached in the file. We think that some changes in this version are causing problems. If anyone has any idea what it could be or how to fix it, we would appreciate it. From the logs we think that the problem is in how the schemas are being detected.
    72b1866a_24aa_4cfc_ae2f_7d3216ac79e6_logs_1_txt.txt
    m
    • 2
    • 3
  • r

    Rocky Appiah

    02/03/2023, 7:16 PM
    Getting this error when trying to do `docker compose up -d`:
    Copy code
    Error response from daemon: invalid mount config for type "bind": bind source path does not exist: /home/ec2-user/airbyte/flags.yml
    u
    • 2
    • 3
  • r

    Rafael Ferreira

    02/03/2023, 7:42 PM
    Hey guys!
  • r

    Rafael Ferreira

    02/03/2023, 7:43 PM
    There is some alpha or beta connectors for Pendo?
    u
    • 2
    • 1
  • l

    Luis Gomez

    02/03/2023, 7:44 PM
    Hi Airbyte team. We recently migrated to v0.40.28 and when the sync jobs of some of our connections run, they don't save a state (even though they complete successfully). That's not good on our end and because of that, we're running full refresh every round, instead of only getting incremental updates. Any idea what might be wrong?
    n
    l
    • 3
    • 9
  • s

    Steve Wilkinghoff

    02/03/2023, 7:54 PM
    Getting an error when trying to run pip install -r requirments.txt in connectors folder using the low-code-sdk. Error is "ERROR: Could not find a version that satisfies the requirement airbyte-cdk~=0.2 (from connector-acceptance-test) (from versions: none) ERROR: No matching distribution found for airbyte-cdk~=0.2"
    u
    • 2
    • 1
  • s

    Sanjeev

    02/02/2023, 5:15 PM
    Hi I am trying to upgrated airbyte from 0.35.12 to latest version. Can I export all the configurations and import it ? There is no UI to do it in the latest version as far as I can see (I tried following the method in airbyte docs, but it was causing issues during the upgrade)
    u
    • 2
    • 1
  • r

    Rocky Appiah

    02/03/2023, 9:58 PM
    I have a large number of views, ~ 20k. I suspect when doing a discover_schema I’m getting a timeout error based on the error below. Is there a way to ignore views on schema discovery for a postgres db?
    Copy code
    airbyte-server                      | Feb 03, 2023 9:57:20 PM org.glassfish.jersey.server.ServerRuntime$Responder writeResponse
    airbyte-server                      | SEVERE: An I/O error has occurred while writing a response message entity to the container output stream.
    airbyte-server                      | org.glassfish.jersey.server.internal.process.MappableException: org.eclipse.jetty.io.EofException
    j
    n
    • 3
    • 7
  • c

    Chris Nogradi

    02/03/2023, 10:40 PM
    I am seeing the normalization into Postgres change the column name from uppercase first letter to all lowercase. It seems that this is a postgres thing that requires columns to be quoted to keep them the proper case. I can't figure out how to make Airbyte preserve the case during normalization. I tried setting the advanced options for the CSV S3 ingestor to set the column name with escaped quotes: "\"Column\"" but that did not work. Any other ideas?
    u
    • 2
    • 2
  • j

    Jordan Fox

    02/03/2023, 11:06 PM
    Is there an api endpoint to just flush the logs older than X days? If not, that would be nice
    n
    • 2
    • 2
  • t

    Tien Nguyen

    02/04/2023, 12:32 AM
    Hi all, I have deployed a kubernetes Airbyte instance and tried to sync large data from MSSQL. However, it fails during the replication process and I don't know the reason why. The first time, I guess it was the time out sync problem. Then, I change the config to 7 days. It gave me a different error. Can someone please help. Thanks very much in advance.
    Copy code
    [33mWARN[m i.t.i.a.ActivityTaskExecutors$BaseActivityTaskExecutor(execute):114 - Activity failure. ActivityId=faf75225-c60a-35d8-ab74-012dafc6f873, activityType=RunWithJobOutput, attempt=4
    java.util.concurrent.ExecutionException: io.airbyte.workers.exception.WorkerException: Unexpected error while getting checking connection.
    	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:396) ~[?:?]
    	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:2073) ~[?:?]
    	at io.airbyte.workers.temporal.TemporalAttemptExecution.get(TemporalAttemptExecution.java:161) ~[io.airbyte-airbyte-workers-0.40.28.jar:?]
    	at io.airbyte.workers.temporal.check.connection.CheckConnectionActivityImpl.runWithJobOutput(CheckConnectionActivityImpl.java:117) ~[io.airbyte-airbyte-workers-0.40.28.jar:?]
    	at jdk.internal.reflect.DirectMethodHandleAccessor.invoke(DirectMethodHandleAccessor.java:104) ~[?:?]
    	at java.lang.reflect.Method.invoke(Method.java:578) ~[?:?]
    	at io.temporal.internal.activity.RootActivityInboundCallsInterceptor$POJOActivityInboundCallsInterceptor.executeActivity(RootActivityInboundCallsInterceptor.java:64) ~[temporal-sdk-1.17.0.jar:?]
    	at io.temporal.internal.activity.RootActivityInboundCallsInterceptor.execute(RootActivityInboundCallsInterceptor.java:43) ~[temporal-sdk-1.17.0.jar:?]
    	at io.temporal.internal.activity.ActivityTaskExecutors$BaseActivityTaskExecutor.execute(ActivityTaskExecutors.java:95) ~[temporal-sdk-1.17.0.jar:?]
    	at io.temporal.internal.activity.ActivityTaskHandlerImpl.handle(ActivityTaskHandlerImpl.java:92) ~[temporal-sdk-1.17.0.jar:?]
    	at io.temporal.internal.worker.ActivityWorker$TaskHandlerImpl.handleActivity(ActivityWorker.java:241) ~[temporal-sdk-1.17.0.jar:?]
    	at io.temporal.internal.worker.ActivityWorker$TaskHandlerImpl.handle(ActivityWorker.java:206) ~[temporal-sdk-1.17.0.jar:?]
    	at io.temporal.internal.worker.ActivityWorker$TaskHandlerImpl.handle(ActivityWorker.java:179) ~[temporal-sdk-1.17.0.jar:?]
    	at io.temporal.internal.worker.PollTaskExecutor.lambda$process$0(PollTaskExecutor.java:93) ~[temporal-sdk-1.17.0.jar:?]
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) ~[?:?]
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ~[?:?]
    	at java.lang.Thread.run(Thread.java:1589) ~[?:?]
    Caused by: io.airbyte.workers.exception.WorkerException: Unexpected error while getting checking connection.
    	at io.airbyte.workers.general.DefaultCheckConnectionWorker.run(DefaultCheckConnectionWorker.java:122) ~[io.airbyte-airbyte-commons-worker-0.40.28.jar:?]
    	at io.airbyte.workers.general.DefaultCheckConnectionWorker.run(DefaultCheckConnectionWorker.java:40) ~[io.airbyte-airbyte-commons-worker-0.40.28.jar:?]
    	at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$5(TemporalAttemptExecution.java:195) ~[io.airbyte-airbyte-workers-0.40.28.jar:?]
    
    Caused by: io.airbyte.workers.exception.WorkerException: An error has occurred.
    	at io.airbyte.workers.process.KubeProcessFactory.create(KubeProcessFactory.java:140) ~[io.airbyte-airbyte-commons-worker-0.40.28.jar:?]
    	at io.airbyte.workers.process.AirbyteIntegrationLauncher.check(AirbyteIntegrationLauncher.java:95) ~[io.airbyte-airbyte-commons-worker-0.40.28.jar:?]
    	at io.airbyte.workers.general.DefaultCheckConnectionWorker.run(DefaultCheckConnectionWorker.java:68) ~[io.airbyte-airbyte-commons-worker-0.40.28.jar:?]
    	at io.airbyte.workers.general.DefaultCheckConnectionWorker.run(DefaultCheckConnectionWorker.java:40) ~[io.airbyte-airbyte-commons-worker-0.40.28.jar:?]
    	at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$5(TemporalAttemptExecution.java:195) ~[io.airbyte-airbyte-workers-0.40.28.jar:?]
    
    Caused by: io.fabric8.kubernetes.client.KubernetesClientException: An error has occurred.
    	at io.fabric8.kubernetes.client.KubernetesClientException.launderThrowable(KubernetesClientException.java:103) ~[kubernetes-client-5.12.2.jar:?]
    	at io.fabric8.kubernetes.client.KubernetesClientException.launderThrowable(KubernetesClientException.java:97) ~[kubernetes-client-5.12.2.jar:?]
    	at io.fabric8.kubernetes.client.dsl.base.CreateOnlyResourceOperation.create(CreateOnlyResourceOperation.java:63) ~[kubernetes-client-5.12.2.jar:?]
    	at io.fabric8.kubernetes.client.utils.CreateOrReplaceHelper.createOrReplace(CreateOrReplaceHelper.java:48) ~[kubernetes-client-5.12.2.jar:?]
    	at io.fabric8.kubernetes.client.dsl.base.BaseOperation.createOrReplace(BaseOperation.java:318) ~[kubernetes-client-5.12.2.jar:?]
    	at io.fabric8.kubernetes.client.dsl.base.BaseOperation.createOrReplace(BaseOperation.java:83) ~[kubernetes-client-5.12.2.jar:?]
    	at io.fabric8.kubernetes.client.dsl.base.BaseOperation.createOrReplace(BaseOperation.java:308) ~[kubernetes-client-5.12.2.jar:?]
    	at io.fabric8.kubernetes.client.dsl.base.BaseOperation.createOrReplace(BaseOperation.java:83) ~[kubernetes-client-5.12.2.jar:?]
    	at io.airbyte.workers.process.KubePodProcess.<init>(KubePodProcess.java:536) ~[io.airbyte-airbyte-commons-worker-0.40.28.jar:?]
    	at io.airbyte.workers.process.KubeProcessFactory.create(KubeProcessFactory.java:136) ~[io.airbyte-airbyte-commons-worker-0.40.28.jar:?]
    	at io.airbyte.workers.process.AirbyteIntegrationLauncher.check(AirbyteIntegrationLauncher.java:95) ~[io.airbyte-airbyte-commons-worker-0.40.28.jar:?]
    	at io.airbyte.workers.general.DefaultCheckConnectionWorker.run(DefaultCheckConnectionWorker.java:68) ~[io.airbyte-airbyte-commons-worker-0.40.28.jar:?]
    	at io.airbyte.workers.general.DefaultCheckConnectionWorker.run(DefaultCheckConnectionWorker.java:40) ~[io.airbyte-airbyte-commons-worker-0.40.28.jar:?]
    	at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$5(TemporalAttemptExecution.java:195) ~[io.airbyte-airbyte-workers-0.40.28.jar:?]
    
    Caused by: java.net.SocketTimeoutException: Connect timed out
    	at sun.nio.ch.NioSocketImpl.timedFinishConnect(NioSocketImpl.java:539) ~[?:?]
    	at sun.nio.ch.NioSocketImpl.connect(NioSocketImpl.java:585) ~[?:?]
    	at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:327) ~[?:?]
    	at java.net.Socket.connect(Socket.java:666) ~[?:?]
    	at okhttp3.internal.platform.Platform.connectSocket(Platform.kt:128) ~[okhttp-4.10.0.jar:?]
    	at okhttp3.internal.connection.RealConnection.connectSocket(RealConnection.kt:295) ~[okhttp-4.10.0.jar:?]
    	at okhttp3.internal.connection.RealConnection.connect(RealConnection.kt:207) ~[okhttp-4.10.0.jar:?]
    	at okhttp3.internal.connection.ExchangeFinder.findConnection(ExchangeFinder.kt:226) ~[okhttp-4.10.0.jar:?]
    	at okhttp3.internal.connection.ExchangeFinder.findHealthyConnection(ExchangeFinder.kt:106) ~[okhttp-4.10.0.jar:?]
    	at okhttp3.internal.connection.ExchangeFinder.find(ExchangeFinder.kt:74) ~[okhttp-4.10.0.jar:?]
    	at okhttp3.internal.connection.RealCall.initExchange$okhttp(RealCall.kt:255) ~[okhttp-4.10.0.jar:?]
    	at okhttp3.internal.connection.ConnectInterceptor.intercept(ConnectInterceptor.kt:32) ~[okhttp-4.10.0.jar:?]
    	at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109) ~[okhttp-4.10.0.jar:?]
    	at okhttp3.internal.cache.CacheInterceptor.intercept(CacheInterceptor.kt:95) ~[okhttp-4.10.0.jar:?]
    	at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109) ~[okhttp-4.10.0.jar:?]
    	at okhttp3.internal.http.BridgeInterceptor.intercept(BridgeInterceptor.kt:83) ~[okhttp-4.10.0.jar:?]
    	at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109) ~[okhttp-4.10.0.jar:?]
    	at okhttp3.internal.http.RetryAndFollowUpInterceptor.intercept(RetryAndFollowUpInterceptor.kt:76) ~[okhttp-4.10.0.jar:?]
    	at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109) ~[okhttp-4.10.0.jar:?]
    	at io.fabric8.kubernetes.client.okhttp.OkHttpClientBuilderImpl$InteceptorAdapter.intercept(OkHttpClientBuilderImpl.java:62) ~[kubernetes-client-5.12.2.jar:?]
    	at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109) ~[okhttp-4.10.0.jar:?]
    	at io.fabric8.kubernetes.client.okhttp.OkHttpClientBuilderImpl$InteceptorAdapter.intercept(OkHttpClientBuilderImpl.java:62) ~[kubernetes-client-5.12.2.jar:?]
    	at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109) ~[okhttp-4.10.0.jar:?]
    	at io.fabric8.kubernetes.client.okhttp.OkHttpClientBuilderImpl$InteceptorAdapter.intercept(OkHttpClientBuilderImpl.java:62) ~[kubernetes-client-5.12.2.jar:?]
    	at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109) ~[okhttp-4.10.0.jar:?]
    	at io.fabric8.kubernetes.client.okhttp.OkHttpClientBuilderImpl$InteceptorAdapter.intercept(OkHttpClientBuilderImpl.java:62) ~[kubernetes-client-5.12.2.jar:?]
    	at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109) ~[okhttp-4.10.0.jar:?]
    	at okhttp3.internal.connection.RealCall.getResponseWithInterceptorChain$okhttp(RealCall.kt:201) ~[okhttp-4.10.0.jar:?]
    	at okhttp3.internal.connection.RealCall.execute(RealCall.kt:154) ~[okhttp-4.10.0.jar:?]
    	at io.fabric8.kubernetes.client.okhttp.OkHttpClientImpl.send(OkHttpClientImpl.java:138) ~[kubernetes-client-5.12.2.jar:?]
    	at io.fabric8.kubernetes.client.dsl.base.OperationSupport.retryWithExponentialBackoff(OperationSupport.java:574) ~[kubernetes-client-5.12.2.jar:?]
    	at io.fabric8.kubernetes.client.dsl.base.OperationSupport.handleResponse(OperationSupport.java:553) ~[kubernetes-client-5.12.2.jar:?]
    	at io.fabric8.kubernetes.client.dsl.base.OperationSupport.handleResponse(OperationSupport.java:518) ~[kubernetes-client-5.12.2.jar:?]
    	at io.fabric8.kubernetes.client.dsl.base.OperationSupport.handleCreate(OperationSupport.java:305) ~[kubernetes-client-5.12.2.jar:?]
    	at io.fabric8.kubernetes.client.dsl.base.BaseOperation.handleCreate(BaseOperation.java:644) ~[kubernetes-client-5.12.2.jar:?]
    	at io.fabric8.kubernetes.client.dsl.base.BaseOperation.handleCreate(BaseOperation.java:83) ~[kubernetes-client-5.12.2.jar:?]
    	at io.fabric8.kubernetes.client.dsl.base.CreateOnlyResourceOperation.create(CreateOnlyResourceOperation.java:61) ~[kubernetes-client-5.12.2.jar:?]
    	at io.fabric8.kubernetes.client.utils.CreateOrReplaceHelper.createOrReplace(CreateOrReplaceHelper.java:48) ~[kubernetes-client-5.12.2.jar:?]
    	at io.fabric8.kubernetes.client.dsl.base.BaseOperation.createOrReplace(BaseOperation.java:318) ~[kubernetes-client-5.12.2.jar:?]
    	at io.fabric8.kubernetes.client.dsl.base.BaseOperation.createOrReplace(BaseOperation.java:83) ~[kubernetes-client-5.12.2.jar:?]
    	at io.fabric8.kubernetes.client.dsl.base.BaseOperation.createOrReplace(BaseOperation.java:308) ~[kubernetes-client-5.12.2.jar:?]
    	at io.fabric8.kubernetes.client.dsl.base.BaseOperation.createOrReplace(BaseOperation.java:83) ~[kubernetes-client-5.12.2.jar:?]
    	at io.airbyte.workers.process.KubePodProcess.<init>(KubePodProcess.java:536) ~[io.airbyte-airbyte-commons-worker-0.40.28.jar:?]
    	at io.airbyte.workers.process.KubeProcessFactory.create(KubeProcessFactory.java:136) ~[io.airbyte-airbyte-commons-worker-0.40.28.jar:?]
    	at io.airbyte.workers.process.AirbyteIntegrationLauncher.check(AirbyteIntegrationLauncher.java:95) ~[io.airbyte-airbyte-commons-worker-0.40.28.jar:?]
    	at io.airbyte.workers.general.DefaultCheckConnectionWorker.run(DefaultCheckConnectionWorker.java:68) ~[io.airbyte-airbyte-commons-worker-0.40.28.jar:?]
    	at io.airbyte.workers.general.DefaultCheckConnectionWorker.run(DefaultCheckConnectionWorker.java:40) ~[io.airbyte-airbyte-commons-worker-0.40.28.jar:?]
    	at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$5(TemporalAttemptExecution.java:195) ~[io.airbyte-airbyte-workers-0.40.28.jar:?]
    	... 1 more
    u
    • 2
    • 2
  • d

    Danish Raza

    02/04/2023, 11:52 AM
    Hey Airbyte team, how to better do API development? As i test i added a log statement in one of API method. In order to see the log, i have to stop docker, build again with
    SUB_BUILD=PLATFORM ./gradlew build
    and then make docker up again. This is such a productivity killer. i must be missing something. I already try to build only server
    ./gradlew :airbyte-server:build
    also tried
    --continuous
    flag so gradle keep building. But i must stop docker and start again to see it in effect. i guess i understand why docker needs to be restart so it can pick up new jar. but how to do API development in more easy/better way. Can someone please assist? I am trying to debug and create new API/methods. Thank you very much for your help in advance.
    ✅ 1
    m
    • 2
    • 4
  • d

    Dhanji Mahto

    02/04/2023, 3:18 PM
    Hi Team
    👋 1
    ✅ 1
  • d

    Dhanji Mahto

    02/04/2023, 3:20 PM
    I am using first time AirByte to create destination connector using java , i ran generator but getting below error , please guide me
    n
    m
    • 3
    • 2
  • a

    Ajay Kulkarni

    02/04/2023, 5:14 PM
    Hi All
  • a

    Ajay Kulkarni

    02/04/2023, 5:15 PM
    I am getting this error when replicating from mysql->mysql .
  • a

    Ajay Kulkarni

    02/04/2023, 5:16 PM
    ERROR i.a.c.i.LineGobbler(voidCall):114 - SLF4J: Found binding in [jarfile/airbyte/lib/log4j-slf4j-impl-2.17.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
  • a

    Ajay Kulkarni

    02/04/2023, 5:16 PM
    java.lang.RuntimeException: io.airbyte.workers.exception.WorkerException: Running the launcher replication-orchestrator failed
  • a

    Ajay Kulkarni

    02/04/2023, 6:20 PM
    software.amazon.awssdk.services.s3.model.NoSuchBucketException: The specified bucket does not exist (Service: S3, Status Code: 404,
    u
    m
    • 3
    • 10
  • c

    Chris

    02/04/2023, 8:30 PM
    Hello, I am running Airbyte Open Source on GCP (Google Cloud Engine) where source is Bing Ads and destination is BIgQuery. For GCE VM instance I am using n1-standard-2 with boot disk debian-10-buster-v20221206 (size 10 GB). I also added additional disk with 40GB size because I thought maybe I needed more space. (which don’t seem to be in use). When I try to sync data from Bing Ads worth 3 month data, it crash with error message: “Failure Origin: destination, Message: Something went wrong in the connector. See the logs for more detail.“. What could be the issue?
    63c72a05_6590_4758_b3d2_a8302f8dc086_logs_2_txt.txt
    👀 1
    m
    u
    +4
    • 7
    • 14
  • z

    Zawar Khan

    02/04/2023, 11:08 PM
    Hi Team, My PR is not being responded for last 2 months can anyone see it. https://github.com/airbytehq/airbyte/pull/19930
    m
    n
    • 3
    • 6
  • a

    Assaf Pinhasi

    02/05/2023, 8:03 AM
    Hi, I am using Airbyte to load data to BigQuery. My users don’t want to have to scroll down hundreds of
    _airbyte_raw
    tables in their schema. Is there a way to specify a separate schema for the
    _airbyte_raw_*
    and a separate schema for the normalized data? According to this, it seems that all the destination namespace configuration applies to both raw and normalized data together… Thanks
    ✅ 1
    m
    • 2
    • 2
  • l

    Lior Chen

    02/05/2023, 3:01 PM
    Hey All, My Airbyte k8s deployment is not persisting state. the sync completes successfully but there’s no state in
    state
    table in the database connector: shopify airbyte version: 0.40.27 destination: snowflake
    Copy code
    2023-02-05 12:49:47 replication-orchestrator > State in DefaultReplicationWorker from destination: io.airbyte.protocol.models.AirbyteMessage@1883e30e[type=STATE,log=<null>,spec=<null>,connectionStatus=<null>,catalog=<null>,record=<null>,state=io.airbyte.protocol.models.AirbyteStateMessage@782370d9[type=STREAM,stream=io.airbyte.protocol.models.AirbyteStreamState@40d139b5[streamDescriptor=io.airbyte.protocol.models.StreamDescriptor@7d4eb003[name=customers,namespace=<null>,additionalProperties={}],streamState={"updated_at":"2023-02-05T13:22:39+01:00"},additionalProperties={}],global=<null>,data={"customers":{"updated_at":"2023-02-05T13:22:39+01:00"}},additionalProperties={}],trace=<null>,control=<null>,additionalProperties={}]
    2023-02-05 12:49:47 replication-orchestrator > State in DefaultReplicationWorker from destination: io.airbyte.protocol.models.AirbyteMessage@62672ce4[type=STATE,log=<null>,spec=<null>,connectionStatus=<null>,catalog=<null>,record=<null>,state=io.airbyte.protocol.models.AirbyteStateMessage@57742206[type=STREAM,stream=io.airbyte.protocol.models.AirbyteStreamState@3197f389[streamDescriptor=io.airbyte.protocol.models.StreamDescriptor@761d7bfe[name=order_refunds,namespace=<null>,additionalProperties={}],streamState={"orders":{"updated_at":"2023-02-05T13:22:41+01:00"},"created_at":"2023-02-03T10:25:32+01:00"},additionalProperties={}],global=<null>,data={"customers":{"updated_at":"2023-02-05T13:22:39+01:00"},"order_refunds":{"orders":{"updated_at":"2023-02-05T13:22:41+01:00"},"created_at":"2023-02-03T10:25:32+01:00"}},additionalProperties={}],trace=<null>,control=<null>,additionalProperties={}]
    2023-02-05 12:49:47 replication-orchestrator > State in DefaultReplicationWorker from destination: io.airbyte.protocol.models.AirbyteMessage@55d1b121[type=STATE,log=<null>,spec=<null>,connectionStatus=<null>,catalog=<null>,record=<null>,state=io.airbyte.protocol.models.AirbyteStateMessage@480ef2ee[type=STREAM,stream=io.airbyte.protocol.models.AirbyteStreamState@5698b2fb[streamDescriptor=io.airbyte.protocol.models.StreamDescriptor@2d64ed83[name=orders,namespace=<null>,additionalProperties={}],streamState={"updated_at":"2023-02-05T13:22:41+01:00"},additionalProperties={}],global=<null>,data={"customers":{"updated_at":"2023-02-05T13:22:39+01:00"},"order_refunds":{"orders":{"updated_at":"2023-02-05T13:22:41+01:00"},"created_at":"2023-02-03T10:25:32+01:00"},"orders":{"updated_at":"2023-02-05T13:22:41+01:00"}},additionalProperties={}],trace=<null>,control=<null>,additionalProperties={}]
    2023-02-05 12:49:47 replication-orchestrator > State in DefaultReplicationWorker from destination: io.airbyte.protocol.models.AirbyteMessage@59de70be[type=STATE,log=<null>,spec=<null>,connectionStatus=<null>,catalog=<null>,record=<null>,state=io.airbyte.protocol.models.AirbyteStateMessage@53dc86f3[type=STREAM,stream=io.airbyte.protocol.models.AirbyteStreamState@6a0948e[streamDescriptor=io.airbyte.protocol.models.StreamDescriptor@5c5b9109[name=tender_transactions,namespace=<null>,additionalProperties={}],streamState={"processed_at":"2023-02-05T13:22:36+01:00"},additionalProperties={}],global=<null>,data={"customers":{"updated_at":"2023-02-05T13:22:39+01:00"},"order_refunds":{"orders":{"updated_at":"2023-02-05T13:22:41+01:00"},"created_at":"2023-02-03T10:25:32+01:00"},"orders":{"updated_at":"2023-02-05T13:22:41+01:00"},"tender_transactions":{"processed_at":"2023-02-05T13:22:36+01:00"}},additionalProperties={}],trace=<null>,control=<null>,additionalProperties={}]
    2023-02-05 12:49:47 replication-orchestrator > State in DefaultReplicationWorker from destination: io.airbyte.protocol.models.AirbyteMessage@437042d0[type=STATE,log=<null>,spec=<null>,connectionStatus=<null>,catalog=<null>,record=<null>,state=io.airbyte.protocol.models.AirbyteStateMessage@4323506e[type=STREAM,stream=io.airbyte.protocol.models.AirbyteStreamState@67205e17[streamDescriptor=io.airbyte.protocol.models.StreamDescriptor@13766afd[name=transactions,namespace=<null>,additionalProperties={}],streamState={"orders":{"updated_at":"2023-02-05T13:22:41+01:00"},"created_at":"2023-02-05T13:22:36+01:00"},additionalProperties={}],global=<null>,data={"customers":{"updated_at":"2023-02-05T13:22:39+01:00"},"order_refunds":{"orders":{"updated_at":"2023-02-05T13:22:41+01:00"},"created_at":"2023-02-03T10:25:32+01:00"},"orders":{"updated_at":"2023-02-05T13:22:41+01:00"},"tender_transactions":{"processed_at":"2023-02-05T13:22:36+01:00"},"transactions":{"orders":{"updated_at":"2023-02-05T13:22:41+01:00"},"created_at":"2023-02-05T13:22:36+01:00"}},additionalProperties={}],trace=<null>,control=<null>,additionalProperties={}]
    2023-02-05 12:49:47 destination > Completed integration: io.airbyte.integrations.destination.snowflake.SnowflakeDestination
    2023-02-05 12:49:47 destination > Completed destination: io.airbyte.integrations.destination.snowflake.SnowflakeDestination
    2023-02-05 12:49:48 replication-orchestrator > (pod: default / destination-snowflake-write-9991-0-purzr) - Closed all resources for pod
    2023-02-05 12:49:48 replication-orchestrator > Source and destination threads complete.
    2023-02-05 12:49:48 replication-orchestrator > Source output at least one state message
    2023-02-05 12:49:48 replication-orchestrator > State capture: Updated state to: Optional[io.airbyte.config.State@73da303e[state=[{"type":"STREAM","stream":{"stream_descriptor":{"name":"transactions"},"stream_state":{"orders":{"updated_at":"2023-02-05T13:22:41+01:00"},"created_at":"2023-02-05T13:22:36+01:00"}}}]]]
    2023-02-05 12:49:48 replication-orchestrator > sync summary: {
      "status" : "completed",
      "recordsSynced" : 720,
      "bytesSynced" : 2025353,
      "startTime" : 1675601180508,
    691fa6a3_11e4_40fc_a7a7_08053c91b2c5_logs_9991_txt (2).txt
    n
    u
    • 3
    • 8
  • l

    Lenin Mishra

    02/05/2023, 3:10 PM
    Seems like while building custom connectors, the new version 0.40.32 is failing. It was working well with 0.40.27 and previous versions. Is it possible to downgrade the version of airbyte and give it another try.
    l
    m
    • 3
    • 5
  • w

    Wajdi M

    02/05/2023, 6:59 PM
    Hey All, Is airbyte log timestamp utc? if not, how to change it to utc or any format? is there a way to edit it? Is there a way to connect my airbyte logs directly to any log service like cloudwatch or gcp logging not local log file?
    🙏 1
    ✅ 1
    m
    • 2
    • 2
  • s

    Sabbiu Shah

    02/06/2023, 12:50 AM
    Hi all, I have a question regarding
    Primary key
    definition I receive a nested json response from one of the square api (Orders ) And the data we receive from
    orders
    are as such
    Copy code
    {
       "uid": "abc-123" // this is setup as primary key in our source",
       "line_items": [
          {
             "uid": "cde-234", // can we set nested value as primary key as well?
             // other fields 
          }
       ],
       // other data
    }
    And during incremental sync if we receive any changes in
    orders
    it updates the previous
    orders
    correctly, but it creates new
    line_items
    with the same id. Can we also define
    primary_keys
    for these nested fields (somewhere)? If not, how should we approach this?
    a
    n
    • 3
    • 3
  • l

    Lenin Mishra

    02/06/2023, 4:55 AM
    Anyone experienced with building custom Airbyte connectors? I can pay per hour(if results are delivered as per requirement!)
  • k

    Keshav Agarwal

    02/06/2023, 7:17 AM
    Hi, can we change temp table names airbyte makes? _airbyte_tmp to something else?_
    u
    • 2
    • 1
1...136137138...245Latest