https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • d

    Daniel Antwi

    08/29/2024, 2:21 PM
    I guys, I'm running pyairbyte in an airflow dag inside a docker container using docker-compose. I have a postgres server on my host local machine that am trying to connect to as PostgresCache. I've added extra_hosts: - "host.docker.internal:host-gateway" to my docker compose yaml and provided a host value of host.docker.internal in my Postgrescache Constructor. However, it's still not able to connect to Postgres on the host machine. Some direction would be greatly appreciated. Thanks
  • u

    user

    08/29/2024, 6:01 PM
    #44913 Modification of field within nested path (for pagination) New discussion created by KrishivGubba I'm trying to build a source connector for the Bill Api, and I'm trying to implement pagination in the Airbyte UI, however I'm facing an issue. This is what the request body to a Bill API endpoint looks like:
    { "data": "{"start":0,"max":999}", "devKey": "xxx", "sessionId": "xxx" }
    For pagination, I need the "start" field which is nested within the data object to be modified/incremented (by 999 in this case). Note: I need pagination because the maximum number of records that the endpoint can return in one go is 999 (which is what I've set it to above ^^). My problem lies in the Airbyte connector builder, wherein you cannot modify a nested field for pagination: image As you can see, the Key Name field only asks for a key and not the path to the key. By the way, I need the body data to be in URL encoded format, and I need the injected field to be in the same format as well (well, obviously :) ). Could you please add this feature please. I need to be able to modify the field according the path I provide. Thank you! airbytehq/airbyte
  • l

    Lucas Abreu

    08/29/2024, 7:54 PM
    Hi everyone, I'm using Otel as Airbyte metrics collector and it seems that all data for GSheets are coming as zero, other integrations seems to be ok. Nothing in the open-telemetry logs colector nor in the airbyte metrics server has anyone seen this ?
    ✅ 1
    u
    • 2
    • 2
  • r

    Rajesh K

    08/30/2024, 4:58 AM
    Does airbyte support self hosted mongodb cluster with self signed ssl as source?
  • a

    Antoine

    08/30/2024, 6:48 AM
    Hi everyone, We are doing sync from our production database (Postgres Source) to a our warehouse database (Postgres Destination). Our connection was working fine before 20 august but we paused it few days to manage our billing. The WAL increased quite a lot on our source database (status 'extended') without any lost because we never purge WAL. Since the 28, we are trying to unpause the connection but it is failing with the following error : "Socket is closed" We sometimes receive this log INFO :
    2024-08-29 05:33:52 source > INFO main i.a.c.i.d.i.DebeziumRecordIterator(requestClose):276 No records were returned by Debezium in the timeout seconds 600, closing the engine and iterator
    We tried to increase the initial sync timeout to the maximum (2400) but it did not help. Do you have any pro tips to make it works ? Thanks in advance for your help ! Note : The AI Bot tell me it can come from the fact that we do not synchronize enough table from the source. But we are syncing 97 table over 122 from our database. Is it not enough ? Thanks in advance for your help ! Yours faithfully, LCDP
  • s

    Stefano Messina

    08/30/2024, 7:21 AM
    Hi all, I've been encountering an issue performing syncs in "Incremental | Append" mode from MSSQL sources, the tables get randomly completely re-loaded as if Airbyte cannot correctly evaluate the cursor on
    LastUpdated
    (with microsecond resolution but without timezone offset) Has anyone encountered this problem before?
  • u

    user

    08/30/2024, 11:11 AM
    #44924 Supporting rootless (non-root) connectors with v0.63.9 New discussion created by nataliekwong What As a security best practice, Airbyte has begun migrating our connectors to run rootless, or non-root versions. Most organizations IT/Ops departments mandate non-root privileges for containers in their Kubernetes clusters. To ensure Airbyte aligns with those security policies, the Airbyte platform v0.63.9 was updated to run both root and non-root versions of connectors. Next Steps If you have upgraded to Airbyte v0.63.9 or beyond, no action is required. If you have not yet upgraded to Airbyte or beyond, you are required to update the platform version of Airbyte. If you do not upgrade your platform version, your syncs will begin to fail as connectors are migrated to the non-root version. We expect that all connectors will be migrated to their non-root version by the end of October 2024. airbytehq/airbyte
  • p

    PaoloF

    08/30/2024, 11:48 AM
    Hi, I have tried to use the source connector for Commercetools, but I have a lot of problems. In the URL generated for the request is missing the project_key parameter (defined in the config):
    Copy code
    airbyte_cdk.sources.declarative.exceptions.ReadException: Request to <https://api.europe-west1.gcp.commercetools.com/orders?sort=lastModifiedAt+asc&where=lastModifiedAt+%3E%3D+%3Adate&limit=500&var.date=2024-06-01T00%3A00%3A00.000000%2B0000> failed with status code 404 and error message Not found
    Any idea? thanks
  • r

    Ravi Nathwani

    08/30/2024, 6:39 PM
    Hi @all Can we upgrade Airbyte deployed on Kubernetes from Application version 0.59.1 , helm version 0.67.17 as when I am trying to upgrade the Airbyte above this Version , it's asking for enterprise licensekey Helm Chart :- https://artifacthub.io/packages/helm/airbyte/airbyte/0.78.4 Did any one try upgrading Airbyte beyond this Version ?
    Copy code
    enterprise:
        # -- Secret name where an Airbyte license key is stored
        secretName: "airbyte-config-secrets"
        # -- The key within `licenseKeySecretName` where the Airbyte license key is stored
        licenseKeySecretKey: "license-key"
    p
    • 2
    • 8
  • m

    Mike Nguyen

    08/31/2024, 3:04 PM
    Hi team, I am trying to do CDC from Aurora Postgres 16 to MySQL RDS with latest open source Airbyte deployed on my Mac following this guide for deployment (abctl version: v0.13.1). But I could never get it working, the table is created at MySQL destination but there is no records there. In the log of Job, it has the error which I could not understand. Could anyone throw some light how to resolve it? Quite frustrated as I could not find any solution in the Slack and over the internet for this :( 2024-08-31 145039 destination > ERROR main i.a.c.i.b.AirbyteExceptionHandler(uncaughtException):30 Something went wrong in the connector. See the logs for more details. java.util.concurrent.CompletionException: java.lang.RuntimeException: java.sql.SQLSyntaxErrorException: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '`intermediate_data` as (select cast(JSON_VALUE(JSON_EXTRACT(
    _airbyte_data
    , '$.' at line 1
    856 at java.base/java.util.concurrent.CompletableFuture.encodeThrowable(CompletableFuture.java:315) ~[?:?] 857 at java.base/java.util.concurrent.CompletableFuture.completeThrowable(CompletableFuture.java:320) ~[?:?] 858 at java.base/java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1770) ~[?:?] 859 at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) ~[?:?] 860 at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ~[?:?] 861 at java.base/java.lang.Thread.run(Thread.java:1583) [?:?] 862 Caused by: java.lang.RuntimeException: java.sql.SQLSyntaxErrorException: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '`intermediate_data` as (select cast(JSON_VALUE(JSON_EXTRACT(
    _airbyte_data
    , '$.' at line 1 863 at io.airbyte.integrations.base.destination.typing_deduping.DefaultTyperDeduper.prepareTablesFuture$lambda$4(DefaultTyperDeduper.kt:264) ~[airbyte-cdk-typing-deduping-0.33.0.jar:?] 864 at java.base/java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1768) ~[?:?] 865 ... 3 more 866 Caused by: java.sql.SQLSyntaxErrorException: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '`intermediate_data` as (select cast(JSON_VALUE(JSON_EXTRACT(
    _airbyte_data
    , '$.' at line 1 867 at com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:120) ~[mysql-connector-java-8.0.22.jar:8.0.22] 868 at com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:97) ~[mysql-connector-java-8.0.22.jar:8.0.22] 869 at com.mysql.cj.jdbc.exceptions.SQLExceptionsMapping.translateException(SQLExceptionsMapping.java:122) ~[mysql-connector-java-8.0.22.jar:8.0.22] 870 at com.mysql.cj.jdbc.StatementImpl.executeInternal(StatementImpl.java:764) ~[mysql-connector-java-8.0.22.jar:8.0.22] 871 at com.mysql.cj.jdbc.StatementImpl.execute(StatementImpl.java:648) ~[mysql-connector-java-8.0.22.jar:8.0.22] 872 at com.zaxxer.hikari.pool.ProxyStatement.execute(ProxyStatement.java:94) ~[HikariCP-5.1.0.jar:?] 873 at com.zaxxer.hikari.pool.HikariProxyStatement.execute(HikariProxyStatement.java) ~[HikariCP-5.1.0.jar:?] 874 at io.airbyte.cdk.db.jdbc.JdbcDatabase.executeWithinTransaction$lambda$1(JdbcDatabase.kt:46) ~[airbyte-cdk-core-0.33.0.jar:?] 875 at io.airbyte.cdk.db.jdbc.DefaultJdbcDatabase.execute(DefaultJdbcDatabase.kt:30) ~[airbyte-cdk-core-0.33.0.jar:?] 876 at io.airbyte.cdk.db.jdbc.JdbcDatabase.executeWithinTransaction(JdbcDatabase.kt:43) ~[airbyte-cdk-core-0.33.0.jar:?] 877 at io.airbyte.cdk.integrations.destination.jdbc.typing_deduping.JdbcDestinationHandler.execute(JdbcDestinationHandler.kt:180) ~[airbyte-cdk-db-destinations-0.33.0.jar:?] 878 at io.airbyte.integrations.base.destination.typing_deduping.TypeAndDedupeTransaction.executeTypeAndDedupe(TypeAndDedupeTransaction.kt:56) ~[airbyte-cdk-typing-deduping-0.33.0.jar:?] 879 at io.airbyte.integrations.base.destination.typing_deduping.TypeAndDedupeTransaction.executeSoftReset(TypeAndDedupeTransaction.kt:92) ~[airbyte-cdk-typing-deduping-0.33.0.jar:?] 880 at io.airbyte.integrations.base.destination.typing_deduping.DefaultTyperDeduper.prepareTablesFuture$lambda$4(DefaultTyperDeduper.kt:222) ~[airbyte-cdk-typing-deduping-0.33.0.jar:?] 881 at java.base/java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1768) ~[?:?] 882 ... 3 more
  • g

    guyarad

    09/01/2024, 7:57 AM
    [PRODUCTION ISSUE]
    we have an S3 destination with a filename pattern:
    {sync_id}/{part_number}{format_extension}
    Until 2 days ago things worked fine. Suddenly, the
    sync_id
    gets an empty value. We don't get an error. The file is stored as
    ...//1.jsonl
    . We have other identical sources (but different source entity in Airbyte) with identical connection configuration that does work. I fetched the source/connection configuration via the API and they are identical (except for the name and ID). They all use the same destination object. Please assist
    plus1 1
    e
    • 2
    • 7
  • u

    user

    09/01/2024, 4:51 PM
    #45070 Can one use `mockserver` to mock stripe too New discussion created by kasir-barati I know that this might be out of place but I wanted to know if it is possible for me to mock stripe with
    mockserver
    . I read this and I know that it is possible for the server but I am not sure about stripe. https://github.com/airbytehq/airbyte/blob/master/airbyte-cdk/python/README.md#when-you-dont-have-access-to-the-api airbytehq/airbyte
  • s

    Scheduled message

    09/02/2024, 4:00 AM
    Please post your weekly update in thread🧵. Thanks, team!
  • n

    Nihal V Velpula

    09/02/2024, 5:26 AM
    Hi everyone. We're testing Airbyte open source and trying to add a Postgres source but I'm constantly getting this error. "State code: 08001; Message: Connection to localhost:5432 refused. Check that the hostname and port are correct and that the postmaster is accepting TCP/IP connections." I'm using Mac and I've installed Airbyte through abctl. Python 3.11 and Postgres 16. I tried giving both default postgres user and also dedicated read only user that I've created but still getting the same error. I've checked Dbeaver it's able to form a connection. So I'm kind of stuck. Could someone who is using posgres as a source please help me out Thanking you.
    p
    m
    • 3
    • 5
  • u

    user

    09/02/2024, 6:51 AM
    #45077 Unable to connect to any database in Postgres New discussion created by nihalvv Hi Everyone, We're trying to connect a Postgres Source but constantly getting the following error. Configuration check failed State code: 08001; Message: Connection to localhost:5432 refused. Check that the hostname and port are correct and that the postmaster is accepting TCP/IP connections. The postgres config file is listening to localhost and I'm able to form a connection in Dbeaver as well. We're unable to figure out what's wrong and kind of stuck right now. I've installed using abctl. Using Python 3.11 and Postgres 16. These are settings : SSL : prefer, xmin replication, no SSH tunneling, localhost, port 5432. Could someone please help me out? airbytehq/airbyte
  • k

    Kaan Erdoğan

    09/02/2024, 10:03 AM
    Hi, i am gettin Could not resolve placeholder ${AIRBYTE_API_HOST}" error in deploying airbyte with terraform. Is anybody faced that error?
    p
    • 2
    • 2
  • h

    Hari Haran R

    09/02/2024, 10:05 AM
    Hi, i'm building a custom connector for inspectlet API, which Pagination Method is suitable (offset,incremental,cursor pagination)?
  • a

    Alexandre RG

    09/02/2024, 1:01 PM
    Hi, I'm trying to select the password and email with abctl, but the changes don't apply to my cluster and I can't log in. Thanks in advance
    p
    j
    • 3
    • 3
  • u

    user

    09/02/2024, 3:34 PM
    #45087 [abctl] failed to verify certificate: x509 New discussion created by atlasrule What happened? Hello, When I start abctl.exe on windows host using wsl docker, I get the following unkown authority certificate error: INFO Using Kubernetes provider: Provider: kind Kubeconfig: C:\Users\atlasrule.airbyte\abctl\abctl.kubeconfig Context: kind-airbyte-abctl SUCCESS Found Docker installation: version 27.0.3 SUCCESS Port 8000 appears to be available SUCCESS Existing cluster 'airbyte-abctl' found SUCCESS Cluster 'airbyte-abctl' validation complete ERROR Failed to initialize 'local' command ERROR unable to initialize local command: error communicating with kubernetes: unable to fetch kubernetes server version: Get "https://127.0.0.1:61690/version": tls: failed to verify certificate: x509: certificate signed by unknown authority INFO An error occurred while communicating with the Kubernetes cluster. If this error persists, you may need to run the uninstall command before attempting to run the install command again. ssl-error What did you expect to happen? Couldn't find any bypass or ignore certificate flag in abctl, and also couldn't edit the dockerfile for installing any certificate, can you help me about the convention for this error? Abctl Version $ abctl version version: v0.12.2 Docker Version $ docker version Client: Version: 27.0.3 API version: 1.46 Go version: go1.21.11 Git commit: 7d4bcd8 Built: Sat Jun 29 000332 2024 OS/Arch: windows/amd64 Context: desktop-linux Server: Docker Desktop 4.32.0 (157355) Engine: Version: 27.0.3 API version: 1.46 (minimum version 1.24) Go version: go1.21.11 Git commit: 662f78c Built: Sat Jun 29 000250 2024 OS/Arch: linux/amd64 Experimental: false containerd: Version: 1.7.18 GitCommit: ae71819c4f5e67bb4d5ae76a6b735f29cc25774e runc: Version: 1.7.18 GitCommit: v1.1.13-0-g58aa920 docker-init: Version: 0.19.0 GitCommit: de40ad0 OS Version # On Linux: $ cat /etc/os-release # paste output here # On Mac: $ uname -a # paste output here # On Windows: C:\> wmic os get Caption, Version, BuildNumber, OSArchitecture BuildNumber Caption OSArchitecture Version 19045 Microsoft Windows 10 Pro 64-bit 10.0.19045 airbytehq/airbyte
  • u

    user

    09/02/2024, 3:35 PM
    Comment on #45087 failed to verify certificate: x509 Discussion answered by atlasrule
    Can you check the discussion on Docker forum. This looks your env is missing the certificates.
    Thank you very much, solved the issue by installing on some different system and network. airbytehq/airbyte
  • s

    Sumit Kumar

    09/02/2024, 3:41 PM
    Hi everyone, I've deployed Airbyte on a GKE cluster using Helm. However, while syncing data from various data sources to BigQuery, I'm encountering an error (details added below). I've already upgraded Airbyte to the latest version, but the issue still persists. Any help would be greatly appreciated. Thanks! io.airbyte.workload.launcher.pipeline.stages.model.StageError: io.airbyte.workload.launcher.pods.KubeClientException: Failed to create pod source-zendesk-support-check-1663-1-qwqxh. at io.airbyte.workload.launcher.pipeline.stages.model.Stage.apply(Stage.kt:46) at io.airbyte.workload.launcher.pipeline.stages.LaunchPodStage.apply(LaunchPodStage.kt:38) at io.airbyte.workload.launcher.pipeline.stages.$LaunchPodStage$Definition$Intercepted.$$access$$apply(Unknown Source) at io.airbyte.workload.launcher.pipeline.stages.$LaunchPodStage$Definition$Exec.dispatch(Unknown Source) at io.micronaut.context.AbstractExecutableMethodsDefinition$DispatchedExecutableMethod.invoke(AbstractExecutableMethodsDefinition.java:456) at io.micronaut.aop.chain.MethodInterceptorChain.proceed(MethodInterceptorChain.java:129) at io.airbyte.metrics.interceptors.InstrumentInterceptorBase.doIntercept(InstrumentInterceptorBase.kt:61) at io.airbyte.metrics.interceptors.InstrumentInterceptorBase.intercept(InstrumentInterceptorBase.kt:44) at io.micronaut.aop.chain.MethodInterceptorChain.proceed(MethodInterceptorChain.java:138) at io.airbyte.workload.launcher.pipeline.stages.$LaunchPodStage$Definition$Intercepted.apply(Unknown Source) at io.airbyte.workload.launcher.pipeline.stages.LaunchPodStage.apply(LaunchPodStage.kt:24) at reactor.core.publisher.MonoFlatMap$FlatMapMain.onNext(MonoFlatMap.java:132) at reactor.core.publisher.MonoFlatMap$FlatMapMain.onNext(MonoFlatMap.java:158) at reactor.core.publisher.MonoFlatMap$FlatMapMain.onNext(MonoFlatMap.java:158) at reactor.core.publisher.MonoFlatMap$FlatMapMain.onNext(MonoFlatMap.java:158) at reactor.core.publisher.Operators$ScalarSubscription.request(Operators.java:2571) at reactor.core.publisher.MonoFlatMap$FlatMapMain.request(MonoFlatMap.java:194) at reactor.core.publisher.MonoFlatMap$FlatMapMain.request(MonoFlatMap.java:194) at reactor.core.publisher.MonoFlatMap$FlatMapMain.request(MonoFlatMap.java:194) at reactor.core.publisher.MonoFlatMap$FlatMapMain.request(MonoFlatMap.java:194) at reactor.core.publisher.Operators$MultiSubscriptionSubscriber.set(Operators.java:2367) at reactor.core.publisher.FluxOnErrorResume$ResumeSubscriber.onSubscribe(FluxOnErrorResume.java:74) at reactor.core.publisher.MonoFlatMap$FlatMapMain.onSubscribe(MonoFlatMap.java:117) at reactor.core.publisher.MonoFlatMap$FlatMapMain.onSubscribe(MonoFlatMap.java:117) at reactor.core.publisher.MonoFlatMap$FlatMapMain.onSubscribe(MonoFlatMap.java:117) at reactor.core.publisher.MonoFlatMap$FlatMapMain.onSubscribe(MonoFlatMap.java:117) at reactor.core.publisher.FluxFlatMap.trySubscribeScalarMap(FluxFlatMap.java:193) at reactor.core.publisher.MonoFlatMap.subscribeOrReturn(MonoFlatMap.java:53) at reactor.core.publisher.Mono.subscribe(Mono.java:4552) at reactor.core.publisher.MonoSubscribeOn$SubscribeOnSubscriber.run(MonoSubscribeOn.java:126) at reactor.core.scheduler.ImmediateScheduler$ImmediateSchedulerWorker.schedule(ImmediateScheduler.java:84) at reactor.core.publisher.MonoSubscribeOn.subscribeOrReturn(MonoSubscribeOn.java:55) at reactor.core.publisher.Mono.subscribe(Mono.java:4552) at reactor.core.publisher.Mono.subscribeWith(Mono.java:4634) at reactor.core.publisher.Mono.subscribe(Mono.java:4395) at io.airbyte.workload.launcher.pipeline.LaunchPipeline.accept(LaunchPipeline.kt:50) at io.airbyte.workload.launcher.pipeline.consumer.LauncherMessageConsumer.consume(LauncherMessageConsumer.kt:28) at io.airbyte.workload.launcher.pipeline.consumer.LauncherMessageConsumer.consume(LauncherMessageConsumer.kt:12) at io.airbyte.commons.temporal.queue.QueueActivityImpl.consume(Internal.kt:87) at java.base/jdk.internal.reflect.DirectMethodHandleAccessor.invoke(DirectMethodHandleAccessor.java:103) at java.base/java.lang.reflect.Method.invoke(Method.java:580) at io.temporal.internal.activity.RootActivityInboundCallsInterceptor$POJOActivityInboundCallsInterceptor.executeActivity(RootActivityInboundCallsInterceptor.java:64) at io.temporal.internal.activity.RootActivityInboundCallsInterceptor.execute(RootActivityInboundCallsInterceptor.java:43) at io.temporal.common.interceptors.ActivityInboundCallsInterceptorBase.execute(ActivityInboundCallsInterceptorBase.java:39) at io.temporal.opentracing.internal.OpenTracingActivityInboundCallsInterceptor.execute(OpenTracingActivityInboundCallsInterceptor.java:78) at io.temporal.internal.activity.ActivityTaskExecutors$BaseActivityTaskExecutor.execute(ActivityTaskExecutors.java:107) at io.temporal.internal.activity.ActivityTaskHandlerImpl.handle(ActivityTaskHandlerImpl.java:124) at io.temporal.internal.worker.ActivityWorker$TaskHandlerImpl.handleActivity(ActivityWorker.java:278) at io.temporal.internal.worker.ActivityWorker$TaskHandlerImpl.handle(ActivityWorker.java:243) at io.temporal.internal.worker.ActivityWorker$TaskHandlerImpl.handle(ActivityWorker.java:216) at io.temporal.internal.worker.PollTaskExecutor.lambda$process$0(PollTaskExecutor.java:105) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) at java.base/java.lang.Thread.run(Thread.java:1583) Caused by: io.airbyte.workload.launcher.pods.KubeClientException: Failed to create pod source-zendesk-support-check-1663-1-qwqxh. at io.airbyte.workload.launcher.pods.KubePodClient.launchConnectorWithSidecar(KubePodClient.kt:352) at io.airbyte.workload.launcher.pods.KubePodClient.launchCheck(KubePodClient.kt:279) at io.airbyte.workload.launcher.pipeline.stages.LaunchPodStage.applyStage(LaunchPodStage.kt:44) at io.airbyte.workload.launcher.pipeline.stages.LaunchPodStage.applyStage(LaunchPodStage.kt:24) at io.airbyte.workload.launcher.pipeline.stages.model.Stage.apply(Stage.kt:42) ... 53 more Caused by: io.fabric8.kubernetes.client.KubernetesClientException: Failure executing: PATCH at: https://34.118.224.1:443/api/v1/namespaces/default/pods/source-zendesk-support-check-1663-1-qwqxh?fieldManager=fabric8. Message: pods "source-zendesk-support-check-1663-1-qwqxh" is forbidden: User "systemserviceaccountdefault:airbyte-admin" cannot patch resource "pods" in API group "" in the namespace "default". Received status: Status(apiVersion=v1, code=403, details=StatusDetails(causes=[], group=null, kind=pods, name=source-zendesk-support-check-1663-1-qwqxh, retryAfterSeconds=null, uid=null, additionalProperties={}), kind=Status, message=pods "source-zendesk-support-check-1663-1-qwqxh" is forbidden: User "systemserviceaccountdefault:airbyte-admin" cannot patch resource "pods" in API group "" in the namespace "default", metadata=ListMeta(_continue=null, remainingItemCount=null, resourceVersion=null, selfLink=null, additionalProperties={}), reason=Forbidden, status=Failure, additionalProperties={}). at io.fabric8.kubernetes.client.KubernetesClientException.copyAsCause(KubernetesClientException.java:238) at io.fabric8.kubernetes.client.dsl.internal.OperationSupport.waitForResult(OperationSupport.java:507) at io.fabric8.kubernetes.client.dsl.internal.OperationSupport.handleResponse(OperationSupport.java:524) at io.fabric8.kubernetes.client.dsl.internal.OperationSupport.handlePatch(OperationSupport.java:419) at io.fabric8.kubernetes.client.dsl.internal.OperationSupport.handlePatch(OperationSupport.java:397) at io.fabric8.kubernetes.client.dsl.internal.BaseOperation.handlePatch(BaseOperation.java:764) at io.fabric8.kubernetes.client.dsl.internal.HasMetadataOperation.lambda$patch$2(HasMetadataOperation.java:231) at io.fabric8.kubernetes.client.dsl.internal.HasMetadataOperation.patch(HasMetadataOperation.java:236) at io.fabric8.kubernetes.client.dsl.internal.HasMetadataOperation.patch(HasMetadataOperation.java:251) at io.fabric8.kubernetes.client.dsl.internal.BaseOperation.serverSideApply(BaseOperation.java:1179) at io.fabric8.kubernetes.client.dsl.internal.BaseOperation.serverSideApply(BaseOperation.java:98) at io.airbyte.workload.launcher.pods.KubePodLauncher$create$1.invoke(KubePodLauncher.kt:57) at io.airbyte.workload.launcher.pods.KubePodLauncher$create$1.invoke(KubePodLauncher.kt:52) at io.airbyte.workload.launcher.pods.KubePodLauncher.runKubeCommand$lambda$0(KubePodLauncher.kt:307) at dev.failsafe.Functions.lambda$toCtxSupplier$11(Functions.java:243) at dev.failsafe.Functions.lambda$get$0(Functions.java:46) at dev.failsafe.internal.RetryPolicyExecutor.lambda$apply$0(RetryPolicyExecutor.java:74) at dev.failsafe.SyncExecutionImpl.executeSync(SyncExecutionImpl.java:187) at dev.failsafe.FailsafeExecutor.call(FailsafeExecutor.java:376) at dev.failsafe.FailsafeExecutor.get(FailsafeExecutor.java:112) at io.airbyte.workload.launcher.pods.KubePodLauncher.runKubeCommand(KubePodLauncher.kt:307) at io.airbyte.workload.launcher.pods.KubePodLauncher.create(KubePodLauncher.kt:52) at io.airbyte.workload.launcher.pods.KubePodClient.launchConnectorWithSidecar(KubePodClient.kt:349) ... 57 more Caused by: io.fabric8.kubernetes.client.KubernetesClientException: Failure executing: PATCH at: https://34.118.224.1:443/api/v1/namespaces/default/pods/source-zendesk-support-check-1663-1-qwqxh?fieldManager=fabric8. Message: pods "source-zendesk-support-check-1663-1-qwqxh" is forbidden: User "systemserviceaccountdefault:airbyte-admin" cannot patch resource "pods" in API group "" in the namespace "default". Received status: Status(apiVersion=v1, code=403, details=StatusDetails(causes=[], group=null, kind=pods, name=source-zendesk-support-check-1663-1-qwqxh, retryAfterSeconds=null, uid=null, additionalProperties={}), kind=Status, message=pods "source-zendesk-support-check-1663-1-qwqxh" is forbidden: User "systemserviceaccountdefault:airbyte-admin" cannot patch resource "pods" in API group "" in the namespace "default", metadata=ListMeta(_continue=null, remainingItemCount=null, resourceVersion=null, selfLink=null, additionalProperties={}), reason=Forbidden, status=Failure, additionalProperties={}). at io.fabric8.kubernetes.client.dsl.internal.OperationSupport.requestFailure(OperationSupport.java:660) at io.fabric8.kubernetes.client.dsl.internal.OperationSupport.requestFailure(OperationSupport.java:640) at io.fabric8.kubernetes.client.dsl.internal.OperationSupport.assertResponseCode(OperationSupport.java:589) at io.fabric8.kubernetes.client.dsl.internal.OperationSupport.lambda$handleResponse$0(OperationSupport.java:549) at java.base/java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:646) at java.base/java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:510) at java.base/java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:2179) at io.fabric8.kubernetes.client.http.StandardHttpClient.lambda$completeOrCancel$10(StandardHttpClient.java:142) at java.base/java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) at java.base/java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) at java.base/java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:510) at java.base/java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:2179) at io.fabric8.kubernetes.client.http.ByteArrayBodyHandler.onBodyDone(ByteArrayBodyHandler.java:51) at java.base/java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) at java.base/java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) at java.base/java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:510) at java.base/java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:2179) at io.fabric8.kubernetes.client.okhttp.OkHttpClientImpl$OkHttpAsyncBody.doConsume(OkHttpClientImpl.java:136) ... 3 more
    p
    j
    • 3
    • 2
  • r

    Rajesh K

    09/02/2024, 4:44 PM
    I have deployed airbyte & mongodb on eks cluster within same namespace. Mongodb is configured with self signed certificate & tested working from mongodb pods when using internal service name and with cert files. But in airbyte ui when I give the same connection uri it is not connecting as source.what to do
  • a

    Alan Balcazar

    09/02/2024, 6:09 PM
    Hello, anyone knows how solve this error, is the mongo connector 1.5.9 Terminating due to java.lang.OutOfMemoryError: Java heap space 2024-09-02 174225 platform > readFromSource: source exception io.airbyte.workers.internal.exception.SourceException: Source process exited with non-zero exit code 3 at io.airbyte.workers.general.BufferedReplicationWorker.readFromSource(BufferedReplicationWorker.java:366) ~[io.airbyte-airbyte-commons-worker-0.50.46.jar:?] at io.airbyte.workers.general.BufferedReplicationWorker.lambda$runAsyncWithHeartbeatCheck$3(BufferedReplicationWorker.java:235) ~[io.airbyte-airbyte-commons-worker-0.50.46.jar:?] at java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) ~[?:?] at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) ~[?:?] at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ~[?:?] at java.base/java.lang.Thread.run(Thread.java:1583) [?:?]
    p
    • 2
    • 7
  • u

    user

    09/02/2024, 6:15 PM
    #45089 Sync from Postgres to Pinecone Succeeds, but the record count in Pinecone is still 0 New discussion created by RosasSebastian2003 Hello, I'm trying to sync my Postgres database to a pinecone index, although the sync succeeds, my record count in pinecone remains as 0 Screenshot 2024-09-02 at 12 14 57 PM Screenshot 2024-09-02 at 12 15 11 PM airbytehq/airbyte
  • s

    Sumit Kumar

    09/02/2024, 7:04 PM
    Hi Team Greetings I've deployed Airbyte on a GKE cluster using Helm. However, while syncing data from various data sources to BigQuery, I was encountering an error message='io.temporal.serviceclient.CheckedExceptionWrapper: io.airbyte.workers.exception.WorkerException: Running the launcher replication-orchestrator failed', type='java.lang.RuntimeException', nonRetryable=false Later I upgraded my Airbyte to the latest version, and tried to sync the data but we have encountered a new issue . Any help would be greatly appreciated. Thanks!
  • h

    Hari Haran R

    09/03/2024, 3:32 AM
    Hi in Custom Connector builder , Incremental Sync Method, how does it work •
    "Start datetime" is set to "user input" to allow the user of the connector configuring a Source to specify the time to start syncing
    what does this mean , does it mean that when the user gives the input, it starts sync daily from same date which user has given for Eg. if user gives 2020-09-01, then does it daily run from the same date? what is the main difference between, cursor field and Startdate time field in airbyte UI
  • u

    user

    09/03/2024, 6:27 AM
    #45096 error: postgres to aws s3, Saved offset is before replication slot's confirmed lsn New discussion created by joew-extremenetworks Hello, how do I fix the error below when reading cdc from a postgres database in aws and writing to an aws s3 bucket? The source and destination each individually tested successfully. I also confirmed I was able to read from the db from a separate script, as well as read and write to s3. I tried reseting the connection as well. Any help? Thanks. error: Saved offset is before replication slot's confirmed lsn reset instructions: https://www.restack.io/docs/airbyte-knowledge-airbyte-data-reset-guide#:~:text=Steps%20to%20Reset%20Your%20Data airbytehq/airbyte
  • e

    Eric Yoo

    09/03/2024, 6:57 AM
    Hi, After the sync for custom api connector was executed, the following log repeated forever before force terminating the sync.
    Copy code
    2024-09-02 08:31:35 destination > 2024-09-02T08:31:35,556`pool-3-thread-1`17`INFO`i.a.c.i.d.b.BufferManager(printQueueInfo):118 - [ASYNC QUEUE INFO] Global: max: 296.96 MB, allocated: 10 MB (10.0 MB), % used: 0.03367428551701215 | State Manager memory usage: Allocated: 10 MB, Used: 0 bytes, percentage Used 0.000000
    2024-09-02 08:31:35 destination > 2024-09-02T08:31:35,627`pool-5-thread-1`19`INFO`i.a.c.i.d.FlushWorkers(printWorkerInfo):143 - [ASYNC WORKER INFO] Pool queue size: 0, Active threads: 0
    Following log was ouputted once in between:
    Copy code
    2024-09-02 11:30:31 platform > thread status... heartbeat thread: true , replication thread: false
    2024-09-02 11:30:31 platform > Do not terminate as feature flag is disable
    Would this be related to server resource issue?
  • r

    Ritvik Nagpal

    09/03/2024, 7:30 AM
    Hi everyone, I am using airbyte to fetch all the ad performance from meta. I want to parse all the creatives properly. Each type of creative is being stored in a different format in meta tables: • Single image ad • Carousel (image / video) • Flexible ads • Dynamic creative ads • Catalogue ads • Placement customisations • Partnership ads • Manual upload vs using existing post It’s getting difficult for me to figure out a robust parsing logic. Can anyone help me out? If you have used motion / triple whale for creative analytics, we want a similar outcome.
1...217218219...245Latest