https://linen.dev logo
Join Slack
Powered by
# ask-ai
  • j

    Jonathan Golden

    10/31/2024, 7:39 AM
    @kapa.ai getting this now: User "systemserviceaccount<ksa>" cannot patch resource "pods" in API group "" in the namespace "<namespace>".
    u
    • 2
    • 1
  • h

    Henrik Nilsson

    10/31/2024, 7:54 AM
    My connection between postgres and redshift stopped working. In the logs i can see “WARN main c.n.s.UnknownKeywordFactory(lambda$getKeyword$0):37 Unknown keyword group - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword or if it should generate annotations AnnotationKeyword” what does this mean?
    u
    • 2
    • 1
  • g

    Girish N

    10/31/2024, 9:30 AM
    at io.airbyte.integrations.destination.mongodb.MongodbDestination.main(MongodbDestination.java:64) ~[io.airbyte.airbyte-integrations.connectors-destination-mongodb-24.0.2.jar:?] Stack Trace: java.util.MissingFormatArgumentException: Format specifier '%s' at java.base/java.util.Formatter.format(Formatter.java:2688) at java.base/java.util.Formatter.format(Formatter.java:2625) at java.base/java.lang.String.format(String.java:4143) at io.airbyte.integrations.destination.mongodb.MongodbDestination.buildConnectionString(MongodbDestination.java:168) at io.airbyte.integrations.destination.mongodb.MongodbDestination.getConnectionString(MongodbDestination.java:150) at io.airbyte.integrations.destination.mongodb.MongodbDestination.getDatabase(MongodbDestination.java:136) at io.airbyte.integrations.destination.mongodb.MongodbDestination.check(MongodbDestination.java:71) at io.airbyte.integrations.base.ssh.SshTunnel.sshWrap(SshTunnel.java:270) at io.airbyte.integrations.base.ssh.SshWrappedDestination.check(SshWrappedDestination.java:67) at io.airbyte.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:124) at io.airbyte.integrations.base.IntegrationRunner.run(IntegrationRunner.java:99) at io.airbyte.integrations.destination.mongodb.MongodbDestination.main(MongodbDestination.java:64) 2024-10-31 092019 platform > INFO i.a.i.b.IntegrationRunner(runInternal):197 Completed integration: io.airbyte.integrations.base.ssh.SshWrappedDestination 2024-10-31 092019 platform > INFO i.a.i.d.m.MongodbDestination(main):65 completed destination: class io.airbyte.integrations.destination.mongodb.MongodbDestination 2024-10-31 092020 platform > Check connection job received output: io.airbyte.config.StandardCheckConnectionOutput@604ecce1[status=failed,message=Format specifier '%s',additionalProperties={}] 2024-10-31 092020 platform > 2024-10-31 092020 platform > ----- END CHECK ----- 2024-10-31 092021 platform > I am using mongodb as destination connector. I am getting above error
    u
    • 2
    • 1
  • s

    Slackbot

    10/31/2024, 9:58 AM
    This message was deleted.
    u
    • 2
    • 1
  • w

    Will

    10/31/2024, 10:05 AM
    What are the new
    ClusterRole
    and
    ClusterRoleBindings
    added in v1 helm chart?
    u
    k
    • 3
    • 3
  • s

    srinivasa

    10/31/2024, 11:07 AM
    how to install airbyte on AWS eks cluster?
    u
    • 2
    • 2
  • m

    Marcello Caron

    10/31/2024, 11:56 AM
    I was using the Google Cloud Storage (GCS) connector (airbyte/source-gcs) and it was working fine (version 0.7.4). Now I updated to the latest version (0.8.1) and it's throwing this error:
    Copy code
    2024-10-31 11:43:50 platform > Retry State: RetryManager(completeFailureBackoffPolicy=BackoffPolicy(minInterval=PT10S, maxInterval=PT30M, base=3), partialFailureBackoffPolicy=null, successiveCompleteFailureLimit=5, totalCompleteFailureLimit=10, successivePartialFailureLimit=1000, totalPartialFailureLimit=20, successiveCompleteFailures=4, totalCompleteFailures=4, successivePartialFailures=0, totalPartialFailures=0)
    2024-10-31 11:43:50 platform > Backing off for: 4 minutes 30 seconds.
    2024-10-31 11:48:21 INFO i.a.w.l.p.s.m.Stage(apply):39 - APPLY Stage: CLAIM — (workloadId = 2a8c41ae-8c23-4be0-a73f-2ab10ca1a820_334_4_check) — (dataplaneId = local)
    2024-10-31 11:48:33 INFO i.a.c.i.LineGobbler(voidCall):166 - 
    2024-10-31 11:48:34 INFO i.a.c.i.LineGobbler(voidCall):166 - ----- START CHECK -----
    2024-10-31 11:48:34 INFO i.a.c.i.LineGobbler(voidCall):166 - 
    2024-10-31 11:48:35 INFO i.a.c.ConnectorWatcher(processConnectorOutput):114 - Connector exited, processing output
    2024-10-31 11:48:35 INFO i.a.c.ConnectorWatcher(getConnectorOutputStream):151 - Output file jobOutput.json found
    2024-10-31 11:48:35 INFO i.a.c.ConnectorWatcher(processConnectorOutput):117 - Connector exited with exit code 0
    2024-10-31 11:48:35 INFO i.a.w.i.VersionedAirbyteStreamFactory(create):189 - Reading messages from protocol version 0.2.0
    2024-10-31 11:48:35 WARN i.a.m.l.MetricClientFactory(getMetricClient):43 - MetricClient has not been initialized. Must call MetricClientFactory.CreateMetricClient before using MetricClient. Using a dummy client for now. Ignore this if Airbyte is configured to not publish any metrics.
    2024-10-31 11:48:35 INFO i.a.c.ConnectorMessageProcessor(updateConfigFromControlMessage):231 - Checking for optional control message...
    2024-10-31 11:48:35 INFO i.a.c.ConnectorWatcher(saveConnectorOutput):162 - Writing output of 2a8c41ae-8c23-4be0-a73f-2ab10ca1a820_334_4_check to the doc store
    2024-10-31 11:48:35 INFO i.a.c.ConnectorWatcher(markWorkloadSuccess):167 - Marking workload 2a8c41ae-8c23-4be0-a73f-2ab10ca1a820_334_4_check as successful
    2024-10-31 11:48:35 INFO i.a.c.i.LineGobbler(voidCall):166 - 
    2024-10-31 11:48:35 INFO i.a.c.ConnectorWatcher(exitProperly):215 - Deliberately exiting process with code 0.
    2024-10-31 11:48:35 INFO i.a.c.i.LineGobbler(voidCall):166 - ----- END CHECK -----
    2024-10-31 11:48:35 INFO i.a.c.i.LineGobbler(voidCall):166 - 
    2024-10-31 11:48:36 platform > Retry State: RetryManager(completeFailureBackoffPolicy=BackoffPolicy(minInterval=PT10S, maxInterval=PT30M, base=3), partialFailureBackoffPolicy=null, successiveCompleteFailureLimit=5, totalCompleteFailureLimit=10, successivePartialFailureLimit=1000, totalPartialFailureLimit=20, successiveCompleteFailures=5, totalCompleteFailures=5, successivePartialFailures=0, totalPartialFailures=0)
     Backoff before next attempt: 13 minutes 30 seconds
    2024-10-31 11:48:36 platform > Failing job: 334, reason: Job failed after too many retries for connection e5bcbb6b-2710-4e12-90aa-8bff33e108c4
    2024-10-31 11:48:21 INFO i.a.w.l.c.WorkloadApiClient(claim):75 - Claimed: true for 2a8c41ae-8c23-4be0-a73f-2ab10ca1a820_334_4_check via API for local
    2024-10-31 11:48:21 INFO i.a.w.l.p.s.m.Stage(apply):39 - APPLY Stage: CHECK_STATUS — (workloadId = 2a8c41ae-8c23-4be0-a73f-2ab10ca1a820_334_4_check) — (dataplaneId = local)
    2024-10-31 11:48:22 INFO i.a.w.l.p.s.CheckStatusStage(applyStage):59 - No pod found running for workload 2a8c41ae-8c23-4be0-a73f-2ab10ca1a820_334_4_check
    2024-10-31 11:48:22 INFO i.a.w.l.p.s.m.Stage(apply):39 - APPLY Stage: BUILD — (workloadId = 2a8c41ae-8c23-4be0-a73f-2ab10ca1a820_334_4_check) — (dataplaneId = local)
    2024-10-31 11:48:22 INFO i.a.w.l.p.s.m.Stage(apply):39 - APPLY Stage: MUTEX — (workloadId = 2a8c41ae-8c23-4be0-a73f-2ab10ca1a820_334_4_check) — (dataplaneId = local)
    2024-10-31 11:48:22 INFO i.a.w.l.p.s.EnforceMutexStage(applyStage):50 - No mutex key specified for workload: 2a8c41ae-8c23-4be0-a73f-2ab10ca1a820_334_4_check. Continuing...
    2024-10-31 11:48:22 INFO i.a.w.l.p.s.m.Stage(apply):39 - APPLY Stage: LAUNCH — (workloadId = 2a8c41ae-8c23-4be0-a73f-2ab10ca1a820_334_4_check) — (dataplaneId = local)
    2024-10-31 11:48:26 INFO i.a.w.l.c.WorkloadApiClient(updateStatusToLaunched):60 - Attempting to update workload: 2a8c41ae-8c23-4be0-a73f-2ab10ca1a820_334_4_check to LAUNCHED.
    2024-10-31 11:48:26 INFO i.a.w.l.p.h.SuccessHandler(accept):60 - Pipeline completed for workload: 2a8c41ae-8c23-4be0-a73f-2ab10ca1a820_334_4_check.
    Already rolled back to make it work.
    u
    • 2
    • 2
  • e

    Euan Blackledge

    10/31/2024, 12:35 PM
    @kapa.ai, how does airbyte store the passwords or the aws access keys for connections in its database? What table is used? How are these stored?
    u
    u
    u
    • 4
    • 4
  • m

    Marcello Caron

    10/31/2024, 12:45 PM
    Tried to use the dbt source connector (airbyte/source-dbt) and it returned the error "'Unauthorized. Please ensure you are authenticated correctly.'"
    u
    u
    +2
    • 5
    • 10
  • i

    Ishan Anilbhai Koradiya

    10/31/2024, 12:52 PM
    Hi @kapa.ai how to uninstall airbyte on kubernetes ?
    u
    u
    +9
    • 12
    • 17
  • s

    srinivasa

    10/31/2024, 1:06 PM
    Db is required to install airbyte ?
    u
    • 2
    • 1
  • s

    srinivasa

    10/31/2024, 1:22 PM
    sample yaml.yaml file to install on airbyte on AWS eks
    u
    • 2
    • 1
  • t

    Tobias Willi

    10/31/2024, 1:37 PM
    @kapa.ai when creating connectors, where do i specify the Operating system ?
    u
    • 2
    • 1
  • t

    Talha Naeem

    10/31/2024, 2:31 PM
    @kapa.ai I am getting this warning and an eror in airbyte workload launcher pod logs. Can you help me identify the issue:
    Copy code
    Caused by: io.grpc.netty.shaded.io.netty.channel.AbstractChannel$AnnotatedConnectException: finishConnect(..) failed: Connection refused: airbyte-temporal/10.100.66.5:7233
    Caused by: java.net.ConnectException: finishConnect(..) failed: Connection refused
    	at io.grpc.netty.shaded.io.netty.channel.unix.Errors.newConnectException0(Errors.java:166) ~[grpc-netty-shaded-1.66.0.jar:1.66.0]
    	at io.grpc.netty.shaded.io.netty.channel.unix.Errors.handleConnectErrno(Errors.java:131) ~[grpc-netty-shaded-1.66.0.jar:1.66.0]
    	at io.grpc.netty.shaded.io.netty.channel.unix.Socket.finishConnect(Socket.java:359) ~[grpc-netty-shaded-1.66.0.jar:1.66.0]
    	at io.grpc.netty.shaded.io.netty.channel.epoll.AbstractEpollChannel$AbstractEpollUnsafe.doFinishConnect(AbstractEpollChannel.java:710) ~[grpc-netty-shaded-1.66.0.jar:1.66.0]
    	at io.grpc.netty.shaded.io.netty.channel.epoll.AbstractEpollChannel$AbstractEpollUnsafe.finishConnect(AbstractEpollChannel.java:687) ~[grpc-netty-shaded-1.66.0.jar:1.66.0]
    	at io.grpc.netty.shaded.io.netty.channel.epoll.AbstractEpollChannel$AbstractEpollUnsafe.epollOutReady(AbstractEpollChannel.java:567) ~[grpc-netty-shaded-1.66.0.jar:1.66.0]
    	at io.grpc.netty.shaded.io.netty.channel.epoll.EpollEventLoop.processReady(EpollEventLoop.java:499) ~[grpc-netty-shaded-1.66.0.jar:1.66.0]
    	at io.grpc.netty.shaded.io.netty.channel.epoll.EpollEventLoop.run(EpollEventLoop.java:407) ~[grpc-netty-shaded-1.66.0.jar:1.66.0]
    	at io.grpc.netty.shaded.io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997) ~[grpc-netty-shaded-1.66.0.jar:1.66.0]
    	at io.grpc.netty.shaded.io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) ~[grpc-netty-shaded-1.66.0.jar:1.66.0]
    	at io.grpc.netty.shaded.io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) ~[grpc-netty-shaded-1.66.0.jar:1.66.0]
    	... 1 more
    2024-10-31 14:24:42 WARN i.t.i.w.Poller$PollerUncaughtExceptionHandler(logPollErrors):324 - Failure in poller thread Workflow Poller taskQueue="workload_default", namespace="default": 1
    io.grpc.StatusRuntimeException: UNAVAILABLE: io exception
    Copy code
    2024-10-31 14:27:50 ERROR i.a.w.l.p.h.FailureHandler(apply):39 - Pipeline Error
    io.airbyte.workload.launcher.pipeline.stages.model.StageError: io.airbyte.workers.exception.KubeClientException: Failed to create pod ce-mailchimp-check-ab0d555a-cbc9-48ad-b61b-4e59b2c43f9b-0-kqnfe.
    	at io.airbyte.workload.launcher.pipeline.stages.model.Stage.apply(Stage.kt:46) ~[io.airbyte-airbyte-workload-launcher-1.0.0.jar:?]
    	at io.airbyte.workload.launcher.pipeline.stages.LaunchPodStage.apply(LaunchPodStage.kt:38) ~[io.airbyte-airbyte-workload-launcher-1.0.0.jar:?]
    	at io.airbyte.workload.launcher.pipeline.stages.$LaunchPodStage$Definition$Intercepted.$$access$$apply(Unknown Source) ~[io.airbyte-airbyte-workload-launcher-1.0.0.jar:?]
    	at io.airbyte.workload.launcher.pipeline.stages.$LaunchPodStage$Definition$Exec.dispatch(Unknown Source) ~[io.airbyte-airbyte-workload-launcher-1.0.0.jar:?]
    	at io.micronaut.context.AbstractExecutableMethodsDefinition$DispatchedExecutableMethod.invoke(AbstractExecutableMethodsDefinition.java:456) ~[micronaut-inject-4.6.5.jar:4.6.5]
    	at io.micronaut.aop.chain.MethodInterceptorChain.proceed(MethodInterceptorChain.java:134) ~[micronaut-aop-4.6.5.jar:4.6.5]
    	at io.airbyte.metrics.interceptors.InstrumentInterceptorBase.doIntercept(InstrumentInterceptorBase.kt:61) ~[io.airbyte.airbyte-metrics-metrics-lib-1.0.0.jar:?]
    	at io.airbyte.metrics.interceptors.InstrumentInterceptorBase.intercept(InstrumentInterceptorBase.kt:44) ~[io.airbyte.airbyte-metrics-metrics-lib-1.0.0.jar:?]
    	at io.micronaut.aop.chain.MethodInterceptorChain.proceed(MethodInterceptorChain.java:143) ~[micronaut-aop-4.6.5.jar:4.6.5]
    	at io.airbyte.workload.launcher.pipeline.stages.$LaunchPodStage$Definition$Intercepted.apply(Unknown Source) ~[io.airbyte-airbyte-workload-launcher-1.0.0.jar:?]
    	at io.airbyte.workload.launcher.pipeline.stages.LaunchPodStage.apply(LaunchPodStage.kt:24) ~[io.airbyte-airbyte-workload-launcher-1.0.0.jar:?]
    	at reactor.core.publisher.MonoFlatMap$FlatMapMain.onNext(MonoFlatMap.java:132) ~[reactor-core-3.6.9.jar:3.6.9]
    	at reactor.core.publisher.MonoFlatMap$FlatMapMain.onNext(MonoFlatMap.java:158) ~[reactor-core-3.6.9.jar:3.6.9]
    	at reactor.core.publisher.MonoFlatMap$FlatMapMain.onNext(MonoFlatMap.java:158) ~[reactor-core-3.6.9.jar:3.6.9]
    	at reactor.core.publisher.MonoFlatMap$FlatMapMain.onNext(MonoFlatMap.java:158) ~[reactor-core-3.6.9.jar:3.6.9]
    	at reactor.core.publisher.Operators$ScalarSubscription.request(Operators.java:2571) ~[reactor-core-3.6.9.jar:3.6.9]
    	at reactor.core.publisher.MonoFlatMap$FlatMapMain.request(MonoFlatMap.java:194) ~[reactor-core-3.6.9.jar:3.6.9]
    	at reactor.core.publisher.MonoFlatMap$FlatMapMain.request(MonoFlatMap.java:194) ~[reactor-core-3.6.9.jar:3.6.9]
    	at reactor.core.publisher.MonoFlatMap$FlatMapMain.request(MonoFlatMap.java:194) ~[reactor-core-3.6.9.jar:3.6.9]
    	at reactor.core.publisher.MonoFlatMap$FlatMapMain.request(MonoFlatMap.java:194) ~[reactor-core-3.6.9.jar:3.6.9]
    	at reactor.core.publisher.Operators$MultiSubscriptionSubscriber.set(Operators.java:2367) ~[reactor-core-3.6.9.jar:3.6.9
    u
    • 2
    • 1
  • c

    Carlos Bernal Carvajal

    10/31/2024, 3:16 PM
    @kapa.ai My syncs are faling with this Error:
    Copy code
    2024-10-31 15:12:26 platform > readFromSource: exception caught
    io.airbyte.workers.exception.WorkerException: A stream status (public.GridStatistics) has been detected for a stream not present in the catalog
            at io.airbyte.workers.helper.StreamStatusCompletionTracker.track(StreamStatusCompletionTracker.kt:36) ~[io.airbyte-airbyte-commons-worker-0.63.13.jar:?]
            at io.airbyte.workers.general.BufferedReplicationWorker.readFromSource(BufferedReplicationWorker.java:361) ~[io.airbyte-airbyte-commons-worker-0.63.13.jar:?]
            at io.airbyte.workers.general.BufferedReplicationWorker.lambda$runAsyncWithHeartbeatCheck$3(BufferedReplicationWorker.java:242) ~[io.airbyte-airbyte-commons-worker-0.63.13.jar:?]
            at java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) ~[?:?]
            at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) ~[?:?]
            at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ~[?:?]
    Can you tell me why and how to fix it?
    u
    u
    +9
    • 12
    • 16
  • t

    Talha Naeem

    10/31/2024, 4:09 PM
    I have setup
    JOB_KUBE_TOLERATIONS
    env variable correctly in my helm. And we can see it in airbyte-workload-launcher pod as well. But my sync pods are not being created with this toleration applied. Can you please help me out here?
    u
    u
    +3
    • 6
    • 9
  • h

    Hercules Gabriel da Silva e Mazucato

    10/31/2024, 4:35 PM
    on connector-sidecar container running on openshift I am getting below WARN message while performing discover actions: JAXB is unavailable. Will fallback to SDK implementation which may be less performant.If you are using Java 9+, you will need to include javax.xml.bind:jaxb-api as a dependency.
    u
    u
    • 3
    • 3
  • m

    Mor Iluz

    10/31/2024, 4:44 PM
    If we are using airbyte cloud and using a source connector from the marketplace and it is missing some data that we need, what are our options? the missing data is a nested stream so apparently it requires using the cdk which we cant use in the cloud version
    u
    u
    • 3
    • 3
  • j

    Josh Taylor

    10/31/2024, 5:28 PM
    I'm getting this error when running
    abctl local install --values ./values.yaml --insecure-cookies
    Copy code
    INFO    Using Kubernetes provider:
                Provider: kind
                Kubeconfig: /home/ec2-user/.airbyte/abctl/abctl.kubeconfig
                Context: kind-airbyte-abctl
     SUCCESS  Found Docker installation: version 25.0.6                                                                                                                                                                                                         
     SUCCESS  Existing cluster 'airbyte-abctl' found                                                                                                                                                                                                            
     SUCCESS  Cluster 'airbyte-abctl' validation complete                                                                                                                                                                                                       
      INFO    Namespace 'airbyte-abctl' already exists                                                                                                                                                                                                          
      INFO    Persistent volume 'airbyte-minio-pv' already exists                                                                                                                                                                                               
      INFO    Persistent volume 'airbyte-volume-db' already exists                                                                                                                                                                                              
      INFO    Persistent volume claim 'airbyte-minio-pv-claim-airbyte-minio-0' already exists                                                                                                                                                                   
      INFO    Persistent volume claim 'airbyte-volume-db-airbyte-db-0' already exists                                                                                                                                                                           
      INFO    Starting Helm Chart installation of 'airbyte/airbyte' (version: 1.1.1)                                                                                                                                                                            
      ERROR   Failed to install airbyte/airbyte Helm Chart                                                                                                                                                                                                      
      ERROR   Unable to install Airbyte locally                                                                                                                                                                                                                 
      ERROR   unable to install airbyte chart: unable to install helm: failed to create patch: The order in patch list:
              [map[name:JOB_MAIN_CONTAINER_MEMORY_REQUEST value:2Gi] map[name:JOB_MAIN_CONTAINER_MEMORY_REQUEST valueFrom:map[configMapKeyRef:map[key:JOB_MAIN_CONTAINER_MEMORY_REQUEST name:airbyte-abctl-airbyte-env]]] map[name:JOB_MAIN_CONTAINER_MEMORY_LIMIT value:4Gi] map[name:JOB_MAIN_CONTAINER_MEMORY_LIMIT valueFrom:map[configMapKeyRef:map[key:JOB_MAIN_CONTAINER_MEMORY_LIMIT name:airbyte-abctl-airbyte-env]]] map[name:SECRET_PERSISTENCE value:<nil>]]
               doesn't match $setElementOrder list:
              [map[name:API_AUTHORIZATION_ENABLED] map[name:LOG_LEVEL] map[name:LOG4J_CONFIGURATION_FILE] map[name:AIRBYTE_API_HOST] map[name:AIRBYTE_VERSION] map[name:AIRBYTE_EDITION] map[name:AIRBYTE_URL] map[name:CONFIG_ROOT] map[name:MICROMETER_METRICS_ENABLED] map[name:MICROMETER_METRICS_STATSD_FLAVOR] map[name:MICRONAUT_ENVIRONMENTS] map[name:SEGMENT_WRITE_KEY] map[name:STATSD_HOST] map[name:STATSD_PORT] map[name:TRACKING_STRATEGY] map[name:WORKER_ENVIRONMENT] map[name:WORKSPACE_ROOT] map[name:WEBAPP_URL] map[name:TEMPORAL_HOST] map[name:JOB_MAIN_CONTAINER_CPU_REQUEST] map[name:JOB_MAIN_CONTAINER_CPU_LIMIT] map[name:JOB_MAIN_CONTAINER_MEMORY_REQUEST] map[name:JOB_MAIN_CONTAINER_MEMORY_LIMIT] map[name:CONFIGS_DATABASE_MINIMUM_FLYWAY_MIGRATION_VERSION] map[name:JOBS_DATABASE_MINIMUM_FLYWAY_MIGRATION_VERSION] map[name:KEYCLOAK_INTERNAL_HOST] map[name:CONNECTOR_BUILDER_SERVER_API_HOST] map[name:AIRBYTE_API_AUTH_HEADER_NAME] map[name:AIRBYTE_API_AUTH_HEADER_VALUE] map[name:ENTERPRISE_SOURCE_STUBS_URL] map[name:AB_INSTANCE_ADMIN_PASSWORD] map[name:AB_INSTANCE_ADMIN_CLIENT_ID] map[name:AB_INSTANCE_ADMIN_CLIENT_SECRET] map[name:AB_JWT_SIGNATURE_SECRET] map[name:AB_COOKIE_SECURE] map[name:AB_COOKIE_SAME_SITE] map[name:SECRET_PERSISTENCE] map[name:S3_PATH_STYLE_ACCESS] map[name:STORAGE_TYPE] map[name:STORAGE_BUCKET_ACTIVITY_PAYLOAD] map[name:STORAGE_BUCKET_LOG] map[name:STORAGE_BUCKET_STATE] map[name:STORAGE_BUCKET_WORKLOAD_OUTPUT] map[name:AWS_ACCESS_KEY_ID] map[name:AWS_SECRET_ACCESS_KEY] map[name:MINIO_ENDPOINT] map[name:DATABASE_HOST] map[name:DATABASE_PORT] map[name:DATABASE_DB] map[name:DATABASE_USER] map[name:DATABASE_PASSWORD] map[name:DATABASE_URL] map[name:AIRBYTE_INSTALLATION_ID] map[name:BASIC_AUTH_PASSWORD] map[name:JOB_MAIN_CONTAINER_MEMORY_LIMIT] map[name:JOB_MAIN_CONTAINER_MEMORY_REQUEST] map[name:TEMPORAL_HISTORY_RETENTION_IN_DAYS]]
               && failed to create patch: The order in patch list:
              [map[name:JOB_MAIN_CONTAINER_MEMORY_REQUEST valueFrom:map[configMapKeyRef:map[key:JOB_MAIN_CONTAINER_MEMORY_REQUEST name:airbyte-abctl-airbyte-env]]] map[name:JOB_MAIN_CONTAINER_MEMORY_REQUEST value:2Gi] map[name:JOB_MAIN_CONTAINER_MEMORY_LIMIT value:4Gi] map[name:JOB_MAIN_CONTAINER_MEMORY_LIMIT valueFrom:map[configMapKeyRef:map[key:JOB_MAIN_CONTAINER_MEMORY_LIMIT name:airbyte-abctl-airbyte-env]]] map[name:SECRET_PERSISTENCE value:<nil>]]
               doesn't match $setElementOrder list:
              [map[name:AIRBYTE_VERSION] map[name:CONFIG_ROOT] map[name:LOG_LEVEL] map[name:LOG4J_CONFIGURATION_FILE] map[name:MICROMETER_METRICS_ENABLED] map[name:MICROMETER_METRICS_STATSD_FLAVOR] map[name:SEGMENT_WRITE_KEY] map[name:STATSD_HOST] map[name:STATSD_PORT] map[name:TRACKING_STRATEGY] map[name:WORKSPACE_DOCKER_MOUNT] map[name:WORKSPACE_ROOT] map[name:LOCAL_ROOT] map[name:WEBAPP_URL] map[name:TEMPORAL_HOST] map[name:TEMPORAL_WORKER_PORTS] map[name:JOB_KUBE_NAMESPACE] map[name:JOB_KUBE_SERVICEACCOUNT] map[name:JOB_MAIN_CONTAINER_CPU_REQUEST] map[name:JOB_MAIN_CONTAINER_CPU_LIMIT] map[name:JOB_MAIN_CONTAINER_MEMORY_REQUEST] map[name:JOB_MAIN_CONTAINER_MEMORY_LIMIT] map[name:INTERNAL_API_HOST] map[name:WORKLOAD_API_HOST] map[name:WORKLOAD_API_BEARER_TOKEN] map[name:CONFIGS_DATABASE_MINIMUM_FLYWAY_MIGRATION_VERSION] map[name:JOBS_DATABASE_MINIMUM_FLYWAY_MIGRATION_VERSION] map[name:METRIC_CLIENT] map[name:OTEL_COLLECTOR_ENDPOINT] map[name:ACTIVITY_MAX_ATTEMPT] map[name:ACTIVITY_INITIAL_DELAY_BETWEEN_ATTEMPTS_SECONDS] map[name:ACTIVITY_MAX_DELAY_BETWEEN_ATTEMPTS_SECONDS] map[name:WORKFLOW_FAILURE_RESTART_DELAY_SECONDS] map[name:SHOULD_RUN_NOTIFY_WORKFLOWS] map[name:MICRONAUT_ENVIRONMENTS] map[name:AIRBYTE_API_AUTH_HEADER_NAME] map[name:AIRBYTE_API_AUTH_HEADER_VALUE] map[name:SECRET_PERSISTENCE] map[name:S3_PATH_STYLE_ACCESS] map[name:STORAGE_TYPE] map[name:STORAGE_BUCKET_ACTIVITY_PAYLOAD] map[name:STORAGE_BUCKET_LOG] map[name:STORAGE_BUCKET_STATE] map[name:STORAGE_BUCKET_WORKLOAD_OUTPUT] map[name:AWS_ACCESS_KEY_ID] map[name:AWS_SECRET_ACCESS_KEY] map[name:MINIO_ENDPOINT] map[name:DATABASE_HOST] map[name:DATABASE_PORT] map[name:DATABASE_DB] map[name:DATABASE_USER] map[name:DATABASE_PASSWORD] map[name:DATABASE_URL] map[name:AIRBYTE_INSTALLATION_ID] map[name:BASIC_AUTH_PASSWORD] map[name:CONTAINER_ORCHESTRATOR_ENABLED] map[name:JOB_MAIN_CONTAINER_MEMORY_LIMIT] map[name:JOB_MAIN_CONTAINER_MEMORY_REQUEST] map[name:TEMPORAL_HISTORY_RETENTION_IN_DAYS]]
               && cannot patch "airbyte-abctl-workload-launcher" with kind Deployment: Deployment.apps "airbyte-abctl-workload-launcher" is invalid: [spec.template.spec.containers[0].env[30].valueFrom: Invalid value: "": may not be specified when `value` is not empty, spec.template.spec.containers[0].env[31].valueFrom: Invalid value: "": may not be specified when `value` is not empty]
    my values.yml:
    Copy code
    global:
      env_vars:
        BASIC_AUTH_PASSWORD: "<password>"
        JOB_MAIN_CONTAINER_MEMORY_REQUEST: 2Gi
        JOB_MAIN_CONTAINER_MEMORY_LIMIT: 4Gi
        TEMPORAL_HISTORY_RETENTION_IN_DAYS: 7
    u
    u
    +2
    • 5
    • 7
  • b

    Brian Bolt

    10/31/2024, 6:30 PM
    I am running a sync which logs a lot and the view logs button isn’t working because it’s too much data. What kubectl deployment or pod can I monitor to see the current logs of the job?
    u
    • 2
    • 1
  • j

    Jesus Rivero

    10/31/2024, 11:36 PM
    I try to set up IMAGE_PULL_SECRET on airbyte but the orchestrator job is not setting the pullSecret config on the pod.
    u
    • 2
    • 1
  • a

    ABHISHEK TRIPATHI

    11/01/2024, 12:22 AM
    How do I add debezium configs like
    incremental.snapshot.chunk.size
    to airbyte sql server source connector
    u
    • 2
    • 1
  • b

    Ben Tennyson

    11/01/2024, 3:39 AM
    @kapa.ai Why is whenever I'm trying to call the
    /v1/jobs
    API it always returns the error
    403 Forbidden
    ?
    u
    u
    +176
    • 179
    • 263
  • t

    Tom Evers

    11/01/2024, 4:09 AM
    @kapa.ai Can I unzip a gzip file via the UI connector builder?
    u
    • 2
    • 1
  • c

    coder xu

    11/01/2024, 7:31 AM
    k8s helm how to use private connector-sidecar image
    u
    k
    c
    • 4
    • 4
  • p

    Peter Holterman

    11/01/2024, 8:27 AM
    what is the meaning of the fields
    ab_cdc_cursor
    and
    ab_cdc_updated_at
    ?
    u
    u
    +2
    • 5
    • 6
  • t

    Teddy Kosciuszek

    11/01/2024, 8:46 AM
    @kapa.ai I am running pyairbyte and using the source-google-sheets connector to extract data. I keep receiving the following issue for airbyte==0.19.0 indicating an issue with the ConnectorStateManager
    Copy code
    2024-11-01 09:29:15 - INFO - ConnectorStateManager.__init__() got an unexpected keyword argument 'stream_instance_map'
    Traceback (most recent call last):
      File "/Users/teddykosciuszek/Documents/CodeRepos/data-infra-airflow/dags/airbyte/.venv-source-google-sheets/bin/source-google-sheets", line 8, in <module>
        sys.exit(run())
      File "/Users/teddykosciuszek/Documents/CodeRepos/data-infra-airflow/dags/airbyte/.venv-source-google-sheets/lib/python3.10/site-packages/source_google_sheets/run.py", line 15, in run
        launch(source, sys.argv[1:])
      File "/Users/teddykosciuszek/Documents/CodeRepos/data-infra-airflow/dags/airbyte/.venv-source-google-sheets/lib/python3.10/site-packages/airbyte_cdk/entrypoint.py", line 234, in launch
        for message in source_entrypoint.run(parsed_args):
      File "/Users/teddykosciuszek/Documents/CodeRepos/data-infra-airflow/dags/airbyte/.venv-source-google-sheets/lib/python3.10/site-packages/airbyte_cdk/entrypoint.py", line 122, in run
        yield from map(AirbyteEntrypoint.airbyte_message_to_string, self.read(source_spec, config, config_catalog, state))
      File "/Users/teddykosciuszek/Documents/CodeRepos/data-infra-airflow/dags/airbyte/.venv-source-google-sheets/lib/python3.10/site-packages/airbyte_cdk/entrypoint.py", line 164, in read
        for message in self.source.read(self.logger, config, catalog, state):
      File "/Users/teddykosciuszek/Documents/CodeRepos/data-infra-airflow/dags/airbyte/.venv-source-google-sheets/lib/python3.10/site-packages/source_google_sheets/source.py", line 244, in read
        yield from self._read(logger, config, catalog, state)
      File "/Users/teddykosciuszek/Documents/CodeRepos/data-infra-airflow/dags/airbyte/.venv-source-google-sheets/lib/python3.10/site-packages/source_google_sheets/source.py", line 166, in _read
        state_manager = ConnectorStateManager(stream_instance_map=stream_instances, state=state or {})
    TypeError: ConnectorStateManager.__init__() got an unexpected keyword argument 'stream_instance_map'
    u
    u
    +3
    • 6
    • 7
  • z

    Zikra Wahyudi Nazir

    11/01/2024, 9:45 AM
    I attempted to connect to an RDS PostgreSQL as source database using Airbyte Cloud, but encountered the following error:
    Error Message:
    "Configuration check failed: HikariPool-1 - Connection is not available, request timed out after 10002ms (total=0, active=0, idle=0, waiting=0)"
    I have already allowed inbound traffic from Airbyte's IP addresses and set the SSL Method to "require." What steps can I take to resolve this issue?
    u
    f
    • 3
    • 2
  • a

    Al-Moatasem Mohammad

    11/01/2024, 12:55 PM
    I have a question related to destination bigquery, when I load the data using the standard approach, can you list the full steps applied since the bigquery connector gets the data from upstream the loading involves deduplicating/overwrite I need the steps in simple/plain English, and I will elaborate
    u
    u
    +2
    • 5
    • 6
  • j

    james 98

    11/01/2024, 12:57 PM
    For connection from GA4 to postgres (postgres 17) raw json tables were automatically normalised without having a tab to choose normalisation. Is this happening automatically
    u
    • 2
    • 1
1...434445...48Latest