user
10/22/2024, 9:22 PMs3-access-key-id
and s3-secret-access-key
and that I should invoke their values using the key global.storage.storageSecretName
. This breaks the Helm parser. According to an archived thread on Slack it should instead be global.storage.secretName
. At least when using this key the parser doesn't break here. However the Helm deployment will instead fail due to that Secret.stringData.AWS_ACCESS_KEY_ID
and Secret.stringData.AWS_SECRET_ACCESS_KEY
has unknown objects of type "nil". It seems that the supplied secrets never gets picked up. Adding Secret.stringData.AWS_ACCESS_KEY_ID
and Secret.stringData.AWS_SECRET_ACCESS_KEY
to my K8s secrets does not help.
There is also a sidenote in the documentation that if I want to use another S3-compatible interface, then the endpoint
-key should be supplied. The documentation does not mention where this key should be supplied. Perhaps it should be under global.storage.s3
but I can't test that since the deployment fails anyhow.
Secret Management
According to the documentation the four supported secret managers are AWS Secrets Manager, Google Secrets Manager, Azure Key Vault and Hashicorp Vault. However there are only configuration examples of the first three. I asked ChatGPT if it knew how to configure Hashicorp Vault as a secrets manager for Airbyte. It gave a suggestion that looked promising but the deployment failed due to Helm parsing errors.
External database
The documentation first outlines how to disable the internal Postgres db by setting postgresql.enabled: false
. This will make the Helm parser fail, complaining about "nil" in Secret.stringData.DATABASE_PASSWORD
and Secret.stringData.DATABASE_USER
.
Skipping the setting postgresql.enabled: false
will allow the deployment to continue but will make the airbyte-bootloader
pod fail because it cannot connect to the external database. After a lot of trial and errors and printenv
inside the failing pod I came to the conclusion that the only way to get at least something to work is to set the env DATABASE_URL
to something like jdbc:<postgresql://external-db.com:5432/airbyte>
. That was the only way to make the bootloader care about anything else than references to a K8s-internal service.
It still ultimately fails, though because the temporal pod fails to connect to a database. Probably another reference somewhere that doesn't get set properly.
All in all. The current state of the documentation and Helm chart is a mess.
airbytehq/airbyteCale Anderson
10/22/2024, 9:54 PMEthan Brown
10/23/2024, 12:15 AMmessage='activity ScheduleToStart timeout', timeoutType=TIMEOUT_TYPE_SCHEDULE_TO_START
It looks like some others have reported this but I don't see any resolutions. Has anyone else encountered and solved this before?user
10/23/2024, 1:30 AMnode-viewer
ClusterRole created by the Helm chart. As this is a cluster wide resource is conflicts when I try to deploy Airbyte to a second namespace.
I believe this is an avoidable conflict but needs either:
• a configuration option, something like .Values.serviceAccount.createClusterRole
• a check to see if it already exists or not, something like if not (lookup "<http://rbac.authorization.k8s.io/v1|rbac.authorization.k8s.io/v1>" "ClusterRole" "" "node-viewer")
airbytehq/airbyteSunil Jimenez
10/23/2024, 2:20 AMuser
10/23/2024, 4:50 AMShubham
10/23/2024, 7:44 AMWareIQ
(https://documenter.getpostman.com/view/17076115/U16nM5Tu#6bc956d6-d020-4029-b290-f78a7c72bc27) using no code connector builder.
For Orders endpoint, we have the options shown in the attached image as payload for the POST request.
I need to insert the start and end date to make this stream incremental, but there is only one place where I need to provide both start and end date in the payload (as a range/list). How do I do this in no code connector ?Sergi Gómez
10/23/2024, 8:07 AMSeb J
10/23/2024, 8:26 AMuser
10/23/2024, 8:31 AMuser
10/23/2024, 8:41 AMSergi Gómez
10/23/2024, 8:58 AMuser
10/23/2024, 9:05 AMgetResultSet not implementedI’ve tried using different versions of the ClickHouse JDBC driver, but the error persists. Has anyone experienced this issue before or found a workaround for it? Any suggestions on how to resolve this would be greatly appreciated! Below are more detailed logs from the error:
2024-10-18 10:55:45 source > 2024-10-18 10:55:45 ERROR i.a.c.d.j.StreamingJdbcDatabase$1(tryAdvance):109 - SQLState: 0A000, Message: getResultSet not implemented 2024-10-18 10:56:01 source > 2024-10-18 10:56:01 ERROR i.a.c.u.CompositeIterator(close):126 - exception while closing 2024-10-18 10:56:01 source > java.lang.RuntimeException: java.sql.SQLFeatureNotSupportedException: getResultSet not implemented 2024-10-18 10:56:01 source > at io.airbyte.cdk.db.jdbc.StreamingJdbcDatabase.lambda$unsafeQuery$0(StreamingJdbcDatabase.java:77) ~[airbyte-cdk-core-0.20.4.jar:?] 2024-10-18 10:56:01 source > at java.base/java.util.stream.AbstractPipeline.close(AbstractPipeline.java:323) ~[?:?] 2024-10-18 10:56:01 source > at io.airbyte.commons.concurrency.VoidCallable.call(VoidCallable.java:15) ~[airbyte-cdk-dependencies-0.20.4.jar:?] 2024-10-18 10:56:01 source > at io.airbyte.commons.util.DefaultAutoCloseableIterator.close(DefaultAutoCloseableIterator.java:53) ~[airbyte-cdk-dependencies-0.20.4.jar:?] 2024-10-18 10:56:01 source > at io.airbyte.commons.util.LazyAutoCloseableIterator.close(LazyAutoCloseableIterator.java:56) ~[airbyte-cdk-dependencies-0.20.4.jar:?] 2024-10-18 10:56:01 source > at io.airbyte.commons.concurrency.VoidCallable.call(VoidCallable.java:15) ~[airbyte-cdk-dependencies-0.20.4.jar:?] 2024-10-18 10:56:01 source > at io.airbyte.commons.util.DefaultAutoCloseableIterator.close(DefaultAutoCloseableIterator.java:53) ~[airbyte-cdk-dependencies-0.20.4.jar:?] 2024-10-18 10:56:01 source > at io.airbyte.commons.concurrency.VoidCallable.call(VoidCallable.java:15) ~[airbyte-cdk-dependencies-0.20.4.jar:?] 2024-10-18 10:56:01 source > at io.airbyte.commons.util.DefaultAutoCloseableIterator.close(DefaultAutoCloseableIterator.java:53) ~[airbyte-cdk-dependencies-0.20.4.jar:?] 2024-10-18 10:56:01 source > at io.airbyte.commons.concurrency.VoidCallable.call(VoidCallable.java:15) ~[airbyte-cdk-dependencies-0.20.4.jar:?] 2024-10-18 10:56:01 source > at io.airbyte.commons.util.DefaultAutoCloseableIterator.close(DefaultAutoCloseableIterator.java:53) ~[airbyte-cdk-dependencies-0.20.4.jar:?] 2024-10-18 10:56:01 source > at io.airbyte.commons.util.CompositeIterator.close(CompositeIterator.java:124) ~[airbyte-cdk-dependencies-0.20.4.jar:?] 2024-10-18 10:56:01 source > at io.airbyte.commons.util.AutoCloseableIterators.lambda$appendOnClose$0(AutoCloseableIterators.java:106) ~[airbyte-cdk-dependencies-0.20.4.jar:?] 2024-10-18 10:56:01 source > at io.airbyte.commons.concurrency.VoidCallable.call(VoidCallable.java:15) ~[airbyte-cdk-dependencies-0.20.4.jar:?] 2024-10-18 10:56:01 source > at io.airbyte.commons.util.DefaultAutoCloseableIterator.close(DefaultAutoCloseableIterator.java:53) ~[airbyte-cdk-dependencies-0.20.4.jar:?] 2024-10-18 10:56:01 source > at io.airbyte.commons.util.AutoCloseableIterators.lambda$appendOnClose$0(AutoCloseableIterators.java:106) ~[airbyte-cdk-dependencies-0.20.4.jar:?] 2024-10-18 10:56:01 source > at io.airbyte.commons.concurrency.VoidCallable.call(VoidCallable.java:15) [airbyte-cdk-dependencies-0.20.4.jar:?] 2024-10-18 10:56:01 source > at io.airbyte.commons.util.DefaultAutoCloseableIterator.close(DefaultAutoCloseableIterator.java:53) [airbyte-cdk-dependencies-0.20.4.jar:?] 2024-10-18 10:56:01 source > at io.airbyte.cdk.integrations.base.IntegrationRunner.readSerial(IntegrationRunner.java:275) [airbyte-cdk-core-0.20.4.jar:?] 2024-10-18 10:56:01 source > at io.airbyte.cdk.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:173) [airbyte-cdk-core-0.20.4.jar:?] 2024-10-18 10:56:01 source > at io.airbyte.cdk.integrations.base.IntegrationRunner.run(IntegrationRunner.java:125) [airbyte-cdk-core-0.20.4.jar:?] 2024-10-18 10:56:01 source > at io.airbyte.integrations.source.clickhouse.ClickHouseSource.main(ClickHouseSource.java:134) [io.airbyte.airbyte-integrations.connectors-source-clickhouse-0.50.50.jar:?]
airbytehq/airbyteSeppo Puusa
10/23/2024, 11:59 AMuser
10/23/2024, 12:46 PMDamien Querbes
10/23/2024, 1:50 PMvalues.yaml
. I plan to store these secrets in GCP Secret Manager. I have already referred gcp.json
in values.yaml
as described in the doc but I don’t get how to map connector secrets (stored in GCP secret managers) with the airbyte keys specific to connector tokens.
Can anyone clarify this please? 🙏user
10/23/2024, 2:33 PMuser
10/23/2024, 2:56 PMuser
10/23/2024, 3:01 PMNicolas Gutierrez
10/23/2024, 3:36 PMAttempted to close a destination which is already closed.
and java.io.IOException: Broken pipe
. The replication orchestrator summarized the failures like so:
2024-10-23 15:00:19 replication-orchestrator > failures: [ {
"failureOrigin" : "replication",
"internalMessage" : "No exit code found.",
"externalMessage" : "Something went wrong during replication",
"metadata" : {
"attemptNumber" : 2,
"jobId" : 8
},
"stacktrace" : "java.lang.IllegalStateException: No exit code found.\n\tat io.airbyte.workers.internal.ContainerIOHandle.getExitCode(ContainerIOHandle.kt:104)\n\tat io.airbyte.workers.internal.LocalContainerAirbyteDestination.getExitValue(LocalContainerAirbyteDestination.kt:119)\n\tat io.airbyte.workers.general.BufferedReplicationWorker.readFromDestination(BufferedReplicationWorker.java:493)\n\tat io.airbyte.workers.general.BufferedReplicationWorker.lambda$runAsync$2(BufferedReplicationWorker.java:215)\n\tat java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)\n\tat java.base/java.lang.Thread.run(Thread.java:1583)\n",
"timestamp" : 1729695494141
}, {
"failureOrigin" : "destination",
"internalMessage" : "Destination process message delivery failed",
"externalMessage" : "Something went wrong within the destination connector",
"metadata" : {
"attemptNumber" : 2,
"jobId" : 8,
"connector_command" : "write"
},
"stacktrace" : "io.airbyte.workers.internal.exception.DestinationException: Destination process message delivery failed\n\tat io.airbyte.workers.general.BufferedReplicationWorker.writeToDestination(BufferedReplicationWorker.java:451)\n\tat io.airbyte.workers.general.BufferedReplicationWorker.lambda$runAsyncWithTimeout$5(BufferedReplicationWorker.java:243)\n\tat java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)\n\tat java.base/java.lang.Thread.run(Thread.java:1583)\nCaused by: java.io.IOException: Broken pipe\n\tat java.base/sun.nio.ch.UnixFileDispatcherImpl.write0(Native Method)\n\tat java.base/sun.nio.ch.UnixFileDispatcherImpl.write(UnixFileDispatcherImpl.java:65)\n\tat java.base/sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:137)\n\tat java.base/sun.nio.ch.IOUtil.write(IOUtil.java:102)\n\tat java.base/sun.nio.ch.IOUtil.write(IOUtil.java:72)\n\tat java.base/sun.nio.ch.FileChannelImpl.write(FileChannelImpl.java:300)\n\tat java.base/sun.nio.ch.ChannelOutputStream.writeFully(ChannelOutputStream.java:68)\n\tat java.base/sun.nio.ch.ChannelOutputStream.write(ChannelOutputStream.java:105)\n\tat java.base/sun.nio.cs.StreamEncoder.writeBytes(StreamEncoder.java:309)\n\tat java.base/sun.nio.cs.StreamEncoder.implWrite(StreamEncoder.java:381)\n\tat java.base/sun.nio.cs.StreamEncoder.implWrite(StreamEncoder.java:357)\n\tat java.base/sun.nio.cs.StreamEncoder.lockedWrite(StreamEncoder.java:158)\n\tat java.base/sun.nio.cs.StreamEncoder.write(StreamEncoder.java:139)\n\tat java.base/java.io.OutputStreamWriter.write(OutputStreamWriter.java:219)\n\tat java.base/java.io.BufferedWriter.implFlushBuffer(BufferedWriter.java:178)\n\tat java.base/java.io.BufferedWriter.flushBuffer(BufferedWriter.java:163)\n\tat java.base/java.io.BufferedWriter.implWrite(BufferedWriter.java:334)\n\tat java.base/java.io.BufferedWriter.write(BufferedWriter.java:313)\n\tat java.base/java.io.Writer.write(Writer.java:278)\n\tat io.airbyte.workers.internal.VersionedAirbyteMessageBufferedWriter.write(VersionedAirbyteMessageBufferedWriter.java:39)\n\tat io.airbyte.workers.internal.LocalContainerAirbyteDestination.acceptWithNoTimeoutMonitor(LocalContainerAirbyteDestination.kt:139)\n\tat io.airbyte.workers.internal.LocalContainerAirbyteDestination.accept(LocalContainerAirbyteDestination.kt:96)\n\tat io.airbyte.workers.general.BufferedReplicationWorker.writeToDestination(BufferedReplicationWorker.java:436)\n\t... 5 more\n",
"timestamp" : 1729695499021
}, {
"failureOrigin" : "source",
"internalMessage" : "Source process read attempt failed",
"externalMessage" : "Something went wrong within the source connector",
"metadata" : {
"attemptNumber" : 2,
"jobId" : 8,
"connector_command" : "read"
},
"stacktrace" : "io.airbyte.workers.internal.exception.SourceException: Source process read attempt failed\n\tat io.airbyte.workers.general.BufferedReplicationWorker.readFromSource(BufferedReplicationWorker.java:375)\n\tat io.airbyte.workers.general.BufferedReplicationWorker.lambda$runAsyncWithHeartbeatCheck$3(BufferedReplicationWorker.java:222)\n\tat java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)\n\tat java.base/java.lang.Thread.run(Thread.java:1583)\nCaused by: java.lang.IllegalStateException: No exit code found.\n\tat io.airbyte.workers.internal.ContainerIOHandle.getExitCode(ContainerIOHandle.kt:104)\n\tat io.airbyte.workers.internal.LocalContainerAirbyteSource.getExitValue(LocalContainerAirbyteSource.kt:90)\n\tat io.airbyte.workers.general.BufferedReplicationWorker.readFromSource(BufferedReplicationWorker.java:355)\n\t... 5 more\n",
"timestamp" : 1729695499108
}, {
"failureOrigin" : "replication",
"internalMessage" : "io.airbyte.workers.exception.WorkerException: Destination has not terminated. This warning is normal if the job was cancelled.",
"externalMessage" : "Something went wrong during replication",
"metadata" : {
"attemptNumber" : 2,
"jobId" : 8
},
"stacktrace" : "java.lang.RuntimeException: io.airbyte.workers.exception.WorkerException: Destination has not terminated. This warning is normal if the job was cancelled.\n\tat io.airbyte.workers.general.BufferedReplicationWorker$CloseableWithTimeout.lambda$close$0(BufferedReplicationWorker.java:545)\n\tat io.airbyte.workers.general.BufferedReplicationWorker.lambda$runAsyncWithTimeout$5(BufferedReplicationWorker.java:243)\n\tat java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)\n\tat java.base/java.lang.Thread.run(Thread.java:1583)\nCaused by: io.airbyte.workers.exception.WorkerException: Destination has not terminated. This warning is normal if the job was cancelled.\n\tat io.airbyte.workers.internal.LocalContainerAirbyteDestination.close(LocalContainerAirbyteDestination.kt:65)\n\tat io.airbyte.workers.general.BufferedReplicationWorker$CloseableWithTimeout.lambda$close$0(BufferedReplicationWorker.java:543)\n\t... 5 more\n",
"timestamp" : 1729695559154
} ]
I wasn't able to find anything more helpful in the logs but happy to post more snippets if that would be useful. Anyone have any suggestions for debugging?Mert Ors
10/23/2024, 3:47 PMMert Ors
10/23/2024, 3:53 PMMert Ors
10/23/2024, 3:54 PMMarco Hemken
10/23/2024, 5:11 PMorchestrator-repl-job
pods?
Use case:
• Logging won't write to S3 bucket unless AWS_REGION
is set. It doesn't work with AWS_DEFAULT_REGION
.
• using Helm chart version 0.64.151
Arun Addagatla
10/23/2024, 5:42 PMColin
10/23/2024, 7:44 PMuser
10/23/2024, 10:49 PMuser
10/23/2024, 10:50 PMWarning from replication: Something went wrong during replication
message='io.temporal.serviceclient.CheckedExceptionWrapper: io.airbyte.workers.exception.WorkerException: Init container error encountered while processing workload for id: 30263b0f-ca05-44ec-8907-9fc7845dfe44_1_4_sync. Encountered exception of type: class com.amazonaws.SdkClientException. Exception message: Unable to load AWS credentials from any provider in the chain: [EnvironmentVariableCredentialsProvider: Unable to load AWS credentials from environment variables (AWS_ACCESS_KEY_ID (or AWS_ACCESS_KEY) and AWS_SECRET_KEY (or AWS_SECRET_ACCESS_KEY)), SystemPropertiesCredentialsProvider: Unable to load AWS credentials from Java system properties (aws.accessKeyId and aws.secretKey), WebIdentityTokenCredentialsProvider: You must specify a value for roleArn and roleSessionName, com.amazonaws.auth.profile.ProfileCredentialsProvider@402f61f5: profile file cannot be null, com.amazonaws.auth.EC2ContainerCredentialsProviderWrapper@3dc55719: Unauthorized (Service: null; Status Code: 401; Error Code: null; Request ID: null; Proxy: null)].', type='java.lang.RuntimeException', nonRetryable=false
Source Stripe image: airbyte/source-stripe:5.6.2
Destination Postgres image: airbyte/destination-postgres:2.4.0
Orchestrator image: airbyte/container-orchestrator:1.1.0
I would appreciate any guidance on resolving this issue.
Relevant log output
2024-10-22 084517 ERROR i.a.w.l.p.h.FailureHandler(apply):39 - Pipeline Error
io.airbyte.workload.launcher.pipeline.stages.model.StageError: java.lang.RuntimeException: Init container for Pod: pods did not complete successfully. Actual termination reason: Error.
at io.airbyte.workload.launcher.pipeline.stages.model.Stage.apply(Stage.kt:46) ~[io.airbyte-airbyte-workload-launcher-1.1.0.jar:?]
at io.airbyte.workload.launcher.pipeline.stages.LaunchPodStage.apply(LaunchPodStage.kt:38) ~[io.airbyte-airbyte-workload-launcher-1.1.0.jar:?]
at io.airbyte.workload.launcher.pipeline.stages.$LaunchPodStage$Definition$Intercepted.$$access$$apply(Unknown Source) ~[io.airbyte-airbyte-workload-launcher-1.1.0.jar:?]
at io.airbyte.workload.launcher.pipeline.stages.$LaunchPodStage$Definition$Exec.dispatch(Unknown Source) ~[io.airbyte-airbyte-workload-launcher-1.1.0.jar:?]
at io.micronaut.context.AbstractExecutableMethodsDefinition$DispatchedExecutableMethod.invoke(AbstractExecutableMethodsDefinition.java:456) ~[micronaut-inject-4.6.5.jar:4.6.5]
at io.micronaut.aop.chain.MethodInterceptorChain.proceed(MethodInterceptorChain.java:134) ~[micronaut-aop-4.6.5.jar:4.6.5]
at io.airbyte.metrics.interceptors.InstrumentInterceptorBase.doIntercept(InstrumentInterceptorBase.kt:61) ~[io.airbyte.airbyte-metrics-metrics-lib-1.1.0.jar:?]
at io.airbyte.metrics.interceptors.InstrumentInterceptorBase.intercept(InstrumentInterceptorBase.kt:44) ~[io.airbyte.airbyte-metrics-metrics-lib-1.1.0.jar:?]
at io.micronaut.aop.chain.MethodInterceptorChain.proceed(MethodInterceptorChain.java:143) ~[micronaut-aop-4.6.5.jar:4.6.5]
at io.airbyte.workload.launcher.pipeline.stages.$LaunchPodStage$Definition$Intercepted.apply(Unknown Source) ~[io.airbyte-airbyte-workload-launcher-1.1.0.jar:?]
at io.airbyte.workload.launcher.pipeline.stages.LaunchPodStage.apply(LaunchPodStage.kt:24) ~[io.airbyte-airbyte-workload-launcher-1.1.0.jar:?]
at reactor.core.publisher.MonoFlatMap$FlatMapMain.onNext(MonoFlatMap.java:132) ~[reactor-core-3.6.9.jar:3.6.9]
at reactor.core.publisher.MonoFlatMap$FlatMapMain.onNext(MonoFlatMap.java:158) ~[reactor-core-3.6.9.jar:3.6.9]
at reactor.core.publisher.MonoFlatMap$FlatMapMain.onNext(MonoFlatMap.java:158) ~[reactor-core-3.6.9.jar:3.6.9]
at reactor.core.publisher.MonoFlatMap$FlatMapMain.onNext(MonoFlatMap.java:158) ~[reactor-core-3.6.9.jar:3.6.9]
at reactor.core.publisher.Operators$ScalarSubscription.request(Operators.java:2571) ~[reactor-core-3.6.9.jar:3.6.9]
at reactor.core.publisher.MonoFlatMap$FlatMapMain.request(MonoFlatMap.java:194) ~[reactor-core-3.6.9.jar:3.6.9]
at reactor.core.publisher.MonoFlatMap$FlatMapMain.request(MonoFlatMap.java:194) ~[reactor-core-3.6.9.jar:3.6.9]
at reactor.core.publisher.MonoFlatMap$FlatMapMain.request(MonoFlatMap.java:194) ~[reactor-core-3.6.9.jar:3.6.9]
at reactor.core.publisher.MonoFlatMap$FlatMapMain.request(MonoFlatMap.java:194) ~[reactor-core-3.6.9.jar:3.6.9]
at reactor.core.publisher.Operators$MultiSubscriptionSubscriber.set(Operators.java:2367) ~[reactor-core-3.6.9.jar:3.6.9]
at reactor.core.publisher.FluxOnErrorResume$ResumeSubscriber.onSubscribe(FluxOnErrorResume.java:74) ~[reactor-core-3.6.9.jar:3.6.9]
at reactor.core.publisher.MonoFlatMap$FlatMapMain.onSubscribe(MonoFlatMap.java:117) ~[reactor-core-3.6.9.jar:3.6.9]
at reactor.core.publisher.MonoFlatMap$FlatMapMain.onSubscribe(MonoFlatMap.java:117) ~[reactor-core-3.6.9.jar:3.6.9]
at reactor.core.publisher.MonoFlatMap$FlatMapMain.onSubscribe(MonoFlatMap.java:117) ~[reactor-core-3.6.9.jar:3.6.9]
at reactor.core.publisher.MonoFlatMap$FlatMapMain.onSubscribe(MonoFlatMap.java:117) ~[reactor-core-3.6.9.jar:3.6.9]
at reactor.core.publisher.FluxFlatMap.trySubscribeScalarMap(FluxFlatMap.java:193) ~[reactor-core-3.6.9.jar:3.6.9]
at reactor.core.publisher.MonoFlatMap.subscribeOrReturn(MonoFlatMap.java:53) ~[reactor-core-3.6.9.jar:3.6.9]
at reactor.core.publisher.Mono.subscribe(Mono.java:4560) ~[reactor-core-3.6.9.jar:3.6.9]
airbytehq/airbyteuser
10/23/2024, 10:50 PMauthenticationType
should be set to credentials
instead of instanceProfile
. Without this adjustment, the sync operation does not work as expected.
I hope this helps anyone encountering a similar issue.
airbytehq/airbyteShubham
10/24/2024, 4:33 AMdestination-bigquery
?