https://linen.dev logo
Join Slack
Powered by
# ask-ai
  • s

    Steven Ayers

    11/18/2025, 2:31 PM
    @kapa.ai how can I configure the autodiscover timeout for when airbyte searches source schemas?
    k
    • 2
    • 1
  • y

    Yuvraj Rimal (युवराज)

    11/18/2025, 3:08 PM
    @kapa.ai, does airbyte use any bitnami images apart from kubectl and postgresql that I need to update on my airbyte-values.yml ?
    k
    • 2
    • 1
  • y

    Yuvraj Rimal (युवराज)

    11/18/2025, 3:14 PM
    Hi @kapa.ai, what are things to consider while upgrading from
    0.54.157
    to 2.0 ? Do I need to change anything on my values.yaml ?
    k
    • 2
    • 4
  • k

    Kothapalli Venkata Avinash

    11/18/2025, 3:18 PM
    @kapa.ai RESOURCE_EXHAUSTED: namespace rate limit exceeded io.grpc.StatusRuntimeException: RESOURCE_EXHAUSTED: namespace rate limit exceeded at io.grpc.stub.ClientCalls.toStatusRuntimeException(ClientCalls.java:268) at io.grpc.stub.ClientCalls.getUnchecked(ClientCalls.java:249) at io.grpc.stub.ClientCalls.blockingUnaryCall(ClientCalls.java:167) at io.temporal.api.workflowservice.v1.WorkflowServiceGrpc$WorkflowServiceBlockingStub.listClosedWorkflowExecutions(WorkflowServiceGrpc.java:4620)
    k
    • 2
    • 1
  • y

    Yuvraj Rimal (युवराज)

    11/18/2025, 3:43 PM
    @kapa.ai, trying to setup salesforce connector, when running it's throwing error.
    Copy code
    Using existing AIRBYTE_ENTRYPOINT: /entrypoint.sh
    Waiting on CHILD_PID 7
    PARENT_PID: 1
    EXIT_STATUS: 2
    sh: 16: cannot create /pipes/stderr: Permission denied
    k
    • 2
    • 1
  • k

    Kothapalli Venkata Avinash

    11/18/2025, 3:47 PM
    @kapa.ai All airbyte Pods are getting recreated, what can be cause of issue?
    k
    • 2
    • 1
  • j

    JCrock8

    11/18/2025, 4:09 PM
    @kapa.ai Mysql source shows success but isn’t successful and 0 rows syncrhonized
    k
    • 2
    • 7
  • s

    Stefano Messina

    11/18/2025, 4:19 PM
    @kapa.ai we keep getting
    java.lang.IllegalStateException: Sync completed, but unflushed states were detected.
    for a connection sync from MSSQL to ClickHouse V2
    k
    • 2
    • 4
  • k

    Kothapalli Venkata Avinash

    11/18/2025, 5:13 PM
    @kapa.ai When we sync the job in Airbyte , discover also getting triggered. How to stop discover job.
    k
    • 2
    • 1
  • c

    Carmela Beiro

    11/18/2025, 7:12 PM
    Hey @kapa.ai can I prevent the initial snapshot when creating a new connection with Postgres?
    k
    • 2
    • 1
  • l

    Lucas Segers

    11/18/2025, 9:17 PM
    @kapa.ai after upgrading to 2.0, my SOAP stream from a connection doesnt even test anymore error: Something went wrong in the connector. See the logs for more details. Exception while syncing stream Empresas: Request body json cannot be a string I've chosen "plain text body"
    k
    • 2
    • 7
  • p

    Pragyash Barman

    11/18/2025, 9:56 PM
    @kapa.ai While running the MySQL to BQ connector, the orchestrator job doesn't move forward. Logs from the replication pod are:
    Copy code
    orchestrator 2025-11-18 21:49:04,818 [pool-4-thread-1]    DEBUG    i.a.c.o.w.WorkloadHeartbeatSender(sendHeartbeat):93 - Sending workload heartbeat.
    orchestrator 2025-11-18 21:49:04,819 [pool-4-thread-1]    DEBUG    i.a.a.c.a.AccessTokenInterceptor(intercept):115 - Fetching access token from control plane...
    orchestrator 2025-11-18 21:49:04,819 [pool-4-thread-1]    DEBUG    i.a.a.c.a.AccessTokenInterceptor(getCachedToken):65 - Using cached token
    orchestrator 2025-11-18 21:49:04,819 [pool-4-thread-1]    DEBUG    i.a.a.c.a.AccessTokenInterceptor(intercept):118 - Token added successfully. 1234.sdhvhvbhbvjbvjvj
    orchestrator 2025-11-18 21:49:04,852 [pool-4-thread-1]    DEBUG    i.a.w.a.c.WorkloadApiClient(createRetryPolicy$lambda$9):211 - Successfully called <http://datasync-airbyte-hr-workload-api-server-svc.datasync:8001/api/v1/workload/heartbea>
    orchestrator 2025-11-18 21:49:04,852 [pool-4-thread-1]    DEBUG    i.a.m.c.AirbyteMetricMeterFilter(accept):48 - Resolved metric ID MeterId{name='airbyte.workload-api-client.success', tags=[tag(max-retries=5),tag(method=PUT),tag(retry-att
    orchestrator 2025-11-18 21:49:11,699 [scheduled-executor-thread-2]    TRACE    i.m.c.DefaultBeanContext(resolveBeanRegistration):2738 - Looking up existing bean for key: healthMonitorTask
    orchestrator 2025-11-18 21:49:11,700 [scheduled-executor-thread-2]    DEBUG    i.m.m.h.m.HealthMonitorTask(monitor):82 - Starting health monitor check
    orchestrator 2025-11-18 21:49:11,701 [scheduled-executor-thread-2]    TRACE    i.m.c.DefaultBeanContext(resolveBeanRegistration):2738 - Looking up existing bean for key: T
    orchestrator 2025-11-18 21:49:11,703 [virtual-executor-629071439]    TRACE    i.m.m.h.m.HealthMonitorTask(lambda$monitor$1):104 - Health monitor result for compositeDiscoveryClient(): status UP, details {services={}}
    orchestrator 2025-11-18 21:49:11,704 [virtual-executor-629071439]    TRACE    i.m.m.h.m.HealthMonitorTask(lambda$monitor$1):104 - Health monitor result for gracefulShutdown: status UP, details {activeTasks=1}
    orchestrator 2025-11-18 21:49:11,704 [virtual-executor-629071439]    TRACE    i.m.m.h.m.HealthMonitorTask(lambda$monitor$1):104 - Health monitor result for service: status UP, details {}
    orchestrator 2025-11-18 21:49:11,704 [virtual-executor-629071439]    TRACE    i.m.m.h.m.HealthMonitorTask(lambda$monitor$1):104 - Health monitor result for diskSpace: status UP, details {total=32199651328, free=21779140608, threshold=1048
    orchestrator 2025-11-18 21:49:11,704 [virtual-executor-629071439]    TRACE    i.m.m.h.m.HealthMonitorTask(lambda$monitor$1):104 - Health monitor result for deadlockedThreads: status UP, details {}
    orchestrator 2025-11-18 21:49:14,853 [pool-4-thread-1]    DEBUG    i.a.c.o.w.WorkloadHeartbeatSender(sendHeartbeat):93 - Sending workload heartbeat.
    orchestrator 2025-11-18 21:49:14,854 [pool-4-thread-1]    DEBUG    i.a.a.c.a.AccessTokenInterceptor(intercept):115 - Fetching access token from control plane...
    orchestrator 2025-11-18 21:49:14,854 [pool-4-thread-1]    DEBUG    i.a.a.c.a.AccessTokenInterceptor(getCachedToken):65 - Using cached token
    orchestrator 2025-11-18 21:49:14,854 [pool-4-thread-1]    DEBUG    i.a.a.c.a.AccessTokenInterceptor(intercept):118 - Token added successfully. eyJhbGciOiJIUzI1NiJ9.eyJpc3MiOiJodHRwczovL2FpcmJ5dGUtdXNlMS1wcm9kLTAwMS1pbnRlcm5hbC5icm93c2Vyc3
    orchestrator 2025-11-18 21:49:14,883 [pool-4-thread-1]    DEBUG    i.a.w.a.c.WorkloadApiClient(createRetryPolicy$lambda$9):211 - Successfully called <http://datasync-airbyte-hr-workload-api-server-svc.datasync:8001/api/v1/workload/heartbea>
    orchestrator 2025-11-18 21:49:14,883 [pool-4-thread-1]    DEBUG    i.a.m.c.AirbyteMetricMeterFilter(accept):48 - Resolved metric ID MeterId{name='airbyte.workload-api-client.success', tags=[tag(max-retries=5),tag(method=PUT),tag(retry-att
    orchestrator 2025-11-18 21:49:15,245 [scheduled-executor-thread-1]    TRACE    i.m.c.DefaultBeanContext(resolveBeanRegistration):2738 - Looking up existing bean for key: storageUsageReporter
    orchestrator 2025-11-18 21:49:23,013 [idle-connection-reaper]    DEBUG    o.a.h.i.c.PoolingHttpClientConnectionManager(closeIdleConnections):441 - Closing connections idle longer than 60000 MILLISECONDS
    orchestrator 2025-11-18 21:49:23,013 [idle-connection-reaper]    DEBUG    o.a.h.i.c.LoggingManagedHttpClientConnection(close):79 - http-outgoing-2: Close connection
    orchestrator 2025-11-18 21:49:23,014 [idle-connection-reaper]    DEBUG    s.a.a.u.Logger(debug):85 - closing <http://sts.amazonaws.com/67.220.250.177:443|sts.amazonaws.com/67.220.250.177:443>
    orchestrator 2025-11-18 21:49:23,015 [idle-connection-reaper]    DEBUG    o.a.h.i.c.PoolingHttpClientConnectionManager(closeIdleConnections):441 - Closing connections idle longer than 60000 MILLISECONDS
    orchestrator 2025-11-18 21:49:23,015 [idle-connection-reaper]    DEBUG    o.a.h.i.c.PoolingHttpClientConnectionManager(closeIdleConnections):441 - Closing connections idle longer than 60000 MILLISECONDS
    orchestrator 2025-11-18 21:49:23,015 [idle-connection-reaper]    DEBUG    o.a.h.i.c.LoggingManagedHttpClientConnection(close):79 - http-outgoing-1: Close connection
    orchestrator 2025-11-18 21:49:23,015 [idle-connection-reaper]    DEBUG    s.a.a.u.Logger(debug):85 - closing <http://browserstack-datasync-prod-use.s3.amazonaws.com/16.15.182.215:443|browserstack-datasync-prod-use.s3.amazonaws.com/16.15.182.215:443>
    orchestrator 2025-11-18 21:49:23,016 [idle-connection-reaper]    DEBUG    o.a.h.i.c.PoolingHttpClientConnectionManager(closeIdleConnections):441 - Closing connections idle longer than 60000 MILLISECONDS
    orchestrator 2025-11-18 21:49:23,016 [idle-connection-reaper]    DEBUG    o.a.h.i.c.LoggingManagedHttpClientConnection(close):79 - http-outgoing-0: Close connection
    orchestrator 2025-11-18 21:49:23,016 [idle-connection-reaper]    DEBUG    s.a.a.u.Logger(debug):85 - closing <http://sts.amazonaws.com/67.220.250.177:443|sts.amazonaws.com/67.220.250.177:443>
    k
    • 2
    • 1
  • l

    Leon Kozlowski

    11/18/2025, 11:50 PM
    Why is IRSA based authentication (instance profile) not supported for Nessie catalog for s3 data lake destination @kapa.ai
    k
    • 2
    • 1
  • c

    Chiranga

    11/19/2025, 12:04 AM
    @kapa.ai Can you disable connector builder on a kubernetes helm chart based install?
    k
    • 2
    • 1
  • r

    Rupak Patel

    11/19/2025, 1:36 AM
    @kapa.ai Using Airbyte Cloud I am exporting hubspot data. We do a full sync every few days on the companies table but occasionally not all records are collected and there is no warning that not everything got collected. This was a summary of the logs from an AI that looked at the logs.
    Copy code
    📌 The Final Diagnosis
    
    Your sync did not export all HubSpot company records because:
    
    ❗ The replication pod lost its heartbeat almost immediately and became unhealthy, so no data was ever read from HubSpot.
    
    This is a cluster or Airbyte orchestration issue, not a HubSpot API or connector issue.
    How do I stop this happening again?
    k
    • 2
    • 1
  • i

    Ishan Anilbhai Koradiya

    11/19/2025, 6:05 AM
    Hi @kapa.ai it looks like upgrading the connector in one workspace, is also upgrading it in all other workspaces. This didn't happen before. I am on airbyte v1.8.3
    k
    • 2
    • 1
  • d

    dilan silva

    11/19/2025, 6:13 AM
    @kapa.ai I created a pull request for my connector and now it has some validation errors when running QA checks, first one is this,
    Copy code
    Run # Exit with code 1 if the CDK is not pinned to a standard version.
    Poe the Poet (version 0.37.0)
    
    Error: Unrecognized task 'detect-cdk-prerelease'
    
    Usage:
      poe [global options] task [task arguments]
    
    Global options:
      -h [TASK], --help [TASK]
                            Show this help page and exit, optionally supply a
                            task.
      --version             Print the version and exit
      -v, --verbose         Increase output (repeatable)
      -q, --quiet           Decrease output (repeatable)
      -d, --dry-run         Print the task contents but don't actually run it
      -C PATH, --directory PATH
                            Specify where to find the pyproject.toml
      -e EXECUTOR, --executor EXECUTOR
                            Override the default task executor
      --ansi                Force enable ANSI output
      --no-ansi             Force disable ANSI output
    
    Configured tasks:
      get-connector-name          
      fetch-secrets               
      install                     
      install-cdk-cli             
      install-unit-tests-project  
      test-all                    
      test-unit-tests             
      test-integration-tests      
      format-check                
      lint-check                  
      lock                        
      get-language                Get the language of the connector from its metadata.yaml file. Use with -qq to get just the language name.
      get-base-image              Get the base image of the connector from its metadata.yaml file.
      get-version                 Get the version of the connector from its metadata.yaml file.
      run-cat-tests               Run the legacy Airbyte-CI acceptance tests (CAT tests).
    
    
    Error: Process completed with exit code 1.
    k
    • 2
    • 7
  • h

    Hari Haran R

    11/19/2025, 6:40 AM
    @kapa.ai we are using airbyte A and Airbyte B, in Airbyte A the cstom cdk connector is working fine, in airbyte B the custom connector says Warning from source: Something went wrong in the connector. See the logs for more details. 'list' object has no attribute 'items' we have deployed both airbyte in kubernets with same airbyte version, why is this occuring in airbyte B alone
    k
    • 2
    • 10
  • s

    Santoshi Kalaskar

    11/19/2025, 7:14 AM
    hi #C01AHCD885S Team, Has anyone worked with Full Refresh data sync for unstructured documents? For example, my source is Google Drive and the destination is Azure Blob Storage. When I delete files in the source, the Full Refresh sync does not delete them in the destination. Shouldn’t Full Refresh remove deleted files from the destination as well ?
    k
    • 2
    • 1
  • l

    Louis Demet

    11/19/2025, 9:15 AM
    Hi @kapa.ai, I’m using the Shopify source connector and I can’t find any data related to returns (like
    Return
    ,
    ReturnRequest
    , or
    ReturnLineItem
    ). These endpoints include the return reasons we need. Are these Shopify Returns API resources planned to be supported soon ? If not, is there any workaround to get this data through Airbyte ? Thanks !
    k
    • 2
    • 1
  • v

    Vishal Garg

    11/19/2025, 9:19 AM
    @kapa.ai Getting error : Cannot invoke "io.airbyte.protocol.models.v0.AirbyteGlobalState.getStreamStates()" because the return value of "io.airbyte.protocol.models.v0.AirbyteStateMessage.getGlobal()" is null while trying to sync data from mysql to weaviate using fake embedding .
    k
    • 2
    • 1
  • r

    Rahul

    11/19/2025, 11:25 AM
    @kapa.ai can I use Amazon Aurora MySQL this host in Airbyte OSS MySQL source?
    k
    • 2
    • 1
  • k

    Konathala Chaitanya

    11/19/2025, 1:14 PM
    @kapa.ai Could not connect with provided configuration. Error: Driver org.postgresql.Driver claims to not accept jdbcUrl, jdbc:postgresql://shared-postgres-inv-2.chf.u-1.rds.aws.com/:5432/phis?prepareThreshold=0&amp;tcpKeepAlive=true&amp;currentSchema=ic&amp;sslmode=disable
    k
    • 2
    • 1
  • l

    Lucas Segers

    11/19/2025, 1:17 PM
    @kapa.ai on a simple builder connection this error happens on only 2 streams 2025-11-19 101248 source ERROR Concurrent read failure Traceback (most recent call last): File "/airbyte/integration_code/main.py", line 4, in <module> run() ~~~^^ File "/airbyte/integration_code/source_declarative_manifest/run.py", line 307, in run handle_command(args) ~~~~~~~~~~~~~~^^^^^^ File "/airbyte/integration_code/source_declarative_manifest/run.py", line 93, in handle_command handle_remote_manifest_command(args) ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^ File "/airbyte/integration_code/source_declarative_manifest/run.py", line 159, in handle_remote_manifest_command launch( ~~~~~~^ source=source, ^^^^^^^^^^^^^^ args=args, ^^^^^^^^^^ ) ^ File "/usr/local/lib/python3.13/site-packages/airbyte_cdk/entrypoint.py", line 377, in launch for message in source_entrypoint.run(parsed_args): ~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^ File "/usr/local/lib/python3.13/site-packages/airbyte_cdk/entrypoint.py", line 207, in run yield from map( ...<2 lines>... ) File "/usr/local/lib/python3.13/site-packages/airbyte_cdk/entrypoint.py", line 280, in read for message in self.source.read(self.logger, config, catalog, state): ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.13/site-packages/airbyte_cdk/sources/declarative/concurrent_declarative_source.py", line 381, in read yield from self._concurrent_source.read(selected_concurrent_streams) File "/usr/local/lib/python3.13/site-packages/airbyte_cdk/sources/concurrent_source/concurrent_source.py", line 126, in read yield from self._consume_from_queue( ...<2 lines>... ) File "/usr/local/lib/python3.13/site-packages/airbyte_cdk/sources/concurrent_source/concurrent_source.py", line 154, in _consume_from_queue if queue.empty() and concurrent_stream_processor.is_done(): ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^ File "/usr/local/lib/python3.13/site-packages/airbyte_cdk/sources/concurrent_source/concurrent_read_processor.py", line 223, in is_done raise AirbyteTracedException( ...<3 lines>... ) airbyte_cdk.utils.traced_exception.AirbyteTracedException: Concurrent read failure
    k
    • 2
    • 1
  • k

    Kothapalli Venkata Avinash

    11/19/2025, 1:21 PM
    @kapa.ai Disable Detecting Schema Changes
    k
    • 2
    • 1
  • g

    Geert

    11/19/2025, 2:12 PM
    Getting this error when I create a custom api: Internal Server Error: java.net.UnknownHostException: airbyte-abctl-airbyte-connector-builder-server-svc.airbyte-abctl: Name or service not known
    k
    • 2
    • 1
  • r

    Rahul

    11/19/2025, 2:20 PM
    @kapa.ai in which pod the connection_test happen for destination? In Airbyte OSS?
    k
    • 2
    • 1
  • g

    guifesquet

    11/19/2025, 3:13 PM
    running airbyte oss in docker, i try to configure a S3 destination, I have created policy, role and S3, but when I test I have an error failed to assume role from STS
    k
    • 2
    • 1
  • s

    Simon Veerman

    11/19/2025, 3:28 PM
    Hi @kapa.ai I keep getting 403 forbidden error's on setting up a woocommerce connector. The error is like this: 'GET' request to 'https://example.com/wp-json/wc/v3/system_status/tools?order=asc&amp;orderby=id&amp;dates_are_gmt=true&amp;per_page=100' failed with status code '403' and error message: '<!DOCTYPE html> I have removed the original domain name but the error is like that. I have done some troubleshooting and made sure the api key is present, the credentials are correct and I have no inclination to think anything is wrong on my airbyte OS instance as other woocommerce sites are syncing just fine. This particular customer has Wordfence installed, disabling it doesn't make any difference. Whitelisting the address on wordfence also doesn't help. They have the legacy api active as well but this key is for the REST API. I'm at a loss at what could be causing this issue. The connector used to work before.
    k
    • 2
    • 1
  • t

    Todd Matthews

    11/19/2025, 10:14 PM
    how do I set expire options on the s3 objects created by temporal?
    k
    • 2
    • 1