https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • a

    Alex Johnson

    11/28/2025, 5:28 AM
    Zoho Inventory Connector - No Organisation ID? Hey Guys, Does anyone know why the Zoho Inventory connector does not use the organisation ID? I use multiple organisations so this is a required variable in all API calls. Is there a plan to include this in the future? Thanks! Alex
    k
    • 2
    • 1
  • p

    Prabhu Agarwal

    11/28/2025, 8:53 AM
    Hello, I am facing issue while using the custom decoder component. I have setup the custom component in builder UI and using it for one of the stream, and when I run this stream, its working fine in connector builder UI 2.0, but the rest of the streams which are not using the custom component, getting error while running them in builder UI. Could not load module
    source_declarative_manifest.components
    . I have checked for
    source_declarative_manifest
    traces in the other streams but its not the case. Not sure why this error is coming. How can I fix this issue? any leads would be highly appreciated. Thanks
    k
    • 2
    • 4
  • l

    Leon Kozlowski

    11/29/2025, 4:09 PM
    I'm using RDS for my external database for deploying self hosted airbyte - are there any minimum requirements for external database? I'm running a t3.large and my CPU is pinned at 100%, temporal is holding 150 idle connections - does anyone know what the min RDS size I should use?
    k
    • 2
    • 4
  • s

    Sam Riggleman

    11/29/2025, 5:37 PM
    I get this far in setting up a connection for the first time, but then it always hangs. All the tests pass in the configuration screen.
    h
    • 2
    • 2
  • p

    Piyush Shakya

    12/01/2025, 9:34 AM
    has any one faced this kind of issue before ? Init container error encountered while processing workload for id: 85283132-f3b9-470d-847f-cea2c0a57301_29938_4_check. Encountered exception of type: class io.micronaut.data.connection.jdbc.exceptions.CannotGetJdbcConnectionException. Exception message: Failed to obtain JDBC Connection.
    k
    • 2
    • 1
  • c

    Corentin Marin

    12/01/2025, 10:25 AM
    Hi everyone! My Team is trying to ingest Apple Search Ads data within Snowflake using the following connector; for now we are mainly interested in getting spend at campaign, ad group & keyword levels . We managed to create the connection between Apple Ads and Snowflake, however we are not able to recover the full spend data history; it actually seems that we can recover it fully only for the day - 1 of running the refresh. Has anyone faced that kind of issue and managed to find a way to recover the full history? Thanks for your help!
    k
    h
    • 3
    • 4
  • n

    Nadya Niukina

    12/02/2025, 6:26 AM
    Hello, I deployed Airbyte on Kubernetes using the helm chart and the terraform provider (to configure sources and destinations). From what I understand based on issues #39528 and #63772, it’s not possible to create a new application in the Core version (contrary to documentation) Is there any way to rename the existing application (
    Default User Application
    )? Thanks!
  • a

    Aswin

    12/02/2025, 1:38 PM
    Hi , Could someone clarify whether the free/open-source edition of Airbyte imposes any limits on database size, data-transfer volume, or the number of connections? Additionally, are there any architectural or operational constraints that commonly affect large-scale migrations?
    k
    j
    • 3
    • 3
  • d

    Dana Williams

    12/02/2025, 9:32 PM
    Is anyone having issues with there self hosted Airbyte? I have every connector failing with error: Warning from source: Check took too long. Check exceeded the timeout. The Ai tool is saying its a platform issue, but nothing posted on Airbyte status page.
    k
    j
    j
    • 4
    • 8
  • y

    Yuki Kakegawa

    12/03/2025, 12:02 AM
    Does anyone know how to trigger "refresh stream" on a schedule? Via the API or the UI
    k
    • 2
    • 1
  • p

    Purna Shah

    12/03/2025, 8:34 AM
    Hi guys, I am trying to clear the data for a connection via the API, which should be doable per the official doc on clearing data. However, there is no endpoint documented to accomplish this in the official API docs. One is discussed in this doc on the internal configuration, but I have tried it using the following pattern and it doesn't seem to work. Any ideas on how to clear data for a connection via API as promised in the official docs?
    Copy code
    curl -X POST \
      -H "Content-Type: application/json" \
      -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
      -d '{
        "connectionId": "YOUR_CONNECTION_ID"
      }' \
      <https://api.airbyte.com/v1/connections/clear>
    k
    • 2
    • 1
  • s

    sam khizer

    12/03/2025, 3:09 PM
    hi does anyone can help me with the issue of installing abctl im facing this issue: ERROR failed to determine if any previous psql version exists: error reading pgdata version file: open /home/ec2-user/.airbyte/abctl/data/airbyte-volume-db/pgdata/PG_VERSION: permission denied && DEBUG helm: Add/Modify event for airbyte-abctl-bootloader: MODIFIED ERROR Failed to install airbyte/airbyte Helm Chart DEBUG looking at airbyte-abctl-bootloader Failed() DEBUG found logs: Unsetting empty environment variable 'AB_INSTANCE_ ERROR Unable to install Airbyte locally DEBUG Received unexpected event: *v1.Status DEBUG Event watcher completed. ERROR unable to install airbyte chart: unable to install helm: failed pre-install: 1 error occurred: * pod airbyte-abctl-bootloader failed [ec2-user@ip-172-31-31-20 airbyte]$ pls give me some solution stuck with this loop
    k
    • 2
    • 1
  • v

    Vinicius Nunes

    12/03/2025, 3:19 PM
    Is anyone else having trouble setting up the Netsuite 0.1.27 connector (Community version)? I got stuck on this screen.
  • s

    sam khizer

    12/04/2025, 9:42 AM
    hi forks, can suggest me how to change table name after using connection successfully in airbyte: ex: dim_tables and fact_tables in target pasting sample images i want both prefix
    j
    • 2
    • 1
  • j

    James Booth

    12/04/2025, 2:58 PM
    I’ve upgraded airbyte Helm chart from 1.7.2 to 1.9.2 (airbyte:2.0.1 image). Everything is back up except the airbyte-workload-launcher. We use a static
    airbyte-auth-secrets
    Secret (via. External secrets operator), so I believe that whatever migrations have occurred, haven’t applied to the dataplane ID used in this secret. If I remove that secret, run the bootstrap Pod and restart, the workload-launcher comes up fine but never runs any workloads. I’ve checked the DB and I have a number of
    dataplane
    ,
    dataplane_groups
    ,
    service_accounts
    and
    dataplane_client_credentials
    so I assume I need to somehow align what the API and the workload-launcher are using
    k
    • 2
    • 4
  • b

    Bryan Whiting

    12/04/2025, 4:37 PM
    Hi Everyone! I've just switched from using an old version of abctl to Helm with a separate postgres db and Caddy for a reverse proxy. My deployment is slower now- and the bottleneck is in deploying worker pods. On one small job for example, the pod creation is taking 2-3 minutes, and the sync itself only takes 20 seconds. Any suggestions on where to begin debugging this?
    k
    • 2
    • 1
  • e

    Enklid Hoxha

    12/04/2025, 5:53 PM
    Hi guys, I have a problem going on with one of my destination connectors not connecting to my minio properly, I can't really seem to find any fault with the credentials, and the logs don't really help much in this case either. Even the older connectors to the same destination seem to not work anymore. I was thinking if it could be a problem since the s3 connector is not at the latest version, but if someone has had a similar issue and has any ideas it would be great! Thanks for your help!
    k
    • 2
    • 1
  • k

    kapa.ai

    12/04/2025, 11:25 PM
    “Proxy” can refer to a few different things in Airbyte, depending on what you’re trying to do: 1. Network proxy for outbound internet access (HTTP/HTTPS proxy) - Some Airbyte deployments need to go through a corporate proxy to reach the internet (e.g., to download connector images or reach APIs). - There are known issues when the proxy requires authentication: users have reported “Failed to Authenticate with proxy” errors when deploying via Helm on Kubernetes/OpenShift, even though
    HTTPS_PROXY
    works with
    curl
    . This is currently tracked as a bug and may block deployment in such environments. [[Proxy auth bug](https://github.com/airbytehq/airbyte/issues/53227)] - Another user reported that sidecar containers for connector pods (e.g.,
    ce-...-check-...
    with
    connector-sidecar
    ) do not inherit proxy environment variables set via
    .Values.global.env_vars
    in the Helm chart, so those containers cannot reach the internet. [[Sidecar proxy vars](https://github.com/airbytehq/airbyte/issues/56917)] - There is also a closed issue where adding the Airbyte Helm repo via
    helm repo add
    failed with
    403 Forbidden
    when going through a proxy; in that case the proxy itself was returning 403. [[Helm repo 403](https://github.com/airbytehq/airbyte/issues/50840)] 2. Reverse proxy in front of Airbyte (e.g., nginx) - Airbyte can be secured by placing it behind a reverse proxy (like nginx) that handles SSL and access control. The docs provide an example nginx config that proxies HTTPS traffic to
    <http://127.0.0.1:8000>
    and optionally forwards cookies for Airbyte’s basic auth. [[Security docs](https://docs.airbyte.com/platform/operating-airbyte/security#securing-airbyte-open-source)] - Some users run nginx in front of Airbyte and have seen
    502 Bad Gateway
    until Airbyte containers are restarted; this was reported as fixed in later versions (around 0.40.23) via changes to the
    airbyte/proxy
    container restart behavior. [[502 via nginx](https://discuss.airbyte.io/t/3239)] - The
    airbyte/proxy
    container is the nginx-based proxy used in the standard Docker deployment; it can be configured with
    BASIC_AUTH_USERNAME
    and
    BASIC_AUTH_PASSWORD
    to protect the UI/API. [[Docker compose proxy](https://discuss.airbyte.io/t/3748); [Upstream host config](https://discuss.airbyte.io/t/3203)] 3. AI Agents “Authentication Proxy” and “File Storage Proxy” (Airbyte Cloud / AI Agents) - Airbyte’s AI Agents include a Proxy feature (in alpha) that lets you send authenticated HTTP requests to external APIs using a configured Source as the auth context. You call it via endpoints like: -
    POST /api/v1/sonar/apis/{SOURCE_ID}/request
    for arbitrary API calls (e.g., Stripe). -
    GET /api/v1/sonar/files/{SOURCE_ID}/list/...
    and
    /get/...
    for file storage (e.g., S3). [[AI Agents proxy](https://docs.airbyte.com/ai-agents#proxy-requests)] 4. Connectors that themselves are “proxy” connectors - There is at least one connector described as a “Proxy connector” (e.g., for Go High Level) that forwards through a paid proxy service; this is specific to that integration and not the platform-level network proxy. [[High Level source](https://docs.airbyte.com/integrations/sources/high-level)] If you can clarify whether you’re asking about: - outbound HTTP/HTTPS proxy for Airbyte components/connectors, - reverse proxy (nginx) in front of Airbyte, - the AI Agents proxy APIs, or - a specific “proxy” connector, I can give more concrete, step-by-step guidance based strictly on the available docs and issues.
  • h

    H1ROME

    12/04/2025, 11:29 PM
    How can I set up Outbound HTTP/HTTPS proxy for Airbyte environment’s variables and component/connectors
    k
    • 2
    • 2
  • k

    kapa.ai

    12/04/2025, 11:38 PM
    Based on the provided sources, there is no documented way to configure outbound proxy variables specifically for abctl‑based deployments. What we do know: - abctl wraps a Helm install of Airbyte into a local kind‑based Kubernetes cluster and exposes only a limited set of flags (
    --values
    ,
    --secret
    , etc.) for customization; there is no mention of proxy‑related options in the abctl reference.[\[abctl reference 1.8\]](https://docs.airbyte.com/platform/1.8/deploying-airbyte/abctl#reference)[\[abctl reference next\]](https://docs.airbyte.com/platform/next/deploying-airbyte/abctl#reference) - Proxy configuration issues are discussed for Helm‑based installs (e.g., setting
    global.env_vars
    or JAVA_TOOL_OPTIONS) and for job pods that don’t inherit those env vars, but these discussions are specifically about the Helm chart, not abctl.[\[Helm proxy issue\]](https://github.com/airbytehq/airbyte/issues/53227)[\[Sidecar proxy issue\]](https://github.com/airbytehq/airbyte/issues/56917) - For network‑restricted environments, one user notes that with abctl you may need a system‑wide proxy so that all components (including the internal Kubernetes controller) can access required images, but no concrete abctl flags or patterns are documented.[\[bootloader/proxy comment\]](https://github.com/airbytehq/airbyte/issues/58086) So, from the available documentation and issues: - There is no explicit support or example for setting outbound proxy variables via abctl flags or values. - Workarounds like system‑wide proxy configuration are mentioned but not formally documented for abctl. Because of that, I cannot say how difficult it is in practice, only that there is currently no clearly documented, first‑class mechanism for outbound proxy configuration when deploying with abctl in the provided sources.
  • h

    H1ROME

    12/04/2025, 11:39 PM
    "Is it similarly difficult at present to define outbound proxy variables when deploying with 'abctl'?"
    k
    • 2
    • 1
  • h

    H1ROME

    12/05/2025, 8:13 AM
    How can I limit CPU requests and limits in the values.yaml?
    k
    • 2
    • 9
  • j

    James Booth

    12/05/2025, 9:47 AM
    Now I’m getting a stack overflow on my replication job, but I also checked and it looks like my Connection state is empty? Is that error happening because it’s trying to validate an empty connection state, or is that expected in the UI? Or is the connection state dependent on the source? I think the connection state is a red herring as I created a new sync and it’s empty, so it’s not from the upgrade.
    Copy code
    replication-job-12168-attempt-0 orchestrator 2025-12-05 09:16:13,896 [Notification Thread]      DEBUG   i.a.m.c.AirbyteMetricMeterFilter(accept):48 - Resolved metric ID MeterId{name='airbyte.jvm.gc.pause', tags=[tag(action=end of minor GC),tag(cause=Allocation Failure),tag(gc=Copy)]} to null
    replication-job-12168-attempt-0 orchestrator 2025-12-05 09:16:13,901 [main]     ERROR   i.m.r.Micronaut(handleStartupException):362 - Error starting Micronaut server: Error instantiating bean of type  [io.airbyte.container.orchestrator.worker.RecordSchemaValidator]
    replication-job-12168-attempt-0 orchestrator
    replication-job-12168-attempt-0 orchestrator Path Taken:
    replication-job-12168-attempt-0 orchestrator new @j.i.Singleton i.a.c.o.EventListeners(RecordSchemaValidator recordSchemaValidator)
    replication-job-12168-attempt-0 orchestrator \---> new @j.i.Singleton i.a.c.o.EventListeners([RecordSchemaValidator recordSchemaValidator])
    replication-job-12168-attempt-0 orchestrator       \---> @j.i.Singleton i.a.c.o.w.RecordSchemaValidator i.a.c.o.c.OrchestratorBeanFactory.recordSchemaValidator#recordSchemaValidator(@Named("jsonSchemaValidator") JsonSchemaValidator jsonSchemaValidator, @Named("schemaValidationExecutorService") ExecutorService schemaValidationExecutorService, [@Named("streamNamesToSchemas") Map<AirbyteStreamNameNamespacePair, JsonNode> streamNamesToSchemas])
    replication-job-12168-attempt-0 orchestrator io.micronaut.context.exceptions.BeanInstantiationException: Error instantiating bean of type  [io.airbyte.container.orchestrator.worker.RecordSchemaValidator]
    replication-job-12168-attempt-0 orchestrator
    replication-job-12168-attempt-0 orchestrator Path Taken:
    replication-job-12168-attempt-0 orchestrator new @j.i.Singleton i.a.c.o.EventListeners(RecordSchemaValidator recordSchemaValidator)
    replication-job-12168-attempt-0 orchestrator \---> new @j.i.Singleton i.a.c.o.EventListeners([RecordSchemaValidator recordSchemaValidator])
    replication-job-12168-attempt-0 orchestrator       \---> @j.i.Singleton i.a.c.o.w.RecordSchemaValidator i.a.c.o.c.OrchestratorBeanFactory.recordSchemaValidator#recordSchemaValidator(@Named("jsonSchemaValidator") JsonSchemaValidator jsonSchemaValidator, @Named("schemaValidationExecutorService") ExecutorService schemaValidationExecutorService, [@Named("streamNamesToSchemas") Map<AirbyteStreamNameNamespacePair, JsonNode> streamNamesToSchemas])
    replication-job-12168-attempt-0 orchestrator    at io.micronaut.context.DefaultBeanContext.resolveByBeanFactory(DefaultBeanContext.java:2352)
    replication-job-12168-attempt-0 orchestrator    at io.micronaut.context.DefaultBeanContext.createRegistration(DefaultBeanContext.java:3150)
    replication-job-12168-attempt-0 orchestrator    at io.micronaut.context.SingletonScope.getOrCreate(SingletonScope.java:80)
    replication-job-12168-attempt-0 orchestrator    at io.micronaut.context.DefaultBeanContext.resolveBeanRegistration(DefaultBeanContext.java:3000)
    replication-job-12168-attempt-0 orchestrator    at io.micronaut.context.DefaultBeanContext.resolveBeanRegistration(DefaultBeanContext.java:2762)
    replication-job-12168-attempt-0 orchestrator    at io.micronaut.context.DefaultBeanContext.getBean(DefaultBeanContext.java:1781)
    replication-job-12168-attempt-0 orchestrator    at io.micronaut.context.AbstractBeanResolutionContext.getBean(AbstractBeanResolutionContext.java:210)
    replication-job-12168-attempt-0 orchestrator    at io.micronaut.context.AbstractInitializableBeanDefinition.resolveBean(AbstractInitializableBeanDefinition.java:2130)
    replication-job-12168-attempt-0 orchestrator    at io.micronaut.context.AbstractInitializableBeanDefinition.getBeanForConstructorArgument(AbstractInitializableBeanDefinition.java:1360)
    replication-job-12168-attempt-0 orchestrator    at io.airbyte.container.orchestrator.config.$OrchestratorBeanFactory$RecordSchemaValidator4$Definition.instantiate(Unknown Source)
    replication-job-12168-attempt-0 orchestrator    at io.micronaut.context.DefaultBeanContext.resolveByBeanFactory(DefaultBeanContext.java:2337)
    replication-job-12168-attempt-0 orchestrator    at io.micronaut.context.DefaultBeanContext.createRegistration(DefaultBeanContext.java:3150)
    replication-job-12168-attempt-0 orchestrator    at io.micronaut.context.SingletonScope.getOrCreate(SingletonScope.java:80)
    replication-job-12168-attempt-0 orchestrator    at io.micronaut.context.DefaultBeanContext.resolveBeanRegistration(DefaultBeanContext.java:3000)
    replication-job-12168-attempt-0 orchestrator    at io.micronaut.context.DefaultBeanContext.resolveBeanRegistration(DefaultBeanContext.java:2762)
    replication-job-12168-attempt-0 orchestrator    at io.micronaut.context.DefaultBeanContext.findBean(DefaultBeanContext.java:1844)
    replication-job-12168-attempt-0 orchestrator    at io.micronaut.context.AbstractBeanResolutionContext.findBean(AbstractBeanResolutionContext.java:286)
    replication-job-12168-attempt-0 orchestrator    at io.micronaut.context.AbstractInitializableBeanDefinition.resolveBean(AbstractInitializableBeanDefinition.java:2127)
    replication-job-12168-attempt-0 orchestrator    at io.micronaut.context.AbstractInitializableBeanDefinition.getBeanForConstructorArgument(AbstractInitializableBeanDefinition.java:1360)
    replication-job-12168-attempt-0 orchestrator    at io.airbyte.container.orchestrator.$EventListeners$Definition.instantiate(Unknown Source)
    replication-job-12168-attempt-0 orchestrator    at io.micronaut.context.DefaultBeanContext.resolveByBeanFactory(DefaultBeanContext.java:2337)
    replication-job-12168-attempt-0 orchestrator    at io.micronaut.context.DefaultBeanContext.createRegistration(DefaultBeanContext.java:3150)
    replication-job-12168-attempt-0 orchestrator    at io.micronaut.context.SingletonScope.getOrCreate(SingletonScope.java:80)
    replication-job-12168-attempt-0 orchestrator    at io.micronaut.context.DefaultBeanContext.resolveBeanRegistration(DefaultBeanContext.java:3000)
    replication-job-12168-attempt-0 orchestrator    at io.micronaut.context.DefaultBeanContext.resolveBeanRegistration(DefaultBeanContext.java:2762)
    replication-job-12168-attempt-0 orchestrator    at io.micronaut.context.DefaultBeanContext.getBean(DefaultBeanContext.java:1781)
    replication-job-12168-attempt-0 orchestrator    at io.micronaut.context.DefaultBeanContext$BeanExecutionHandle.getTarget(DefaultBeanContext.java:3963)
    replication-job-12168-attempt-0 orchestrator    at io.micronaut.context.DefaultBeanContext$BeanExecutionHandle.invoke(DefaultBeanContext.java:3985)
    replication-job-12168-attempt-0 orchestrator    at io.micronaut.aop.chain.AdapterIntroduction.intercept(AdapterIntroduction.java:91)
    replication-job-12168-attempt-0 orchestrator    at io.micronaut.aop.chain.MethodInterceptorChain.proceed(MethodInterceptorChain.java:143)
    replication-job-12168-attempt-0 orchestrator    at io.airbyte.container.orchestrator.EventListeners$ApplicationEventListener$initializeValidator1$Intercepted.onApplicationEvent(Unknown Source)
    replication-job-12168-attempt-0 orchestrator    at io.micronaut.context.event.ApplicationEventPublisherFactory.notifyEventListeners(ApplicationEventPublisherFactory.java:266)
    replication-job-12168-attempt-0 orchestrator    at io.micronaut.context.event.ApplicationEventPublisherFactory$2.publishEvent(ApplicationEventPublisherFactory.java:226)
    replication-job-12168-attempt-0 orchestrator    at io.micronaut.http.server.netty.discovery.NettyServiceDiscovery.onStart(NettyServiceDiscovery.java:54)
    replication-job-12168-attempt-0 orchestrator    at io.micronaut.http.server.netty.discovery.$NettyServiceDiscovery$Definition$Exec.dispatch(Unknown Source)
    replication-job-12168-attempt-0 orchestrator    at io.micronaut.context.AbstractExecutableMethodsDefinition$DispatchedExecutableMethod.invoke(AbstractExecutableMethodsDefinition.java:456)
    replication-job-12168-attempt-0 orchestrator    at io.micronaut.context.DefaultBeanContext$BeanExecutionHandle.invoke(DefaultBeanContext.java:3987)
    replication-job-12168-attempt-0 orchestrator    at io.micronaut.aop.chain.AdapterIntroduction.intercept(AdapterIntroduction.java:91)
    replication-job-12168-attempt-0 orchestrator    at io.micronaut.aop.chain.MethodInterceptorChain.proceed(MethodInterceptorChain.java:143)
    replication-job-12168-attempt-0 orchestrator    at io.micronaut.http.server.netty.discovery.NettyServiceDiscovery$ApplicationEventListener$onStart1$Intercepted.onApplicationEvent(Unknown Source)
    replication-job-12168-attempt-0 orchestrator    at io.micronaut.context.event.ApplicationEventPublisherFactory.notifyEventListeners(ApplicationEventPublisherFactory.java:266)
    replication-job-12168-attempt-0 orchestrator    at io.micronaut.context.event.ApplicationEventPublisherFactory$2.publishEvent(ApplicationEventPublisherFactory.java:226)
    replication-job-12168-attempt-0 orchestrator    at io.micronaut.http.server.netty.NettyHttpServer.fireStartupEvents(NettyHttpServer.java:720)
    replication-job-12168-attempt-0 orchestrator    at io.micronaut.http.server.netty.NettyHttpServer.start(NettyHttpServer.java:330)
    replication-job-12168-attempt-0 orchestrator    at io.micronaut.http.server.netty.NettyHttpServer.start(NettyHttpServer.java:124)
    replication-job-12168-attempt-0 orchestrator    at io.micronaut.runtime.Micronaut.start(Micronaut.java:81)
    replication-job-12168-attempt-0 orchestrator    at io.airbyte.container.orchestrator.ApplicationKt.main(Application.kt:29)
    replication-job-12168-attempt-0 orchestrator Caused by: java.lang.StackOverflowError: null
  • f

    Fabrizio Spini

    12/05/2025, 10:35 AM
    Ciao everyone, anyone could direct me on where can I report an error on documentation? On this page https://docs.airbyte.com/platform/operator-guides/configuring-connector-resources#configuring-instance-wide-requirements is reported
    Copy code
    Connect to the database and run the following query with the image name replaced to find the connector definition id.
    select * from actor_definition where actor_definition.docker_repository like '%<image-name>';
    but field docker_repository doesn't exists on actor_definition for airbyte version 2.0
    k
    • 2
    • 2
  • s

    Slackbot

    12/05/2025, 10:40 AM
    This message was deleted.
    k
    • 2
    • 1
  • b

    Bogdan

    12/05/2025, 10:55 AM
    Hello everyone, please help us set up parallelism in Airbyte. We have 3 worker container replicas, max_sync_worker = 10, but we can't run multiple syncs simultaneously. They start, but wait for the previous one to finish before starting the next one. We are using Airbyte through abctl.
    k
    • 2
    • 1
  • h

    H1ROME

    12/05/2025, 11:29 AM
    I am currently building Airbyte and Airflow on separate machines. I built Airbyte using abctl (Airbyte Control Tool), and Airflow is built on Docker. I access the Airbyte UI from my client PC via port forwarding. I am trying to create an Airbyte connection in Airflow and trigger an Airbyte job using a DAG, but it is failing with an error. What are the required connection settings that I need to configure on the Airflow side?
    k
    • 2
    • 1
  • m

    Matthias Grawinkel

    12/05/2025, 3:56 PM
    Hello everyone, I am trying to set up the https://docs.airbyte.com/integrations/sources/sendgrid integration, but something is feeling off. It's like the documentation talks about a completely different service to the actual product that I see in front of me. for example:
    • Create Sendgrid API Key with the following permissions:
    • Read-only access to all resources
    • Full access to marketing resources
    when i log in to sendgrid, these options are not available for the API key creation. After some testing, I gave up and created a Full Access Key on sendgrid. Adding it to the Airbyte Source, I can test and save the config. That looks good so far. But running the connector i will see
    Failure in source: HTTP Status Code: 403. Error: Forbidden. You don't have permission to access this resource.
    The source was able to list all streams, but I can't sync any of them, all resulting in 403. Does anyone know what has changed here or how this could work? Many thanks in advance!
    k
    i
    • 3
    • 3
  • v

    Vu Nguyen

    12/05/2025, 7:28 PM
    Hi, I have Access denied error for https://api.airbyte.com/v1/sources/discover_schema API, I'm not sure if I need to do anything additionally to enable the API?
    k
    • 2
    • 7
  • k

    Kexun Liu

    12/05/2025, 8:25 PM
    Hi, I’m getting Authentication error – Failed to send email when trying to log into Airbyte Cloud with my email address. Could someone from the team help check?
    k
    • 2
    • 1