https://linen.dev logo
Join Slack
Powered by
# ask-ai
  • s

    s

    05/07/2025, 1:07 PM
    How many parallel syvs that airbyte cloud support?
    k
    • 2
    • 1
  • t

    Timmy Walters

    05/07/2025, 1:22 PM
    @kapa.ai I just had to manually reactivate a bunch of source/destination actors and connections that were all soft-deleted (
    tombstone = true
    for
    actor
    ,
    status = 'deprecated'
    for
    connection
    ) — even though we haven’t done an upgrade recently. This caused the UI/API to show no workspaces, connections, or sources, and threw errors like:
    Copy code
    TypeError: Cannot read properties of undefined (reading 'workspaceId')
    and
    Copy code
    Failed to get organization id for source id: ...
    Any idea why this might’ve happened? Was there a script or cleanup job that may have flagged everything as deleted? Just trying to understand so we can prevent it from happening again. Thanks!
    k
    • 2
    • 1
  • s

    Sirine Hdiji

    05/07/2025, 1:31 PM
    Hi everyone! I'm working on a declarative stream using yaml, and I'm trying to use
    DatetimeStreamSlicer
    to generate monthly date slices between a
    start_date
    and
    end_date
    defined in the config :
    202501, 202502...
    My goal is to pass the
    date
    from each slice (in the format
    YYYYMM
    ) as a required query parameter in the API request. Here's a simplified version of my YAML configuration:
    Copy code
    stream_slicer:
        type: DatetimeStreamSlicer
        cursor_field: date
        start_datetime: "{{ config['start_date'] }}"
        end_datetime: "{{ config['end_date'] }}"
        step: "1M"
        datetime_format: "%Y%m"
        cursor_granularity: "P1M"
    
    retriever:
      type: SimpleRetriever
      requester:
        request_parameters:
          date: "{{ stream_slice['date'] }}"
    However, in the actual requests, the
    date
    parameter is missing, and the API returns a 400 error:
    "must have required property 'date'"
    . is
    DatetimeStreamSlicer
    still supported or has it been deprecated in favor of another approach? also, what are the best practices to pass dynamic query parameters ? Any tips or examples would be appreciated! Thanks a lot 🙏
    k
    • 2
    • 2
  • t

    Théo

    05/07/2025, 1:44 PM
    @kapa.ai is there any way I can enable log DEBUG mod in replication-job-14-attempt-0 pods ?
    k
    • 2
    • 23
  • h

    Hari Haran R

    05/07/2025, 1:54 PM
    @kapa.ai im trying to use youtube analytics connector, but it semms there is a sync problem with the connector and there is no updated version
    k
    • 2
    • 4
  • h

    Hari Haran R

    05/07/2025, 2:01 PM
    @kapa.ai i'm using youtube analytics business connector, but there seems to be no latest version , there is only one 0.1.0 and the sync is also not getting updated
    k
    • 2
    • 1
  • s

    Sree Shanthan Kuthuru

    05/07/2025, 2:20 PM
    @kapa.ai I am using airbyte v1.6.1 and sync are failing with below error. While this is the error for the recent sync, previous sync succeeded, but the sync prior to the previous sync failed with same error
    Copy code
    message='Airbyte could not start the process within time limit. The workload was claimed but never started.', type='io.airbyte.workers.exception.WorkloadMonitorException', nonRetryable=false
    k
    • 2
    • 4
  • c

    Charles Bockelmann

    05/07/2025, 2:29 PM
    How can I set a specific password for Airbyte OSS ?
    k
    • 2
    • 1
  • m

    Moe Hein Aung

    05/07/2025, 2:47 PM
    @kapa.ai I started seeing this error on my Airbyte server hosted on K8s (EKS)
    Workload failed, source: workload-monitor-heartbeat Airbyte could not track the sync progress. Sync process exited without reporting status
    I did research online this seems to be because the storage for MinIO logs ran out of space. As suggested on this github issue, I set about deleting and re-creating this MinIO pod and PVs attached to it as a fix. I started by running the following:
    Copy code
    kubectl get statefulset -n airbyte
    kubectl get pvc -n airbyte
    kubectl get pv -n airbyte
    Then deleted them all successfully:
    Copy code
    kubectl delete statefulset airbyte-minio -n airbyte
    kubectl delete pvc airbyte-minio-pv-claim-airbyte-minio-0 -n airbyte
    kubectl delete pv pvc-26a02143-f688-4674-a49f-1335e8c74cca
    In the process I also did helm repo update to update from 1.5.1 to 1.6.1 then tried helm upgrade:
    Copy code
    helm upgrade --install airbyte airbyte/airbyte --version 1.6.1 -n airbyte -f values.yaml --debug
    However, it gets stuck with
    Pod airbyte-minio-create-bucket running
    and minio pod does not get created. This is part of my helm chart:
    Copy code
    # for v1.6.1
    logs:
      storage:
        type: minio
    
      minio:                 
        enabled: true
        persistence:
          enabled: true
          size: 20Gi
          storageClass: gp2
    
    # old for v1.5.1 commented out
    # minio:
    #   persistence:
    #     enabled: true
    #     size: 20Gi
    #     storageClass: "gp2"
    k
    • 2
    • 4
  • c

    Chris

    05/07/2025, 2:48 PM
    @kapa.ai I am setting up GA4 Source on Airbyte Open Source. I am using service account. Do i need to add service account to Google Analytics account? and if so what permission/role do i need?
    k
    • 2
    • 4
  • g

    Gary James

    05/07/2025, 3:07 PM
    what's the expected Airbyte API URL to use for the Terraform provider when using Airbyte Teams?
    k
    • 2
    • 2
  • c

    Chris Brunton

    05/07/2025, 3:22 PM
    @kapa.ai Can I adjust the
    MAX_SYNC_WORKERS
    setting if I am using the Open Source version of Airbyte, version 1.6, on an AWS EC2 instance?
    k
    • 2
    • 7
  • n

    Nicolas Scholten

    05/07/2025, 4:03 PM
    @kapa.ai I can't connect to my Airbyte account to access airbyte cloud, getting the following error in Chrome:
    TypeError: Cannot read properties of undefined (reading 'email')
    k
    • 2
    • 1
  • c

    Caleb NXTLI

    05/07/2025, 4:45 PM
    Is Meta Graph API supported by Airbyte?
    k
    • 2
    • 2
  • o

    Omree Gal-Oz

    05/07/2025, 5:58 PM
    @kapa.ai Airbyte server is looking for an auth user with an id which is our only dataplane ID but it fails to find that id in the auth_user table. Why is it searching for dataplane id as auth user and if so why can't it find it as auth_user?
    k
    • 2
    • 1
  • c

    Chris

    05/07/2025, 9:26 PM
    @kapa.ai What is the highest Facebook Marketing connector version I can use with Airbyte 0.50.xx?
    k
    • 2
    • 4
  • m

    Michael Johnsey

    05/07/2025, 10:13 PM
    @kapa.ai we're using the hosted version of Airbyte and it seems like the new version of Hubspot is throwing an error
    ValueError: No format in ['%ms', '%ms'] matching
    but it's not clear to me which stream is causing the issue. Is there any way to figure that out?
    k
    • 2
    • 1
  • b

    Brandon Rickman

    05/08/2025, 12:29 AM
    I want to configure my asset so that there's one version that runs regularly against production data sources, and one that runs regularly against stage data sources. What's the best way to do this?
    k
    • 2
    • 1
  • s

    Sean Stach

    05/08/2025, 2:15 AM
    My API returns a CSV, how can this be handled?
    k
    e
    • 3
    • 2
  • l

    Lakshmipathy

    05/08/2025, 5:00 AM
    while creating a custom builder by using api.i am getting an error like Internal Server Error: com.fasterxml.jackson.databind.JsonMappingException: String value length (20046850) exceeds the maximum allowed (20000000, from
    StreamReadConstraints.getMaxStringLength()
    ) (through reference chain: io.airbyte.connectorbuilderserver.api.client.model.generated.StreamRead["slices"]->java.util.ArrayList[0]->io.airbyte.connectorbuilderserver.api.client.model.generated.StreamReadSlicesInner["pages"]->java.util.ArrayList[0]->io.airbyte.connectorbuilderserver.api.client.model.generated.StreamReadSlicesInnerPagesInner["response"]->io.airbyte.connectorbuilderserver.api.client.model.generated.HttpResponse["body"]) can you please suggest like how fix this issue
    k
    • 2
    • 2
  • l

    Lakshmipathy

    05/08/2025, 5:19 AM
    how to increase the maximum allowed string length by using
    Copy code
    MICRONAUT_SERVER_MAX_REQUEST_SIZE
    can you give the steps
    k
    n
    • 3
    • 10
  • j

    Julian Andersen

    05/08/2025, 7:25 AM
    @kapa.ai getting this error with the facebook authentication: The domain of this URL isn't included in the app's domains. To be able to load this URL, add all domains and sub-domains of your app to the App Domains field in your app settings.
    k
    • 2
    • 7
  • v

    Vasily Safronov

    05/08/2025, 8:02 AM
    Where airbyte stored cdc state?
    k
    • 2
    • 1
  • s

    Syed Hamza Raza Kazmi

    05/08/2025, 8:27 AM
    @kapa.ai, how does airbyte replicate views
    k
    • 2
    • 1
  • f

    Fasih Ullah

    05/08/2025, 8:35 AM
    @kapa.ai using bigQuery source, the timestamp columns are read as strings in AirByte. Can it still be used as a cursor field?
    k
    • 2
    • 1
  • n

    Namratha D

    05/08/2025, 9:27 AM
    @kapa.ai i have installed airbyte using abctl with values.yaml file but still the limit is not increased even after adding
    Copy code
    global:
      env_vars:
        MICRONAUT_SERVER_MAX_REQUEST_SIZE: "104857600"  # For HTTP request size
        JACKSON_DATABIND_MAX_STRING_LENGTH: "30000000"  # Increase Jackson string length limit
    and i have reinstalled airbyte Is there any other approaches that this issue can be solved?
    k
    • 2
    • 1
  • l

    Lakshmipathy

    05/08/2025, 10:44 AM
    is possible to pass the required columns in the custom builder . if possible then please provide the steps how to pass that
    k
    • 2
    • 1
  • k

    kanzari soumaya

    05/08/2025, 10:49 AM
    Hi , I m trying to connect from mongoDB self managed replicaset source . I created two nods with those address : mongodb://localhost:27080,localhost:27081/?replicaSet=rs0&readPreference=primary . I have my database imported in MongoDB compass with collections . But when I try to import source in Airbyte, I get this error message : Configuration check failed Timed out after 30000 ms while waiting for a server that matches ReadPreferenceServerSelector{readPreference=primary}. Client view of cluster state is {type=REPLICA_SET, servers=[{address=localhost:27081, type=UNKNOWN, state=CONNECTING, exception={com.mongodb.MongoSocketOpenException: Exception opening socket}, caused by {java.net.ConnectException: Connection refused}}, {address=localhost:27080, type=UNKNOWN, state=CONNECTING, exception={com.mongodb.MongoSocketOpenException: Exception opening socket}, caused by {java.net.ConnectException: Connection refused}}] .at com.mongodb.client.internal.MongoDatabaseImpl.executeCommand(MongoDatabaseImpl.java:196) at com.mongodb.client.internal.MongoDatabaseImpl.runCommand(MongoDatabaseImpl.java:165) at com.mongodb.client.internal.MongoDatabaseImpl.runCommand(MongoDatabaseImpl.java:160) at com.mongodb.client.internal.MongoDatabaseImpl.runCommand(MongoDatabaseImpl.java:150) at io.airbyte.integrations.source.mongodb.MongoUtil.getAuthorizedCollections(MongoUtil.java:94) at io.airbyte.integrations.source.mongodb.MongoDbSource.check(MongoDbSource.java:68) at io.airbyte.cdk.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.kt:166) at io.airbyte.cdk.integrations.base.IntegrationRunner.run(IntegrationRunner.kt:119) at io.airbyte.cdk.integrations.base.IntegrationRunner.run$default(IntegrationRunner.kt:113) at io.airbyte.cdk.integrations.base.IntegrationRunner.run(IntegrationRunner.kt) at io.airbyte.integrations.source.mongodb.MongoDbSource.main(MongoDbSource.java:53) 2025-05-08 124351 info INFO main i.a.c.i.b.IntegrationRunner(runInternal):224 Completed integration: io.airbyte.integrations.source.mongodb.MongoDbSource 2025-05-08 124351 info INFO main i.a.i.s.m.MongoDbSource(main):54 completed source: class io.airbyte.integrations.source.mongodb.MongoDbSource 2025-05-08 124351 info Checking for optional control message... 2025-05-08 124351 info Optional control message not found. Skipping... 2025-05-08 124351 info Writing output of b2e713cd-cc36-4c0a-b5bd-b47cb8a0561e_caf84f8f-3c0c-4d15-9d01-7a4c3d13b470_0_check to the doc store 2025-05-08 124351 info Marking workload b2e713cd-cc36-4c0a-b5bd-b47cb8a0561e_caf84f8f-3c0c-4d15-9d01-7a4c3d13b470_0_check as successful 2025-05-08 124351 info 2025-05-08 124351 info Deliberately exiting process with code 0. 2025-05-08 124351 info ----- END CHECK ----- 2025-05-08 124351 info l Can you please figure out how can I manage this ? Thank you
    k
    • 2
    • 7
  • i

    Induprakash Gowreesan

    05/08/2025, 11:12 AM
    The connector configuration. The object's keys are the same as the keys defined in the connection specification. Example { "start_date": "2010-01-01T000000.000Z", "api_key": "*****" } how to add a field using config I have added new field called base_url as path and {{ config['base_url'] }} as value in transformation section, but not able to see the record what exactly config will do?
    k
    • 2
    • 1
  • h

    hiwot tadese

    05/08/2025, 11:42 AM
    how can i get token url , client id and client secret on airbyte version 0.39 alpha
    k
    • 2
    • 1
1...4445464748Latest