https://linen.dev logo
Join Slack
Powered by
# ask-ai
  • h

    Henrik Rasmussen

    09/11/2024, 8:06 AM
    When setting up airbyte to use external database for configuration - what permissions should the user have?
    k
    • 2
    • 3
  • e

    ed

    09/11/2024, 8:15 AM
    I cannot replicate from a Postgres read-only replica because I get error
    Copy code
    Caused by: org.postgresql.util.PSQLException: ERROR: recovery is in progress
    902
    but the replica will always be in that state since it's streaming replication
    k
    • 2
    • 9
  • g

    Georgina Walker

    09/11/2024, 8:19 AM
    i want to set up a connection to a new destination that will test the changes that will occur upon upgrading our destinations. is there a way i can partially sync the source data, so that not all the data is loaded into the source, to save time? e.g. if i only wanted x number of rows of data, or data from a certain date
    k
    • 2
    • 3
  • t

    Thomas Clavet

    09/11/2024, 8:21 AM
    @kapa.ai Using helm chart version 0.442.3 how can we increase the max number of sync allowed by the cluster?
    k
    • 2
    • 4
  • o

    Oisin McKnight

    09/11/2024, 8:31 AM
    Hello, I want to update my Airbyte Instance that is hosted on GCP VM Instance. How do I do this?
    k
    b
    • 3
    • 21
  • c

    Christophe Di Prima

    09/11/2024, 9:22 AM
    I am testing Airbyte locally and wanted to try the API to launch some jobs... To do so, I guess I have to create an new application to avoid cors issues. The thong is that I cannot find any "Create an application" button!
    k
    • 2
    • 1
  • t

    Thomas

    09/11/2024, 9:43 AM
    I'm creating workspace through the API (Powered By Airbyte use case) and I get 500 errors from the API ? It used to work perfectly. I tried running the call from the API Reference Guide (https://reference.airbyte.com/reference/createworkspace) and I still get the error:
    Internal Server Error: Client '<http://airbyte-cloud-server-svc:80>': Connect Error: channel not registered to an event loop
    . I wonder if there's not a problem on Airbyte end on this endpoint.
    k
    • 2
    • 1
  • p

    Pablo Martín

    09/11/2024, 10:37 AM
    I am tryung to deploy in an EC2 AWS
    Copy code
    [ec2-user@ip-172-31-7-104 ~]$ abctl local install
      INFO    Using Kubernetes provider:
                Provider: kind
                Kubeconfig: /home/ec2-user/.airbyte/abctl/abctl.kubeconfig
                Context: kind-airbyte-abctl
     SUCCESS  Found Docker installation: version 25.0.6
     SUCCESS  Existing cluster 'airbyte-abctl' found
     SUCCESS  Cluster 'airbyte-abctl' validation complete
      INFO    Namespace 'airbyte-abctl' already exists
      INFO    Persistent volume 'airbyte-minio-pv' already exists
      INFO    Persistent volume 'airbyte-volume-db' already exists
      INFO    Persistent volume claim 'airbyte-minio-pv-claim-airbyte-minio-0' already exists
      INFO    Persistent volume claim 'airbyte-volume-db-airbyte-db-0' already exists
      INFO    Starting Helm Chart installation of 'airbyte/airbyte' (version: 0.551.0)
      ERROR   Failed to install airbyte/airbyte Helm Chart
      ERROR   Unable to install Airbyte locally
      ERROR   unable to install airbyte chart: unable to install helm: post-upgrade hooks failed: 1 error occurred:
                    * timed out waiting for the condition
    k
    • 2
    • 3
  • h

    Hari Gudla

    09/11/2024, 10:56 AM
    @kapa.ai What are the metadata fields _airbyte_meta and _airbyte_generation_id in Snowflake target tell?
    k
    • 2
    • 1
  • a

    Aditya Gupta

    09/11/2024, 11:14 AM
    @kapa.ai I want to setup a Postgres source in Airbyte, where the postgres db is hosted locally how to do that
    k
    • 2
    • 4
  • g

    Guilherme de Souza da Silva

    09/11/2024, 12:04 PM
    @kapa.ai I recently updated to the latest version via Helm, and performed all necessary updates. However, for tables with 9 million, I only receive the following notification: Warning from replication: Something went wrong during replication Destination process is still alive, cannot retrieve exit value. The workload_api logs only show the wait for a thread to respond, thus losing the job No error logs are found. At least not explicitly.
    k
    h
    • 3
    • 10
  • s

    Slackbot

    09/11/2024, 12:46 PM
    This message was deleted.
    k
    • 2
    • 1
  • a

    Avishai Kdoshim

    09/11/2024, 12:58 PM
    @kapa.ai i pushed all of my logs using fluentbit to AWS s3 bucket, airbytes was able to access them, then i want to aggregate the logs and push the aggregated results to PostgreSQL db table, how can i achieve that ?
    k
    • 2
    • 1
  • e

    ed

    09/11/2024, 1:10 PM
    which debezium version does airbyte use
    k
    • 2
    • 6
  • p

    Patricio Villanueva

    09/11/2024, 1:48 PM
    what is airbyte-helm-deployment-worker used for?
    k
    • 2
    • 10
  • j

    Jaydan Pratts

    09/11/2024, 1:57 PM
    @kapa.ai I am getting
    errors.http.internalServerError
    when using Add new connector -> Add using Docker Image on the web gui. The Docker repository name is set up correctly and the docker image is in the same EC2 as Airbyte, with docker image showing that it is set up correctly.
    k
    • 2
    • 13
  • l

    Luc Lagrange

    09/11/2024, 2:03 PM
    With the hubspot connector, how are the deleted deals tracked?
    k
    • 2
    • 1
  • p

    Peter Welte

    09/11/2024, 2:04 PM
    @kapa.ai has anyone seen this error and identified a fix?
    Copy code
    ERROR pool-4-thread-1 i.a.c.i.d.a.FlushWorkers(flush$lambda$6):178 Flush Worker (12116) -- flush worker error: com.google.cloud.bigquery.BigQueryException: <http://bigquery.googleapis.com|bigquery.googleapis.com>
            at io.airbyte.integrations.destination.bigquery.operation.BigQueryDirectLoadingStorageOperation.initWriteChannel(BigQueryDirectLoadingStorageOperation.kt:107) ~[io.airbyte.airbyte-integrations.connectors-destination-bigquery.jar:?]
            at io.airbyte.integrations.destination.bigquery.operation.BigQueryDirectLoadingStorageOperation.writeToStage(BigQueryDirectLoadingStorageOperation.kt:69) ~[io.airbyte.airbyte-integrations.connectors-destination-bigquery.jar:?]
            at io.airbyte.integrations.destination.bigquery.operation.BigQueryDirectLoadingStorageOperation.writeToStage(BigQueryDirectLoadingStorageOperation.kt:30) ~[io.airbyte.airbyte-integrations.connectors-destination-bigquery.jar:?]
            at io.airbyte.integrations.base.destination.operation.StandardStreamOperation.writeRecordsImpl(StandardStreamOperation.kt:32) ~[airbyte-cdk-typing-deduping-0.44.19.jar:?]
            at io.airbyte.integrations.base.destination.operation.AbstractStreamOperation.writeRecords(AbstractStreamOperation.kt:320) ~[airbyte-cdk-typing-deduping-0.44.19.jar:?]
            at io.airbyte.integrations.base.destination.operation.DefaultSyncOperation.flushStream(DefaultSyncOperation.kt:107) ~[airbyte-cdk-typing-deduping-0.44.19.jar:?]
            at io.airbyte.integrations.base.destination.operation.DefaultFlush.flush(DefaultFlush.kt:18) ~[airbyte-cdk-typing-deduping-0.44.19.jar:?]
            at io.airbyte.cdk.integrations.destination.async.FlushWorkers.flush$lambda$6(FlushWorkers.kt:167) ~[airbyte-cdk-core-0.44.19.jar:?]
            at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:572) ~[?:?]
            at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) ~[?:?]
            at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) ~[?:?]
            at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ~[?:?]
            at java.base/java.lang.Thread.run(Thread.java:1583) [?:?]
    Stack Trace: com.google.cloud.bigquery.BigQueryException: <http://bigquery.googleapis.com|bigquery.googleapis.com>
            at io.airbyte.integrations.destination.bigquery.operation.BigQueryDirectLoadingStorageOperation.initWriteChannel(BigQueryDirectLoadingStorageOperation.kt:107)
            at io.airbyte.integrations.destination.bigquery.operation.BigQueryDirectLoadingStorageOperation.writeToStage(BigQueryDirectLoadingStorageOperation.kt:69)
            at io.airbyte.integrations.destination.bigquery.operation.BigQueryDirectLoadingStorageOperation.writeToStage(BigQueryDirectLoadingStorageOperation.kt:30)
            at io.airbyte.integrations.base.destination.operation.StandardStreamOperation.writeRecordsImpl(StandardStreamOperation.kt:32)
            at io.airbyte.integrations.base.destination.operation.AbstractStreamOperation.writeRecords(AbstractStreamOperation.kt:320)
            at io.airbyte.integrations.base.destination.operation.DefaultSyncOperation.flushStream(DefaultSyncOperation.kt:107)
            at io.airbyte.integrations.base.destination.operation.DefaultFlush.flush(DefaultFlush.kt:18)
            at io.airbyte.cdk.integrations.destination.async.FlushWorkers.flush$lambda$6(FlushWorkers.kt:167)
            at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:572)
            at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317)
            at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)
            at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)
            at java.base/java.lang.Thread.run(Thread.java:1583)
    k
    • 2
    • 1
  • b

    Bikram Dhoju

    09/11/2024, 3:11 PM
    @kapa.ai adding imgaepullsecret for the jobs
    k
    • 2
    • 1
  • g

    Gabriel Levine

    09/11/2024, 4:11 PM
    @kapa.ai 2024-09-11 160919 replication-orchestrator > failures: [ { 906 “failureOrigin” : “destination”, 907 “failureType” : “system_error”, 908 “internalMessage” : “Query column 2 has ? TIMESTAMP which cannot be inserted into column _?_data, which has ? STRING at [2:1]“, 909 “externalMessage” : “com.google.cloud.bigquery.BigQueryException: Query column 2 has type TIMESTAMP which cannot be inserted into column _airbyte_data, which has type STRING at [2:1]“, 910 “metadata” : { 911 “attemptNumber” : 4, 912 “jobId” : 17831645, 913 “from_trace_message” : true, 914 “connector_command” : “write” 915 }, 916 “timestamp” : 1726070941025 917 }, { 918 “failureOrigin” : “destination”, 919 “internalMessage” : “Destination process exited with non-zero exit code 1”, 920 “externalMessage” : “Something went wrong within the destination connector”, 921 “metadata” : { 922 “attemptNumber” : 4, 923 “jobId” : 17831645, 924 “connector_command” : “write” 925 },
    k
    • 2
    • 1
  • t

    Tobias Willi

    09/11/2024, 4:31 PM
    @kapa.ai does airbyte still support normalization of the data loaded to snowflake ?
    k
    • 2
    • 1
  • w

    Wajdi M

    09/11/2024, 4:32 PM
    I'm using hubspot as source connector, and I want to fetch archived data, how to do that? is there a way using pass external params like include_archived_only to source connector?
    k
    • 2
    • 1
  • a

    abhinav wagle

    09/11/2024, 5:16 PM
    @kapa.ai: How can I enable dbt transformation in helm setup on EKS cluster.
    k
    • 2
    • 4
  • m

    Martijn van Elferen

    09/11/2024, 6:35 PM
    @kapa.ai, running into following error on compute engine with debian 11 when running sudo abctl local install --migrate unable to install airbyte chart: unable to add airbyte chart repo: open /tmp/.helmrepo: permission denied
    k
    u
    u
    • 4
    • 4
  • g

    GUANGYU QU

    09/11/2024, 8:14 PM
    @kapa.ai i already disabled Auto Commit in source airbyte connector. but still got below error. do you have any idea?
    Copy code
    org.apache.kafka.common.errors.InvalidGroupIdException: To use the group management or offset commit APIs, you must provide a valid group.id in the consumer configuration.
    k
    • 2
    • 1
  • d

    Dhroov Makwana

    09/11/2024, 10:09 PM
    @kapa.ai Do we support incremental sync when the results are sorted in descending order and there is no option to pass a request param to sort it in ascending order? Without building any custom logic.
    k
    • 2
    • 4
  • a

    Adam Marcus

    09/11/2024, 11:34 PM
    @kapa.ai When I run
    abctl local install
    I get
    container "airbyte-abctl-control-plane" is not running (status = "exited")
    k
    • 2
    • 5
  • a

    Adam Marcus

    09/11/2024, 11:40 PM
    @kapa.ai I just upgraded to the latest abctl (v0.15.0) and a previously working
    secrets.yaml
    seems to no longer be valid.
    abctl local install
    fails with
    unexpected error while handling the secret : resource name may not be empty
    k
    u
    c
    • 4
    • 7
  • c

    Christopher Greene

    09/12/2024, 12:13 AM
    Using the connector builder yaml template format how can I use an integer based cursor field instead of a datetime
    k
    • 2
    • 6
  • c

    Charles Bockelmann

    09/12/2024, 2:39 AM
    I built a custom Python destination connector using the command
    airbyte-ci connectors --name destination-appsheet build
    which created an image on Docker
    airbyte/destination-appsheet
    . I want to use this image in my local Airbyte instance, What should I do next?
    k
    u
    u
    • 4
    • 36
1...161718...48Latest