https://linen.dev logo
Join Slack
Powered by
# ask-ai
  • t

    Tom Montgomery

    08/28/2024, 2:17 PM
    Will changing the host of my db reset the stream? It is a MySQL stream and the host is changing to a read-only address
    k
    • 2
    • 1
  • c

    Christopher Daniel

    08/28/2024, 2:37 PM
    Hi @kapa.ai, Airbyte active stream got stuck in a specific number
    k
    • 2
    • 1
  • m

    Matheus Dantas

    08/28/2024, 3:02 PM
    Hello. I would like to check if it is possible to use
    autoImportSchema
    with the Low-code connector implementation.
    k
    • 2
    • 1
  • r

    Rabea Yousef

    08/28/2024, 4:02 PM
    @kapa.ai i'm running airbyte VERSION=0.63.13 and i'm getting duplicates while using sync mode Full refresh | Overwrite !?
    k
    • 2
    • 1
  • a

    Aditya Gupta

    08/28/2024, 4:14 PM
    @kapa.ai How can i get the openapi.json for my server? not able to access it via api/public/v1/openapi.json endpoint
    k
    • 2
    • 7
  • a

    Abhra Gupta

    08/28/2024, 4:40 PM
    I get HTTP 500 error while setting up S3 source
    k
    b
    • 3
    • 3
  • a

    Abhra Gupta

    08/28/2024, 5:00 PM
    how to disable soft reset
    k
    • 2
    • 1
  • k

    Keegan Haukaas

    08/28/2024, 5:08 PM
    @kapa.ai My connector is not normalizing the data. the logs say: "Unsupported operation type 'normalization' found. Skipping operation..."
    k
    • 2
    • 1
  • g

    guifesquet

    08/28/2024, 5:35 PM
    Workload _sync is claimed without any progress what’s going on ?
    k
    • 2
    • 1
  • k

    Karl Jose Buena

    08/28/2024, 5:35 PM
    @kapa.ai what't the endpoint is OSS to get or list airbyte jobs?
    k
    • 2
    • 1
  • c

    Christopher Greene

    08/28/2024, 5:57 PM
    For the airbyte helm charts what was the oldest version tag that fixed vulnerabilities in the worker:0.36.0-alpha
    k
    • 2
    • 15
  • t

    Tiago Protta Gigli

    08/28/2024, 6:05 PM
    which is the latest stable airbyte helm chart available
    k
    • 2
    • 1
  • a

    Aditya Gupta

    08/28/2024, 7:09 PM
    @kapa.ai why are some of the api endpoints are pointing to api/public/v1 and some to api/v1
    k
    • 2
    • 22
  • b

    Brian Bolt

    08/28/2024, 9:57 PM
    @kapa.ai My airbyte sync job is stuck, the las tthings in the logs for the job are:
    Flush Worker (9cbc1) -- Worker finished flushing. Current queue size: 52
    k
    • 2
    • 10
  • h

    Herbert Sousa

    08/28/2024, 10:13 PM
    @kapa.ai How can I detect if all streams have been synced successfully using Airbyte API?
    k
    • 2
    • 4
  • b

    Brian Kasen

    08/28/2024, 11:15 PM
    @kapa.ai what is the last supported version for oss dbt custom normalization?
    k
    • 2
    • 7
  • a

    Andrea Burazor

    08/29/2024, 1:29 AM
    is data residency available for APAC?
    k
    • 2
    • 1
  • p

    premsurawut

    08/29/2024, 4:33 AM
    @kapa.ai how to set up Airbyte GCS source connector to sync new daily file every day?
    k
    e
    +4
    • 7
    • 17
  • a

    Andrew Nada

    08/29/2024, 4:39 AM
    Can you sync views from a MySQL or Postgres table?
    k
    • 2
    • 3
  • r

    Ritika Naidu

    08/29/2024, 5:15 AM
    @kapa.ai How does airbyte use the tables it backs up in airbyte_internal schema?
    k
    • 2
    • 10
  • s

    Slackbot

    08/29/2024, 6:24 AM
    This message was deleted.
    k
    • 2
    • 3
  • j

    Julie Choong

    08/29/2024, 8:25 AM
    Can I run abctl local install on ip 0.0.0.0?
    k
    u
    • 3
    • 14
  • y

    Yannick Sacherer

    08/29/2024, 11:19 AM
    @kapa.ai I am trying to setup a new snowflake destination:
    Copy code
    data "airbyte_workspace" "apt" {
      workspace_id = var.workspace_id
    }
    
    resource "airbyte_destination_snowflake" "snowflake" {
      depends_on = [ data.airbyte_workspace.apt ]
      definition_id = "14c2362c-b605-49e0-a92d-78b3c38906e8"
      workspace_id  = data.airbyte_workspace.apt.workspace_id
      name          = "Snowflake APT"
    
      configuration = {
        host      = "<http://this.snowflakecomputing.com|this.snowflakecomputing.com>"
        role      = var.role
        warehouse = var.warehouse
        database  = var.database
        schema    = "AIRBYTE"
        username  = var.username
        credentials = {
            key_pair_authentication = {
                private_key = file(var.snowflake_pkey_path)
            }
        }
      }
    }
    This code fails with this error message:
    Copy code
    │ Error: unexpected response from API. Got an unexpected response code 400
    │ 
    │   with airbyte_destination_snowflake.snowflake,
    │   on <http://destination.tf|destination.tf> line 5, in resource "airbyte_destination_snowflake" "snowflake":
    │    5: resource "airbyte_destination_snowflake" "snowflake" {
    │ 
    │ **Request**:
    │ POST /api/public/v1/destinations HTTP/1.1
    │ Host: datenkrake-airbyte-dev-gcp-01.paas-dev.local
    │ Accept: application/json
    │ Authorization: (sensitive)
    │ Content-Type: application/json
    │ User-Agent: speakeasy-sdk/go 0.0.1 2.372.3 1.0.0
    │ <http://github.com/airbytehq/terraform-provider-airbyte/internal/sdk|github.com/airbytehq/terraform-provider-airbyte/internal/sdk>
    │ 
    │ 
    │ **Response**:
    │ HTTP/2.0 400 Bad Request
    │ Content-Length: 185
    │ Content-Type: application/json
    │ Date: Thu, 29 Aug 2024 11:11:11 GMT
    │ Strict-Transport-Security: max-age=15768000; includeSubDomains
    │ 
    │ {"title":"resource-not-found","message":"Could not find a resource for:
    │ 13044fa7-7877-4aa9-8fe8-8654b4049399","type":"<https://reference.airbyte.com/reference/errors#resource-not-found>"}
    but this error message is misleading because for other resources my workspace_id is working as expected
    k
    • 2
    • 1
  • j

    Julie Choong

    08/29/2024, 1:21 PM
    I'm running on Kubernetes. I got this error when trying to set up Postgres source connector. "Could not find image: airbyte/source-postgres:3.4.25"
    k
    u
    • 3
    • 6
  • l

    Luke

    08/29/2024, 1:39 PM
    On a local deployment on an EC2 airbyte seems completely incapable of starting pods, it runs fine on local but not the EC2
    Copy code
    024-08-28 21:17:51 platform > Retry State: RetryManager(completeFailureBackoffPolicy=BackoffPolicy(minInterval=PT10S, maxInterval=PT30M, base=3), partialFailureBackoffPolicy=null, successiveCompleteFailureLimit=5, totalCompleteFailureLimit=10, successivePartialFailureLimit=1000, totalPartialFailureLimit=20, successiveCompleteFailures=4, totalCompleteFailures=4, successivePartialFailures=0, totalPartialFailures=0)
     Backoff before next attempt: 4 minutes 30 seconds
    2024-08-28 21:21:23 ERROR i.a.w.l.p.h.FailureHandler(apply):39 - Pipeline Error
    io.airbyte.workload.launcher.pipeline.stages.model.StageError: io.airbyte.workload.launcher.pods.KubeClientException: Destination pod failed to start within allotted timeout of 1145 seconds. (Timed out waiting for [1140000] milliseconds for [Pod] with name:[null] in namespace [airbyte-abctl].)
            at io.airbyte.workload.launcher.pipeline.stages.model.Stage.apply(Stage.kt:46) ~[io.airbyte-airbyte-workload-launcher-0.64.0.jar:?]
            at io.airbyte.workload.launcher.pipeline.stages.LaunchPodStage.apply(LaunchPodStage.kt:38) ~[io.airbyte-airbyte-workload-launcher-0.64.0.jar:?]
            at io.airbyte.workload.launcher.pipeline.stages.$LaunchPodStage$Definition$Intercepted.$$access$$apply(Unknown Source) ~[io.airbyte-airbyte-workload-launcher-0.64.0.jar:?]
            at io.airbyte.workload.launcher.pipeline.stages.$LaunchPodStage$Definition$Exec.dispatch(Unknown Source) ~[io.airbyte-airbyte-workload-launcher-0.64.0.jar:?]
            at io.micronaut.context.AbstractExecutableMethodsDefinition$DispatchedExecutableMethod.invoke(AbstractExecutableMethodsDefinition.java:456) ~[micronaut-inject-4.5.4.jar:4.5.4]
    k
    c
    • 3
    • 11
  • d

    Daniel Antwi

    08/29/2024, 2:22 PM
    I'm running pyairbyte in an airflow dag inside a docker container using docker-compose. I have a postgres server on my host local machine that am trying to connect to as PostgresCache. I've added extra_hosts: - "host.docker.internal:host-gateway" to my docker compose yaml and provided a host value of host.docker.internal in my Postgrescache Constructor. However, it's still not able to connect to Postgres on the host machine
    k
    • 2
    • 1
  • a

    Abhinav Pandey

    08/29/2024, 3:48 PM
    @kapa.ai Where to get definition ID of a custom connectior built using connector builder
    k
    • 2
    • 16
  • r

    Rohit Chatterjee

    08/29/2024, 4:00 PM
    what permissions do i need to give the postgres user for the airbyte config db @kapa.ai
    k
    • 2
    • 2
  • g

    Glen Aultman-Bettridge

    08/29/2024, 5:21 PM
    @kapa.ai Could you give me an Terraform example of the
    airbyte_connection
    resource that sets a primary_key in the streams configuration?
    k
    • 2
    • 1
1...272829...48Latest