https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • m

    Marcos Marx (Airbyte)

    04/26/2023, 2:55 PM
    has renamed the channel from "public-airbyte-connections-issues" to "public-help-connections-issues"
  • r

    Ryan

    04/26/2023, 3:15 PM
    Hi there, the
    cancellation_details
    field from the Stripe subscriptions stream doesn't exist in the final tables, it only exists as part of the JSON fetched from Stripe. Is it possible to pull in this extra field?
    k
    m
    • 3
    • 3
  • g

    Gabriel Levine

    04/26/2023, 3:29 PM
    Initial sync from Postgres Source to BigQuery with CDC succeeds but subsequent syncs return no records. Sync is marked as succesful. Only clue is ‘null’ WAL resume position:
    Copy code
    WalPositionLocator(<init>):45 Looking for WAL restart position for last commit LSN 'null' and last change LSN 'LSN{13A/C1F3C440}'
    
    PostgresStreamingChangeEventSource(searchWalPosition):344 WAL resume position 'null' discovered
    Running Airbyte 0.44.2 on GKE via Helm deploy. Postgres source is 2.0.24 and BigQuery destination is 1.3.0. Postgres source database is GCP CloudSQL.
    k
    m
    +3
    • 6
    • 20
  • a

    Alejo

    04/26/2023, 4:00 PM
    hi all, could anyone help me merge this PR? It’s open since months ago and the supposed blocker (having a sandbox talkdesk account) was supposedly solved.
    k
    m
    s
    • 4
    • 5
  • m

    Mantas Linkis

    12/27/2022, 1:06 PM
    Hello all, merry Christmas; We’re having an issue with a postgres connector - has anyone faced this before? This happened once we upgraded our Airbyte version. It’s deployed with docker-compose. In our logs we can see what appears to be message sent by the client:
    could not accept SSL connection: SSLV3_ALERT_CERTIFICATE_UNKNOWN"
    We think this is related to truststore but we’re unsure how to go about this.
    u
    • 2
    • 2
  • b

    Ben Konicek

    04/26/2023, 7:40 PM
    We just migrated one of our Airbyte instances from a standalone server to Kubernetes, and recreated all the connections. All but one of the connections is working after the migration. When we click Sync Now on the connection having problems, we get the error
    Failed to start sync: The connection manager workflow for connection bc2a79bb-bb0d-42b4-9492-8dcdbf5e7d4e is deleted, so no further operations cannot be performed on it.
    k
    m
    • 3
    • 4
  • r

    Rafael Rossini

    04/26/2023, 7:53 PM
    Hi, is there an AMI in aws marketplace that contais linux, docker and airbyte ready to run?
    k
    c
    • 3
    • 3
  • i

    Inayet Hadi

    04/26/2023, 10:01 PM
    👋 Hello, team!
    k
    • 2
    • 3
  • k

    Kundan Kumar

    04/27/2023, 11:29 AM
    Hello, I am trying to connect mysql(beta) as a source from airbyte. But it is throwing non-json response. Can anyone suggest which versiob should i use. Currently I am using the latest version of mysql 2.0.18.
    k
    • 2
    • 2
  • u

    이유진

    04/27/2023, 11:42 AM
    Hello, I'm trying to replicate data using CDC replication with incremental sync following the instruction https://airbyte.com/tutorials/incremental-change-data-capture-cdc-replication. There's are only Full Refresh sync mode as in the picture, but I need to set the sync mode to Incremental( deduped + history ). What should I do to define the sync mode to incremental? Thank you!
  • g

    Gaëtan Podevijn

    04/27/2023, 1:34 PM
    Hi Team! My team is evaluating Airbyte as our main data ingestion tool. We’re very enthusiastic about it. I have a question regarding the integration between Postgres and Kafka sources and the Databricks destination. When ingestion from Postgres to Databricks Lakehouse in managed tables, the data is written in an unnormalized form (in
    _airbyte_data
    ), as explained in the documentation Basic Normalization. However, when writing from Postgres to Databricks lakehouse in unmanaged tables, the data is written in a normalised from (i.e., the columns from the source are created in the Lakehouse external table). But according to the documentation (or at least my understanding of it), the unnormalized form should be the default and normalization should be configured (and not supported by the Dataabricks connector yet). With Kafka to Databricks in managed tables, the data is written in an unnormalized form as I would expect in
    _airbyte_data
    . With unmanaged tables, however, the data is put in
    _airbyte_additional_properties
    . I assume that it is because the Kafka source does not use the schema registry to create the schema (even though it is correctly configured in the source). Do you know why • The data is in the unnormalized form depending on the Lakehouse table’s type in the postgres case? • Does the Kafka source use the schema registry to write the data in the destination? Thanks!
    k
    • 2
    • 2
  • b

    Benjamin Edwards

    04/27/2023, 2:14 PM
    Hi, When I customise the namespace schema "mubi" the resulting schema created in snowflake is "_mubi_". Is anyone aware how to create a custom namespace without the _ ?
    k
    s
    • 3
    • 4
  • k

    Krunal Jagani

    04/27/2023, 7:38 PM
    Hi experts, We are working on migrating our on-prem OLTP to Azure delta lakehouse using Airbyte and we are able to complete initial POC tests on migration. We have CDC turned on on the source side DB and our current replication type is 'Incremental - Append' to Databricks Lakehouse. Everything works as it should, however, we are noticing that existing row Updates & Delete operations on the source side create separate records in the delta table and delta versioning is not able to pick up the latest updated version. The replication type 'Incremental deduped- history' option is still not available on public preview. So I would like to ask how Engineers here have dealt with the additional records (append) in order for our Delta table to only contain the most updated records only. Do I need to write a logic to deal with this? e.g Logic to pull most recent for each row based on the _ab_cdc_updated_at column for each primary key and also exclude any row with a non-null _ab_cdc_deleted_at column cc : @Michael Adaniken
    k
    • 2
    • 2
  • a

    Arjunsingh Yadav

    04/28/2023, 10:25 AM
    Hi guys, A small doubt I’m trying to have the Source as File with storage provider as S3 and destination as Postgres The reason to have File as source is because S3 source doesn’t offer xlsx file format support I’m able to create a connection and transfer the files The File config asks for the url with the exact filename - s3://bucket/folder/filename.xlsx The only problem is, I don’t want to create a connection for every file I just want it to consider all the files in the folder like - S3://bucket/folder/*.xlsx Tried adding this way in the url, it gives an error -
    Copy code
    Failed to load <s3://bucket/folder/*.xlsx>: OSError("unable to access bucket: 'bucket' key: 'folder/*.xlsx' version: None error: An error occurred (NoSuchKey) when calling the GetObject operation: The specified key does not exist.")
    Any help would be appreciated Thanks :)
    k
    • 2
    • 2
  • s

    Slackbot

    04/28/2023, 10:41 AM
    This message was deleted.
  • s

    Slackbot

    04/28/2023, 10:41 AM
    This message was deleted.
  • k

    Konrad Ertmanski

    04/28/2023, 10:56 AM
    Hi Team! Pretty new to the game was hoping maybe someone had similar issues. We’ve been using Stripe connector for a solid while to pull data from 4 selected streams. Connector was running on an outdated version
    0.1.33
    but performed just fine for our needs. Recently had upgraded it to the newest version
    3.4.1
    and the sync times have sky-rocketed( and I mean it, from 30min to 30h). Airbyte version: 0.40.32 It seems that the upgrade has increased the amount of records read/emitted massively. More info in the thread - any help/ideas would be appreciated 🙏 Thanks!
    k
    • 2
    • 3
  • s

    Slackbot

    04/28/2023, 12:21 PM
    This message was deleted.
  • s

    Satyam Saxena

    04/28/2023, 12:31 PM
    Hi team, I am trying to deploy a snowflake destination image on a container inside a kube pod which should listen to a particular port where airbyte source image will dump data. can someone please help me to write correct commands/yaml to run snowflake destination image on kube pod.
    Copy code
    apiVersion: v1
    kind: Pod
    metadata:
      name: airbyte-destination-pod
    spec:
      restartPolicy: Never
      containers:
      - name: airbyte-destination
        image: airbyte/source-snowflake:latest
        command: ["/bin/sh", "-c"]
        args:
        - sh
        - '-c'
        - >-
          if [ ! -z "$AIRBYTE_ENTRYPOINT" ]; then
            ENTRYPOINT=$AIRBYTE_ENTRYPOINT
          else
            ENTRYPOINT="/airbyte/base.sh"
          fi
    
          (eval "$ENTRYPOINT write --config /config/config.json --catalog
          /catalog/catalog.json")
        ports:
        - containerPort: 9999
    k
    • 2
    • 6
  • b

    Bradley Penwarden

    04/28/2023, 1:21 PM
    Good Afternoon team, question around the Jira connector being very slow and hitting a socket error after 4 days of syncing. More in thread.
    k
    • 2
    • 7
  • s

    Slackbot

    04/28/2023, 3:02 PM
    This message was deleted.
  • g

    Giuseppe Russo

    04/28/2023, 4:02 PM
    Hi there. I've set up a TalkDesk source for an account based in Europe region. To do that, I needed to change .com to .eu in the Auth URL.
    <https://xxxxx.talkdeskid>.*eu*/oauth/token?grant_type=client_credentials
    I'm now trying to run a connection but I get the following error:
    Unauthorized for url: <https://api.talkdeskapp>.*com*/data/reports/calls/jobs
    However, this URL is unauthorized because it should have the .eu and not the .com domain. Do you know if there is the possibility to change the region for the TalkDesk connection?
    k
    s
    • 3
    • 5
  • m

    Micky

    04/28/2023, 4:46 PM
    Hi, I have deployed Airbyte on AWS. When I connected to the source with everything setup like replication slot and publication, I got an error message 'Configuration check failed Message: HikariPool-1 - Connection is not available, request timed out after 10001ms.'Internal messageio.airbyte.commons.exceptions.ConnectionErrorException java.sql.SQLTransientConnectionException: HikariPool-1 - Connection is not available, request timed out after 10002ms. Failure origin: source Failure type: config_error.
    k
    g
    • 3
    • 4
  • g

    Gabriel Levine

    04/28/2023, 5:25 PM
    The BigQuery destination’s latest version is marked as 1.3.3 but the Changelog only details updates up to 1.3.1. Are there details on the subsequent releases?
    k
    • 2
    • 2
  • s

    Slackbot

    04/28/2023, 6:28 PM
    This message was deleted.
  • m

    Murat Cetink

    04/28/2023, 7:06 PM
    Hello. How can I configure AWS user permissions to restrict access to specific DynamoDB tables, rather than granting full read access to all tables? Currently, I'm using the DynamoDB source connector, but it seems to require read permissions on all tables in a region, which is not ideal.
    k
    i
    • 3
    • 30
  • s

    Shubhra Ghosh

    04/28/2023, 9:28 PM
    Hello - I had a quick question. Is there any options of replicating source records into destination and not change the destination table ddl. I see metadata being added but I do not want the metadata in the main destination table.
    k
    • 2
    • 3
  • s

    Slackbot

    04/29/2023, 4:58 PM
    This message was deleted.
  • g

    Gabriel Levine

    04/29/2023, 5:03 PM
    I’m having an issue with a connection between a Postgres source (version 2.0.28) and BigQuery destination (version 1.3.1). The number of records emitted is greater than the number of records committed. The BigQuery destination functions as expected for several other syncs. The sync fails after several attempts. Errors include:
    Destination process message delivery failed
    ,
    Failed to upload buffer to stage and commit to destination
    , and
    Failed to upload staging files to destination table
    .
    k
    • 2
    • 5
  • d

    Dhanji Mahto

    04/29/2023, 11:18 PM
    Hi All I am getting below two issue while running integration test case suite. Does any one have idea how to fix it ? {"level":"ERROR","message":"ERROR i.a.w.i.DefaultAirbyteStreamFactory(internalLog):155 ERROR i.a.i.b.AirbyteExceptionHandler(uncaughtException):26 Something went wrong in the connector. See the logs for more details. java.lang.RuntimeException: java.nio.file.NoSuchFileException: destination_config.json\n\tat io.airbyte.commons.io.IOs.readFile(IOs.java:74) ~[io.airbyte-airbyte-commons-0.44.2.jar:?]\n\tat io.airbyte.integrations.base.IntegrationRunner.parseConfig(IntegrationRunner.java:318) ~[io.airbyte.airbyte-integrations.bases-base-java-0.44.2.jar:?]\n\tat io.airbyte.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:147) ~[io.airbyte.airbyte-integrations.bases-base-java-0.44.2.jar:?]\n\tat io.airbyte.integrations.base.IntegrationRunner.run(IntegrationRunner.java:101) ~[io.airbyte.airbyte-integrations.bases-base-java-0.44.2.jar:?]\n\tat io.airbyte.integrations.destination.vertica.VerticaDestination.main(VerticaDestination.java:78) ~[io.airbyte.airbyte-integrations.connectors-destination-vertica-0.44.2.jar:?]\nCaused by: java.nio.file.NoSuchFileException: destination_config.json\n\tat sun.nio.fs.UnixException.translateToIOException(UnixException.java:92) ~[?:?]\n\tat sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:106) ~[?:?]\n\tat sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:111) ~[?:?]\n\tat sun.nio.fs.UnixFileSystemProvider.newByteChannel(UnixFileSystemProvider.java:218) ~[?:?]\n\tat java.nio.file.Files.newByteChannel(Files.java:380) ~[?:?]\n\tat java.nio.file.Files.newByteChannel(Files.java:432) ~[?:?]\n\tat java.nio.file.Files.readAllBytes(Files.java:3288) ~[?:?]\n\tat java.nio.file.Files.readString(Files.java:3366) ~[?:?]\n\tat io.airbyte.commons.io.IOs.readFile(IOs.java:72) ~[io.airbyte-airbyte-commons-0.44.2.jar:?]\n\t... 4 more\n\nStack Trace: java.lang.RuntimeException: java.nio.file.NoSuchFileException: destination_config.json\n\tat io.airbyte.commons.io.IOs.readFile(IOs.java:74)\n\tat io.airbyte.integrations.base.IntegrationRunner.parseConfig(IntegrationRunner.java:318)\n\tat io.airbyte.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:147)\n\tat io.airbyte.integrations.base.IntegrationRunner.run(IntegrationRunner.java:101)\n\tat io.airbyte.integrations.destination.vertica.VerticaDestination.main(VerticaDestination.java:78)\nCaused by: java.nio.file.NoSuchFileException: destination_config.json\n\tat java.base/sun.nio.fs.UnixException.translateToIOException(UnixException.java:92)\n\tat java.base/sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:106)\n\tat java.base/sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:111)\n\tat java.base/sun.nio.fs.UnixFileSystemProvider.newByteChannel(UnixFileSystemProvider.java:218)\n\tat java.base/java.nio.file.Files.newByteChannel(Files.java:380)\n\tat java.base/java.nio.file.Files.newByteChannel(Files.java:432)\n\tat java.base/java.nio.file.Files.readAllBytes(Files.java:3288)\n\tat java.base/java.nio.file.Files.readString(Files.java:3366)\n\tat io.airbyte.commons.io.IOs.readFile(IOs.java:72)\n\t... 4 more\n"}} {"type":"LOG","log":{"level":"ERROR","message":"ERROR i.a.w.i.DefaultAirbyteStreamFactory(internalLog):155 ERROR i.a.i.b.AirbyteExceptionHandler(uncaughtException):26 Something went wrong in the connector. See the logs for more details. java.lang.RuntimeException: java.nio.file.NoSuchFileException: destination_config.json\n\tat io.airbyte.commons.io.IOs.readFile(IOs.java:74) ~[io.airbyte-airbyte-commons-0.44.2.jar:?]\n\tat io.airbyte.integrations.base.IntegrationRunner.parseConfig(IntegrationRunner.java:318) ~[io.airbyte.airbyte-integrations.bases-base-java-0.44.2.jar:?]\n\tat io.airbyte.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:147) ~[io.airbyte.airbyte-integrations.bases-base-java-0.44.2.jar:?]\n\tat io.airbyte.integrations.base.IntegrationRunner.run(IntegrationRunner.java:101) ~[io.airbyte.airbyte-integrations.bases-base-java-0.44.2.jar:?]\n\tat io.airbyte.integrations.destination.vertica.VerticaDestination.main(VerticaDestination.java:78) ~[io.airbyte.airbyte-integrations.connectors-destination-vertica-0.44.2.jar:?]\nCaused by: java.nio.file.NoSuchFileException: destination_config.json\n\tat sun.nio.fs.UnixException.translateToIOException(UnixException.java:92) ~[?:?]\n\tat sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:106) ~[?:?]\n\tat sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:111) ~[?:?]\n\tat sun.nio.fs.UnixFileSystemProvider.newByteChannel(UnixFileSystemProvider.java:218) ~[?:?]\n\tat java.nio.file.Files.newByteChannel(Files.java:380) ~[?:?]\n\tat java.nio.file.Files.newByteChannel(Files.java:432) ~[?:?]\n\tat java.nio.file.Files.readAllBytes(Files.java:3288) ~[?:?]\n\tat java.nio.file.Files.readString(Files.java:3366) ~[?:?]\n\tat io.airbyte.commons.io.IOs.readFile(IOs.java:72) ~[io.airbyte-airbyte-commons-0.44.2.jar:?]\n\t... 4 more\n\nStack Trace: java.lang.RuntimeException: java.nio.file.NoSuchFileException: destination_config.json\n\tat io.airbyte.commons.io.IOs.readFile(IOs.java:74)\n\tat io.airbyte.integrations.base.IntegrationRunner.parseConfig(IntegrationRunner.java:318)\n\tat io.airbyte.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:147)\n\tat io.airbyte.integrations.base.IntegrationRunner.run(IntegrationRunner.java:101)\n\tat io.airbyte.integrations.destination.vertica.VerticaDestination.main(VerticaDestination.java:78)\nCaused by: java.nio.file.NoSuchFileException: destination_config.json\n\tat java.base/sun.nio.fs.UnixException.translateToIOException(UnixException.java:92)\n\tat java.base/sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:106)\n\tat java.base/sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:111)\n\tat java.base/sun.nio.fs.UnixFileSystemProvider.newByteChannel(UnixFileSystemProvider.java:218)\n\tat java.base/java.nio.file.Files.newByteChannel(Files.java:380)\n\tat java.base/java.nio.file.Files.newByteChannel(Files.java:432)\n\tat java.base/java.nio.file.Files.readAllBytes(Files.java:3288)\n\tat java.base/java.nio.file.Files.readString(Files.java:3366)\n\tat io.airbyte.commons.io.IOs.readFile(IOs.java:72)\n\t... 4 more\n"}}
    k
    • 2
    • 2
1...188189190...245Latest