https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • a

    Abhishek Battu

    06/05/2023, 6:11 AM
    We connected our google ads to air byte and we storing our data in big query,but we are get are getting wrong data from airbyte,can you help me to resolvev the issued
    k
    • 2
    • 2
  • c

    Chidambara Ganapathy

    06/05/2023, 11:57 AM
    Hi Team, I am trying to use snowflake source to fetch details about snowflake usage and store it in a destination When I try to connect by giving the required details, i get the tables present in the database but not account usage..any suggestions on this will be helpful. Thanks
    k
    • 2
    • 2
  • m

    Maria Martiyanova

    06/05/2023, 2:19 PM
    Hello, would you be able to clarify if Airbyte is a non-blocking process? We are interested in running a backup process of Postgres-to-CSV and for the duration of the backup would like any incoming reads & writes to the database to not experience any slowdows. Thank you for your time.
    k
    • 2
    • 2
  • l

    Luis Felipe Oliveira

    06/05/2023, 6:58 PM
    Hello people, I'm getting a weird error when I test any of my sources. The strangest thing is that until yesterday, everything was working, but today, those same sources are failing. I'm even creating new sources, and they're failing. I already checked if it could be something with the credentials, but no matter what type of source definition I choose, the same error happens. The strangest thing is that there is no return about the error that is happening, the message is simply this: Configuration check failed Failed to run connection tests. When I use the airbyte API it returns an empty logLines. What can it be?
    k
    • 2
    • 6
  • o

    Octavia Squidington III

    06/05/2023, 7:45 PM
    🔥 Community Office Hours starts in 15 minutes 🔥 Topic and schedule posted in #C045VK5AF54 octavia loves At 1pm PDT click here to join us on Zoom!
  • d

    Divine

    06/05/2023, 10:48 PM
    Hello Everyone, Please is there a way to trigger airbyte sync via an API call without having to use any orchestration tool? Please point me to a doc for that. Thank you
    k
    • 2
    • 5
  • d

    Dai Zhang

    06/06/2023, 1:55 AM
    hi, could airbytes integrate data from s3 to kdb+ database, great thanks
    k
    • 2
    • 2
  • s

    Sathish

    06/06/2023, 2:00 AM
    hello, appreciate any troubleshooting/suggestions with this issue - https://github.com/airbytehq/airbyte/issues/27048
    k
    • 2
    • 2
  • w

    Wei Seng Thong

    06/06/2023, 3:07 AM
    hello guys Im trying to build a new connection to Zephyr Squad which is an extension in Jira for Test Management. Part of the params that needs to be passed into Zephyr Squad's API would be Jira's
    projectId
    and
    versionId
    . Assuming that Jira works and I have a table sitting in the Redshift, how do i call the data and put these`projectId` and
    versionId
    as params into Zephyr Squad's API. Thanks.
    k
    • 2
    • 2
  • c

    Chidambara Ganapathy

    06/06/2023, 8:25 AM
    Hi Team, Any update on QuickBooks online Beta release? Thanks
    k
    • 2
    • 3
  • m

    Madhav

    06/06/2023, 10:29 AM
    Hi Team, I have created few airbyte connection on EC2 machine, Can anyone let me know where the meta data and data of that connections stored at backend level?
    k
    l
    • 3
    • 4
  • o

    Octavia Squidington III

    06/06/2023, 1:45 PM
    🔥 Office Hours starts in 15 minutes 🔥 Topic and schedule posted in #C045VK5AF54 octavia loves At 16:00 CEST / 10am EDT click here to join us on Zoom!
  • l

    Leo Sussan

    06/06/2023, 4:59 PM
    Hello! So - I have two running Postgres CDC incremental sync into BigQuery, using the batch run type. I have added a third connection, using a distinct replication slot & publication, to the same Postgres database. I am attempting to sync 100 partitions of a larger table that, when originally attempted, failed numerous times mid-sync. Shifting to syncs of the partitions finally succeeded after a week of partition-by-partition additions to the connection, but sync now not completing now. Upon inspection of logs & the files sent over to GCS, the buffers are being flushed after 302 bytes, which seems really surprisingly small, and the syncs I cancelled saw 1715 changes emitted within an hour. My BigQuery file buffer count is at 40. What is the expected file size for a buffer file that has been flushed?
    k
    g
    • 3
    • 40
  • r

    Roberto Cuellar Hattam

    06/06/2023, 5:15 PM
    Hey! I'm using Airbyte to connect Jira to GBQ and I'm getting this
    Copy code
    ERROR c.n.s.DateTimeValidator(tryParse):82 - Invalid date-time: Invalid timezone offset: +0100
    Any ideas?
    👀 1
    k
    • 2
    • 2
  • c

    Chidambara Ganapathy

    06/06/2023, 5:39 PM
    Hi Team, I am trying to add snowflake source to get details from the database to get account usage from a database called SNOWFLAKE. Source is getting configured without issues. Destination is another snowflake db which is also configured perfectly During connection it fetched multiple streams and while syncing it is throwing error "278 Unknown keyword order- you should define your own meta schema. If the keyword is irrelevant for validation. Use a nonvalidation keyword"
    k
    • 2
    • 2
  • h

    Huib

    06/06/2023, 5:42 PM
    The Clickhouse connector doesn’t seem to like the
    incremental deduped history
    sync method. dbt fails with the following message:
    Copy code
    17:23:24.670768 [error] [MainThread]:   :HTTPDriver for <https://x116ylkqmo.eu-west-1.aws.clickhouse.cloud:8443> returned response code 404)
    17:23:24.671163 [error] [MainThread]:    Code: 47. DB::Exception: Missing columns: '_airbyte_unique_key' while processing query: 'SELECT SolarEfficiency, SolarOrientation, SolarMaxProduction, SolarTiltAngle, CreatedAt, AddressId, UpdatedAt, SolarProductionThreshold, _airbyte_ab_id
    17:23:24.671614 [error] [MainThread]:   compiled Code at ../build/run/airbyte_utils/models/generated/airbyte_incremental/airbyte_raw/users_SolarSettings.sql,retryable=<null>,timestamp=1686072209500,additionalProperties={}], io.airbyte.config.FailureReason@77c7a5f4[failureOrigin=normalization,failureType=system_error,internalMessage=[0m17:23:24.668621 [error] [MainThread]:   :HTTPDriver for <https://x116ylkqmo.eu-west-1.aws.clickhouse.cloud:8443> returned response code 404),externalMessage=Normalization failed during the dbt run. This may indicate a problem with the data itself.,metadata=io.airbyte.config.Metadata@36dd6edb[additionalProperties={attemptNumber=1, jobId=18, from_trace_message=true}],stacktrace=AirbyteDbtError: 
    17:23:23.231583 [error] [Thread-1 (]: 27 of 28 ERROR creating sql incremental model airbyte_raw.users_ChargeSettings ......................................... [ERROR in 1.71s]
    17:23:24.633715 [error] [Thread-1 (]: 28 of 28 ERROR creating sql incremental model airbyte_raw.users_SolarSettings .......................................... [ERROR in 1.40s]
    k
    c
    • 3
    • 6
  • c

    Chidambara Ganapathy

    06/06/2023, 5:42 PM
    Does snowflake source connector has any known issues?
    k
    • 2
    • 2
  • c

    Chidambara Ganapathy

    06/06/2023, 5:49 PM
    Is the QuickBooks online refresh token issue fixed?
    🚫 1
    k
    h
    +14
    • 17
    • 101
  • k

    Kenneth Myers

    06/06/2023, 5:52 PM
    I'm getting the following error when setting up a private S3 bucket as a source:
    Server temporarily unavailable (http.502.tQnjxFFeB8zupnmBbSK2dQ)
    Anybody able to help with this?
    k
    l
    t
    • 4
    • 8
  • j

    John Olinger

    06/06/2023, 7:10 PM
    Question about the mysql destination... we have a source of Google Directory with the associated csv's with the json. I have not tested this in a schema yet, but I would like to know if the mysql destination uses json import to get the comma separated json values into columns. Can anyone confirm if this is the expected behavior?
    k
    • 2
    • 2
  • k

    KRITIN MADHAVAN D 20BCE1536

    06/06/2023, 9:17 PM
    Hi im new to airbyte, while installing in my local machine(Windows) as per the doc the last instruction is to run "bash run-ab-platform.sh". when i run i get the an error message "run-ab-platform.sh: line 2: $'\r': command not found run-ab-platform.sh: line 7: $'\r': command not found". Can anyone help me out please?
    k
    • 2
    • 3
  • j

    Jatin Morar

    06/06/2023, 9:35 PM
    Hi all, I'm new to Airbyte. That said, is there a recommended way to connect to s3. The documentation states using Access Key ID and Secret Access Key, though it is unclear if they are optional or required. Moreover, are there any ways to utilize roles to authenticate with?
    k
    • 2
    • 6
  • k

    KRITIN MADHAVAN D 20BCE1536

    06/06/2023, 10:50 PM
    permission denied while trying to connect to the Docker daemon socket at unix///var/run/docker.sock Get "http://%2Fvar%2Frun%2Fdocker.sock/v1.24/containers/json?all=1&amp;filters=%7B%22label%22%3A%7B%22com.docker.compose.config-hash%22%3Atrue%2C%22com.docker.compose.project%3Dairbyte%22%3Atrue%7D%7D": dial unix /var/run/docker.sock: connect: permission denied , I have this error
    k
    • 2
    • 2
  • i

    Ignacio Martínez de Toda

    06/07/2023, 7:30 AM
    Hi team, i already have a running connection on airbyte deployed in a google VM, I’m syncing data from MongoDB to Bigquery and is running fine. Now when i try to set up a new connection using the same source and destination, or trying to refresh the source schema i get the
    non-json response
    problem when fetching the schema from my mongoDB. I know this is an extended issue that happens across different sources and users of Airbyte, does someone know how to fix it? My connectors are up to date, maybe downgrading to an older version? Thanks in advance, any help is much appreciated!!
    k
    • 2
    • 2
  • a

    Anibal Blazquez

    06/07/2023, 8:04 AM
    Hello, I’m running a connection but it freezes at the moment of creating the tables. After some hours time it is canceled:
    Copy code
    2023-06-07 07:44:51 destination > starting destination: class io.airbyte.integrations.destination.redshift.RedshiftDestination
    2023-06-07 07:44:51 destination > integration args: {catalog=destination_catalog.json, write=null, config=destination_config.json}
    2023-06-07 07:44:51 destination > Running integration: io.airbyte.integrations.destination.redshift.RedshiftDestination
    2023-06-07 07:44:51 destination > Command: WRITE
    2023-06-07 07:44:51 destination > Integration config: IntegrationConfig{command=WRITE, configPath='destination_config.json', catalogPath='destination_catalog.json', statePath='null'}
    2023-06-07 07:44:51 destination > Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
    2023-06-07 07:44:51 destination > Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
    2023-06-07 07:44:51 destination > Using destination type: COPY_S3
    2023-06-07 07:44:52 destination > HikariPool-1 - Starting...
    2023-06-07 07:44:52 destination > HikariPool-1 - Start completed.
    2023-06-07 07:44:52 destination > Creating S3 client...
    2023-06-07 07:44:53 destination > Write config: WriteConfig{streamName=messages, namespace=stage_airbyte_twilio_it2, outputSchemaName=stage_airbyte_twilio_it2, tmpTableName=_airbyte_tmp_wto_messages, outputTableName=_airbyte_raw_messages, syncMode=append_dedup}
    2023-06-07 07:44:53 destination > class io.airbyte.integrations.destination.buffered_stream_consumer.BufferedStreamConsumer started.
    2023-06-07 07:44:53 destination > Preparing raw tables in destination started for 1 streams
    2023-06-07 07:44:53 destination > Preparing staging area in destination started for schema stage_airbyte_twilio_it2 stream messages: target table: _airbyte_raw_messages, stage: airbyte/stage_airbyte_twilio_it2_messages/2023_06_07_07_66a40649-1157-46ec-bcdf-a7be2f573272/
    2023-06-07 07:44:53 destination > HikariPool-1 - Driver does not support get/set network timeout for connections. ([Amazon][JDBC](10220) Driver does not support this optional feature.)
    2023-06-07 07:44:55 destination > Preparing staging area in destination completed for schema stage_airbyte_twilio_it2 stream messages
    2023-06-07 07:44:55 destination > Executing finalization of tables.
    The finalization of tables never end
    k
    • 2
    • 2
  • a

    Alejo

    06/07/2023, 8:57 AM
    hi all, we’re experiencing random issues when syncing from an AWS RDS MySQL database with CDC activated. Every 2-3 executions it fails with a “Error during binlog processing” error, but then next iteration runs ok. Airbyte version: 0.40.3 MySQL connector version: 0.6.5 I attach some logs from some executions that failed
    logs-2605.txtlogs-2616.txt
  • c

    Chidambara Ganapathy

    06/07/2023, 9:20 AM
    Hi Team, While configuring snowflake as a source I am getting "No active warehouse selected in the current session. Select an active warehouse with the use warehouse command" Any suggestions on this would be helpful
    plus1 1
    k
    h
    • 3
    • 3
  • c

    Chidambara Ganapathy

    06/07/2023, 9:25 AM
    Is snowflake source supports only OAuth 2.0 authentication alone.
    k
    • 2
    • 2
  • s

    Soshi Nakachi仲地早司

    06/07/2023, 10:12 AM
    I have a question about when the synchronization method is incremental synchronization. Suppose there is a configuration item that is the initial value of the cursor field during the initial source setup. (e.g. start_date) 1. set the value of the cursor field to
    2023-01-01
    in the initial source setup 2. synchronize daily until `2023-06-01 At this point the value of state is 2023-05-15 3. now overwrite the value of the cursor field in the source configuration to 2023-06-01 At this point, will the next synchronization use the stored state value (2023-05-15)? Or will the overwritten value (2023-06-01) be used?
    k
    • 2
    • 2
  • c

    Chidambara Ganapathy

    06/07/2023, 10:35 AM
    Hi Team, I am trying to configure Snowflake as source and fetch details from a database named "SNOWFLAKE" which is a database which holds the account usage information and put it in another snowflake destination. The Database has 3 namespaces and in each namespace it has multiple streams and some of the stream names are same across the namespaces. While syncing getting the error "io.airbyte.commons.exceptions.ConfigErrorException: You are trying to write multiple streams to the same table. Consider switching to a custom namespace format using ${SOURCE_NAMESPACE}, or moving one of them into a separate connection with a different stream prefix. Affected streams: null.AUTOMATIC_CLUSTERING_HISTORY, null.AUTOMATIC_CLUSTERING_HISTORY" Kindly let me know what needs to be done. Thanks & Regards, Chidambara Ganapathy
    kwpu80slndck_logs_11953_txt.txt
    t
    • 2
    • 2
1...199200201...245Latest