https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • s

    Syed Ali Raza

    01/03/2023, 2:50 PM
    Hi
    s
    • 2
    • 1
  • s

    Sandeep Yadav

    01/03/2023, 3:48 PM
    Deployed airbyte with terraform , what would be default username / password ?
    m
    • 2
    • 6
  • s

    Sandeep Yadav

    01/03/2023, 3:49 PM
    image.png
    o
    • 2
    • 6
  • f

    Frederic Laithier

    01/03/2023, 4:49 PM
    Hello everyone, we are looking to process data from CSV files, it seems it is not possible to use the "file" connector to inject data from multiple file. Our use case is : • We receive data in CSV format each x minutes in a SFTP • Airbyte should process the data of each file and inject it to our data lake Does airbyte allow to do that ? Thanks in advance
    r
    n
    • 3
    • 3
  • t

    Thomas Pedot

    01/03/2023, 4:51 PM
    "Architecture" question Hello, I have one CRM with API containing Customer + a model warehouse for our product stock. We have an ecommerce with all those products and I wonder to sync the remaining stock. The CRM can be accessed by another service (the stock can be updated). How would you do this ? I don't need to have a perfect match at any time but having the right transaction. Both could trigger events. CRM is the main database for stock. How would you do the reconciliation ?
    s
    • 2
    • 2
  • g

    Gleber Baptistella

    01/03/2023, 5:15 PM
    PostgreSQL Connector - Version 1.0.34 Airbyte - Version 0.40.26 Hi folks! I've just upgrade Airbyte to version 0.40.26 and PostgreSQL Connector to version 1.0.34 and since then we are getting the following error message:
    Copy code
    Position: 243,externalMessage=Something went wrong in the connector. See the logs for more details.,metadata=io.airbyte.config.Metadata@567f3621[additionalProperties={attemptNumber=0, jobId=207613, from_trace_message=true, connector_command=read}],stacktrace=org.postgresql.util.PSQLException: ERROR: syntax error at or near "-"
      Position: 243
    	at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2675)
    	at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:2365)
    	at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:355)
    	at org.postgresql.jdbc.PgStatement.executeInternal(PgStatement.java:490)
    	at org.postgresql.jdbc.PgStatement.execute(PgStatement.java:408)
    	at org.postgresql.jdbc.PgStatement.executeWithFlags(PgStatement.java:329)
    	at org.postgresql.jdbc.PgStatement.executeCachedSql(PgStatement.java:315)
    	at org.postgresql.jdbc.PgStatement.executeWithFlags(PgStatement.java:291)
    	at org.postgresql.jdbc.PgStatement.executeQuery(PgStatement.java:243)
    	at com.zaxxer.hikari.pool.ProxyStatement.executeQuery(ProxyStatement.java:110)
    	at com.zaxxer.hikari.pool.HikariProxyStatement.executeQuery(HikariProxyStatement.java)
    	at io.airbyte.integrations.source.postgres.PostgresSource.lambda$verifyCursorColumnValues$12(PostgresSource.java:532)
    	at io.airbyte.db.jdbc.DefaultJdbcDatabase.bufferedResultSetQuery(DefaultJdbcDatabase.java:55)
    	at io.airbyte.integrations.source.postgres.PostgresSource.verifyCursorColumnValues(PostgresSource.java:532)
    	at io.airbyte.integrations.source.postgres.PostgresSource.verifyCursorColumnValues(PostgresSource.java:78)
    👀 1
    e
    s
    • 3
    • 16
  • h

    Hrvoje Piasevoli

    01/03/2023, 5:32 PM
    Hi! I am wondering how upgrading airbyte to v0.40.26 is working for anyone as it seems that the temporalio/auto-setup:1.7.0 move to airbyte/temporal-auto-setup:1.13.0 has broken airbyte upgrade. I have tested this both using helm charts deployment to GKE and using stable overlay deploying to local minikube cluster. Error:
    Copy code
    + temporal-sql-tool --plugin postgres --ep airbyte-db-svc -u docker -p 5432 create --db temporal
    2023-01-03T17:19:34.826Z ERROR Unable to create SQL database. {"error": "pq: database \"temporal\" already exists", "logging-call-at": "handler.go:97"}
    2023/01/03 17:19:35 Loading config; env=docker,zone=,configDir=config
    2023/01/03 17:19:35 Loading config files=[config/docker.yaml]
    {"level":"info","ts":"2023-01-03T17:19:35.468Z","msg":"Updated dynamic config","logging-call-at":"file_based_client.go:143"}
    {"level":"info","ts":"2023-01-03T17:19:35.469Z","msg":"Starting server for services","value":["history","matching","frontend","worker"],"logging-call-at":"server.go:123"}
    Unable to start server. Error: sql schema version compatibility check failed: version mismatch for keyspace/database: "temporal". Expected version: 1.6 cannot be greater than Actual version: 1.4
    The problem seems to be in the
    update-and-start-temporal.sh
    :
    Copy code
    update_postgres_schema() {
      { export SQL_PASSWORD=${POSTGRES_PWD}; } 2> /dev/null
    
      CONTAINER_ALREADY_STARTED="CONTAINER_ALREADY_STARTED_PLACEHOLDER"
      if [ ! -e $CONTAINER_ALREADY_STARTED ]; then
          touch $CONTAINER_ALREADY_STARTED
          temporal-sql-tool --plugin postgres --ep "${POSTGRES_SEEDS}" -u "${POSTGRES_USER}" -p "${DB_PORT}" create --db "${DBNAME}"
          ...
      fi
    Steps to reproduce: (kubernetes deployment, for example minikube) 1. git checkout v0.40.25 2. kubectl apply -k kube/overlays/stable 3. git checkout v0.40.26 4. kubectl apply -k kube/overlays/stable This will result in airbyte-temporal pod crashing Fix (either works): • edit the deployment for airbyte-temoral and update the image used to temporalio/auto-setup:1.7.0 • revert
    kube/resources/temporal.yaml
    to previous version (basically changing the spec container image) and deploy
    r
    g
    • 3
    • 2
  • t

    Till Blesik

    01/03/2023, 6:14 PM
    Hi everyone, I have created a new connector using the low code / configuration method. I successfully implemented a few streams and are now running integration and the standard tests on it. It errors on missing the
    spec.yaml
    file. If I understand the tutorial correctly (Step 3: Connecting to the API | Airbyte Documentation), the spec is part of the
    source_connector-name/connector-name.yaml
    file. Am I supposed to manually copy the spec section from that file into its own file or is there a step I am missing?
    u
    • 2
    • 2
  • f

    Fabiano Pena

    01/03/2023, 8:30 PM
    Hi @Marcos Marx (Airbyte) I have followed the documentation steps and exported the dbt model. But differently from your live tutorial (

    https://www.youtube.com/watch?v=18P5_ohcu5A&t=1450s▾

    ) once I’ve ran
    docker cp airbyte-server:/tmp/workspace/${NORMALIZE_WORKSPACE}/build/run/airbyte_utils/models/generated/ models/
    it didn’t create a full dbt project containing dbt_project.yml and packages.yml files. It could only export the folder
    models
    . Is there anything else I can do to export whole dbt project?
  • r

    Robert Put

    01/03/2023, 10:28 PM
    is there any documentation on how automatic schema discovery works?
    AUTO_DETECT_SCHEMA=false
    Is this supposed to pull in changes to schema at the start of syncs?
    s
    • 2
    • 1
  • z

    Zaza Javakhishvili

    01/04/2023, 3:04 AM
    I found the reason... So connection has two failed attempts and each attempt pulled some rows. Now I have very important question: Data will be duplicated or not?
  • a

    Avi Sagal

    01/04/2023, 8:19 AM
    Hi, i’m using a connection between Google Analytics (Universal Analytics) and Postgres. the connection failed due to external issue (quota that was solved) but when i tigger again the connection i get this error:
    Copy code
    Failure Origin: source, Message: Checking source connection failed - please review this connection's configuration to prevent future syncs from failing
    is someone familiar with this issue? how did you solve it? Thanks:)
    u
    • 2
    • 2
  • a

    Andrey Tyukavin

    01/04/2023, 10:36 AM
    Hey everyone! I'm running into this issue, does anyone know any workarounds? Or is there a chance it will be fixed soon?
    TLDR: Full refresh | Overwrite clears the destination tables on the Normalization phase when it fails to connect to the source database.
    This is crucial for our use case, because we have to run a rather long Full refresh replication, and if the data gets deleted there's a big gap until the next successful run. Thanks a lot!
    👀 1
    n
    • 2
    • 1
  • g

    Gleber Baptistella

    01/04/2023, 10:46 AM
    Hi folks! After upgrade to 0.40.26 we are facing a lot of this error:
    Copy code
    java.lang.RuntimeException: io.airbyte.workers.exception.WorkerException: Running the launcher replication-orchestrator failed
    	at io.airbyte.commons.temporal.TemporalUtils.withBackgroundHeartbeat(TemporalUtils.java:314) ~[io.airbyte-airbyte-commons-temporal-0.40.26.jar:?]
    	at io.airbyte.workers.sync.LauncherWorker.run(LauncherWorker.java:114) ~[io.airbyte-airbyte-commons-worker-0.40.26.jar:?]
    	at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$5(TemporalAttemptExecution.java:195) ~[io.airbyte-airbyte-workers-0.40.26.jar:?]
    	at java.lang.Thread.run(Thread.java:1589) ~[?:?]
    Caused by: io.airbyte.workers.exception.WorkerException: Running the launcher replication-orchestrator failed
    	at io.airbyte.workers.sync.LauncherWorker.lambda$run$3(LauncherWorker.java:230) ~[io.airbyte-airbyte-commons-worker-0.40.26.jar:?]
    	at io.airbyte.commons.temporal.TemporalUtils.withBackgroundHeartbeat(TemporalUtils.java:309) ~[io.airbyte-airbyte-commons-temporal-0.40.26.jar:?]
    	... 3 more
    Caused by: io.airbyte.workers.exception.WorkerException: Orchestrator process exited with non-zero exit code: 1
    	at io.airbyte.workers.sync.LauncherWorker.lambda$run$3(LauncherWorker.java:210) ~[io.airbyte-airbyte-commons-worker-0.40.26.jar:?]
    	at io.airbyte.commons.temporal.TemporalUtils.withBackgroundHeartbeat(TemporalUtils.java:309) ~[io.airbyte-airbyte-commons-temporal-0.40.26.jar:?]
    	... 3 more
    I've found a similar thread without an answer... https://airbytehq.slack.com/archives/C021JANJ6TY/p1661440742418679 Could someone help me?
    • 1
    • 1
  • n

    Nicolas Xu

    01/04/2023, 10:57 AM
    Hi everyone, wishing you the best of the year to come. I'm trying to schedule airbyte's runs on sources that fetch data from a "start_date" but it's not optimal since the volume of the data is only going to increase as the times goes. I sought help on this documentation : https://airbyte-public-api-docs.s3.us-east-2.amazonaws.com/rapidoc-api-docs.html#post-/v1/sources/update Would anyone have an example on how to use this with an orchestrator (Airflow in my case) and if it works ? (I see the issues is still open https://github.com/airbytehq/airbyte/issues/10001)
    u
    g
    • 3
    • 3
  • r

    Rachel RIZK

    01/04/2023, 11:27 AM
    Hello, I'm still waiting for a review on this same PR, is anyone available please? 🙏
    u
    s
    • 3
    • 3
  • a

    Aravindh MK

    01/04/2023, 12:45 PM
    Hii Everyone !! I Have successfully created connection with the ec2 instance with my machine. but when i check the deployment it fails!! anyone please help me1
    d
    u
    • 3
    • 7
  • m

    Mauro Veneziano

    01/04/2023, 1:37 PM
    Hello everyone! Is there an equivalent to pressing the "Refresh source schema" button using the airbyte api? I need to update source schema for hundreds of connections. I read that you need to run a sync, but for so many connections that would take a LONG time. Also, if i have to do it by hand each time i think im just going to roll into fetal position, cry and quit my job.
  • s

    Sandeep Yadav

    01/04/2023, 1:57 PM
    Hi Everyone, Do we thing this bash script will run in user data ? doesn't look like bash script. It is failing for me when running this bash on linux box
    u
    u
    • 3
    • 5
  • r

    Renato Todorov

    01/04/2023, 2:49 PM
    Hello everyone. I'm having a strange failure with Airbyte deployed to Kubernetes. Most of the API calls are retuning a "upstream request timeout" error and I'm currently not able to understand where is it coming from because there are absolutely no errors in the logs, even with DEBUG enabled. I've logged an issue here, any help is appreciated: https://github.com/airbytehq/airbyte/issues/20963
    s
    • 2
    • 1
  • a

    Abubakar Alaro

    01/04/2023, 2:52 PM
    Hello everyone, I'm building a custom destination connector and would like to add some logging to it. How can I go about it? Thanks
    n
    n
    • 3
    • 3
  • d

    Davison Rebechi

    01/04/2023, 3:22 PM
    Hi Everyone, I'm using a SQL Server connector for the CDC process, the process is working correctly but the numeric data types of the table were not mapped (recognized), making it impossible to bring the numeric records. I would like some help if it is possible to edit and add the missing fields and data types in the SQL Server connector source? Thanks • Is this your first time deploying Airbyte?: No • OS Version / Instance: Ubuntu • Memory / Disk: 16Gb / 100 Gb • Deployment: Kubernetes deployment? • Airbyte Version: 0.40.18? • Source name/version: Microsoft SQL Server (MSSQL)ALPHA 0.4.26 • Destination name/version: • Step: The issue is happening during creating the source connection • Description: I configured the SQL Server source connector for the CDC process, the CDC is working normally but some numerical columns of the tables do not appear mapped by the SQL Server source connector, in this case the “QT_EQUIP” column. There are other numerical columns with the same problem, I would like some help if it is possible to edit and add the column that the connector didn’t map? • Thanks https://discuss.airbyte.io/t/sqlserver-source-connector-does-not-recognize-data-types/3587
    n
    • 2
    • 3
  • j

    Justen Walker

    01/04/2023, 4:49 PM
    Is there an optimal way of using Snowflake as a destination for Airbyte loads? We periodically run into long running queries where the snowflake connector has set up a transaction to copy multiple temporary tables into target tables as a single transaction; and they block progress unless they are killed. I believe this happens when queries start to queue if the warehouse is overloaded, but it seems like even sizing up the warehouse after the queuing happens doesn't resolve the issue.
    r
    n
    u
    • 4
    • 9
  • a

    Aravindh MK

    01/04/2023, 5:47 PM
    sudowgethttps://github.com/docker/compose/releases/download/1.26.2/dockercompose-$(uname-s)-$(uname -m) -O /usr/local/bin/docker-compose
    s
    • 2
    • 1
  • a

    Aravindh MK

    01/04/2023, 5:48 PM
    some body explain whats the uname??
  • t

    Tamas Foldi

    01/04/2023, 5:56 PM
    hey, I’d like to specify storage class for the airbyte helm chart (for instance to use EFS CSI driver based storage class for minio) but it seems the chart does not support this. any chance that you will add it in the near future? or if you think I can add that config option as a PR
    s
    • 2
    • 3
  • t

    Thomas Pedot

    01/04/2023, 6:27 PM
    Hello, I follow the low-code doc and I wonder how to make incremental reads with query option and not with a path url ? https://docs.airbyte.com/connector-development/config-based/tutorial/incremental-reads
    u
    • 2
    • 1
  • m

    Murat Cetink

    01/04/2023, 6:45 PM
    Hi, do you know when the v0.40.27 will be released?
    n
    • 2
    • 1
  • l

    Luan Carvalho

    01/04/2023, 7:26 PM
    Hello Everyone! I’m using the Postgres connector to load some big tables to S3. Can I load one more table per time in a connection? Because now, I only could load one table per time in a connection, and I want to load multiple parallel tables per time
    s
    • 2
    • 1
  • r

    Ruud Erie

    01/04/2023, 8:05 PM
    Hi everyone I’m trying to use airbyte’s salesforce connector where it asks for client id and client secret and refresh token. I’ve been unable to get the refresh token using the guide provided. I was wondering if there’s anyone who has gone through this hands on and might be able to give me some pointers.
    o
    n
    • 3
    • 6
1...117118119...245Latest