https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • m

    Miles Poindexter

    05/27/2021, 10:47 PM
    Hi, I'm running AirByte on MacOS. The app comes up OK, But when I try the Exchange Rate API in the UI and fill in the data, the UI tells me that the check failed and the APIs response was 400? But I used my API key and went directly the the API and it works.
    s
    • 2
    • 4
  • j

    Jins Kadwood

    05/28/2021, 3:33 AM
    How do I change the namespace for the destination (if possible) - using latest AB version
    c
    u
    • 3
    • 5
  • s

    Steve

    05/28/2021, 9:42 AM
    Hi, I'm getting a white screen when I visit localhost:8000 after attempting to launch Airbyte with
    docker compose up
    To try and debug it I've started from scratch by removing all existing Airbyte docker images + the Airbyte container, the local repo and re-cloning but it's always the same issue. The console output reports errors around:
    airbyte-db         | 2021-05-28 09:33:24.676 UTC [29] ERROR:  database "temporal" already exists
    airbyte-db         | 2021-05-28 09:33:24.676 UTC [29] STATEMENT:  CREATE DATABASE temporal
    airbyte-temporal   | 2021/05/28 09:33:24 error creating database:pq: database "temporal" already exists
    airbyte-temporal   | + temporal-sql-tool --plugin postgres --ep db -u docker -p 5432 --db temporal setup-schema -v 0.0
    airbyte-temporal   | 2021/05/28 09:33:24 Starting schema setup, config=&{SchemaFilePath: InitialVersion:0.0 Overwrite:false DisableVersioning:false}
    airbyte-temporal   | 2021/05/28 09:33:24 Setting up version tables
    airbyte-db         | 2021-05-28 09:33:24.709 UTC [30] ERROR:  relation "schema_version" already exists
    and
    airbyte-server     | 2021-05-28 09:33:28 ERROR i.a.s.v.VersionMismatchServer(getServer):68 - {workspace_app_root=/tmp/workspace/server/logs} - Version mismatch between 0.24.2-alpha and 0.14.1-alpha.
    I don't understand how something can "exist" after I've deleted all the Docker components and the code repo? Full output is attached here. My environment:
    macOS 11.0.1
    Docker version 20.10.6, build 370c289
    Does anyone have any ideas please? Thanks
    airByteDockerLog.txt
    ✅ 1
    c
    • 2
    • 2
  • p

    Patric

    05/28/2021, 12:54 PM
    I followed https://docs.airbyte.io/deploying-airbyte/on-gcp-compute-engine and it looks like airbyte is not able to launch the stripe source image
    u
    s
    c
    • 4
    • 118
  • s

    Steve

    05/28/2021, 1:54 PM
    Hi, Having got Airbyte on my local machine I wanted to set things up on a remote Ubuntu VM @ Google Cloud. I get this when attempting to launch Airbyte: /airbyte$ docker-compose up
    ERROR: Invalid interpolation format for "environment" option in service "scheduler": "AIRBYTE_ROLE=${AIRBYTE_ROLE:-}"
    Environment:
    Ubuntu 18.0.4
    Docker version 20.10.6, build 370c289
    It's not critical as Airbyte is running fine locally for me to investigate with, but if it's an easy fix it would be great to have it running remotely too. Thanks.
    ✅ 1
    u
    • 2
    • 4
  • n

    Nathan Atkins

    05/28/2021, 5:19 PM
    Just fired up Airbyte this morning. Eventually I want to replicate Mongo DB to Postgres. I started with Postgres to Postgres with the DVD Rental database. I got the source and destination setup and the connector working to replicate from one database to another. I had expected the table schema in the new database to match the source database, but I wound up with all the data in a jsonb blob. This matches the Postgres Destination Documentation, but is different than my expectations. Is this the expected behavior or do I need to configure something differently?
    ✅ 1
    u
    • 2
    • 3
  • n

    Nathan Atkins

    05/28/2021, 5:20 PM
    Screen Shot 2021-05-27 at 11.24.45 AM.png,Screen Shot 2021-05-27 at 11.25.01 AM.png
    c
    • 2
    • 1
  • d

    Daniel Getejanc

    05/28/2021, 9:38 PM
    Hi everyone, I was trying to set up 2 Mongo DB to Postgres connection. For one of the Mongo databases it worked perfectly fine, for the other one it didn’t. The difference I could see so for is that the one not working has to be set up using TLS. I’ve attached to logs. It would be great, if you could help me and and point me into the right direction. Thanks a lot!
    logs-e8c32a63-2946-476a-8015-f1912009ccf0-.txt
    ✅ 1
    u
    • 2
    • 8
  • p

    P.VAD

    05/29/2021, 6:51 AM
    Hello everyone, I had a hard time to connect my S3 as destination. I am using Minio as my S3. Currently my Minio S3 is not running on SSL yet so, I might need connect that not require SSL. Any suggestion?
    ✅ 1
    s
    • 2
    • 5
  • m

    Munirah

    05/29/2021, 10:51 PM
    hi everyone, I'm using Airbyte for the first time, trying to import local CSV file as a source but this error showed up
    logs-a24fccf7-999e-4ab3-90d3-7c8f7c972f24-.txt
    u
    • 2
    • 3
  • j

    John Smith

    05/30/2021, 8:18 PM
    Hi. Can airbyte be used for CDC from Postgres to Kafka? I didn't find Kafka in the list of connectors.
    ✅ 1
    u
    • 2
    • 2
  • b

    Baatch

    05/31/2021, 5:57 AM
    Hi, when can we expect support for Delta Lake 😄 ?
    ✅ 1
    d
    • 2
    • 3
  • v

    Vika Petrenko

    05/31/2021, 9:38 PM
    Hi, i am developing custom connector. Let’s say I have surveys entities and stream for it. For each entity I need: • call POST request to start export surveys responses • call POST request to download the csv/json file contains responses • import that responses from file how is it better to do? thanks!
    s
    • 2
    • 8
  • m

    Matt Hardner

    06/02/2021, 5:42 PM
    Hello I am on the last step on getting airbyte up and running on GCP. I am running
    gcloud beta compute ssh --zone "us-central1-a" --tunnel-through-iap "instance-1" --project "curantisapp-staging-shared" -- -L 8000:localhost:8000 -L 8001:localhost:8001 -N -f
    but am getting the following
    ERROR: (gcloud.beta.compute.ssh) Could not fetch resource: - Request had insufficient authentication scopes.
    ✅ 1
    u
    a
    • 3
    • 13
  • c

    Charter Smith

    06/03/2021, 2:10 AM
    Hey all! I'm just getting started and am connecting a Google Sheet as a test. When I made the connection I was only able to pull in the first tab of a spreadsheet - is there an easy way to select a different specific tab?
    ✅ 1
    s
    • 2
    • 2
  • t

    test

    06/03/2021, 5:05 AM
    Hi created a new connector clickhouse in destination while syncing getting below error 2021-06-03 042056 INFO (/tmp/workspace/15/0) DefaultReplicationWorker(run):132 - Waiting for source thread to join. 2021-06-03 042100 ERROR (/tmp/workspace/15/0) LineGobbler(voidCall):69 - Exception in thread "main" java.lang.NullPointerException 2021-06-03 042100 ERROR (/tmp/workspace/15/0) LineGobbler(voidCall):69 - at io.airbyte.integrations.base.IntegrationRunner.run(IntegrationRunner.java:112) 2021-06-03 042100 ERROR (/tmp/workspace/15/0) LineGobbler(voidCall):69 - at io.airbyte.integrations.source.clickhouse.ClickHouseSource.main(ClickHouseSource.java:114)
    ✅ 1
    s
    s
    d
    • 4
    • 10
  • m

    Mohammad Shahvez

    06/03/2021, 7:10 AM
    i have created a connection with postgres destination. where i check data after sync ?
    ✅ 1
    s
    u
    • 3
    • 10
  • m

    Mohammad Shahvez

    06/03/2021, 10:01 AM
    how to integrate airbyte with dbt on AWS EC2
    👀 1
    u
    • 2
    • 1
  • a

    Anastasia

    06/03/2021, 10:25 AM
    Do you know any SQL editors with suggestions like in google docs (of there is an error your word is underlined)
    ✅ 1
    m
    c
    d
    • 4
    • 4
  • s

    Slackbot

    06/03/2021, 1:54 PM
    This message was deleted.
    👀 1
    u
    m
    • 3
    • 2
  • v

    Vika Petrenko

    06/03/2021, 8:29 PM
    Is it right that
    get_updated_state
    returned state doesn’t do anything until is using in update params in url? or it skips updating items that are under the latest state?
    ✅ 1
    s
    • 2
    • 3
  • c

    Chirag Kakani

    06/04/2021, 9:16 AM
    Where can I get the API token from?
    d
    u
    +2
    • 5
    • 11
  • v

    Vika Petrenko

    06/06/2021, 11:40 PM
    Does it make sense to filter items in
    read_records
    if source API does not support query params to get incremental new or modified data and use like an incremental stream?
    Copy code
    def read_records(...) -> Iterable[Mapping[str, Any]]:
            filtered = []
            items = super().read_records(sync_mode=sync_mode, cursor_field=cursor_field, stream_slice=stream_slice, stream_state=stream_state)
            for item in items:
                if self._field_to_datetime(item[self.cursor_field]) > self._field_to_datetime(stream_state[self.cursor_field]):
                    filtered.append(item)
            yield from filtered
    s
    • 2
    • 4
  • v

    Vika Petrenko

    06/06/2021, 11:45 PM
    Is it possible to set partition field when using full refresh append? This way it would be possible to set partition expiration and automatically get rid of old data.
    s
    • 2
    • 5
  • v

    Vika Petrenko

    06/07/2021, 12:24 AM
    How to watch item removal with the incremental stream? is it supposed soft deletion when the row is included in the response?
    ✅ 1
    s
    d
    u
    • 4
    • 7
  • g

    gunu

    06/07/2021, 8:55 AM
    hey team. where can one configure what the destination schema name? It seems to automatically use the source
    database
    name. Current use-case: RDS (mysql) --> snowflake
    ✅ 1
    d
    • 2
    • 5
  • g

    gunu

    06/07/2021, 9:00 AM
    one more, (and apologies i feel like i should know where this is but just cant find it) where can I see changes for connectors from one version to another e.g. mysql 0.3.3 --> mysql 0.3.4
    u
    • 2
    • 3
  • h

    Hawkar Mahmod

    06/07/2021, 1:21 PM
    Hey folks, Is it possible to stop the creation of multiple tables for streams created from API data sources that have nested data? Ideally I just have one table with the top level data and I can nest as I need to further down my pipeline. I’m not referring here to stopping normalization altogether.
    ✅ 1
    c
    • 2
    • 5
  • l

    Ly Pham

    06/07/2021, 1:51 PM
    can anyone suggest me how to do it easier, have tried using the puckel airflow and add the requirement.txt && in the docker file but none of that works. All of them say that
    module airflow.providers
    is not found. But the directories do exists. How can i fix this?
    u
    • 2
    • 4
  • l

    Ly Pham

    06/07/2021, 1:52 PM
    also, how about execute pipeline from CLI? I saw the CLI in the front page but have yet find any resources guiding how to use it
    u
    • 2
    • 3
12345...245Latest