https://linen.dev logo
Join SlackCommunities
Powered by
# ask-community-for-troubleshooting
  • b

    Baiyue Cao

    07/14/2022, 5:56 PM
    Hi all, how do I access secrets in the UI when setting up a source after setting up SECRET_STORE_GCP_CREDENTIALS and SECRET_STORE_GCP_PROJECT_ID? I don't see documentation on how to reference secrets in source set up?
    h
    • 2
    • 1
  • c

    ChrisH

    07/15/2022, 2:08 AM
    Ingesting data from OneDrive Personal.
  • c

    ChrisH

    07/15/2022, 2:27 AM
    With my Office 365 account, I get 1TB of storage in OneDrive with no egress costs. Perfect for handling multiple files that are multiple GB in size. This is quite affordable compared with many other storage options. But, Airbyte has no source connector for OneDrive. I looked into using the SFTP connector, but my OneDrive Personal account can't have an SFTP share. I have another service that can transfer my files from OneDrive to my S3 storage account on Linode (Not AWS). However, the Airbyte S3 bucket connector keeps giving me the following error:
    ValueError('Invalid endpoint: <http://votesentry.us-southeast-1.linodeobjects.com|votesentry.us-southeast-1.linodeobjects.com>')
    I've tried numerous configurations for the endpoint name, TLS settings, etc. but always the same error. I'd love to connect to OneDrive, but barring that if anyone is using Linode, I would love to see the configuration for the S3 connector.
    h
    • 2
    • 1
  • i

    Ivan Zhabin

    07/15/2022, 7:56 AM
    Hi all, I saw the clickhouse 0.1.8 update (Added JDBC default parameter socket timeout), I did the update, but no fields for entering parameters appeared.
    ✅ 1
    h
    • 2
    • 4
  • p

    per sunde

    07/15/2022, 9:08 AM
    I have a problem, "Incremental Sync - Deduped History" does NOT show up in the "Set up connection" page for Postgres to MySQL. I can not select it for Sync mode, as seen in the image. Incremental Sync - Deduped History does not show up for Postgres -> MySQL. It does however show up if I do: • Postgres -> Postgres • MySQL -> Postgres So why can I not choose Postgres -> MySQL with Incremental Sync - Deduped History? If it works for Postgres to Postgres for the same source DB and tables. Then why would it not work for Postgres to MySQL? It is the same source DB and table. I am just testing out the OSS version and making a MVP, as we want to see how it works before fully committing to Airbyte.
    h
    • 2
    • 3
  • b

    Brian Mullins

    07/15/2022, 9:55 AM
    Hi all, new to Airbyte and I have built the container image locally but failed to create a connection to Hubspot source.. it seems to just hang..I also tried to connect to MySQL database and the same is happening..any ideas what the issue may be?
    h
    • 2
    • 3
  • m

    Mangesh Nehete

    07/15/2022, 10:16 AM
    Hi Team, We have set up 2 connections as follows: Connection Source Destination No of streams Sync Mode(for all streams) Frequency Data Size/Sync Avg Time taken/Sync Connection1 Oracle DB Snowflake 15 Full refresh | Overwrite Every 24 hours 11.28 GB 1 hour 38 minutes Connection2 Oracle DB Snowflake 4 Full refresh | Overwrite Every 24 hours 8.32 GB 42 minutes We want to reduce the average time taken for these 2 jobs. Could you please suggest us any approach/solution to achieve it.
  • s

    Stepan Chatalyan

    07/15/2022, 2:53 PM
    Hi team! Could you provide the correct link to this pls? https://docs.airbyte.com/connector-development/tutorials/cdk-tutorial-python-http/define-inputs#:~:text=ConnectorSpecification (connector specification)
    ✅ 1
  • p

    Paul Sims

    07/15/2022, 6:13 PM
    We did some testing of Airbyte Open Source installed on our Azure VM using a Docker container loading data into a Postgres DB on that same VM (not in Docker). Everything was working, so now we’re trying to setup the same connection in Airbyte Cloud. We added Azure inbound port rules for Airbyte’s public IP to allow Airbyte Cloud to connect to our Postgres DB on our Azure VM. However we get an error when setting up our destination, “Could not connect with provided configuration. HikariPool-1 - Connection is not available, request timed out after 30009ms”. The hostname is our public IP for the Azure VM and we have the correct port number. We have Rudderstack cloud hitting that same Postgres DB using port rules and it works fine, so I know our port rules are working correctly. Any thoughts?
    h
    • 2
    • 1
  • s

    Shift 'N Tab

    07/16/2022, 4:15 PM
    Beginner question❓ it is possible for AirByte to send data from a postgres (write only) database to other python servers (Flask/Django/FastAPI) ? So far what I found is it can manage to replicate database postgres to a separate postgres. i.e. write-only db to read-only db. So what about database to python servers?
  • s

    Shift 'N Tab

    07/16/2022, 4:26 PM
    ^ nevermind it looks like it should be Postgres -> AirByte -> RabbitMQ -> External Servers 👍
  • r

    Rajendra Badri

    07/18/2022, 10:45 AM
    Its really strange we have tried multiple times to sync data with different options, so for all data types of INT & INTEGER of MSSQL source db data airbyte / google bigquery updates always / converts data type to FLOAT Where its wrong can any one please suggest us how to fix this. It create `_airbyte_raw_`* tables with fields _airbtye_data holds JSON data_ and update all INT fields to FLOAT,, for us looks really strange how and where its happening. we have tried to sync data using hevodata.com and it all seems to be fine on that, source and destination schema and data types are same it is. but here on airbyte tool its different ? can you please suggest us to resolve this
    h
    • 2
    • 1
  • m

    Muhammad Imtiaz

    07/18/2022, 1:22 PM
    Hi, I'm trying to add Microsoft sql source connector in airbyte version
    0.35.31-alpha
    with SSL Method: Encrypted (Verify Certificate). I've enabled SSL option on the RDS instance side, but when I add the details, I got the following error I don't see
    .ssl_method.trustStorePassword
    and
    ssl_method.trustStoreName
    fields in the UI in version ``0.35.31-alpha` and
    0.39.17-alpha
    both version. Can anybody let me know where to set these values? Or these fileds have been added to newer version? PS: Following is the PR that adds these fileds in the UI. https://github.com/airbytehq/airbyte/pull/3195#:~:text=This%20displays%20in%20the%20UI%20as%20follows%3A
    • 1
    • 1
  • a

    Arjun

    07/18/2022, 1:56 PM
    Hi all, What is the correct approach to develop an HTTP API source connector (using the Python CDK) with SSL verification disabled? Based on my understanding on the docs, I can manually specify this is in the
    check_connection
    function using
    verify=False
    (if I use
    requests
    ). But, what about the stream class (which extends
    HttpStream
    )? How can I do the same there as there is no explicit
    requests
    call? Apologies if it is a basic question but I couldn't find an answer online. Thank you.
    ✅ 1
    a
    • 2
    • 2
  • ö

    Ömer Oğuz Özkahraman

    07/18/2022, 3:31 PM
    Hi, I’m trying to setup a BigQuery destination with GCS staging, however I get this error:
    Access Denied: Project <project>: User does not have bigquery.jobs.create permission in project <project>.
    I have the correct permissions set for the service account (BQ User and BQ Data Editor) and I still get the error. Can someone help? Thanks in advance.
    h
    • 2
    • 1
  • m

    maxim

    07/18/2022, 3:55 PM
    Hey all! Is it possible to set default workspace ID by myself before running docker compose? If yes, tell me please how to do that 🙏
    h
    l
    • 3
    • 7
  • y

    Yaning Zhang

    07/18/2022, 8:15 PM
    Hi, I’m trying to connect data from a MySQL 5.6.51 database to Snowflake using an Airbyte instance that someone else in my company set up, and I’m seeing a bunch of dbt errors during the load attempts.
    Copy code
    2022-07-18 18:58:08 normalization > 18:58:08  Database Error in model MRXW_LAV (models/generated/airbyte_tables/UMLS/MRXW_LAV.sql)
    2022-07-18 18:58:08 normalization > 18:58:08    002003 (42S02): SQL compilation error:
    2022-07-18 18:58:08 normalization > 18:58:08    Object 'UMLS_DATA_DEIDDEV.UMLS._AIRBYTE_RAW_MRXW_LAV' does not exist or not authorized.
    2022-07-18 18:58:08 normalization > 18:58:08    compiled SQL at ../build/run/airbyte_utils/models/generated/airbyte_tables/UMLS/MRXW_LAV.sql
    As these seem to be models that are set up by the connection at the top of the sync run, I’m not sure what’s needed or why they wouldn’t exist. The sync seems to be able to stage all of the data (using the internal “recommended” option), but fails as soon as it tries to transform that data into output to Snowflake. The data I’m trying to load includes some large tables (records in the hundreds of millions), and one of the questions I had was whether the recommended option could support staging that much data? Or if there’s some other pattern that I might be missing for ensuring that the sync can run to completion. Thanks!
  • d

    Deepak Dalai

    07/19/2022, 4:43 AM
    Hello from India. Airbyte noob here. Trying to setup a connection from PostgreSQL. Getting the following error for a rather large table (50M records) after it has successfully read 20M. Any help shall be appreciated. Thanks in advance.
    h
    • 2
    • 1
  • c

    Chen Lin

    07/19/2022, 2:49 PM
    Hi all, I noticed that after deleting a connection I can still find it with status deprecated using /v1/web_backend/connections/list_all, is there a way to completely remove a connection from a workspace? Thanks
  • p

    Ping-Lin Chang

    07/19/2022, 9:36 PM
    Hello team, I wonder what's the proper way to figure out the one-to-one mapping between the source and destination data? I know we have
    _airbyte_ab_id
    uuid in the destination db table, but how do I use the uuid to find out the original record in the source? Thanks!
    м
    • 2
    • 1
  • j

    Joey Taleño

    07/20/2022, 3:23 AM
    Hello Team! Is there anyone here tried to load Google Analytics UA?
  • s

    Shift 'N Tab

    07/20/2022, 5:03 AM
    Is AirByte Database replication with Change Data Capture feature limited in open-source version?
  • e

    Ethan Brown

    07/20/2022, 5:23 AM
    Just exploring Airbyte for a possible solution for our team. I wanted to learn more about the connector scaling. We are needing to pull data from various custom sources on behalf of our users each with their own credentials. My understanding is we would have a connection setup for each on of these sources/user pairs. Is airbyte a viable option for this usecase? Will it scale well enough for my usecase? The docs reference "running over a thousand connections". I'm assuming that means actively syning over a thousand at a time. If it's active syncs then I suspect it would be fine for our usecase. If we bump up to any limits we can mitigate it with a queue or will the built in schedular handle it fine? https://docs.airbyte.com/operator-guides/scaling-airbyte
    ...only uncommonly large workloads will require deployments at scale. In general, you would only encounter scaling issues when running over a thousand connections.
    As a reference point, the typical Airbyte user has 5 - 20 connectors and 10 - 100 connections configured. Almost all of these connections are scheduled, either hourly or daily, resulting in at most 100 concurrent jobs.
  • m

    Mathieu Beau

    07/20/2022, 4:56 PM
    Hello! My developer encounters some difficulties to install an unofficial data source (woocommerce), we did not find enough explicit documentation about this process, would you have a link to recommend please? Thanks
    k
    • 2
    • 5
  • l

    Leonardo Almeida Reis

    07/20/2022, 11:25 PM
    Hello! How can I setup a new MySQL source which is inside a docker container running on a VM? On the add source panel I'm trying to add it with a SSH connection to the VM, but I don't know how to indicate in which of the containers mysql is running on BTW, VM is using centos6 and it's using docker-compose to run microservices in separate containers
  • j

    Jaladanki Varaprasad

    07/21/2022, 7:28 AM
    @Channel Tools Hi i am getting this error while syncing the data from redshift to kafka Sync is success even though it's failing. Please any help
    k
    l
    • 3
    • 3
  • m

    Mathieu Beau

    07/21/2022, 8:07 AM
    Hello, I am trying to connect a google sheet as source (with oauth). In the documentation it's indicated to click on "connect with google" but there is no such button, only 3 fields : client id, client secret, refresh token. Where to get those id please? Did I miss something with the button connect with google I should see? Thank you
    a
    • 2
    • 3
  • h

    Huib

    07/21/2022, 3:50 PM
    Can anyone point my in the right direction? I’m struggling to get Airbyte to reliably sync my mssql tables to S3 - Airbyte is running in k8s, and quite often syncs fail due to pods that aren’t scheduled or things timing out somewhere. I just kicked off a very small sync job which hung for 45 minutes (after creating and running the source and destination checks) - cancelling another sync immediately started the read and write pods, starting the sync. What can I try to make airbyte more responsive?
  • v

    Vinicius Nunes

    07/21/2022, 4:58 PM
    My airbyte doesn't show up connectors in alpha as the meta base, how can I import? Could someone explain to me, I'm a beginner and I don't understand very well
  • a

    abdelrahman Ibrahim

    07/21/2022, 5:54 PM
    Hello, I just installed stable airbyte on an on-premise Kubernetes cluster. All the pods are running except airbyte-webapp
1...545556...245Latest