https://linen.dev logo
Join Slack
Powered by
# troubleshooting
  • h

    Hamza Ansari

    03/29/2022, 7:43 AM
    Hi Everyone,
  • h

    Hamza Ansari

    03/29/2022, 7:49 AM
    Hi Everyone, When I am making the connection in Airbyte in which Postgres is my source and S3 is my destination. I am not able to do incremental: Append during Replication. It shows "The form is invalid. Please make sure that all fields are correct.". Please Help me.
    a
    • 2
    • 1
  • h

    Hamza Ansari

    03/29/2022, 9:38 AM
    I am using the latest version of Postgres as source and s3 as destination connectors. But I am facing an issue that the data is not transferred in the s3 path which I have given during the creation of s3 as a destination. It replicates the data in the bucket, not in the bucket subdirectory which I have given.
    c
    • 2
    • 1
  • a

    Alpana Shukla

    03/29/2022, 11:57 AM
    What are the tmp tables that are made during sync in Zendesk Support source?
    o
    m
    • 3
    • 3
  • m

    Manish Tomar

    03/29/2022, 12:25 PM
    Airbyte adds '_Airbyte' in Table names when I send data to Snowflake, But I don't want that
  • m

    Manish Tomar

    03/29/2022, 12:26 PM
    I want to give Table Name myself, How can I do that
  • m

    Manish Tomar

    03/29/2022, 12:30 PM
    Also Let's say I create a connection from Rest API to Snowflake, and I successfully sync data to Table 'XYZ' but then I renamed the Table to ABC then what happens on the next sync from Rest API to Snowflake?
    o
    o
    • 3
    • 3
  • n

    Nick Chao

    03/29/2022, 2:15 PM
    Hi, i'm having an issue with airbyte (version 0.30.25-alpha) where some time ago the sync scheduler stopped working. I have a job configured to sync with 5 minute frequency however no sync is occurring on this schedule. There is nothing in the scheduler log to indicate any sync attempt is made on this 5 minute schedule. The schedule was working initially when set up but at some point several weeks ago it stopped working. I would like to avoid restarting the service if possible to resolve this. Does anyone have any tips on where to look in the system logs?
    o
    m
    • 3
    • 6
  • j

    Javier Llorente Mañas

    03/29/2022, 4:17 PM
    Is this your first time deploying Airbyte: Non OS Version / Instance: Ubuntu Deployment: Kubernetes Airbyte Version: 0.35.57-alpha Source name/version: Salesforce 1.0.2 Destination name/version:BigQuery Desnormalized 0.2.11 Step: When running multiple connections in different pods the worker it is not able to sync the logs between sync and destination and pods end up crashing
    h
    • 2
    • 4
  • m

    Manish Tomar

    03/29/2022, 6:18 PM
    Is there any way to remove this _AIRBYTE_RAW_ from Table name on snowflake ?
    m
    • 2
    • 4
  • m

    Marcos Marx (Airbyte)

    03/29/2022, 8:30 PM
    If you encounter any issues using Airbyte, check out our Troubleshooting forum. You’ll see how others have got their issues resolved, and our team will be there to assist if your issue hasn’t been encountered yet.
  • g

    gunu

    03/30/2022, 1:31 AM
    Is this your first time deploying Airbyte: no OS Version / Instance: Linux EC2 m5.4xlarge Deployment: Docker Airbyte Version: 0.35.50-alpha Source: GitHub (0.2.23) Destination: Snowflake (0.4.24) Description: The only stream connected is an incremental stream for
    reviews
    which produces the following error:
    '>' not supported between instances of 'str' and 'NoneType'
    s
    s
    • 3
    • 11
  • a

    Anton Escalante

    03/30/2022, 5:47 AM
    Transferring using latest airbyte with mssql 0.3.17 to snowflake 0..4.24, I was tracking records per day and was surprised to find the data was emitted on the 29th but some rows were normalized today
    a
    • 2
    • 6
  • i

    Ivan Zaykov

    03/30/2022, 6:43 AM
    Hey @Marcos Marx (Airbyte) I'm just checking if you had a chance to investigate this 🙂
  • h

    Hokam Singh Chauhan

    03/30/2022, 6:51 AM
    Hi, can someone please share details about the logs location for connections that we execute.
    o
    h
    a
    • 4
    • 5
  • j

    Javier Llorente Mañas

    03/30/2022, 7:06 AM
    Sync worker failed.
  • j

    Jules Druelle

    03/30/2022, 9:31 AM
    Sorry about the delay
    • 1
    • 3
  • m

    Manish Tomar

    03/30/2022, 9:53 AM
    @Harshith (Airbyte) I don't want the default prefix _AIRBYTE_RAW for my Tables on Snowflake (Destination) How can I do that? I believe there was an option which let's us set the Table Prefix. Attaching 1 Image for that, but I can't seem to find that option now.
    m
    • 2
    • 1
  • o

    Octavia Squidington III

    03/30/2022, 1:57 PM
    loading...
    • 1
    • 3
  • m

    Matthieu Lombard

    03/30/2022, 10:12 AM
    I have the same problem. Does it exists an issue for that ?
    o
    • 2
    • 1
  • s

    Suntx

    03/30/2022, 11:14 AM
    Is this your first time deploying Airbyte: Yes OS Version / Instance: Ubuntu 20.04 Memory / Disk: 4Gb / 50 GB SSD Deployment: Docker Airbyte Version: 0.26.2-alpha Source name/version: File 0.29 Destination name/version: Postgres 0.3.4 Step: Setting new connection Description: I'm trying to sync for the first time and the process doesn't finish. I have successfully managed to add the destination and source - a file hosted in Onedrive/SharePoint and shared publicly, but the process fails to load the schema whenever i try to create the connection
    m
    • 2
    • 1
  • u

    윤도경

    03/30/2022, 5:12 PM
    Is this your first time deploying Airbyte: yes OS Version / Instance: M1 MAC Deployment: Docker Airbyte Version: dev of 0.35.62-alpha Source: mongodb-v2 Description: I can't customize the connector image. To enable awsDocumentDB with TLS, I forked the repository and modified the code by referring to the following PR. https://github.com/airbytehq/airbyte/pull/10995
    Copy code
    SUB_BUILD=PLATFORM ./gradlew build
    VERSION=dev docker-compose up
    After that, it was built and executed, but the changes were not reflected. Is there any other way to build a local image?
    m
    • 2
    • 4
  • p

    Prakash

    03/30/2022, 7:17 PM
    Hi Everyone, I am trying to use the open source faros connector but getting this below and i tried this with multiple connector same error coming when i try to pull it inside airbyte source section.
    Copy code
    ERROR i.a.s.RequestLogger(filter):110 - REQ 172.18.0.2 POST 500 /api/v1/source_definitions/create - {"name":"okta","documentationUrl":"<https://github.com/faros-ai/airbyte-connectors/tree/main/sources/okta-source>","dockerImageTag":"latest","dockerRepository":"connectprakash/okta-source"}
    o
    m
    • 3
    • 3
  • m

    Marcos Marx (Airbyte)

    03/30/2022, 8:30 PM
    If you encounter any issues using Airbyte, check out our Troubleshooting forum. You’ll see how others have got their issues resolved, and our team will be there to assist if your issue hasn’t been encountered yet.
  • a

    Andreas Flakstad

    03/31/2022, 7:56 AM
    Is this your first time deploying Airbyte: no OS Version / Instance: M1 MAC Deployment: Kubernetes Airbyte Version: 0.35.58-alpha Source: Mixpanel Description: Hey all, I’m looking at setting up the Mixpanel source. I get
    Your plan does not allow API calls. Upgrade at <http://mixpanel.com/pricing|mixpanel.com/pricing>
    . Do any of you know what sort of plan is necessary to be able to do API calls. I’m currently at Growth with the Data Pipelines addon.
    o
    • 2
    • 5
  • m

    Max Berezovskyi

    03/31/2022, 8:06 AM
    Hi team! I have connector called Personalkollen with source and destination both named test_personalkollen_11. Can I somehow acess name of this connection using code? For example you have self.name in every stream, with its help you can access correct name of stream. Is there such an option for connection name or destination name?
    o
    o
    • 3
    • 4
  • b

    Ben Hadman

    03/31/2022, 8:19 AM
    Connectors page appear to be down: https://airbyte.com/connectors
    c
    • 2
    • 1
  • h

    Hamza Ansari

    03/31/2022, 8:54 AM
    Hi Everyone I am using the Airbyte API. And I am getting an error "Response Status: Failed to fetch (CORS or Network Issue)" I am running airbyte on AWS EC2 instance. Please help me.
    o
    o
    a
    • 4
    • 5
  • n

    Nahid Oulmi

    03/31/2022, 9:39 AM
    Is this your first time deploying Airbyte: No OS Version / Instance: Debian 10 (buster) on AWS EC2, 8 cores, 16GB RAM Deployment: Docker Airbyte Version: 0.35.45-alpha *Source name: My*SQL (0.5.+) Step: MySQL connector failing to connect to database Using the MySQL 0.4.13 connector I can successfully create and connect to my source, but switching to a 0.5+ connector version I get this error :
    a
    • 2
    • 4
  • a

    Arash Layeghi

    03/31/2022, 11:03 AM
    Is this your first time deploying Airbyte: No OS Version / Instance: AWS EC2 Memory / Disk: 8Gb/30 GB Deployment: Docker Airbyte Version: 0.35.62-alpha Source: MySQL (0.5.6) Destination: Clickhouse (0.1.4) Source name/version: MySQL 8.0.28 Destination name/version: Clickhouse 22.1.3.7 Description: I created a new MySQL source and then a fresh connection from scratch to ClickHouse. When I use the sync mode of
    Icremental | Deduped + History
    I faced an error that its logs attached. The tables are created but they are empty in the Clickhouse. The exact same steps for MySQL 5.7.33 work fine in the first place. You said you support MySQL 8.0 in the documentation. Right?
    o
    a
    m
    • 4
    • 11
1...91011...14Latest