https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • a

    Akhil Reddy

    12/08/2022, 4:39 AM
    Hi all, I am working on building a source connector for PostgreSQL. My PostgreSQL is running on Google Cloud SQL. While trying to make the connector, I am getting the timeout error. Would anyone be able to help me with the same?
    j
    • 2
    • 2
  • a

    Anand PS

    12/08/2022, 6:02 AM
    Hi All, I tried Airbyte for data ingestion from Redshift to Oracle DB(On-Prem) and I see target table has json data and it took 41mins for 50k rows. As destination is On prem it is taking time right? Or is there any thing to do performance tuning?
    n
    • 2
    • 6
  • g

    Gopinath Sekar

    12/08/2022, 6:07 AM
    Hey guys, Is there anyway that we could incorporate User Creation and Management in Airbyte Open Source?
    this 1
    u
    s
    h
    • 4
    • 3
  • j

    JJ Nilbodee

    12/08/2022, 9:29 AM
    Hey all, just want to ask which part of airbyte is doing normalisation? I upgraded from 0.40.22 to 0.40.23 and my connection between HubSpot and BigQuery failed during normalisation. Couldn’t find anything related in release logs. I’ve attached the logs here.
    3c827b13_29e4_41bf_9dbd_ccdf09e5f6a2_logs_92_txt.txt
    j
    j
    +2
    • 5
    • 8
  • r

    Ramon Vermeulen

    12/08/2022, 10:11 AM
    8 days ago we upgraded airbyte from
    0.40.14
    to
    0.40.22
    and I see there is a specific deployment (pod in k8s) for the cron scheduler now. However, all pipelines of us which had a cron schedule configured before the upgrade are not triggered for 8 days now (we just found out) after the upgrade anymore. After re-saving the form with the cron schedule it seems to be working again, any idea what has been gone wrong here? Is this a known issue, maybe some bug in this specific version of airbyte or the related helm chart? We used helm chart version
    0.42.2
    and all pods seem to be in a green state.
    ✅ 1
    m
    • 2
    • 13
  • g

    Gopinath Sekar

    12/08/2022, 10:52 AM
    Hello All, Does anyone know where the env file is stored for the docker installed version of Airbyte Open Source?
    s
    • 2
    • 1
  • w

    Wasantha P. Bandara

    12/08/2022, 11:00 AM
    Hi Team Airbyte, We are deploying Airbyte 0.40.15 on our server through the docker image and creating a woocommerce source using API with the wrong "api_key" and "api_secret". Request Type: "POST" Endpoint : "<endpoint>/v1/sources/create" But the response is always a success with response code 200. It creates a source using the wrong "api_key" and "api_secret" which is unable to connect.
    n
    • 2
    • 1
  • r

    Ravikant Singh

    12/08/2022, 11:31 AM
    Hi team, I am not able to download source schema for salesforce, please can you tell what could be issue, I am getting null in response. with log:
    Copy code
    b'{"catalog":null,"jobInfo":{"id":"c3761b12-b42b-41f6-802d-a3765962b061","configType":"discover_schema","configId":"Optional[b117307c-14b6-41aa-9422-947e34922962]","createdAt":1670498905844,"endedAt":1670498941647,"succeeded":false,"logs":{"logLines":["2022-12-08 11:28:25 \\u001B[32mINFO\\u001B[m i.a.w.t.TemporalAttemptExecution(get):107 - Cloud storage job log path: /workspace/84e8d005-431a-426a-beeb-eaf1239e6dfc/0/logs.log"]}}}'
    r
    u
    • 3
    • 3
  • s

    Sebastian Brickel

    12/08/2022, 3:08 PM
    Hi folks, I have a question regarding the environmental parameter
    TEMPORAL_HISTORY_RETENTION_IN_DAYS
    Yesterday I have set it 1 in the
    docker-compose.yml
    file, then executed compose up. When I checked today I still had all folders from yesterday in my
    /tmp/workspaces
    directory. Why is that? Is this not the correct parameter for setting a retention period of the workspaces? FYI Ultimately I want to set the retention period to one or two weeks, 1 day was just to check for a quick result Thanks for your help!
    ✅ 1
    • 1
    • 2
  • a

    A BK

    12/08/2022, 4:14 PM
    hi folks, glad i found out this slack channel. I have a container based deployment in AWS and trying to hit the airbyte UI. When I do that using the 8000, port I am able to see ngnix and not the airbyte UI. I am pretty sure it's due to the SG inbound rules that is out there.. Wondering what I am missing.
    s
    s
    • 3
    • 10
  • d

    Don H

    12/08/2022, 4:57 PM
    When I stand-up Airbyte using Helm chart 0.42.4, the minio pod is stuck in pending and will not start. I have had this problem periodically from version to version and I do not understand the cause. Is there a way around minio?
    m
    • 2
    • 9
  • j

    Jeremy Owens

    12/08/2022, 4:57 PM
    Good morning. If I were to attempt to resurrect a connection that was accidentally deleted. How would I go about doing that?
    n
    w
    u
    • 4
    • 5
  • j

    Josh Chapnick

    12/08/2022, 5:10 PM
    Hi - I’m working to deploy airbyte on k8s and was wondering is it possible to scale the workers down to zero when there are no sync processes running? It looks like one worker is running even when no sync jobs are running.
    s
    • 2
    • 1
  • d

    Dan Leshem

    12/08/2022, 7:49 PM
    Hi folks, just starting to play with airbyte and I wonder if it’s a good fit for my use case: We need to collect and analyze data on behalf of our clients from OAuth apps (for example, GitHub). So our clients would give us (not Airbyte) OAuth permission to the source. How would that work with Airbyte? Can we still use the Github-connector or do have to develop a custom connector? Also, is it possible to create a new source from the API?
    m
    • 2
    • 1
  • y

    Yanfeng Wang (Tom)

    12/09/2022, 6:17 AM
    Hi, team. I have just upgraded from 0.39 to 0.40.25, all connections can not sync data. report error: ""
    a
    m
    • 3
    • 3
  • j

    junfeng pan

    12/09/2022, 7:23 AM
    Hi guys, does upgrading the connector source require deleting the original connection and creating a new one
    a
    u
    • 3
    • 5
  • s

    Soshi Nakachi仲地早司

    12/09/2022, 9:50 AM
    Hi, team. I am trying to do an integration using appsotre. Related threads and issues are below. • https://airbytehq.slack.com/archives/C021JANJ6TY/p1661245951575079?thread_ts=1661161170.202709&amp;cid=C021JANJ6TY • https://discuss.airbyte.io/t/normalization-failed-during-appstore-dbt-run-for-the-sales-report/2340 The date format in appsotre sales reports is USA (mm/dd/yyyy), not ISO 8601. Therefore, CAST does not work when Destination is set to BigQuery. I think it would be better to output the date type as a string type. Or is it possible to convert using Custom schema type transformation? Are there any other good solutions?
    e
    u
    • 3
    • 8
  • l

    Luan Carvalho

    12/09/2022, 10:20 AM
    Hello everyone! A problem exists with the new helm chart version 0.43.0. I tried to make a simple deploy using the default values without making any changes, and I got this error:
    Copy code
    Error: INSTALLATION FAILED: template: airbyte/templates/env-configmap.yaml:67:42: executing "airbyte/templates/env-configmap.yaml" at <.Values.worker.containerOrchestrator.image>: nil pointer evaluating interface {}.image
    Executed command: helm install my-airbyte airbyte/airbyte --version 0.43.0 -n airbyte
    e
    d
    • 3
    • 4
  • v

    Vivek Ayer

    12/09/2022, 4:05 PM
    Hi team, need help figuring out why an initial sync using the snowflake source (to the S3 destination) is resulting in a network timeout. Incremental syncs work just fine! This appears to be a cold-start issue and can’t figure out if the issue is on the Airbyte EC2 instance or on the Snowflake side:
    at io.airbyte.integrations.source.snowflake.SnowflakeSource.main(SnowflakeSource.java:47) Caused by: net.snowflake.client.jdbc.SnowflakeSQLLoggedException: JDBC driver internal error: Timeout waiting for the download of #chunk164(Total chunks: 1147) retry=1.
    The sync payload size is around 20GB. The EC2 instance was initially an
    xlarge
    and got upgraded to a
    2xlarge
    , but doesn’t look like changing the instance type is helping. Any help appreciated! Airbyte Version 0.40.23
    r
    m
    n
    • 4
    • 25
  • d

    David Anderson

    12/09/2022, 6:01 PM
    im getting a very odd new error after upgrading from 40.23 to 40.24. this only seems to happen with one connector (hubspot). ive tried rebuilding the connection from scratch and im getting the same error with the new connection. any ideas?
    Copy code
    2022-12-09 17:59:05 - Additional Failure Information: message='java.lang.NullPointerException: Cannot invoke "java.lang.Boolean.booleanValue()" because the return value of "io.airbyte.persistence.job.models.IntegrationLauncherConfig.getIsCustomConnector()" is null', type='java.lang.RuntimeException', nonRetryable=false
  • z

    Zander Otavka

    12/09/2022, 9:50 PM
    I’m looking to deploy Airbyte on AWS in production. We have a fairly large dataset with about 500 million monthly active rows. We are debating between using EKS or EC2. In general, which of these is easiest to manage? Which is more likely to be able to scale to meet our needs?
    a
    u
    • 3
    • 6
  • o

    Offisong Emmanuel

    12/11/2022, 7:40 AM
    Good day I have issues setting up Airbyte open source. It shows completed when I check my terminal but when I run the localhost:8000, it doesn’t bring out the UI
    ✅ 1
    • 1
    • 1
  • s

    Slackbot

    12/11/2022, 9:54 PM
    This message was deleted.
    • 1
    • 1
  • s

    Slackbot

    12/11/2022, 9:55 PM
    This message was deleted.
    • 1
    • 1
  • v

    Victor Jin

    12/12/2022, 7:17 AM
    Hello I have an issue to connect Airbyte to DynamoDB : "The connection tests failed: common.error" Is someone familiar with this ? Thanks
    m
    • 2
    • 15
  • g

    Gaurav Kumar

    12/12/2022, 10:29 AM
    Hello, team! I have created a custom connector using airbyte cdk which extracts data from an API and I am using azure blob storage as destination. I want my data in a particular format in azure blob storage. To be specific, I don't want _airbyte___ab_id and _airbyte_emitted_at columns which were added by airbyte. Is there any way to remove or not add these columns in airbyte while syncing?
    e
    q
    s
    • 4
    • 3
  • r

    Ramon Vermeulen

    12/12/2022, 10:32 AM
    Any clue on how to debug this? When trying to set a destination or source to a different version in the UI I get
    error, something went wrong
    with the following logs in the server pod:
    Copy code
    POST 500 /api/v1/destination_definitions/update - {"destinationDefinitionId":"22f6c74f-5699-40ff-833c-4a879ea40133","dockerImageTag":"1.2.7"}
    e
    n
    j
    • 4
    • 19
  • a

    Alon Edelman

    12/12/2022, 11:02 AM
    Hi. i am trying to sync from MSSQL to s3. it works on a small schema but when i have a big schema(2k tables) i can't set up the connection and getting this error: "non-json response". i tried using the API and was able to add one table, but then i try to do any UI changes i am back to the error. PS using docker. anyone has any idea what can i do ? 10x
    q
    u
    • 3
    • 3
  • m

    Mani M

    12/12/2022, 11:03 AM
    hello, I’m facing an issue in our staging server. We have deployed airbyte
    v0.40.6
    . I’ve configured minio to log into GCS and it was working fine however now airbyte throws error saying that there are not enough permissions. However the serviceaccount which is being used has all the permissions. The same setup has been done in dev and prod it’s working fine there
    e
    m
    • 3
    • 4
  • t

    Tobias Löfgren

    12/12/2022, 11:56 AM
    Hi everyone! I am currently working with Octavia-CLI to redeploy our airbyte instance. I am following the tutorial how to set it up and creating sources and destinations is no issues at all. However, my stored passwords in my local ~/.octavia file are not exported into my airbyte instance. Does anyone know what might be failing? Airbyte instance: Version 0.40.23, Running with docker on a GCP Virtual Machine Octavia Cli: Installed with docker run, according to this My ~/.octavia file includes:
    AIRTABLE_API_KEY="password"
    The configuration file for airtable inclues:
    tables: [Reported Time] # REQUIRED
    api_key: ${AIRTABLE_API_KEY} # SECRET
    base_id: appWvfaXxrKlmO7aP # REQUIRED
    When I run octavia apply, it recognises that I have updated my secret password and updates the configuration. However, when I go into the airbyte UI and look at airtable it says ‘Invalid API key’ meaning the environment variable haven’t been applied
    s
    • 2
    • 1
1...107108109...245Latest