https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • k

    Krzysztof

    11/23/2022, 11:16 AM
    Guys if i may ask to release one single stable version - from 2 months there were no stable version released - all of the bugs doesnt allow us to use private connectors or doesnt work properly - blocking deployment, please lets focus on stabilisation instead of new features
    • 1
    • 1
  • d

    Dragan

    11/23/2022, 11:31 AM
    Hi all, has anyone encountered a problem when trying to move large amount of data from BigQuery
    Response too large to return. Consider specifying a destination table in your job configuration
    and if you seen it, have you been able to resolve this, since there is no configuration when creating a source on how to get the data from it This seem to be an issue on old BigQuery API, but is there an option to override it in Airbyte?
    s
    • 2
    • 2
  • r

    Rahul Borse

    11/23/2022, 12:27 PM
    Hi Team, In case of hubspot source and s3 destination I am not able to achieve incremental sync not any option is available. Can someone please help me out how to achieve it?
    m
    • 2
    • 3
  • a

    Abhijeet Singh

    11/23/2022, 12:32 PM
    Hello everyone, I am having an issue with data transformation on clickhouse with airbyte, can you please help me or you can connect me with someone who can help, Issue:- When airbyte is migrating my data to clickhouse then my materialized view is not executing, ideally it should trigger just after data gets inserted to to the table. Airbyte version:-0.40.21 and logs for support
    fad5cb26_bff0_4ae5_9f48_95d68aed9d1f_logs_24401_txt.txt
    r
    l
    m
    • 4
    • 7
  • r

    Rahul Borse

    11/23/2022, 1:16 PM
    Hi All, In destination connector is there any way I can get the column type of received data from the source?
    m
    • 2
    • 3
  • g

    Guy Feldman

    11/23/2022, 3:15 PM
    has anyone else experienced failures to run discover after upgrading airbyte? got
    Copy code
    Exception: ('Could not discover schema for source',
    trying to run discovery in octavia cli. Tailing the logs in the temporal worker I see a null pointer exception
    Copy code
    java.lang.NullPointerException: null
    	at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:889) ~[guava-31.1-jre.jar:?]
    	at io.airbyte.workers.internal.VersionedAirbyteStreamFactory.<init>(VersionedAirbyteStreamFactory.java:65) ~[io.airbyte-airbyte-commons-worker-0.40.16.jar:?]
    	at io.airbyte.workers.internal.VersionedAirbyteStreamFactory.<init>(VersionedAirbyteStreamFactory.java:56) ~[io.airbyte-airbyte-commons-worker-0.40.16.jar:?]
    	at io.airbyte.workers.temporal.discover.catalog.DiscoverCatalogActivityImpl.lambda$getWorkerFactory$2(DiscoverCatalogActivityImpl.java:127) ~[io.airbyte-airbyte-workers-0.40.16.jar:?]
    This also happens in the UI. Airbyte version 0.40.16
    m
    • 2
    • 5
  • j

    Joviano Cicero Costa Junior

    11/23/2022, 5:23 PM
    Hello! I am with a datetime error that I think somebody already had the same issue: 2022-11-22 215445 ERROR c.n.s.DateTimeValidator(tryParse):82 - Invalid date-time: No timezone information: 2021-09-14T112452.000000 Does anyone help me how to get fix?
    m
    s
    • 3
    • 10
  • j

    Jerri Comeau (Airbyte)

    11/23/2022, 7:57 PM
    Hey everyone! @Jerri Comeau (Airbyte) here with a quick note about the Community Assistance Team for the rest of this week and next week. As I’ve mentioned before Airbyte has a company holiday for the US Thanksgiving, so we will be spending time with our families and friends Thursday and Friday — that means we won’t directly be answering questions until Monday at the earliest. In addition, next week is going to be a week for the team to focus on clarification and cleanup as we head into the Xmas/New Years holiday period. That means we may be slower than usual to respond to questions next week. We’ll still be here and available and trying to make sure we’re providing the support needed, though. We also will NOT have Office Hours next week to allow us to do some focus time as a team. Rest assured that this is in service to having a productive and responsive December and on into 2023. We wish everyone a quiet, enjoyable, and restful Holiday and we’ll see you all on Monday.
  • w

    Will Callaghan

    11/23/2022, 8:36 PM
    Hi everyone, Trying to understand the rationale behind only supporting a (small) subset of flags for
    wcmatch.globmatch
    in the S3 connector? In our use-case, it would be very helpful if we could use a negation pattern in addition to an inclusion pattern. This would require adding support for the negate flag. The change would essentially be: What currently exists:
    flags= GLOBSTAR | SPLIT
    Adding a new flag:
    flags= GLOBSTAR | SPLIT | NEGATE
    https://github.com/airbytehq/airbyte/blob/8d4f7db4e717a39acb3266497c2e8d77d71ab2b3[…]/connectors/source-s3/source_s3/source_files_abstract/source.py
    • 1
    • 1
  • s

    Shangwei Wang

    11/23/2022, 11:54 PM
    some (hopefully) quick questions: Context: • airbyte version -
    0.40.18
    • postgres source version -
    1.0.23
    • redshift destination version -
    0.3.51
    1. We use “incremental dedup” for 2 tables (postgres -> redshift). When i was examine the queries in the log, i noticed that one query has
    updatedAt >= ?
    while the other uses
    updatedAt > ?
    , the difference is the presence of
    =
    . Intuitively, i thought all incremental queries should be without the
    =
    ? Whats the deciding factor for adding/omitting the
    =
    sign? _update_: i just found some doc on this, our
    updatedAt
    timestamp for both tables are down to the millisecond resolution, so I’m not sure why there is a difference in the query… 2. The one query that has the
    =
    sign, the connection state has
    cursor_record_count = 333
    a. what does “cursor count” represent in airbyte? b. is this normal? Thanks a bunch, and happy holidays!
    a
    • 2
    • 5
  • j

    Jason Maddern

    11/24/2022, 12:27 AM
    I'm looking for some advice around airbyte transformations and DBT - in particular running a lot of pipelines at scale... We are currently: • running DBT + Snowflake + Airbyte • have 1xCRM, 1xJIRA, 1xCustom Product A, 30x Custom Product B, 100x Custom Product C all about to be ingested into the data warehouse Syncing data for the 1x options is easy; but there are 30x and 100x of Product A and Product B respectively (and these numbers will continue to grow). Both products have data separated at the source via data schemas (they are not multi-tenant). I'm after some advice re setup so that we don't get into a huge mess with this many pipelines. We want to be able to have snowflake shares & reporting per product (for our clients) so need this to be data striped and have role based security, but also require cross-client aggregated reporting across all. This drives the need to have per-pipeline transformations & other transformations I'm curious to know how people have done this at scale with airbyte, specifically re: 1. Many pipelines: There doesn't appear to be an easy way in Airbyte to copy source/destination/connections with a few edits (particularly programatically) which means a huge volume of clicking. Am I missing something obvious here? 2. Transformations with DBT: I've just started reading the Airbyte transformations documentation, and it could be possible to run the per-pipeline transformations from Airbyte (all connected to the same dbt source). However, it looks like I would need to have a dbt project per pipeline (for the separate config). Am I missing something obvious here? So really my questions is about running a lot of pipelines at scale within Airbyte/DBT (and Snowflake) and I don't want to get into a huge mess Thanks in advance 🙂 FYI @Varun Khanna
    • 1
    • 1
  • j

    Juan Chaves

    11/24/2022, 12:35 AM
    Is this your first time deploying Airbyte: No OS Version / Instance: Mac OS Ventura (M1 Apple Chip) Memory / Disk: 16Gb / 1Tb SSD Deployment: Docker Airbyte Version: 0.40.22 Step: Deploy on local (Docker compose up) Description: I'm trying to deploy airbyte open source on mi laptop using docker. but at some point docker compose up started to show this error. airbyte-webapp | 2022/11/23 234644 [error] 35#35: *16 connect() failed (111: Connection refused) while connecting to upstream, client: 172.21.0.9, server: localhost, request: "POST /api/v1/workspaces/list HTTP/1.0", upstream: "http://172.21.0.7:8001/api/v1/workspaces/list", host: "localhost", referrer: "http://localhost:8000/" I try to used all the options that mention in comments related even with the M1 Apple chip. The bootloader container exited with this 2022-11-23 234551 WARN c.z.h.p.HikariPool(shutdown):218 - Timed-out waiting for add connection executor to shutdown (edita
    ☝️ 1
    s
    • 2
    • 4
  • h

    Hemanta

    11/24/2022, 6:13 AM
    Hi, I am facing an issue while trying to access the google cloud SQL data as airbyte source connector. Anyone can help on this?
    m
    • 2
    • 17
  • f

    Faris

    11/24/2022, 6:33 AM
    How to detemine the suitable ec2 instance for airbyte docker compose deployment? (I have independent DB for airbyte that is and rds instance)
    m
    • 2
    • 3
  • c

    caesee

    11/24/2022, 6:51 AM
    Hello everyone, I want to establish a connection from a mysql server which over 2000 tables,but it shows Failed to fetch schema. Please try again
  • c

    caesee

    11/24/2022, 6:54 AM
    image.png
    ✅ 1
    k
    m
    +3
    • 6
    • 16
  • r

    Rytis Zolubas

    11/24/2022, 7:46 AM
    Hello I get this error every morning around 22:30 UTC when trying to reach API on my EC2 cluster:
    Copy code
    b'<html>\r\n<head><title>504 Gateway Time-out</title></head>\r\n<body>\r\n<center><h1>504 Gateway Time-out</h1></center>\r\n<hr><center>nginx/1.23.2</center>\r\n</body>\r\n</html>\r\n'
    I can see that API is passed correctly and runs the job. Later in the day I don't get this error.
    k
    n
    • 3
    • 6
  • r

    Rytis Zolubas

    11/24/2022, 8:33 AM
    In v 0.40.22 I cannot add customer connector through UI. I thought it was fixed from 0.40.21 since it is not mentioned in Know Issues section. Update: managed to have it with API /v1/sources/list /v1/source_definitions/list_private /v1/source_definitions/grant_definition Endpoints
    s
    • 2
    • 3
  • g

    Gerard Clos

    11/24/2022, 9:41 AM
    Hey guys 👋 I would like to get early access to the programatic api for airbyte cloud, who should I talk to?
    ✅ 1
    m
    • 2
    • 2
  • r

    Rahul Borse

    11/24/2022, 10:20 AM
    Hi All, I am trying to figure out how to get job Id in s3 destination once we sync the connection. Based on this jobId I need to perform few operations as per our company needs. Can someone please help me how to get current Job Id in s3 destination. I am still trying to figure out how jobs runs and how jobs are mapped with s3 destination.
    e
    • 2
    • 2
  • a

    Andreas

    11/24/2022, 10:41 AM
    Hi! I'm trying to add a custom connector, but it fails silently. I open the "New connector" modal, fill out the fields and hit "Add". After that, I can see the loading icon. Then the modal closes without any info and I'm ending up at the settings/account page. Airbyte Version 0.40.22 There doesn't seem to be anything helpful in the logs though:
    Copy code
    airbyte-worker                      | 2022-11-24 10:33:25 INFO i.a.c.i.LineGobbler(voidCall):114 - Checking if redacted.pkg.dev/redacted//npg-airbyte-docker-connectors/plenigo-source:0.2 exists...
    airbyte-worker                      | 2022-11-24 10:33:25 INFO i.a.c.i.LineGobbler(voidCall):114 - redacted.pkg.dev/redacted//npg-airbyte-docker-connectors/plenigo-source:0.2 was found locally.
    airbyte-worker                      | 2022-11-24 10:33:25 INFO i.a.w.p.DockerProcessFactory(create):119 - Creating docker container = plenigo-source-spec-ce979a0a-0c83-4913-a759-f6fd056a415d-0-hpdba with resources io.airbyte.config.ResourceRequirements@1b780d57[cpuRequest=,cpuLimit=,memoryRequest=,memoryLimit=]
    airbyte-worker                      | 2022-11-24 10:33:25 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/ce979a0a-0c83-4913-a759-f6fd056a415d/0 --log-driver none --name plenigo-source-spec-ce979a0a-0c83-4913-a759-f6fd056a415d-0-hpdba --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e DEPLOYMENT_MODE=OSS -e USE_STREAM_CAPABLE_STATE=true -e AIRBYTE_ROLE= -e WORKER_ENVIRONMENT=DOCKER -e WORKER_CONNECTOR_IMAGE=redacted.pkg.dev/redacted//npg-airbyte-docker-connectors/plenigo-source:0.2 -e WORKER_JOB_ATTEMPT=0 -e AIRBYTE_VERSION=0.40.22 -e WORKER_JOB_ID=ce979a0a-0c83-4913-a759-f6fd056a415d redacted.pkg.dev/redacted//npg-airbyte-docker-connectors
    k
    • 2
    • 10
  • a

    Avi Sagal

    11/24/2022, 10:42 AM
    Hi, i’m using the rest api to create and maintain connection (and loving it), i want to know if theres a way to know, using the api, if a connection is in the middle for a sync or after one? i can see that in the UI there’s an indication for this, Thanks!
    e
    • 2
    • 3
  • l

    laila ribke

    11/24/2022, 12:58 PM
    Hi, I need to set up a custom source (for WISE and another one for a bank). Has someone done this and maybe guide me on it? Also, I need to add fields to the google ads and bing ads sources, because I don´t receive basic fields like keyword in bing ads and device in googleads.. I red it can be done by cloning the source.. But also, looking forward to it if someone can help me on that.
    s
    • 2
    • 7
  • m

    Mikhail Masyagin

    11/24/2022, 1:14 PM
    Hi, guys! I'm getting these errors, when trying to sync Zendesk Support to S3 Parquet. It looks pretty similar to https://github.com/airbytehq/airbyte/issues/11154
    logs-5622.txt
    m
    • 2
    • 5
  • t

    thomas trividic

    11/24/2022, 1:53 PM
    hello here
  • t

    thomas trividic

    11/24/2022, 1:53 PM
    i have an issue with salesforce connector
    s
    m
    +3
    • 6
    • 24
  • g

    Gustavo Maia

    11/24/2022, 2:17 PM
    hello everyone, I got into this pretty weird error where I can't build my connector. I am executing
    docker build <connector-path> -t <tag>
    and at the
    pip install .
    step it takes a long time building dependencies with messages like "Installing build dependencies: still running..." but eventually it goes through those dependencies but reaches a point where it tries to build psutil but fails because
    psutil/_psutil_linux.c:19:10: fatal error: linux/version.h: No such file or directory
    Has anyone ever seen something like this?
    • 1
    • 2
  • l

    laila ribke

    11/24/2022, 2:31 PM
    Hi, another question, can I create a custom source from airbyte open source?
    g
    • 2
    • 4
  • a

    Alexander Schmidt

    11/24/2022, 2:36 PM
    Heyho, quick question. When deploying over docker swarm (portainer) do i need a running nginx container in my Swarm or should the config in the WebApp Container be enough?
    m
    • 2
    • 1
  • o

    Oriol

    11/24/2022, 2:53 PM
    Hi! Today I opened this topic about some problems I've been facing lately with syncs for high volume tables from Oracle: https://discuss.airbyte.io/t/poor-sync-performance-for-high-volume-loads-100-gb/3284 Has anyone faced similar performance issues as me? 🤔
    🐙 1
    🙏 2
    💯 1
    🙌 2
    d
    s
    d
    • 4
    • 9
1...99100101...245Latest