https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • j

    Johnny P

    04/17/2023, 12:19 AM
    Hi All, Trying to get going on first connection but keep getting this error after about 5 mins... "Non-json response". Source in SQL server. it seems to get stuck pulling back the list of Primary Keys. The source and destination connected fine.
    • 1
    • 1
  • a

    Akash

    04/17/2023, 2:58 AM
    How can I make a Airbyte connector to do the following : To allow users to load data and execute transformations in my data warehouse. Or else is this not the definition of a Airbyte connector
    k
    • 2
    • 5
  • a

    Abhishek Kale

    04/17/2023, 3:25 AM
    I want a list of all database sources.
    k
    • 2
    • 3
  • a

    Abhishek Kale

    04/17/2023, 3:28 AM
    I want a list of all API sources
    k
    • 2
    • 2
  • u

    UUBOY scy

    04/17/2023, 4:20 AM
    I have SQL Server 2017 with CDC enable. If a row be modified in a transaction multiple times, that is, in CDC log they have the same __$start_lsn and the same __$command_id. How do Airbyte tell which log the latest operation is? I found that incorrect data will be keep in target destination and the correct commit was made inactive.
    k
    m
    • 3
    • 3
  • u

    UUBOY scy

    04/17/2023, 5:21 AM
    Is there way to getting logic of latest data selecting of SQL Server CDC incremental updating?
    k
    • 2
    • 12
  • j

    Jaroslav Loutocký

    04/17/2023, 6:34 AM
    Hi I am using Airbyte to transform data from a PostgreSQL database (production environment) to a PostgreSQL development environment. However, I need to anonymize the data, so during the transformation I need to call a custom NodeJS application to perform anonymization over the schema/table/column. For example, it will replace the email with the generated one. I have the NodeJS application ready, it takes the SQL dump of the production database as input and outputs anonymized SQL. Is this possible to use in Airbyte or what are the ways to anonymize or modify the data during transformation? Thank you
    k
    • 2
    • 4
  • b

    Bishan Singh

    04/17/2023, 6:47 AM
    Hi I want to ask that my source is mysql and destination is bigquery. so which Sync mode is best for my task like i want to filter data from source based on condition and then push that filter data in bigqueryagter making some changes in that filter data.
    k
    • 2
    • 3
  • b

    Bishan Singh

    04/17/2023, 7:29 AM
    I want to do ETL so which Sync mode is best for it
    k
    • 2
    • 2
  • g

    Gabriele Lomuscio

    04/17/2023, 7:56 AM
    Hi, how do you create a custom connector programmatically ? I need to init the Airbyte configuration for a series of servers on AWS. While I can do it for sources , destination and connections, it seems not being possible to create a custom connector programmatically (octavia does not allow creating custom connectors). Do you have any idea ?
    k
    • 2
    • 2
  • s

    Slackbot

    04/17/2023, 8:18 AM
    This message was deleted.
    k
    • 2
    • 2
  • m

    mohd shaikh

    04/17/2023, 8:19 AM
    Hi Everyone , I have setup a connection between GA4 and Google bigquery using airbyte cloud. I am interested in getting a 'Session Manual ad content' data. Can anyone please guide?
    k
    • 2
    • 2
  • m

    Mark Nuttall-Smith

    04/17/2023, 9:35 AM
    Hey, I've been paying a bit of attention to the memory usage on our airbyte server and noticed the airbyte-cron container was surprisingly high:
    Copy code
    CONTAINER ID   NAME                                       CPU %     MEM USAGE / LIMIT     MEM %     NET I/O           BLOCK I/O         PIDS
    ab3a5b99c835   normalization-normalize-12215-0-mficw      0.00%     140.4MiB / 15.44GiB   0.89%     0B / 0B           63.4MB / 24.6kB   19
    5b82e0e49c9b   source-postgres-read-12214-0-tpikv         11.46%    3.004GiB / 15.44GiB   19.45%    0B / 0B           406kB / 0B        27
    933713849e8e   destination-postgres-write-12214-0-bffyw   45.89%    604.4MiB / 15.44GiB   3.82%     0B / 0B           815kB / 0B        27
    13b8fc7fcd53   airbyte-proxy                              0.09%     3.766MiB / 15.44GiB   0.02%     6.92MB / 6.94MB   8.55MB / 4.1kB    3
    ea1753186749   airbyte-webapp                             0.07%     6.062MiB / 15.44GiB   0.04%     5.18MB / 6.89MB   9.57MB / 8.19kB   5
    a69204bca14e   airbyte-cron                               0.08%     1.782GiB / 15.44GiB   11.54%    137MB / 596kB     812MB / 0B        47
    ad679b9c3dea   airbyte-connector-builder-server           0.24%     41.76MiB / 15.44GiB   0.26%     1.73kB / 0B       30.3MB / 0B       1
    cab52e80e142   airbyte-worker                             31.13%    698.1MiB / 15.44GiB   4.41%     5.92MB / 1.22MB   110MB / 8.19kB    189
    f222ef078bb1   airbyte-server                             8.75%     951.7MiB / 15.44GiB   6.02%     120MB / 6.14MB    70.8MB / 0B       57
    c2dda46d7d72   airbyte-temporal                           2.67%     118MiB / 15.44GiB     0.75%     254MB / 163MB     134MB / 8.19kB    12
    I find it particularly surprising, given all our connections are set to manual replication frequency! Is it normal?
    k
    • 2
    • 2
  • s

    Slackbot

    04/17/2023, 9:52 AM
    This message was deleted.
  • m

    Monika Bednarz

    04/17/2023, 9:56 AM
    Hi Team, We have Airbyte set up internally. We haven’t changed much lately in the setup, but surprisingly, all of our connectors show up as “(custom)“. It is not the case, we’re using the official connectors mostly. Could you please let me know why this can be happening? It is a bit problematic, since there’s also no clear alerting that a connector has a new version. I will attach a screenshot below.
    k
    h
    • 3
    • 11
  • g

    Guillaume Jaouen

    04/17/2023, 10:26 AM
    Hi Everyone ! I’m trying to do some tests on Airbyte Cloud with the Hubspot connector & Snowflake, I have two questions: 1. While reading the Airbyte documentation on the Hubspot connector, I did some initial tests with the data export. The Incremental | Deduped + History mode does not work & I get this message:
    Failure Origin: normalization, Message: Normalization failed during the dbt run. This may indicate a problem with the data itself.
    Do you know why ? 2. Looking at several tables in Snowflake, I can’t find all the past events. Example the title of a deal has been changed several times, but I don’t get the history of these changes. After the first synchronization, I only get the last status of the deal. How can we get the history of all the past events ?
    k
    g
    • 3
    • 6
  • h

    hedgar

    04/17/2023, 11:51 AM
    Hello, can I set up a source thats just REST API?
    k
    • 2
    • 2
  • n

    Nick Joanis

    04/17/2023, 2:06 PM
    Hello everyone! 👋 I am currently experiencing some unusual behavior during normalization of a synchronization between two Postgres databases, using the latest source version, destination version. All the tables I am syncing are Incremental | Deduped + History. Even if the first sync is successful, it appears that during subsequent syncs, normalization is not done incrementally. Instead, all tables are recreated from scratch, including _stg, _scd, and the final table. When looking at the dbt logs directly in the UI, I can see that the models are noted as “incremental”. This does not seem to reflect the expected behaviour mentionned in https://docs.airbyte.com/understanding-airbyte/basic-normalization/#incremental-runs. Do you have any tips on what might be causing this issue? It is having a significant impact on our RDS instance and is preventing us from performing frequent loads. Thank you in advance!
    k
    k
    • 3
    • 4
  • i

    Isabella Dintinjana

    04/17/2023, 2:32 PM
    Hi Team, I have a succesful file link to GCS connection, however the data is in json format, and I would like to unnest it into csv with reader_options parameter. Following the documentation https://docs.airbyte.com/integrations/sources/file/ and https://pandas.pydata.org/pandas-docs/stable/user_guide/io.html#io-json-reader, the reader_options I put for my data were not recognized. Format in question is: id1, id2, "{""var1"":"",""var2"":"",...}" Thanks in advance for ideas!
    k
    • 2
    • 3
  • a

    Aric Lasry

    04/17/2023, 3:19 PM
    Quick question: Is there a reason why Airbyte chose not to use the Slack Event API (slack “webhooks”) for the Slack connector?
    k
    • 2
    • 3
  • y

    Yepher

    04/17/2023, 3:48 PM
    Is there any chance this GitHub issue will be serviced in the near future? https://github.com/airbytehq/airbyte/issues/8167#issue-1059752873
    k
    • 2
    • 3
  • c

    Christopher Wu

    04/17/2023, 5:21 PM
    Does Airbyte Enterprise support using custom connectors?
    k
    m
    w
    • 4
    • 8
  • s

    Sandhya Manimaran

    04/17/2023, 5:56 PM
    Hi Team . I need help in fixing the version on airbyte
    k
    m
    • 3
    • 9
  • b

    Ben Greene

    04/17/2023, 6:18 PM
    Hi Team, currently the list endpoints for getting sources/connections do not support pagination. This means that if we have many sources/destinations/connections, we have to retrieve the entire payload in one shot. This is currently causing performance issues, not on the data transfer side, but on the configuration side. Our DB is timing out as we have tens of thousands of sources/connections. Is there any plan to incorporate pagination at the time? Is there any way for me to highlight the importance of this feature outside of this forum? Thank you!
    k
    • 2
    • 4
  • y

    Yepher

    04/17/2023, 6:25 PM
    In EKS, I am trying to setup AirByte without putting the secrets like the postgress password in the
    value.yaml
    file so I can commit it into GitHub. This is what I am trying and it is not working:
    Copy code
    kubectl create secret generic our-secrets --from-literal=airbyte-externaldatabase-password=airbytepass
    for our
    values.yaml
    we use:
    Copy code
    externalDatabase:
      host: <http://my-postgres.rds.amazonaws.com|my-postgres.rds.amazonaws.com>
      user: airbyte
      # password: "airbytepass"
      existingSecret: "our-secrets"
      existingSecretPasswordKey: "airbyte-externaldatabase-password"
      database: db-airbyte
      port: 5432
    If we put
    password: "airbytepass"
    in the
    values.yaml
    file, it works fine. But if we do this in the
    values.yaml
    Copy code
    existingSecret: "our-secrets"
    existingSecretPasswordKey: "airbyte-externaldatabase-password"
    We get this error:
    Copy code
    Warning  Failed 9s (x4 over 40s) kubelet Error: couldn't find key DATABASE_PASSWORD in Secret default/airbyte-airbyte-secrets
    I am sure I am doing something silly, but I can't quite sort it out. To install I am doing this
    helm upgrade -i airbyte airbyte/airbyte -f values.yaml
    👀 1
    k
    • 2
    • 4
  • l

    Luke Rodgers

    04/17/2023, 6:30 PM
    I’m trying to use Airbyte open source edition in non-CDC incremental, append sync mode, from postgres to bigquery, on a large-ish (~260M rows, 420G) table, but Airbyte is unable to complete the initial extraction. It seems to correctly determine the table size, and then issues a query seemingly related to this adaptive fetch size feature (https://github.com/airbytehq/airbyte/pull/12400) to determine how many rows should be fetched per iteration. The query is
    SELECT "id","some","other","columns" FROM "public"."versions" ORDER BY "id" ASC
    (actual columns redacted, other than
    id
    ). However this query does not use a
    LIMIT
    clause, and so it attempts a full table scan. This hangs for several hours, then fails with this error
    ERROR i.a.i.b.AirbyteExceptionHandler(uncaughtException):26 Something went wrong in the connector. See the logs for more details. java.lang.RuntimeException: java.lang.RuntimeException: org.postgresql.util.PSQLException: ERROR: could not write to file "base/pgsql_tmp/pgsql_tmp236578.139": No space left on device
    . The EC2 instance has 1TB of disk. I could increase disk size and keep trying, but this seems like a bug with the code that determines fetch size. How have others handled initial sync for large tables in non-CDC mode?
    k
    s
    • 3
    • 5
  • u

    [DEPRECATED] Marcos Marx

    04/17/2023, 6:46 PM
    Hello today Airbyte Support Team is going to be online to help you troubleshooting issue or to discuss features in the #C045VK5AF54 in a Zoom meeting! See you there https://airbytehq.slack.com/archives/C045VK5AF54/p1681757136329139
  • s

    Slackbot

    04/17/2023, 9:27 PM
    This message was deleted.
    k
    • 2
    • 2
  • m

    Michael Adaniken

    04/17/2023, 9:36 PM
    Hi, I'm using Airbyte open source edition and I keep getting errors when trying to input a webhook URL for notifications on slack.
    c
    r
    • 3
    • 11
  • t

    Taylor Hubbard

    04/17/2023, 9:38 PM
    I'm trying to configure a low-code connector on Airbyte OSS. For incremental loads, there's the inject_into option to add the query string parameter name. The api I am loading from supports the following query string syntax:
    /v1/user?modified_at={'gte': '2023-01-01'}
    Does the low-code builder support this scenario?
    🙏 1
    k
    • 2
    • 4
1...182183184...245Latest