https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • v

    Vinicius de Souza

    10/07/2022, 12:44 PM
    Hello. In the Netsuite source. When the docs says "You need to select manually each record on selection lists and give Full level access on next tabs: (Permissions, Reports, Lists, Setup, Custom Records). You strongly need to be careful and attentive on this point". Have i to select all the options in the selection lists or just that records i want?
    ✍️ 1
    m
    a
    • 3
    • 8
  • a

    Abhishek Sachdeva

    10/07/2022, 12:48 PM
    Re-attempt adding duplicate raw data
    I am using airtbyte to fetch data from hubspot, it should fetch 3 rows of raw data. But due to some scope issues, workflow gets partially completed and re-attempts again. After 3 re-attempts (and partial successful execution), I’ve 9 rows of raw data (3 duplicates of each row for each attempt). Is there any way to resolve this? It works fine if I remove the specific data for which the token doesn’t have permission. Workflow gets completed and I get 3 rows.
    ✍️ 1
    m
    • 2
    • 5
  • h

    Huib

    10/07/2022, 1:17 PM
    Is there a way to purge old logs from minio on k8s?
    ✍️ 1
    s
    l
    • 3
    • 8
  • r

    Rocky Appiah

    10/07/2022, 1:21 PM
    I’m getting the error below, any insight when running an incremental export from a postgres database using CDC via wal2json plugin.
    Copy code
    Stack Trace: org.postgresql.util.PSQLException: ERROR: out of memory
      Detail: Cannot enlarge string buffer containing 1073741293 bytes by 659 more bytes.
      Where: slot "airbyte_slot", output plugin "wal2json", in the change callback, associated LSN 1/92C0F500
            at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2675)
    Setup Airbyte: 0.40.10 Source: postgres 1.0.11 connector Destination: Snowflake 0.4.38
    ✍️ 1
    • 1
    • 11
  • l

    lucien

    10/07/2022, 1:29 PM
    Hey when trying to upgrade I got this following error (0.4.13 version) for the worker pod
    Copy code
    Message: No bean of type [io.airbyte.config.persistence.split_secrets.SecretPersistence] exists for the given qualifier: @Named('secretPersistence')
    Is it specific to me or it is a known issue ?
    ✍️ 1
    r
    • 2
    • 6
  • n

    Nicola Corda

    10/07/2022, 1:42 PM
    Hey, a quick question about https://docs.airbyte.com/deploying-airbyte/on-kubernetes-via-helm/#database-external-secrets it expect a k8s secret with the db password? how about the user??
    ✍️ 1
    d
    s
    • 3
    • 12
  • g

    Gabriel Rotermund

    10/07/2022, 2:07 PM
    Hi Everyone! I am trying to create a connection using airtable as a source. I am having some issue with the pre build api because I don't receive all columns in the table. How can I access the code and update it? I am not the most expert with data, so any help will be amazing
    ✍️ 1
    s
    • 2
    • 2
  • o

    Omar Abdullahi Ahmed

    10/07/2022, 2:42 PM
    Hello! Am new to Airbyte, I might immaturely asking question that have been answered before bare with me. am getting this error.
    2022-10-07 13:34:02 - Additional Failure Information: ('42000', "[42000] [Microsoft][ODBC Driver 17 for SQL Server][SQL Server]'COLUMNSTORE' is not supported in this service tier of the database. See Books Online for more details on feature support in different service tiers of Windows Azure SQL Database. (40536) (SQLExecDirectW)")
    And it says the syncs failed! Now the weird part is.. Data has been loaded in my data warehouse but in json format How can I solve this? How can I ingest data with changing it's data type?
    ✍️ 1
    e
    m
    +2
    • 5
    • 18
  • j

    Jerri Comeau (Airbyte)

    10/07/2022, 4:56 PM
    https://join.slack.com/share/enQtNDE4NTI4OTE2ODIzMC1kZjBlYTljMDZhZDdhOTc3N2E1YTIxYTIyZjQzZmRiNzc3NmZmMzNlZWQ0Zjk1NDkxOTEzODljMzNhM2Y3MTAx
  • d

    Dusty Shapiro

    10/07/2022, 5:54 PM
    Helm/K8s question: What is the difference between the externalDatabase and the database objects in the values?
    ✍️ 1
    k
    n
    +2
    • 5
    • 16
  • b

    Brijesh Singh

    10/07/2022, 9:11 PM
    I am trying to customize the existing Source connector. My request has header and or JSON body. How do I create the nested field in JSON body. json_payload["limit"] = self.items_per_page_limit. -> works good at its at parent level json_payload["query/filter/date_time_filter/updated_at/start_at"] = self.start_date -> Throws error. Similarly next line too.. json_payload["query/sort/sort_order"] = "ASC"
    ✍️ 1
    e
    • 2
    • 2
  • n

    Nate

    10/07/2022, 9:44 PM
    hey all! does anyone know an approximate timeline for API access to Airbyte Cloud? (i.e. triggering syncs + other workspace actions programmatically)
    ✍️ 1
    m
    • 2
    • 3
  • s

    Sentinel AI

    10/08/2022, 12:11 AM
    Hi all, is there already an implementation of the Plaid Investment connector out there? Otherwise, I'll look to extend the current one to add investment flows.
    ✍️ 1
    s
    • 2
    • 4
  • l

    Liyin Qiu

    10/08/2022, 12:21 AM
    Hi community, we had a high throughput kafka topic, is there any guidance about how to correctly scale the kafka source connectors. Or if we have multiple connections reading same topic with same consumer group id? thanks
    ✍️ 1
    e
    u
    m
    • 4
    • 5
  • b

    Bagus Zulfikar

    10/08/2022, 3:41 AM
    hello, im trying to setup airbyte from google compute engine but getting error when trying to connect to airbyte using compute ssh to port 8000, is there anyone know how to solve this? thanks. using guides from https://docs.airbyte.com/deploying-airbyte/on-gcp-compute-engine
    ✍️ 1
    e
    • 2
    • 3
  • r

    Robert Put

    10/08/2022, 3:28 PM
    any extra configs needed to get airbyte to work on arm? Deploying on ec2 on aws
    ✍️ 1
    s
    • 2
    • 2
  • c

    claudio viera

    10/08/2022, 6:37 PM
    hello guys, i am try configure a connections with postgresql and have this error
  • c

    claudio viera

    10/08/2022, 6:37 PM
    Caused by: org.jooq.exception.DataAccessException: SQL [insert into "public"."actor_catalog_fetch_event" ("id", "actor_id", "actor_catalog_id", "config_hash", "actor_version", "modified_at", "created_at") values (cast(? as uuid), cast(? as uuid), cast(? as uuid), ?, ?, cast(? as timestamp with time zone), cast(? as timestamp with time zone))]; ERROR: null value in column "actor_id" of relation "actor_catalog_fetch_event" violates not-null constraint Detail: Failing row contains (f8d15d30-8df9-42ae-a424-e2063a341212, 29b27343-639b-48f7-90bf-07f83593f804, null, null, null, 2022-10-08 182729.247629+00, 2022-10-08 182729.247629+00).
    ✍️ 1
    🙏 1
    e
    m
    • 3
    • 4
  • c

    claudio viera

    10/08/2022, 6:39 PM
    i get this same error with all sources
  • c

    claudio viera

    10/08/2022, 6:39 PM
    i have airbyte with helm
  • c

    claudio viera

    10/08/2022, 6:42 PM
    i belive that airbyte is try write a the table "actor_catalog_fetch_event" with value null and expect not null
  • n

    NITESH KUMAWAT

    10/09/2022, 3:39 AM
    Hello guys, I want to make a custom source connector using an existing connector for a specific edge case. can anyone tell what all steps should i follow? thanks in advance.
    ✍️ 1
    e
    • 2
    • 3
  • j

    Jamiu Afolabi

    10/09/2022, 6:22 AM
    Hello guys, It's so good to be part of the community. I am a newbie with airbyte and i really need the community's help. I am trying to replicate data from postgres to bigquery. However, tables with "timestamp without time zone" are not replicated. I am not allowed to change the source table. Is there any workaround?
    ✍️ 1
    e
    s
    • 3
    • 8
  • s

    Sergio Spieler

    10/09/2022, 9:56 PM
    Good Evening everyone. I am having problems connecting my MongoDB Atlas database as a source at airbyte. Currently running opensource @docker locally at my machine (tried many times and always have the same error Failed to fetch mongodb schema). I am able to connect properly, but receive this error when building the pipeline. The whole database is big (around 60GB). Any ideas of what can be done?
    ✍️ 1
    n
    a
    • 3
    • 36
  • s

    Sergio Spieler

    10/09/2022, 10:30 PM
    2022-10-09 160008 [32mINFO[m i.a.v.j.JsonSchemaValidator(test):71 - JSON schema validation failed. errors: $.host: is missing but it is required, $.port: is missing but it is required, $.instance: does not have a value in the enumeration [standalone] 2022-10-09 160008 [32mINFO[m i.a.v.j.JsonSchemaValidator(test):71 - JSON schema validation failed. errors: $.instance: does not have a value in the enumeration [replica], $.server_addresses: is missing but it is required
  • e

    Eduard Gruy

    10/09/2022, 8:58 PM
    Hey, I am creating a custom connector and there is an issue I am hitting. This specific source has very large number of source objects (100k), so AirbyteCatalog gets quite large. And please mind that these are standard tables, not something like various API stock tickers. The Catalog JSON is ~28MB (and this is actually smaller than final size will be, since I just added mock schemas for now, not the actual ones) If I send the complete Catalog via discover it seems to wreck the frontend. First the browser fires DiscoverSchema twice (I suppose this is because of React StrictMode? But I am not running airbyte in dev mode...). It takes few minutes for the worker to get the result, but it receives it and browser seems to get result for both requests. Finaly browser sends destination_definition_specification, and receives a response, but nothing happens in the frontend. It is stuck on "we are fetching the schema of your data source." The memory footprint of the firefox tab keeps rising (6GB), until I suppose OS kills docker to get more mem. When tried on chrome the screen just gets blank. No errors in the browser console. Running latest version of airbyte in docker on Ubuntu WSL. Do you have any advice for me what to try? Thanks
    ✍️ 1
    m
    m
    • 3
    • 5
  • s

    srikanth

    10/10/2022, 7:15 AM
    Hi, I am trying to setup Airbyte data flow from Oracle to snowflake. During my initial step of setting Source as oracle i have come across following Errors. Not sure what the error says. Need help on this.
    ✍️ 1
    m
    • 2
    • 4
  • s

    srikanth

    10/10/2022, 7:16 AM
    2022-10-10 065218 INFO i.a.w.t.TemporalAttemptExecution(get):110 - Cloud storage job log path: /workspace/460c647d-e336-44f8-9bdf-3c13d00ccc0c/0/logs.log 2022-10-10 065218 INFO i.a.w.t.TemporalAttemptExecution(get):113 - Executing worker wrapper. Airbyte version: dev-e038ea8-cloud
  • s

    srikanth

    10/10/2022, 7:16 AM
    these are error messages that i come across
  • s

    srikanth

    10/10/2022, 7:16 AM
    Can anyone help
1...727374...245Latest