https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • l

    laila ribke

    01/20/2023, 7:52 AM
    Hi all, I was asked to create a report that shows the daily ELT upload. which ones succeeded/ failed, and count of new rows. Does someone have an example to take ideas?
    a
    • 2
    • 25
  • l

    Lihan Li

    01/20/2023, 9:52 AM
    Created a MR for a helm chart bug - https://github.com/airbytehq/airbyte/issues/21638
    ✅ 1
    n
    • 2
    • 1
  • b

    Benjamin Danies

    01/20/2023, 10:11 AM
    Hi ! 🙂 Is it possible to not hardcore
    airbyte-admin
    here but take the value in configuration ? I have not the same name for the
    serviceAccountName
    attribute… 😕 impossible to launch sync https://github.com/airbytehq/airbyte/blob/2a3817748aca2e0db30f7ba42ba9bf4e9f893105[…]ava/io/airbyte/workers/process/AsyncOrchestratorPodProcess.java
    u
    u
    +2
    • 5
    • 13
  • m

    Mahdi Rabbani

    01/20/2023, 10:21 AM
    Hello everyone, I am attempting to set up a connection between Shopify and BigQuery, but am encountering issues. When enabling the connection, the sync takes approximately 15 hours and ends in failure with the message: "_Failure Origin: replication, Message: Something went wrong during replication._" In BigQuery, I am not seeing any data except for a few _airbyte_tmp_ tables. Has anyone else encountered this problem? I have included a text file with the logs and have specified that the replication is set to incremental for all tables (deduped and history) except for the shop table, which is set to full refresh.
    573f791c_d98b_4af0_99e7_137b288e6454_logs_794_txt.txt
    o
    n
    • 3
    • 4
  • m

    Mahdi Dibaiee

    01/20/2023, 12:50 PM
    Hello 👋 Not sure which channel is the best place to ask for this, but I have a pull-request for a connector, and I need a review on it: https://github.com/airbytehq/airbyte/pull/19478 On another note, is it possible for us to run CI tests on a pull-request ourselves instead of waiting for airbyte members to speed up the troubleshooting / development loop?
    m
    • 2
    • 4
  • a

    Abdul Hameed

    01/20/2023, 1:34 PM
    Hi Team I am trying to load the data from source as postgres to destination as snowflake, I am loading one table whose number of records is 185940496, After running the pipeline in destination I am getting only 5734317 there. Help me with this how can I load the data into chunks successfully. Right now I am getting some error and it is terminating
    n
    u
    +2
    • 5
    • 11
  • a

    Akilesh V

    01/20/2023, 3:41 PM
    Hi All After migrating to v0.40.28 workspace not listing connections, it throwing java.lang.NumberFormatException: For input string: "1000000000000000000000000000000000" error
    p
    v
    • 3
    • 9
  • j

    Jhon Edison Bambague Calderon

    01/20/2023, 4:22 PM
    Hi team, I have a question; the Postgres connector does support schema evolution?
    s
    • 2
    • 3
  • a

    Andre Santos

    01/20/2023, 4:24 PM
    Hi Folks, I trying to create a new Jira source in my Airbyte environment. Is this your first time deploying Airbyte: No Memory / Disk: 8Gb / 16GB Deployment: EKS Airbyte Version: 0.40.23 Source name/version: JIRA Destination name/version: Redshift Step: New Source Description: Trying to set a new source connection to jira, and getting the following error: The connection tests failed. HTTPError('401 Client Error: Unauthorized for url: https://xxxxxxx.atlassian.net/rest/api/3/resolution') When trying the request with curl, I'm getting a message saying the endpoint is deprecated...
    a
    • 2
    • 2
  • b

    Ben Ext

    01/20/2023, 4:40 PM
    Hi I am testing the Amazon Ads connector and I have some issues. Sometimes it works, sometimes it doesn't, and I have the following error. Has anyone already this issue ?
    a
    f
    • 3
    • 13
  • c

    Chloe Connor

    01/20/2023, 5:33 PM
    Hi 👋, I'm having some issues with the Netsuite connector. I can set up the source ok, but when I set up a connection it runs but with error messages and then creates empty tables. Has anyone else seen this or have ideas on how to resolve? thanks!
    Copy code
    DATE FORMAT exception. Cannot read using known formats ['%m/%d/%Y', '%Y-%m-%d']
    Search error occurred: Parse of date/time "2022-12-31" failed with date format "dd/MM/yy" in time zone America/Los_Angeles
    u
    u
    +2
    • 5
    • 31
  • a

    Anitha Selvaraj

    01/20/2023, 6:17 PM
    Hello Team, Kindly someone assists me to connect Facebook Marketing as a source connector in open-source iteration. Am getting the following error when I tried to connect the source.
  • s

    Slackbot

    01/20/2023, 6:24 PM
    message has been deleted
    a
    s
    • 3
    • 3
  • n

    Narendra Yadav

    01/20/2023, 8:20 PM
    I am trying to add a SQLserver connection ,and getting below error, did anyone face this issue ?
  • n

    Narendra Yadav

    01/20/2023, 8:20 PM
    The connection tests failed. Message: HikariPool-1 - Connection is not available, request timed out after 60002ms.
  • j

    Jason Carter

    01/20/2023, 8:38 PM
    Hello.... I'm setting up Airbyte for the first time, using K8s with kustomize. I'm also using RDS and S3 customizations. Currently my
    airbyte-worker
    pod is crashing (crashLoopBackOff) which I'm pretty sure is because I left out the AWS access/secret info. My question though is: Should I still be able to create a source or destination with the worker pod crashing? I'm able to connect to the webapp, navigate to destinations and attempt to connect to Snowflake, but when I run
    Set up destination
    the progress bar starts up, fills up (msg saying it'll take a bit longer) and that's it. Nothing seems to break or showing related errors in the logs EXCEPT for
    airbyte-server
    showing
    Copy code
    WARN i.a.s.s.AirbyteGithubStore(getLatestDestinations):58 - Unable to retrieve latest Destination list from Github. Using the list bundled with Airbyte. This warning is expected if this Airbyte cluster does not have internet access.
  • k

    Karen (Airbyte)

    01/20/2023, 9:59 PM
    Hey community, octavia wave Is there anyone here who has experience with SQL server connection experience that will be great? I have a community member who is looking for some support. 🙂
    a
    • 2
    • 2
  • a

    aidan

    01/21/2023, 1:18 AM
    Has anyone set up airbyte behind an outbound proxy server . I can see that a few people have neen able to achieve this .But I cannot see any documentation. Any help or documentation would be greatly appreciated . @airbyte
    m
    • 2
    • 2
  • s

    Somasekhar Reddy Palli

    01/21/2023, 4:58 AM
    Hello All.... I'm trying to setup the destination as Databricks Lakehouse on Azure but connection is failing though all the details provided are valid. Here is the error message- ------------------------------------------------------- Could not connect to the staging persistence with the provided configuration. Status code 401, "<?xml version="1.0" encoding="utf-8"?><Error><Code>NoAuthenticationInformation</Code><Message>Server failed to authenticate the request. Please refer to the information in the www-authenticate header. RequestId:42b90c05-901e-0008-6d4a-2de6c7000000 Time2023 01 21T0346:35.3952225Z</Message></Error>" ------------------------------------------------------- Also, I think Data Source should be optional in the implementations where the storage would be handled automatically. Any help or documentation would be greatly appreciated.
    u
    u
    • 3
    • 5
  • j

    Johnson Zhao

    01/21/2023, 5:59 AM
    Hello, just began to use airbyte as POC project, I have 2 questions:
    u
    • 2
    • 1
  • j

    Johnson Zhao

    01/21/2023, 6:05 AM
    1 Can I give a different name to target table 2 the target table is created by airbyte, the column type does not match to original table, why is that? I am using SqlServer besides the airbyte??? columns, the other column types do not match and also in random order, the original PK is removed, etc. Is this by desigin? ie, my src table is: CREATE TABLE [dbo].[t_stg]( [ETL_CREATE_DATE] [datetime] NULL, [ETL_MODIFIED_DATE] [datetime] NULL, [ETL_YEAR_MONTH] [int] NULL, [ETL_PROCESSED_FLAG] [varchar](1) NULL, [SEQUENCE_NO] [int] NULL, [YrPrd] [int] NOT NULL, [item_no] [varchar](30) NOT NULL, [location_id] [int] NOT NULL, [rev] [float] NULL, [cost] [float] NULL, PRIMARY KEY CLUSTERED ( [YrPrd] ASC, [item_no] ASC, [location_id] ASC )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON, OPTIMIZE_FOR_SEQUENTIAL_KEY = OFF) ON [PRIMARY] ) ON [PRIMARY] GO the autogenerated target table is CREATE TABLE [dbo].[t_stg]( [_airbyte_unique_key] [varchar](32) NULL, [rev] [float] NULL, [cost] [float] NULL, [yrprd] [bigint] NULL, [item_no] [nvarchar](max) NULL, [sequence_no] [bigint] NULL, [location_id] [bigint] NULL, [etl_year_month] [bigint] NULL, [etl_create_date] [nvarchar](max) NULL, [etl_modified_date] [nvarchar](max) NULL, [etl_processed_flag] [nvarchar](max) NULL, [_airbyte_ab_id] [varchar](64) NOT NULL, [_airbyte_emitted_at] [datetimeoffset](7) NULL, [_airbyte_normalized_at] [datetime2](7) NOT NULL, [_airbyte_t_stg_hashid] [varchar](32) NULL ) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY] GO /****** Object: Index [dbo_t_stg_cci] Script Date: 1/21/2023 120250 AM ******/ CREATE CLUSTERED COLUMNSTORE INDEX [dbo_t_stg_cci] ON [dbo].[t_stg] WITH (DROP_EXISTING = OFF, COMPRESSION_DELAY = 0, DATA_COMPRESSION = COLUMNSTORE) ON [PRIMARY] GO
  • p

    Prashant Kumar

    01/22/2023, 6:53 AM
    Hey, Is there any documentation for setting up dbt transformation on ec2 with airbyte ? Have a problem transforming json to variant ( Source MySQL , Destination Snowflake )
    a
    n
    • 3
    • 2
  • n

    Noel Jacob

    01/22/2023, 8:23 AM
    Can anyone add me to the contributor channel? I've already filled the contributor google form and I've opened issues I would like to build.
    m
    • 2
    • 1
  • h

    Harel Oshri

    01/22/2023, 9:53 AM
    Hello everyone! I have an S3 bucket with different folders that represent different tables (different schemas). For example: bucket: events folder_1: click_event folder_2: view_event Just to make sure I understand correctly, this is what “path patterns” solve? For example these 2 tables: events/click_event/…/…/… events/view_event/…/…/… events/next_event/…/…/.. can be devided by the path pattern “*/“? I want to create a source for this S3 bucket (events) and have 1 connection for all different folders within it, is it possible? Or do I have to create sources as the number of subfolders I have and change the prefix each and every time? Thank you!
    u
    • 2
    • 2
  • b

    Boopathy Raja

    01/23/2023, 6:42 AM
    For my use case, I want to use this feature. https://docs.airbyte.com/integrations/destinations/mongodb/#connection-via-ssh-tunnel But in my Airbyte UI, I’m not able to see
    SSH tunnel
    Where can I input SSH Key authentication inputs? My Airbyte installation : version 0.40.28 on K8s
    n
    • 2
    • 7
  • s

    Sharath Chandra

    01/23/2023, 8:45 AM
    Does anyone has resolution for this error ?
    Copy code
    ERROR creating table model <schemaname>_airbyte.pg_stat_statements.................................................. [ERROR in 0.49s]
    Database Error in model pg_stat_statements (models/generated/airbyte_tables/<schema_name>_airbyte/pg_stat_statements.sql)
      syntax error at or near "queryid"
      LINE 22: ..."queryid" != '' then _airbyte_data."queryid" end as queryid,
    u
    • 2
    • 1
  • s

    Sebastian Brickel

    01/23/2023, 8:47 AM
    Hi, after updating to v0.40.28 I run into the an issue when trying to
    docker compose up
    Copy code
    airbyte-bootloader exited with code 255
    I have updated docker and use compose v2 (the one without the “-”) Any idea how I could fix that?
    n
    a
    +2
    • 5
    • 13
  • a

    Arnaud Selva

    01/23/2023, 9:20 AM
    hello everyone octavia wave, happy new year. I have an open PR - to add a new data source - that is still waiting for a review/feedback. It was created in December. Would somebody be able to take a look at it by any chance? I can share with you some sandbox API token to test it, thanks in advance
    n
    u
    • 3
    • 4
  • o

    Omer Kolodny

    01/23/2023, 9:57 AM
    Hi everyone, i have an issue connecting Databricks as a target. for some reason i get following error: 'Could not connect to the staging persistence with the provided configuration. You must agree to the Databricks JDBC Terms & Conditions to use this connector' in the following article it says that i need to: 'Agree to the Databricks JDBC Driver Terms & Conditions: *Set to True*' i have no idea where to set this. can anyone assist? EDIT: There is a toggle above i must have missed. it works now. thanks anyway :)
    u
    • 2
    • 2
  • k

    Kasamba Lumwagi

    01/23/2023, 11:46 AM
    Do you have internships or mentoring innitiatives for junior data engineers?
    m
    k
    • 3
    • 2
1...126127128...245Latest