https://linen.dev logo
Join Slack
Powered by
# connector-development
  • i

    Imon

    04/07/2022, 11:48 AM
    Hi everyone! I’m currently working on a HTTP-connector for a weather-API and I’m having one issue. Currently, I can define a configuration with one city and a date. My connector then successfully delivers the weather data for that city & day. What I want is to be able to use an array of cities in my configuration instead of one city. My idea was to change the
    request_params
    such that instead of returning one dictionary, I return a list of dictionaries where each entry corresponds to the same request but for a different city. However, using a list of dictionaries instead of one dictionaries only results in a
    Too many values to unpack
    -error. Does anyone have an idea on how I could solve this problem?
    o
    • 2
    • 1
  • m

    Mahdi Dibaiee

    04/07/2022, 2:26 PM
    Hello 👋 I’ve hit a bug with
    source-facebook-marketing
    where it assumes
    user_tos_accepted
    to be a number but it’s actually an object, and so read operations fail. I’ve opened an issue and a PR where I’m reusing the schema definition of
    tos_accepted
    for
    user_tos_accepted
    which makes both nullable objects. Can I get a review from one of the maintainers please? It’s a small PR but blocks our use of this connector. thanku Issue: 11800 PR: 11801
    s
    • 2
    • 6
  • j

    Jove Zhong

    04/07/2022, 5:27 PM
    How to find why/where the
    discover
    hangs? Hello, I am building the source and destination connectors for our own product. I had this worked before, but recently the schema discover failed. Not sure it's due to aribyte change or our product change. When I ran the UI test (read from our product and send to csv), the schema discover progress never end. Then I ran
    docker run --rm -i -v ~/Dev:/secrets timeplus/airbyte-source-timeplus:dev discover --config /secrets/source_config.json
    It shows tables/columns but also hang again, with many INFO messages like
    2022-04-07 171926 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):137 - Table bookings column cid (type string[0]) -> Json type io.airbyte.protocol.models.JsonSchemaType@492fc69e
    It's a Java-based connector. Can I run this via IntelliJ and see which method it is stuck? thanku
  • d

    David Anderson

    04/07/2022, 8:32 PM
    its a fairly specialized use-case, but is anyone using airbyte with alloy? took a spin through the community connectors and didnt see anything, but figured id ask anyway!
  • g

    Gujjalapati Raju

    04/08/2022, 9:28 AM
    Hi, I wnat to add pandas to manipulate the json data in streams response method, In local it was working good. But when I deploy the image(after building). It is throwing error as: 2022-04-08 085710 destination > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. 2022-04-08 085711 destination > SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory] 2022-04-08 085711 source > No module named 'pandas' Traceback (most recent call last): File "/airbyte/integration_code/main.py", line 9, in <module> from source_service_now_connector import SourceServiceNowConnector File "/airbyte/integration_code/source_service_now_connector/__init__.py", line 6, in <module> from .source import SourceServiceNowConnector File "/airbyte/integration_code/source_service_now_connector/source.py", line 15, in <module> import pandas as pd ModuleNotFoundError: No module named 'pandas' 2022-04-08 085712 INFO i.a.w.DefaultReplicationWorker(lambda$getReplicationRunnable$5):305 - Total records read: 0 (0 bytes)
  • j

    Johan Strand

    04/08/2022, 1:47 PM
    Is it possible to give any info on how far down the road the SFTP connector is? I´m looking to grabbing data from an SFTP to Cloud storage or Bigquery https://airbyte.com/connectors/sftpftp
    g
    • 2
    • 1
  • a

    Alpana Shukla

    04/08/2022, 1:59 PM
    Hi I am creating a custom connector . I get following error "`Failed to fetch schema. Please try again`" while connecting destination as MSSQL Server. Thanks in advance
  • s

    Sean Zinsmeister

    04/08/2022, 9:18 PM
    ThoughtSpot is potentially looking to contract a few freelance data engineers for specific Airbyte connectors to work with our use case templates we call SpotApps. If this sounds interesting to you, please shoot me a DM.
    o
    • 2
    • 1
  • a

    Achmad Syarif Hidayatullah

    04/09/2022, 6:16 AM
    is there a way to trace this error?
    Copy code
    JSON schema validation failed. 
    errors: $: null found, object expected
    the whole zendesk pipeline failed because of this, and the log aren't clear enough to tell which stream and column that causing this error i want to make a fix but this leave me clueless. also is there any way to suppress this error? for example make certain column with object type as optional? second, is json schema validation run in source or run in destination part? this problem pretty much annoys me (make airbyte keeps retrying until now) and this happens after using zendesk-support connector version 2
    o
    • 2
    • 1
  • a

    AndrewD

    04/09/2022, 5:09 PM
    Is it possible for a custom http stream source connector to do a post request including sending a json body? Any examples of connectors that are doing this?
  • n

    Natan Yellin

    04/10/2022, 2:38 PM
    Hello all, I wrote a custom http source for orbit.love but I am having trouble adding it as a connector in the UI
    e
    • 2
    • 15
  • e

    Enrico Tuvera Jr.

    04/11/2022, 6:32 AM
    airbyte's authenticators are basically the
    requests
    package's session authenticators aren't they? the
    requests_native
    package, at that and the fact that they're passed into
    self._session.auth
    both kinda hint at that. i just want to make sure that there's nothing special about them before trying to roll my own
    o
    • 2
    • 3
  • o

    Oliver Franz

    04/11/2022, 9:34 AM
    are there any updates on this PR?
  • e

    Ehmad Zubair

    04/11/2022, 11:57 AM
    Hi Team, Any idea on how i'd go about creating a source that supports webhooks?
  • l

    Louis-Marius Gendreau

    04/11/2022, 2:56 PM
    Hi folks, is there a way to configure Airbyte so it does not unnest repeated records? I want to this with the Shopify to BigQuery connectors and handle it myself. Thanks! 😀
  • t

    Tien Nguyen

    04/11/2022, 8:40 PM
    Hi Team, I am working on building destination for custom snowflake with python, but still face some error. I have finished building a custom source and successfully output a message. When I pass that message to the destination and print out the result, It flags me with "NoneType object is not iterable" even though it is json and I was able to parse through all the keys. Here is the output, and destination file. Can someone help me with debugging this ?
  • g

    gunu

    04/11/2022, 10:55 PM
    hi team, can someone help me with my understanding of incremental configuring in all airbyte connectors. incremental stream obtains the last updated date for a record and sets this as the
    cursor_field
    so that on the next sync, the API will pull records after this date. however some connectors i’ve looked at all appear to pull records for on or after this date. meaning that it will always pull the last record synced again, causing duplicate records to appear. this is the case for
    zendesk-support
    ,
    zendesk-talk
    ,
    survey-monkey
    and i imagine others too. given no other users are complaining about this, i’m wondering if i’m misunderstanding some fundamental aspect of the incremental stream. posted additional details in the discussion forum if you’re interested
  • s

    Shubham Pinjwani

    04/11/2022, 10:57 PM
    Hello Team, I am currently trying to make a custom postgres connector using java jdbc cdk. I am trying to inherit the postgres connector and getting the following error while running build.
    Task airbyte integrationsconnectorssource custom postgrescompileJava
    FAILED /Users/shubham.pinjwani/airbyte/airbyte-integrations/connectors/source-custom-postgres/src/main/java/io/airbyte/integrations/source/custom_postgres/CustomPostgresSource.java25 error: cannot find symbol public class CustomPostgresSource extends PostgresSource { symbol: class PostgresSource 1 error
  • g

    gunu

    04/11/2022, 11:12 PM
    hi team, can someone help me with my understanding of incremental configuring in all airbyte connectors. incremental stream obtains the last updated date for a record and sets this as the
    cursor_field
    so that on the next sync, the API will pull records after this date. however some connectors i’ve looked at all appear to pull records for on or after this date. meaning that it will always pull the last record synced again, causing duplicate records to appear. this is the case for
    zendesk-support
    ,
    zendesk-talk
    ,
    survey-monkey
    and i imagine others too. given no other users are complaining about this, i’m wondering if i’m misunderstanding some fundamental aspect of the incremental stream. posted additional details in the discussion forum if you’re interested. also attached an image to show how the last record keeps getting synced again.
  • s

    Shubham Pinjwani

    04/12/2022, 12:31 AM
    Hi team. I have created a custom postgres source using cdk by inheriting the existing postgres source. I am getting the following error when I am trying to add the source in the UI. Any help is appreciated.
    Copy code
    Internal Server Error: Get Spec job failed.
  • r

    Ryan Werth

    04/12/2022, 1:51 AM
    I have a custom connector that I believe is working
    Copy code
    python main.py read --config sample_files/config.json --catalog sample_files/configured_catalog.json
    returns the desired result. I need some help adding the connector to the airbyte UI
    g
    • 2
    • 8
  • a

    Aakash Kumar

    04/12/2022, 7:13 AM
    Hi, While running Amazon Seller Partner data sync, for the table GET_FLAT_FILE_ALL_ORDERS_DATA_BY_LAST_UPDATE_GENERAL_STG, we got error
    '_csv.Error: field larger than field limit (131072)'
    We are using Ubuntu OS where airbyte is deployed. We are also attaching logs for you reference.
  • a

    Augustin Lafanechere (Airbyte)

    04/12/2022, 5:31 PM
    Hi there! 🚨 We will archive this channel tomorrow. It will still be available for reading past messages but you won't be able to post here anymore. We are continuing our community support on our Discourse forum. See you there 👋🏻*, in the Connector Development category.*
  • j

    Jonathan Alvarado

    04/12/2022, 5:59 PM
    ./gradlew :airbyte-integrations:connectors:source-postgres:integrationTest
  • t

    Tien Nguyen

    04/12/2022, 8:53 PM
    Hi Team, I have this pyarrow error when building a docker image. Can someone please help me out ? Thank you very much
    g
    • 2
    • 1
  • d

    Daniel Vengoechea

    04/13/2022, 1:29 PM
    Hi , why the pipedrive connector is not available on the airbyte cloud ?
  • d

    Daniel Vengoechea

    04/13/2022, 1:30 PM
    I got the trial and was not able to use it because of this missing connector
  • a

    Augustin Lafanechere (Airbyte)

    04/13/2022, 1:35 PM
    Hello there, I'm archiving this channel now. We are continuing our community support on our Discourse forum. See you there 👋🏻*, in the Connector Development category.*
  • u

    user

    04/13/2022, 1:42 PM
    Everyone from Airbyte Team was removed from this channel because @[DEPRECATED] Augustin Lafanechere archived the channel.
  • u

    [DEPRECATED] Augustin Lafanechere

    04/13/2022, 1:44 PM
    archived the channel