https://linen.dev logo
Join Slack
Powered by
# connector-development
  • u

    user

    09/20/2021, 4:04 AM
    I’m getting the following error from
    python main.py check --config secrets/config.json
    In my spec.json I have
    Copy code
    "page-size": {
        "type": "number",
        "description": "Results per page"
    },
    Changing it to
    string
    works, but I would expect
    number
    to work.
    c
    • 1
    • 10
  • m

    Marwan

    01/17/2022, 9:59 AM
    Hi Guys, Im building a connector where the refresh token changes every time you get a new access token from a refresh token. Is there some way to store the newly generated refresh token? Do you have an example connector that does this available?
    t
    f
    • 3
    • 14
  • a

    Ariyo Kabir

    01/26/2022, 5:38 PM
    Hello Team, I am trying to add the
    accounts
    stream to stripe. I am having trouble putting the schema together. I used the
    openAPIspec2Json
    script but it creates the schema with references, and when I try to use the
    --stand-alone
    flag, I get
    circular dependency error
    . Is there any work around for this or do I have to stitch the schema together manually?
    j
    • 2
    • 2
  • j

    Jags (Kloud.io)

    01/27/2022, 11:42 AM
    Hello Team, I’m trying to connect the destination to alicloud apsaraDB for mongoDB and I get this error:
    Copy code
    Timed out after 30000 ms while waiting for a server that matches com.mongodb.client.internal.MongoClientDelegate$1@2575f671. Client view of cluster state is {type=REPLICA_SET, servers=[{address=<http://dds-pub.mongodb.ap-southeast-5.rds.aliyuncs.com:3717|dds-pub.mongodb.ap-southeast-5.rds.aliyuncs.com:3717>, type=UNKNOWN, state=CONNECTING, exception={com.mongodb.MongoSocketWriteException: Exception sending message}, caused by {javax.net.ssl.SSLHandshakeException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target}, caused by {sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target}
    Can you help me how to fix this?
    f
    a
    +2
    • 5
    • 23
  • f

    Faris Alfa Mauludy

    01/27/2022, 1:45 PM
    Hi, we are checking is Airbyte is suitable for us. We want to connect woocommerce through open source with datastudio. But we cant seem to find out if airbyte can read also product category?
    f
    • 2
    • 1
  • a

    Ali Mojiz

    01/27/2022, 3:30 PM
    The client can enter credentials in a custom UI and those credentials could be imported into airbyte through an api for that specific connector
  • d

    Daniel Eduardo Portugal Revilla

    01/27/2022, 5:13 PM
    Hello! I created a http connector, for credential I used parameters buy the type is string and it showing my password jejej is there another type for credentials inputs?
  • d

    Daniel Eduardo Portugal Revilla

    01/27/2022, 5:17 PM
    There are some duplicated items in io.airbyte.workers.normalization.NormalizationRunnerFactory. No real side effect, just confusing for plugin developer https://github.com/airbytehq/airbyte/blame/b6926d44d47073466fc360b6a21aaa88abfc405[…]o/airbyte/workers/normalization/NormalizationRunnerFactory.java
    j
    • 2
    • 3
  • j

    Jove Zhong

    01/27/2022, 7:01 PM
    hi!! how can I put parameters for my data ingestion to S3. for example I need pull only new data every day and stored it in partitions on S3 I was able to create parameters but only when I create the source
    d
    • 2
    • 1
  • d

    Daniel Eduardo Portugal Revilla

    01/27/2022, 7:46 PM
    Hello, apologies if this was asked before already. What is the easiest way to import a custom connector into a running instance of Airbyte on GCP? I wasn’t able to find any documentation beyond a mention to use Docker Hub.
    t
    • 2
    • 1
  • t

    Tomas Balciunas

    01/28/2022, 5:09 AM
    Hi, I'm having some issues with JSON validation I was hoping you could help with. The errors I'm getting are inconsistent, and not reproducible, in the sense that they occur for a different data point on separate runs. An example error is as below:
    Copy code
    2022-01-28 04:55:07 INFO i.a.v.j.JsonSchemaValidator(test):56 - JSON schema validation failed. 
    errors: $: string found, object expected
    2022-01-28 04:55:07 ERROR i.a.w.p.a.DefaultAirbyteStreamFactory(lambda$create$1):70 - Validation failed: "type"
    Can you please give me any insight into this issue. I can confirm that the data points are as expected by running the script locally.
    w
    f
    j
    • 4
    • 3
  • w

    will

    01/28/2022, 11:33 AM
    Hi, I am trying to build a python source (using python source template) and was successful in implementing a full refresh sync. However, I am unable to add incremental feature to my stream(incremental option does not show in airbyte UI). I have done the following but with no success: -> I am passing the given values while creating a stream during the discover function execution.
    "supported_sync_modes": [
    "full_refresh",
    "incremental"
    ],
    "source_defined_cursor": True,
    "default_cursor_field":["json_data","properties","updated_at"],
    "source_defined_primary_key": ["json_data","properties","_id"],
    Am I missing something?
    s
    • 2
    • 1
  • k

    Kévin Maschtaler

    01/28/2022, 3:16 PM
    My bad, I tried to do it from the destination seeds file, and not from the UI 🤦‍♂️ I found it under Settings > Destinations > + add a destination That part of the documentation wasn't 100% up-to-date
    • 1
    • 1
  • k

    Kévin Maschtaler

    01/28/2022, 4:37 PM
    Hi! I’m a developer working on adding a new Python HTTP API source connector following the guide. I’ve got things working up to this step of adding the connector to UI and somehow can’t add the source connector locally following the instruction. From the docker server log, it seems like it cannot get spec. Can someone help?
    p
    d
    • 3
    • 15
  • p

    Phoebe Yang

    01/28/2022, 4:40 PM
    hello, I am having problems pulling data from an API, when y put retrive only 10000 object the EC2 instance dead. I tried that on local using python request and works well in 3 minutos around. My aws EC2 has: 30GB ssd 4GB Ram 2vcpu in addition, when i pull 1000 using airbyte the size is too big
    Copy code
    lass ServicesnowApi(HttpStream):
        url_base = "https://.com/api/now/v1/"
    
        # Set this as a noop.
        primary_key = None
    
        def __init__(self, limit: str, sys_created_from: str, sys_created_to: str, **kwargs):
            super().__init__(**kwargs)
            # Here's where we set the variable from our input to pass it down to the source.
            self.limit = limit
            self.sys_created_from = sys_created_from
            self.sys_created_to = sys_created_to
    
    
        def path(self, **kwargs) -> str:
            # This defines the path to the endpoint that we want to hit.
            limit = self.limit
            sys_created_from = self.sys_created_from
            sys_created_to = self.sys_created_to
            return f"table/incident?sysparm_offset=0&sysparm_limit={limit}&sysparm_query=sys_created_on>={sys_created_from} 08:00^sys_created_on<{sys_created_to} 08:00^active=ISNOTEMPTY"
    
    
        def request_params(
                self,
                stream_state: Mapping[str, Any],
                stream_slice: Mapping[str, Any] = None,
                next_page_token: Mapping[str, Any] = None,
        ) -> MutableMapping[str, Any]:
            # The api requires that we include the Pokemon name as a query param so we do that in this method.
            limit = self.limit
            sys_created_from = self.sys_created_from
            sys_created_to = self.sys_created_to
            return {"limit": limit, "sys_created_from":sys_created_from, "sys_created_to":sys_created_to}
    
    
        def parse_response(
                self,
                response: requests.Response,
                stream_state: Mapping[str, Any],
                stream_slice: Mapping[str, Any] = None,
                next_page_token: Mapping[str, Any] = None,
        ) -> Iterable[Mapping]:
            # The response is a simple JSON whose schema matches our stream's schema exactly,
            # so we just return a list containing the response.
            return [response.json()]
    
    
        def next_page_token(self, response: requests.Response) -> Optional[Mapping[str, Any]]:
        # While the PokeAPI does offer pagination, we will only ever retrieve one Pokemon with this implementation,
        # so we just return None to indicate that there will never be any more pages in the response.
            return None
    d
    f
    +2
    • 5
    • 9
  • s

    Siddharth Putuvely

    01/29/2022, 12:33 PM
    Hello; i get following error if i test my source connector in airbyte. if i debug it locally it works like a charm.... Any idea? Is this an error on source-connector side, in airbyte or in the destination-connector? What does that mean? WHere is an Object expected? Thanks alot
    Copy code
    2022-01-29 11:54:15 INFO i.a.v.j.JsonSchemaValidator(test):56 - JSON schema validation failed. 
    errors: $.record.data: string found, object expected
    2022-01-29 11:54:15 ERROR i.a.w.p.a.DefaultAirbyteStreamFactory(lambda$create$1):70 - Validation failed: {"type":"RECORD","record":{"stream":"email","data":"{\n  \"id\": \"8994962\"\n}","emitted_at":1643457255165}}
    f
    • 2
    • 1
  • f

    flow

    01/30/2022, 9:09 PM
    Hello, I have successfully generated my connector, but it is saying that the instructions to implement it are at the generated directory. I have gone to the directory but I am not seeing any todo files or something like that. Does anyone know where this is?
    j
    • 2
    • 1
  • k

    Kerim Tricic

    01/31/2022, 12:11 AM
    Hi, is there any way to tune, either from the UI or otherwise, the number of retries on a particular sync job? Currently, it appears that the maximum number of failed attempts is three. If this is possible, it is possible per connection, or globally?
    j
    • 2
    • 2
  • j

    Jordan Velich

    01/31/2022, 12:22 AM
    Hi again, just wondering if it is possible for two sync jobs to be running at the same time on a given connection? Say I set the sync frequency to 5 minutes, and one of the syncs take 6 minutes, what will happen in this case?
    m
    • 2
    • 4
  • u

    user

    01/31/2022, 7:51 PM
    This message was deleted.
  • m

    Martin Prejean

    02/01/2022, 5:51 PM
    Hello, I have created a connector for a servicesnow API. Is there a possibility to retrieve the data by date? that in each execution it brings me the data of the following date, or the possibility of doing backfill for each day. at the moment I have only been able to recover the same data using the same endpoint fetch all data and overwrite would not be a good practice
    Copy code
    <https://mycompany.service-now.com/api/now/v1/table/incident?sysparm_offset=0&sysparm_limit=1000000&sysparm_query=sys_created_on>>=2020-01-01 08:00^sys_created_on<2021-12-02 08:00^active=ISNOTEMPTY
    d
    • 2
    • 3
  • a

    Alessandro Duico

    02/02/2022, 10:48 AM
    What is the proper way to set up a Stream when there are multiple clients whose data need to be fetched? I was thinking of having stream_state store a list[dict] with a different "last_fetched_record_date" for every customer. Is there a better way to achieve this?
    d
    • 2
    • 8
  • a

    Alessandro Duico

    02/02/2022, 3:15 PM
    Hello, I am trying to implement incremental streams or slides to improve data recovery performance. since pulling everything in a single block takes a lot of resources and the aws instance crash I followed other sources and set state_checkpoint_interval but it keeps bringing back all the logs in one go, it should be 100 by 100 I think so
    Copy code
    class ServicesnowApi(HttpStream):
        url_base = "<https://xxx.service-now.com/api/now/v1/>"
    
        # Set this as a noop.
        primary_key = None
        # Save the state every 100 records
        state_checkpoint_interval = 100
        page_size = 100
        cursor_field = "sys_updated_on"
    
        def __init__(self, limit: str, sys_created_from: str, **kwargs):
            super().__init__(**kwargs)
            # Here's where we set the variable from our input to pass it down to the source.
            self.limit = limit
            self.sys_created_from = sys_created_from
    
    
        def path(self, **kwargs) -> str:
            # This defines the path to the endpoint that we want to hit.
            limit = self.limit
            sys_created_from = self.sys_created_from
            return f"table/incident?sysparm_offset=0&sysparm_limit={limit}&sysparm_query=sys_created_on>={sys_created_from} 00:00^active=ISNOTEMPTY"
    
    
        def request_params(
                self,
                stream_state: Mapping[str, Any],
                stream_slice: Mapping[str, Any] = None,
                next_page_token: Mapping[str, Any] = None,
        ) -> MutableMapping[str, Any]:
            limit = self.limit
            sys_created_from = self.sys_created_from
            return {"limit": limit, "sys_created_from":sys_created_from}
    
    
        def parse_response(
                self,
                response: requests.Response,
                stream_state: Mapping[str, Any],
                stream_slice: Mapping[str, Any] = None,
                next_page_token: Mapping[str, Any] = None,
        ) -> Iterable[Mapping]:
            result = response.json()['result']
            return result
    
    
        def next_page_token(self, response: requests.Response) -> Optional[Mapping[str, Any]]:
            return None
    d
    n
    +3
    • 6
    • 16
  • d

    Daniel Eduardo Portugal Revilla

    02/02/2022, 6:49 PM
    Hi team, how are you? I was needing a Vertica destination connector but I cant find it, there is any connector on development? if not, where I can find documentation to develop the connector.
    t
    • 2
    • 2
  • n

    Nathan Gille

    02/02/2022, 10:15 PM
    I'm building a custom source in python and wanted to debug the google_sheets_source using my own credentials to get a better understanding of the process. I'm using the JSON key of the service account for authorization but keep getting the error.
    {"type": "CONNECTION_STATUS", "connectionStatus": {"status": "FAILED", "message": "Please use valid credentials json file. Error: Invalid control character at: line 1 column 171 (char 170)"}}
    when I run the
    check
    command. It doesn't seem to like the
    \n
    s in the private key. Any advice?
  • n

    Nathan Gille

    02/03/2022, 5:51 AM
    Hello Airbyte Team ! I have a problem about deploy Airbyte wint docker swam mode . What shound i do ?
    a
    • 2
    • 1
  • f

    Famezilla Channel

    02/03/2022, 1:10 PM
    Hello, I would like to know if someone already started to work on a Consul (from HashiCorp) connector (source & destination) before jumping into it ? 🙏 Thanks 🙇
    a
    • 2
    • 1
  • k

    Khristina Rustanovich

    02/03/2022, 3:56 PM
    Hello, I'm developing a connector and I have a tricky operation to do : • I have a stream that returns a list of objects, each one has an id and some statistic fields • The statistic fields are wrong, but I can get the right values by calling another endpoint • This other endpoint allows to batch requests for multiple object ids I'm not sure how I could implement that in a simple way, do you have any recommendation about it ? Maybe a connector that faces similar issues ? Or some doc that could help me ? Thanks 😁
    m
    • 2
    • 1
  • m

    Maxime edfeed

    02/06/2022, 11:37 PM
    We are experimenting with a long-living source connector and seem to be observing that the RECORD messages are not processed by the destination connector until the source has exited successfully. Is this the case? My understanding coming from the Singer paradigm was that output of the "tap" could be piped into the "target" for processing live as it comes in (like
    tap | target
    ). Is this how Airbyte works?
    r
    e
    h
    • 4
    • 4
  • j

    Jackson Clarke

    02/07/2022, 9:15 AM
    Hello..I am developing custom connector using python CDK. Having implemented Source ABC, got stuck with incremental sync. Wanted to check how to do incremental sync. Should I implement AbstractSource rather than the Source? Please suggest.
    n
    • 2
    • 3
1234567Latest