https://linen.dev logo
Join Slack
Powered by
# help-connector-development
  • r

    Rachel RIZK

    03/10/2023, 8:20 AM
    Hey everyone 👋 I’m working on a PR to add missing dimensional columns for Bing Ads. However after launching acceptance tests in the comments (thanks @Sherif Nada 🙏) one integration test is failing; specifically for a test implemented last month and comparing expected records with actual ones. I think the actual ones are based on the Airbyte sandbox account and the values drifted since first implementation, am I right? If so, does that mean it needs to be updated by someone from the team for every PR?
  • ł

    Łukasz Aszyk

    03/10/2023, 11:11 AM
    Hi Airbyte CDK team. Thank you for developing the
    connector-builder
    in the UI. Very helpful feature. Is there a chance you could add the
    inject_into: URL
    option when handling the Pagination, please? I think it is a common case these parameters are passed through the URL, so having that would help significantly. Thank you
    s
    • 2
    • 2
  • ł

    Łukasz Aszyk

    03/10/2023, 1:27 PM
    Also, I think this belongs to this channel: * CDK connector acceptance test error ** I’ve created the
    manifest.yaml
    file with the
    connector-builder
    and everything seems to work just fine, but still got a few
    pydantic
    related errors. I’ve checked the other
    .json
    files that the parser could reference by setting up against, but still getting this. Anyone has seen and solved it before?
  • d

    Danish Raza

    03/10/2023, 3:52 PM
    hey team, quick question, Is there way/param that i can pass after
    --catalog
    to run a single stream only? For example, I have 4 streams inside
    configured_catalog.json
    but i want to run one of them only.
    python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.json
    a
    • 2
    • 2
  • a

    Aazam Thakur

    03/11/2023, 11:03 AM
    Hi team, can anyone assist me why am I getting this error when i run the unit test command in the python cdk
  • v

    VISHAL B

    03/13/2023, 7:37 AM
    Hi team, when i migrate data from mysql to Bigquery json field formats are typecasting to String how can i solve this issue Can i solve this by Using cdk, now i am using airbyte which is hosted in K8s
  • j

    Josh Jeffries

    03/13/2023, 10:24 AM
    There seems to be an issue with the transformations or the documentation on here https://docs.airbyte.com/connector-development/config-based/understanding-the-yaml-file/record-selector. I am trying to use it to remove unscary fields/arrays from a data souce, but i can't get it to work. Airbyte version (updated today): 0.42.0 (although been having the issue on 41 as well) CDK Version (again updated today but on previous version to): 0.30.1 sample from yaml
    Copy code
    record_selector:
            type: RecordSelector
            extractor:
              type: DpathExtractor
              field_path:
                - items
            transformations:
                - type: RemoveFields
                  field_pointers:
                    - [ "items", "links" ]
    My json record response
    Copy code
    {
      "userId": 860,
      "firstName": "__WebServiceUser_",
      "email": "",
      "deleted": true,
      "mentionName": "@__webserviceuser_",
      "createdAt": "2021-08-24T00:02:04Z",
      "updatedAt": "2021-08-24T00:02:04Z",
      "links": {
        "self": "<https://eu2api.jobadder.com/v2/users/860>",
        "userGroups": "<https://eu2api.jobadder.com/v2/users/860/usergroups>"
      }
    }
    This should have removed the links array, but hasn't. I have tried with various field_pointers all with the same results. I also can't get the AddField transformation to work either.
    a
    • 2
    • 2
  • f

    Fred Manley

    03/13/2023, 1:58 PM
    Is the .NET CDK still an option? It's linked in the documents but the repo hasn't been updated in 2 years and the CLI references Java or Python, not .NET.
  • e

    Elliot Trabac

    03/13/2023, 2:39 PM
    Hey team! With the low-code CDK, what happens when the partition_router is an incremental stream? Do we need to do something specific on the child stream?
    b
    a
    l
    • 4
    • 11
  • d

    Dave Tomkinson

    03/13/2023, 5:26 PM
    Cross posting this from airbyte-help https://airbytehq.slack.com/archives/C021JANJ6TY/p1678725203320869
  • a

    Andre Santos

    03/13/2023, 7:57 PM
    Hello everyone, I'm creating a custom connector for reading data from Salesforce Reports. To do that I'm using Salesforce Reports and Analytics API . This API has no support for pagination and only returns the first 2000 rows from the report. I'm using simple salesforce library to communicate with the Salesforce Reports api. I had to write a class in order to create the pagination strategy. I mean, I send the request to the report endpoint sorting the return by the id... and then I keep sending requests to get the next 2000 rows until I get a response with 'all data' property... So I know there's no more data to retrive... Actually I see no use for the methods next_page_token, path, and other thing in HTTPStream However it seems to me when we run a python main.py read... the airbyte will use these methods to make the whole thing to work... I tried to find some samples, documentation in a way I could see clear how to adapt my code... But it's being hard to find that... Can you share some tips to help im my journey?
  • a

    Andre Santos

    03/13/2023, 7:59 PM
    I have used the documentation https://docs.airbyte.com/connector-development/tutorials/cdk-tutorial-python-http when creating my new source... and I could follow until the 5th step... I started getting swampedd in the 6th step.
    g
    a
    v
    • 4
    • 6
  • f

    Fred Manley

    03/13/2023, 8:24 PM
    Is there a reason the python examples/tutorials were deleted from GitHub? It's made the speedrun tutorial (https://docs.airbyte.com/connector-development/tutorials/cdk-speedrun) less of a speedrun and more of an exercise in extreme annoyance. Did the referenced documents (json schemas) get moved anywhere?
    a
    • 2
    • 1
  • b

    Bart Veenstra

    03/14/2023, 3:20 PM
    Hi all. I am creating a destination that can write parquet files to Azure Data Lake Gen2 folders as those are way easier to process in Synapse. Also adding support for SAS keys as Storage Accoutn Keys are discouraged. However, I am stuck on the connector acceptance tests as it can’t find the destination_config.json during the startup of the connector.
    Copy code
    2023-03-14T16:14:49.249+0100 INFO Preparing command: docker run --rm --init -i -w /data/job --log-driver none --name destination-azure-data-lake-storage-gen2-write-0-0-yqfcb --network host -v /tmp/airbyte_tests/test9476898619965199338:/data -v /tmp/airbyte_tests/output3489329161128183704:/local -e USE_STREAM_CAPABLE_STATE=false -e FIELD_SELECTION_WORKSPACES= -e STRICT_COMPARISON_NORMALIZATION_WORKSPACES= -e APPLY_FIELD_SELECTION=false -e WORKER_CONNECTOR_IMAGE=airbyte/destination-azure-data-lake-storage-gen2:dev -e WORKER_JOB_ATTEMPT=0 -e AUTO_DETECT_SCHEMA=true -e WORKER_JOB_ID=0 -e STRICT_COMPARISON_NORMALIZATION_TAG=strict_comparison airbyte/destination-azure-data-lake-storage-gen2:dev write --config destination_config.json --catalog destination_catalog.json
    2023-03-14T16:14:51.295+0100 INFO 2023-03-14T15:14:50.847+0000 INFO integration args: {catalog=destination_catalog.json, write=null, config=destination_config.json}
    2023-03-14T16:14:51.300+0100 INFO 2023-03-14T15:14:50.856+0000 INFO Running integration: io.airbyte.integrations.destination.azure_data_lake_storage_gen2.AzureDataLakeStorageGen2Destination
    2023-03-14T16:14:51.300+0100 INFO 2023-03-14T15:14:50.857+0000 INFO Command: WRITE
    2023-03-14T16:14:51.308+0100 INFO 2023-03-14T15:14:50.857+0000 INFO Integration config: IntegrationConfig{command=WRITE, configPath='destination_config.json', catalogPath='destination_catalog.json', statePath='null'}
    2023-03-14T16:14:51.335+0100 INFO 2023-03-14T15:14:50.878+0000 ERROR Something went wrong in the connector. See the logs for more details
    Is there a trick to use the secrets/conf.json for running the connector acceptance tests during local developmenmt?
    f
    • 2
    • 4
  • a

    Adham Suliman

    03/14/2023, 5:13 PM
    Does anyone know of a way to insert the pagination options into the request_body_data as a variable for a graphql source? The documentation seems to only provide the ability to access these values by appending them to the end of an API request.
    a
    • 2
    • 6
  • ł

    Łukasz Aszyk

    03/14/2023, 11:35 PM
    Hi community, I’m building the connector using CDK with a pretty simple Rest API. All seems to be set up correctly, I got the right response for tests, but when running the acceptance test, below, I stumble upon this issue (I’ll paste it to the thread, for chat sanity):
    Copy code
    source-tidsbanken-api % python -m pytest integration_tests -p integration_tests.acceptance
    a
    m
    • 3
    • 5
  • r

    Ryan (Airbyte)

    03/15/2023, 3:30 PM
    Hey team, @Semih Korkmaz is facing an issue with the Connector Builder, specifically with the data type of number acting as int. Here is a link to the thread where the issue was raised, but @Semih Korkmaz feel free to add more context.
    l
    s
    • 3
    • 2
  • s

    Sami TAAISSAT

    03/16/2023, 1:03 PM
    Hello team, I'm migrating a connector from the low-code CDK to the python CDK in the goal of making a PR for a new connector we are using at my company. I have just one issue, in the low-code CDK, there's an option to inject the next page url, but the only examples I have found for the python CDK are injecting a new parameter in the request parameters. The API i'm ingesting from specifies the next page url in this param:
    json_response["links"]["next"]
    how can I use this url directly in the python CDK ?
    a
    x
    • 3
    • 3
  • k

    Karri Shivaharsha

    03/17/2023, 12:02 PM
    Hello Team, I have a couple of quetions could you please help me know before that i will just try to understand you guys what i am trying to achieve Detailed Description: SO I have pulled the opensource airbyte for developing a custom source with sap hana and i was succesful in developing the required functions(check, read, discover) and i checked them using python main.py check, discover read with secrets and catalog during read. After that i have built the image of custom source accorfing to Readme. From here i was Stuck in few problems. Problems: #1 How can i see my custom source in airbyte ui after developing , are there any next steps i need to do after implementing those function like configuring sourcedefnitions etc..If like that can i have any detailed documentation. #2 Do we need to configure Aws cred in .env file in Repo in order to connect to s3 destination #3 Do we have any docker command that reads the configured catalog and writes into local json or some other destination !Please Help me in above Problems .
    f
    • 2
    • 1
  • f

    Fred Manley

    03/17/2023, 3:01 PM
    About to push my first custom connector to "dev" and wondering what the best practice is here. Namely, I've developed the connector locally after pulling down Airbyte, and I can run it locally to my satisfaction from the UI. Now I have to get it to the Azure VM running Airbyte in my dev environment. A few questions there: 1. CI/CD, how do? Are other people copying and pasting from the folder the CLI created into a new folder and uploading just that to a Git repo? Is there a better developer experience than that? (presumably I'm not going to upload an entire fork of Airbyte) 2. Accessing the docker registry from a VM, how do? Namely simply logging in to azure from the VM running the containers doesn't work like it does locally, presumably because Airbyte is running in docker containers which aren't themselves logging into azure. Thanks!
  • d

    Dieter De Beuckeleer

    03/20/2023, 3:30 PM
    Heya! I really like the idea of the connector-builder-ui. I was trying it out but I had a problem and I was wondering if you can help me out. When creating a connector, and than adding it to airbyte, I have stumbled into an error message:
    No properties node in stream schema
    . As I understand, something is missing in the yml file generate by the connector builder. Is there a way to add that piece of information with the connector builder or should I add it myself manually?
    • 1
    • 1
  • k

    Kevin Conseil

    03/20/2023, 3:36 PM
    Hi Everyone, Is there a BC connector with Airbyte? Or what is the best approche to retrieve tables from Business Central (Cloud version)?
  • l

    Lenin Mishra

    03/21/2023, 6:03 AM
    How can I contribute a low CDK connector that I have developed to Airbyte?
    m
    • 2
    • 1
  • s

    Shreepad Khandve

    03/23/2023, 10:59 AM
    Hello team, I am working on low code custom connector and I have following question about paginator: I am getting following response of my yotpo products api and i want to iterate over all pages
    Copy code
    paginator:
          type: DefaultPaginator
          page_size: "#/definitions/page_size"
          limit_option:
            inject_into: "request_parameter"
            field_name: "count"
          page_token_option:
            type: RequestOption
            inject_into: "request_parameter"
            field_name: "page"
          pagination_strategy:
            type: "PageIncrement"
            page_size: 150
    But when i run the read command to see the results, I am not able to iterate over all pages. Can anyone help me to understand the logic ?
  • a

    Anton Marini

    03/23/2023, 7:12 PM
    Hi friends, im trying to fix a bug in the new weaviate connector
  • a

    Anton Marini

    03/23/2023, 7:54 PM
    Ive got gradle set up and am trying to build the weaviate connector as is, before changes, via
    ./gradlew :airbyte-integrations:connectors:destination-weaviate:build
    and I get:
    Copy code
    > Task :airbyte-integrations:connectors:destination-weaviate:_unitTestCoverage FAILED
    [python] .venv/bin/python -m coverage run --data-file=unit_tests/.coverage.unitTest --rcfile=/Volumes/Documents/Repositories/airbyte/pyproject.toml -m pytest -s unit_tests -c /Volumes/Documents/Repositories/airbyte/pyproject.toml
             ============================= test session starts ==============================
             platform darwin -- Python 3.11.0, pytest-6.2.5, py-1.11.0, pluggy-1.0.0 -- /Volumes/Documents/Repositories/airbyte/airbyte-integrations/connectors/destination-weaviate/.venv/bin/python
             cachedir: .pytest_cache
             rootdir: /Volumes/Documents/Repositories/airbyte/airbyte-integrations/connectors/destination-weaviate/unit_tests, configfile: ../../../../pyproject.toml
             collecting ... collected 0 items / 1 error
             
             ==================================== ERRORS ====================================
             ________________________ ERROR collecting unit_test.py _________________________
             unit_tests/unit_test.py:8: in <module>
                 from destination_weaviate.client import Client
             destination_weaviate/__init__.py:6: in <module>
                 from .destination import DestinationWeaviate
             destination_weaviate/destination.py:9: in <module>
                 from airbyte_cdk import AirbyteLogger
             .venv/lib/python3.11/site-packages/airbyte_cdk/__init__.py:5: in <module>
                 from .connector import AirbyteSpec, Connector
             .venv/lib/python3.11/site-packages/airbyte_cdk/connector.py:14: in <module>
                 from airbyte_cdk.models import AirbyteConnectionStatus, ConnectorSpecification
             .venv/lib/python3.11/site-packages/airbyte_cdk/models/__init__.py:8: in <module>
                 from .airbyte_protocol import *
             .venv/lib/python3.11/site-packages/airbyte_cdk/models/airbyte_protocol.py:5: in <module>
                 from airbyte_protocol.models.airbyte_protocol import *
             .venv/lib/python3.11/site-packages/airbyte_protocol/models/__init__.py:3: in <module>
                 from .airbyte_protocol import *
             .venv/lib/python3.11/site-packages/airbyte_protocol/models/airbyte_protocol.py:374: in <module>
                 class AirbyteStateMessage(BaseModel):
             .venv/lib/python3.11/site-packages/pydantic/main.py:292: in __new__
                 cls.__signature__ = ClassAttribute('__signature__', generate_model_signature(cls.__init__, fields, config))
             .venv/lib/python3.11/site-packages/pydantic/utils.py:258: in generate_model_signature
                 merged_params[param_name] = Parameter(
             /Users/vade/miniforge3/envs/airbyte/lib/python3.11/inspect.py:2715: in __init__
                 raise ValueError('{!r} is not a valid parameter name'.format(name))
             E   ValueError: 'global' is not a valid parameter name
             =========================== short test summary info ============================
             ERROR unit_tests/unit_test.py - ValueError: 'global' is not a valid parameter...
             !!!!!!!!!!!!!!!!!!!! Interrupted: 1 error during collection !!!!!!!!!!!!!!!!!!!!
             =============================== 1 error in 0.83s ===============================
    
    FAILURE: Build failed with an exception.
    a
    • 2
    • 8
  • a

    Anton Marini

    03/23/2023, 7:57 PM
    Ive tried checking out from
    v0.42.0
    and for some reason this gradlew test is still failing
  • a

    Anton Marini

    03/23/2023, 8:12 PM
    ok
  • a

    Anton Marini

    03/23/2023, 8:13 PM
    I think i got it. I had to remove the local .gradle in the root of airbyte which was build with venvs inherited from python 3.11 which fucked it all up.
  • a

    Anton Marini

    03/23/2023, 8:13 PM
    and gradlew didnt respect no build cache flag. good lord. 😛
1...456...21Latest