https://linen.dev logo
Join Slack
Powered by
# help-connector-development
  • l

    Lenin Mishra

    06/20/2023, 6:48 AM
    Hello everyone! I am constantly having issues trying to read the invoices endpoint from Zoho Books. I am building it using the Low CDK configuration. The setup is very simple and I am trying to authenticate the endpoint using Oauth. All the necessary configs are provided(Please refer the image attached)! However I keep getting the below error
    Copy code
    File "/root/.pyenv/versions/3.9.11/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/requests_native_auth/abstract_oauth.py", line 33, in get_auth_header
        return {"Authorization": f"Bearer {self.get_access_token()}"}
      File "/root/.pyenv/versions/3.9.11/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/requests_native_auth/abstract_oauth.py", line 38, in get_access_token
        token, expires_in = self.refresh_access_token()
      File "/root/.pyenv/versions/3.9.11/lib/python3.9/site-packages/airbyte_cdk/sources/declarative/auth/oauth.py", line 116, in refresh_access_token
        return response_json[self.get_access_token_name()], response_json[self.get_expires_in_name()]
    KeyError: 'access_token'
    Now, when I run the above query in Postman I get all my results. Seems like something is going wrong with generation of new access token. For some reason, the token_refresh_endpoint is not returning a json with the access_token key. can someone help me with it?
    j
    • 2
    • 2
  • s

    Slackbot

    06/20/2023, 8:51 AM
    This message was deleted.
    k
    • 2
    • 2
  • c

    Chính Bùi Quang

    06/20/2023, 8:53 AM
    I want to setup to get data from Bizfly app according to the following document https://crm.bizfly.vn/apidoc/doc/#api-Help_Support-apiKey. The values ​​in the header are encoded for example the cb-access-sign field I had to code python connect to look like this: time_now = str(int(time.time())) message = str(time_now).encode() + bizfly_project_token.encode() key = api_secret.encode() signature = hmac.new(key, message, hashlib.sha512).hexdigest() df = pd.DataFrame() url = 'https://api.bizfly.vn/crm/_api/base-table/find' headers = { 'cb-access-key': access_token, 'cb-project-token': bizfly_project_token, 'cb-access-timestamp':time_now, 'cb-access-sign': signature, 'Content-Type': 'application/x-www-form-urlencoded' } payload = { "table": data_table, "limit": 1000, "skip": i, "output": "by-key", "query" : query } How to create a valid signature?
    k
    • 2
    • 2
  • s

    Slackbot

    06/20/2023, 8:55 AM
    This message was deleted.
    k
    • 2
    • 2
  • c

    Chính Bùi Quang

    06/20/2023, 8:57 AM
    how to user encode and hashlib.sha512 in param header on AirByte Builder? (edited)
    k
    j
    • 3
    • 40
  • c

    Chính Bùi Quang

    06/20/2023, 9:08 AM
    how to create access hash_hmac sha512 in Builder?
    k
    • 2
    • 2
  • c

    Chidambara Ganapathy

    06/20/2023, 9:25 AM
    Hi Team, I am trying to build AWS Cost Explorer connector using CDK. This is the request syntax.
    Copy code
    response = client.get_cost_and_usage(
        TimePeriod={
            'Start': 'string',
            'End': 'string'
        },
        Granularity='DAILY'|'MONTHLY'|'HOURLY',
        Filter={
            'Or': [
                {'... recursive ...'},
            ],
            'And': [
                {'... recursive ...'},
            ],
            'Not': {'... recursive ...'},
            'Dimensions': {
                'Key': 'AZ'|'INSTANCE_TYPE'|'LINKED_ACCOUNT'|'LINKED_ACCOUNT_NAME'|'OPERATION'|'PURCHASE_TYPE'|'REGION'|'SERVICE'|'SERVICE_CODE'|'USAGE_TYPE'|'USAGE_TYPE_GROUP'|'RECORD_TYPE'|'OPERATING_SYSTEM'|'TENANCY'|'SCOPE'|'PLATFORM'|'SUBSCRIPTION_ID'|'LEGAL_ENTITY_NAME'|'DEPLOYMENT_OPTION'|'DATABASE_ENGINE'|'CACHE_ENGINE'|'INSTANCE_TYPE_FAMILY'|'BILLING_ENTITY'|'RESERVATION_ID'|'RESOURCE_ID'|'RIGHTSIZING_TYPE'|'SAVINGS_PLANS_TYPE'|'SAVINGS_PLAN_ARN'|'PAYMENT_OPTION'|'AGREEMENT_END_DATE_TIME_AFTER'|'AGREEMENT_END_DATE_TIME_BEFORE'|'INVOICING_ENTITY'|'ANOMALY_TOTAL_IMPACT_ABSOLUTE'|'ANOMALY_TOTAL_IMPACT_PERCENTAGE',
                'Values': [
                    'string',
                ],
                'MatchOptions': [
                    'EQUALS'|'ABSENT'|'STARTS_WITH'|'ENDS_WITH'|'CONTAINS'|'CASE_SENSITIVE'|'CASE_INSENSITIVE'|'GREATER_THAN_OR_EQUAL',
                ]
            },
            'Tags': {
                'Key': 'string',
                'Values': [
                    'string',
                ],
                'MatchOptions': [
                    'EQUALS'|'ABSENT'|'STARTS_WITH'|'ENDS_WITH'|'CONTAINS'|'CASE_SENSITIVE'|'CASE_INSENSITIVE'|'GREATER_THAN_OR_EQUAL',
                ]
            },
            'CostCategories': {
                'Key': 'string',
                'Values': [
                    'string',
                ],
                'MatchOptions': [
                    'EQUALS'|'ABSENT'|'STARTS_WITH'|'ENDS_WITH'|'CONTAINS'|'CASE_SENSITIVE'|'CASE_INSENSITIVE'|'GREATER_THAN_OR_EQUAL',
                ]
            }
        },
        Metrics=[
            'string',
        ],
        GroupBy=[
            {
                'Type': 'DIMENSION'|'TAG'|'COST_CATEGORY',
                'Key': 'string'
            },
        ],
        NextPageToken='string'
    )
    Can you please let me know how to define the schema for this. Thanks
    k
    j
    • 3
    • 7
  • m

    Mahesh Thirunavukarasu

    06/20/2023, 12:31 PM
    Hi. I need help in adding a new stream in Quickbooks connector. I am trying to add a quickbooks cdc api to capture the deleted invoice records. But the api throws error in the cdk connector. However, it works in the connector builder ui, but the authentication of quickbooks oauth2 type is not supported there. Appreciate any help on this topic.
    k
    j
    • 3
    • 8
  • c

    Chính Bùi Quang

    06/20/2023, 1:14 PM
    Hi everybody 😄 I am trying to set up a connection with Freshchat but feel that the data retrieved is not complete. Can anyone who has worked with Freshchat share the Builder setup for me?
    k
    • 2
    • 2
  • o

    Octavia Squidington III

    06/20/2023, 1:45 PM
    🔥 Office Hours starts in 15 minutes 🔥 Topic and schedule posted in #C045VK5AF54 octavia loves At 16:00 CEST / 10am EDT click here to join us on Zoom!
  • s

    Slackbot

    06/20/2023, 3:45 PM
    This message was deleted.
    k
    • 2
    • 2
  • m

    Micky

    06/20/2023, 3:49 PM
    Hi, I have deployed Airbyte on AWS EC2 with CDC for database in PostgreSQL. if I recreate the replication slot and reconfigure Airbyte after I drop the replication slot, is the data in the destination (same database, same schema, same table name) going to be overwritten after an initial sync?
    k
    • 2
    • 2
  • a

    Abdul Hameed

    06/20/2023, 3:55 PM
    Hi Team I am trying to run generate.sh using npm and facing this error
    k
    • 2
    • 2
  • a

    Abdul Hameed

    06/20/2023, 3:55 PM
    Hi Team I am trying to run generate.sh using npm and facing this error
    airbyte-connector-generator@0.1.0 generate
    plop
    'plop' is not recognized as an internal or external command, operable program or batch file.
    k
    • 2
    • 2
  • n

    Nohelia Merino

    06/20/2023, 8:35 PM
    @kapa.ai I am getting the following response error while trying to create a connector:
    Copy code
    {
      "message": "Bad Request",
      "_links": {
        "self": {
          "href": "/api/v1/source_definition_specifications/get",
          "templated": false
        }
      },
      "_embedded": {
        "errors": [
          {
            "message": "Required argument [SourceDefinitionIdWithWorkspaceId sourceDefinitionIdWithWorkspaceId] not specified",
            "path": "/sourceDefinitionIdWithWorkspaceId"
          }
        ]
      }
    }
    k
    g
    • 3
    • 5
  • s

    Slackbot

    06/21/2023, 3:37 AM
    This message was deleted.
    k
    • 2
    • 2
  • m

    Mahesh Thirunavukarasu

    06/21/2023, 8:13 AM
    Hello all, Can we configure to run a stream only for incremental sync using low code cdk.
    k
    j
    • 3
    • 13
  • m

    Mạnh Hùng Phan

    06/21/2023, 9:27 AM
    Hi everyone, when will Airbyte support CDC in Oracle? Is there a roadmap available for this feature? Thank you.
    k
    • 2
    • 5
  • a

    Andy Smith

    06/21/2023, 10:10 AM
    Hi does the connector builder support an API that returns data in the following format (i.e. rows expressed as arrays and a column header section) :
    Copy code
    [
      {
        "reportData": {
          "columns": [
            "date",
            "my col 1",
            "my col 2"
          ],
          "rows": [
            [
              "2023-06-20T00:00:00",
              "value 1",
              "value 2"
            ]
          ]
        }
      }
    ]
    k
    • 2
    • 3
  • c

    Chính Bùi Quang

    06/21/2023, 10:35 AM
    It's me again everyone :D I'm trying to get data from Facebook Marketing today, but because I have more than 30 accounts 😞 such a large number of accounts, I have been using Facebook Marketing for a long time. setup get all Facebook Marketing data in just 1 connect like the setup of Google Ads?
    k
    • 2
    • 2
  • a

    Abdul Hameed

    06/21/2023, 6:55 PM
    @here Hi team, I was able to debug the source connector successfully, and build the docker image for the connector but when running that docker image I am facing this error
    Copy code
    "internal_message": "No command passed"
    k
    • 2
    • 2
  • o

    Octavia Squidington III

    06/21/2023, 7:45 PM
    🔥 Office Hours starts in 15 minutes 🔥 Topic and schedule posted in #C045VK5AF54 octavia loves At 1PM PDT click here to join us on Zoom!
  • c

    Chính Bùi Quang

    06/22/2023, 1:59 AM
    Hi @Joe Reuter I still find it impossible to fix the bug https://github.com/airbytehq/airbyte/issues/27494. But I found that maybe because the output format is not the same as the one I configured, it happens that there is no streamState. Can you guide me how to format the configuration in the image below? For example I have "2023-06-22T014821.186Z" and I want to configure to "2023-06-22T014821.000Z"
  • a

    Abdul Hameed

    06/22/2023, 7:46 AM
    I am getting this error when trying to add connector using docker connector Fetching connector failed. Try again
    k
    a
    • 3
    • 3
  • j

    Janis Karimovs

    06/22/2023, 8:45 AM
    Hey everyone, I'm building a Podio source connector and using BigQuery as the destination. I'm currently retrieving data from 20 Podio apps, and all but 1 of the apps get successfully uploaded to BigQuery. The one app that fails gives me these errors:
    Copy code
    "Pickling client objects is explicitly not supported. Clients have non-trivial state that is local and unpickleable."
    Copy code
    "Failure Origin: normalization, Message: Normalization failed during the dbt run. This may indicate a problem with the data itself."
    I can't figure out why that 1 app fails, as I don't see any difference between this app and the rest of them. The data gets retrieved successfully from this app, and as far as I figure something is happening on the BigQuery connector end. Any lead would be appreciated, thank you 🙏
    k
    • 2
    • 3
  • a

    Abdul Hameed

    06/22/2023, 10:25 AM
    unable to run low code connector image in docker
    k
    • 2
    • 2
  • a

    Abdul Hameed

    06/22/2023, 10:40 AM
    unable to run low code connector image in docker getting error as "message": "No command passed"
    k
    • 2
    • 2
  • o

    Octavia Squidington III

    06/22/2023, 1:45 PM
    🔥 Office Hours starts in 15 minutes 🔥 Topic and schedule posted in #C045VK5AF54 octavia loves At 16:00 CEST / 10am EDT click here to join us on Zoom octavia loves
  • a

    Alexander Schmidt

    06/22/2023, 2:34 PM
    Are there any plans to add incremental sync mode for "GET_V2_SETTLEMENT_REPORT_DATA_FLAT_FILE" in the Amazon Seller Partner Connector? (It's limited to 90 days, so a possibility to keep the data would be awesome)
    k
    m
    • 3
    • 4
  • c

    Conor O'Mara

    06/22/2023, 2:54 PM
    Is it possible to have multiple start dates and windows in a config.json for different streams. One thing we are seeing with GA4 is that data can take up to 48 hours to be processed. Could I configure 2 data pulls in airbyte: 1. Incremental from start_date until today-5 days 2. Full overwrite for last 5 days
    k
    • 2
    • 2
1...151617...21Latest