https://linen.dev logo
Join Slack
Powered by
# ask-ai
  • a

    Andres

    10/29/2024, 3:33 PM
    I’m implementing Incremental sync for one of my streams so that not all of the data needs to be fetched, but just those records that have been updated. But it seems airbyte doesn’t support a timestamp value to rely on. This is what my API says in the docs (The timestamp value returned has no relation with actual date or time. As such it cannot be converted to a date\time value. The timestamp is a rowversion value). Is it possible to use this value?
    u
    • 2
    • 1
  • s

    Slackbot

    10/29/2024, 4:10 PM
    This message was deleted.
    u
    • 2
    • 1
  • a

    ABHISHEK TRIPATHI

    10/29/2024, 4:51 PM
    Seeing the following error:
    Copy code
    io.airbyte.cdk.integrations.source.relationaldb.state.FailedRecordIteratorException: java.lang.RuntimeException: java.lang.RuntimeException: com.microsoft.sqlserver.jdbc.SQLServerException: An error occurred during the current command (Done status 0). Could not allocate space for object 'dbo.SORT temporary run storage: 140850374377472' in database 'tempdb' because the 'PRIMARY' filegroup is full due to lack of storage space or database files reaching the maximum allowed size. Note that UNLIMITED files are still limited to 16TB. Create the necessary space by dropping objects in the filegroup, adding additional files to the filegroup, or setting autogrowth on for existing files in the filegroup.
    u
    • 2
    • 1
  • n

    Nicholas Machado

    10/29/2024, 4:59 PM
    @kapa.ai I'm creating a connector using YAML to get data from an endpoint. However, I am having problems obtaining all the fields that are returned by the original API response. Example below RECORDS { "id": "28089818", "lead_id": "22364914", "owner_id": "37852", "owner_name": "Joãozinho", "cadence": "Wallet", "cadence_id": "45254", "created_date": "2024-10-25T180341.955Z", "start_date": "2025-01-13T030000.000Z", "end_date": "2024-10-25T181143.714Z", "status": "SWITCHED_CADENCE", "lead_origin_channel": "UNKNOWN", "conversion": "Industry" }, RESPONSE { "id": "28089818", "lead_id": "22364914", "owner_id": "37852", "owner_name": "Joãozinho", "cadence": "Wallet", "cadence_id": "45254", "lead_base": null, "lead_base_id": null, "rd_conversion_date": null, "created_date": "2024-10-25T180341.955Z", "excluded_date": null, "start_date": "2025-01-13T030000.000Z", "scheduled_date": null, "end_date": "2024-10-25T181143.714Z", "first_activity_date": null, "last_activity_date": null, "status": "SWITCHED_CADENCE", "lost_reason": null, "lost_reason_id": null, "lead_origin_channel": "UNKNOWN", "lead_origin_source": null, "lead_origin_campaign": null, "conversion": "Industry" }, Notice that in response, there are more fields than in records, and I need all response fields to be returned.
    u
    u
    • 3
    • 2
  • m

    Matviy Unzhakov

    10/29/2024, 5:16 PM
    @kapa.ai We have deployed Airbyte to EkS cluster and trying to use DybamoDB as a source. Airbyte used IRSA role instead of user. So no AWS keys present in the pod. It would well with Postgres, Snowflake, etc However with DynamoDB I got error
    Unable to load credentials from any of the providers in the chain
    u
    • 2
    • 1
  • s

    Sharon Lavie

    10/29/2024, 6:51 PM
    @kapa.ai i'm syncing data with a postgres connector. airbyte is running a
    select count(*)
    query before it starts fetching the data. is it possible to disable this?
    u
    u
    u
    • 4
    • 4
  • d

    Dana Williams

    10/29/2024, 7:06 PM
    how do I find my billing information in Airbyte?
    u
    • 2
    • 1
  • b

    Brian Bolt

    10/29/2024, 9:15 PM
    Is it possible to have a master (parent) and detail (substream) which just looks up the details from the parent stream?
    u
    u
    • 3
    • 4
  • s

    Slackbot

    10/29/2024, 10:20 PM
    This message was deleted.
    u
    • 2
    • 1
  • o

    Olivier Rousseau

    10/29/2024, 10:21 PM
    @kapa.ai - What is the method now to create custom connectors with custom python class not available in the connector buildor since we don't access to airbyte code with abctl install ?
    u
    • 2
    • 1
  • o

    Olivier Rousseau

    10/29/2024, 10:47 PM
    @kapa.ai - With airbyte 1.0 and abctl installation, is there a change in the process to use Low-Code CDK ? Can we still clone directly Airbyte repository and use the following command to create connector with airbyte 1.0
    ./generate.sh
    ,
    poetry run
    ,
    poetry read
    ,
    airbyte-ci connectors build
    and push the image on dockerhub ?
    u
    • 2
    • 1
  • o

    Olivier Rousseau

    10/29/2024, 10:53 PM
    @kapa.ai - Are there some compatibility risk with custom connectors developped on previous airbyte version (0.8 for instance) not working with airbyte 1.0 ?
    u
    • 2
    • 1
  • k

    Kamal Tuteja

    10/30/2024, 12:51 AM
    @kapa.ai quickbook connector is not working
    u
    u
    u
    • 4
    • 4
  • p

    Phạm Mạnh Hùng

    10/30/2024, 2:51 AM
    help me
    u
    u
    +3
    • 6
    • 9
  • h

    henryd

    10/30/2024, 3:09 AM
    @kapa.ai have set up airbyte in ec2 with secrets manager but airbyte not creating any secrets when creating connection
    u
    • 2
    • 1
  • h

    henryd

    10/30/2024, 4:04 AM
    @kapa.ai using secrets manager for airbyte and the way it store and retrieve clutters the secrets manager. theres too many. There are 9 secrets created for just 1 connection
    u
    u
    u
    • 4
    • 4
  • r

    ryusei arai

    10/30/2024, 4:48 AM
    @kapa.ai I want to use Helm chart to lower the version of airbyte. Which pod image tag should I modify?
    u
    u
    u
    • 4
    • 5
  • r

    Roberto Tolosa

    10/30/2024, 5:29 AM
    i'm trying to use the Bigquery source connector to load data into Snowflake, but I keep getting the error
    Warning from replication: Airbyte could not start the sync process. This may be due to insufficient system resources. Please check available resources and try again.
    . i'm initially only trying to load a table with 200 rows, and my EC2 instance running Airbyte has 32GB of RAM. this is off of a fresh Airbyte install. it doesn't feel like resources is the real issue. what could it be and how can we fix this?
    u
    u
    y
    • 4
    • 6
  • h

    henryd

    10/30/2024, 6:20 AM
    @kapa.ai if using s3 as destination, where is the path prefix set up would be? documentation said that destination defined is from path-prefix
    u
    u
    +3
    • 6
    • 7
  • h

    henryd

    10/30/2024, 6:31 AM
    @kapa.ai
    Copy code
    {
      "code": 401,
      "message": "Jwt issuer is not configured"
    }
    u
    u
    +3
    • 6
    • 7
  • h

    henryd

    10/30/2024, 6:43 AM
    @kapa.ai sourceType is empty when i get sources from api. Is it because the SourceType is not defined here? connectionSpecification: $schema: http://json-schema.org/draft-07/schema# title: Exact Spec type: object required: - base_url - divisions - credentials properties: base_url: type: string title: Base URL description: Base URL of the Exact API. Defaults to Dutch version. default: "https://start.exactonline.nl" divisions: type: array minItems: 1 title: Divisions description: List of divisions code to extract. The division code refers to the administration in Exact. credentials: title: OAuth Credentials type: object required: - client_id - client_secret - access_token - refresh_token - token_expiry_date properties: access_token: type: string title: OAuth Access Token description: Access token from Exact. airbyte_secret: true refresh_token: type: string title: OAuth Refresh Token description: Refresh token from Exact. airbyte_secret: true token_expiry_date: type: string title: OAuth Token Expiry Date description: Timestamp when the token expires. client_id: type: string title: OAuth Client ID description: OAuth client_id, can be found at apps.exactonline.com. client_secret: type: string title: OAuth Client Secret description: OAuth client_secret, can be found at apps.exactonline.com. airbyte_secret: true
    u
    u
    +3
    • 6
    • 7
  • s

    Sarang Kanfade

    10/30/2024, 6:43 AM
    @kapa.ai I am using Airbyte to live stream data from BigQuery to MSSQL, and the BigQuery table is a linked table from Google Sheets, but I am getting the error: 'BigQuery: Permission denied while getting Drive credentials.
    u
    u
    +3
    • 6
    • 7
  • j

    Jonathan Golden

    10/30/2024, 6:46 AM
    hey I'm getting: "Container airbyte-workload-api-server-container is waiting" for an OSS deployment in google cloud
    u
    • 2
    • 2
  • j

    Jonathan Golden

    10/30/2024, 7:51 AM
    how do i configure the mount path for the pods in the yaml configuration for helm chart deployment?
    u
    • 2
    • 1
  • j

    Jonathan Golden

    10/30/2024, 7:53 AM
    how do i configure helm chart deployment to use gcs for it's pods
    u
    u
    • 3
    • 3
  • o

    Olivier Rousseau

    10/30/2024, 9:03 AM
    @kapa.ai - when i use the low code cdk locally, how can i control airbyte version i am using to create a new connector ?
    u
    u
    +9
    • 12
    • 18
  • k

    Kaustav Ghosh

    10/30/2024, 9:07 AM
    @kapa.ai why is my readme.md for my custom connector not showing on right while setting up the connection
    u
    • 2
    • 1
  • k

    Kaustav Ghosh

    10/30/2024, 9:08 AM
    @kapa.ai how to build custom connector
    u
    u
    u
    • 4
    • 4
  • a

    Alasdair Ellis

    10/30/2024, 9:26 AM
    @kapa.ai can you give more info on this error message='activity Heartbeat timeout', timeoutType=TIMEOUT_TYPE_HEARTBEAT
    u
    u
    +3
    • 6
    • 7
  • r

    Rens O

    10/30/2024, 10:15 AM
    @kapa.ai I am trying to make a custom connector which uses a form of cookie authentication. The login endpoint sends the cookie in the response headers, and I need to extract the information from these headers into my other stream. Is that possible?
    u
    u
    u
    • 4
    • 4
1...424344...48Latest