https://linen.dev logo
Join Slack
Powered by
# ask-ai
  • l

    Leo Salayog

    11/05/2024, 12:06 PM
    Can you fix this {{ record['customField']['*']['id'] }} where I want to get the record from this Json {customField:[ { id:123, value: 124}{ id:323, value : 34 } }}
    u
    u
    +2
    • 5
    • 8
  • f

    Fisakele Vuma

    11/05/2024, 12:14 PM
    @kapa.ai getting 500 error for temporal ui in helm chart version 1.1.1
    u
    u
    u
    • 4
    • 4
  • n

    Nick Baumann

    11/05/2024, 12:18 PM
    In my no code connector setup, channel values are passed from a parent streamt to my stream to retrieve data. The stream ids is however not written to my data as it's not a part of the response. I need the channel id for each record however, which is why I tried adding it via transformations. However, I can't seem to access the value correctly. How do I add a column to include the called channel IDs from my parent stream to each record?
    u
    • 2
    • 1
  • a

    Aazam Thakur

    11/05/2024, 12:36 PM
    @kapa.ai How do I use pyairbyte to load source-faker data into my cloud hosted postgresql?
    u
    • 2
    • 1
  • t

    Tahir Ishaq

    11/05/2024, 1:03 PM
    @kapa.ai How can configure postgres as a source using pyairbyte
    u
    u
    +5
    • 8
    • 10
  • n

    Nathaniel Koranteng

    11/05/2024, 1:10 PM
    I'm currently trying out Airbyte to see how it will suite our needs. I currently have the community version deployed on Kubernetes using the Helm chart. I can only get it to sync raw data though I understand the is supposed to be a way to normalize the data using Airbyte. Problem is I can't see that option in the deployment. Is there something I am missing. Your assistance will be much appreciated Source: Postgres Destination: Clickhouse
    u
    • 2
    • 1
  • d

    Diogo Malheiro

    11/05/2024, 1:29 PM
    Filtering Options for Monday.com Source Connector: Extracting Items from Specific Boards Hello Airbyte community, I'm currently working on integrating Monday.com as a source in Airbyte, and I have a specific requirement regarding data extraction. I'm wondering about the available filtering options, particularly for the Items stream. My questions are: 1. Is it possible to filter the data extraction from Monday.com to only pull Items from specific boards? 2. If yes, how can this be configured in the Monday.com source connector? 3. Are there any other filtering options available for the Items stream, such as by column values or date ranges? Any insights, documentation references, or examples would be greatly appreciated. Thank you in advance for your help! (edited)
    u
    u
    • 3
    • 4
  • j

    Joey Benamy

    11/05/2024, 2:10 PM
    @kapa.ai With the postgres source connector, when a schema has a column data type change from int to bigint, schema change is not detected in airbyte
    u
    • 2
    • 1
  • a

    Andrea Brenna

    11/05/2024, 2:24 PM
    @kapa.ai Hi, I'm using Airbyte to replicate data from MySQL to BigQuery. I'm trying to destroy the Airbyte container in Docker and recreate it (including the same source and destination as in the previous container) in order to understand if the ingestion continues from the last point it had reached. The answer is NO, but I want to understand if exists a way to do that. MySQL is configured as
    CDC Update Method
    and BigQuery as
    Batched Standard Inserts
    as
    Loading Method
    .
    u
    u
    +15
    • 18
    • 25
  • a

    Alasdair Ellis

    11/05/2024, 2:28 PM
    @kapa.ai can you send me an example helm values.yaml file
    u
    • 2
    • 1
  • s

    Slackbot

    11/05/2024, 2:33 PM
    This message was deleted.
    u
    n
    • 3
    • 3
  • f

    Fabrizio Spini

    11/05/2024, 2:37 PM
    How can I manually set the connection state after recreating the airbyte abctl container to approximate the last sync point?
    u
    u
    +4
    • 7
    • 10
  • j

    Justin Hocking

    11/05/2024, 2:41 PM
    How do I migrate the Postgres database from the
    airbyte-db
    pod to an external database?
    u
    • 2
    • 1
  • l

    Lubomyr Kachko

    11/05/2024, 3:14 PM
    @kapa.ai io.airbyte.cdk.integrations.source.relationaldb.state.FailedRecordIteratorException: java.lang.RuntimeException: java.lang.RuntimeException: org.apache.kafka.connect.errors.ConnectException: An exception occurred in the change event producer. This connector will be stopped. how to fix this issue
    u
    u
    • 3
    • 2
  • v

    Vasil Boshnakov

    11/05/2024, 3:34 PM
    @kapa.ai I have Redshift source connector and Redshift destination connection. When I create a connection, all my DATE and TIMESTAMP columns are converted to VARCHAR type. How can I make the Airbyte to preserve these types?
    u
    • 2
    • 1
  • d

    Diako

    11/05/2024, 4:27 PM
    @kapa.ai How do I adjust the kubernetes pod request size for airbyte jobs? Mine are running extremely slow right now.
    u
    u
    +3
    • 6
    • 6
  • t

    Tobias Willi

    11/05/2024, 5:14 PM
    @kapa.ai hey, i have a sharepoint connector with the following error. Traceback (most recent call last): File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/file_based/stream/default_file_based_stream.py", line 271, in _infer_schema base_schema = merge_schemas(base_schema, task.result()) File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/file_based/stream/default_file_based_stream.py", line 281, in _infer_file_schema raise SchemaInferenceError( airbyte_cdk.sources.file_based.exceptions.SchemaInferenceError: Error inferring schema from files. Are the files valid? Contact Support if you need assistance. file=https://istainternational.sharepoint.com/sites/ista_pmext_deothistadatateam/Shared%20Documents/General/Raw Vault/Data Delivery Tribe/SAP Device Portfolio/Datamodell EQKT.xlsx format=filetype='csv' delimiter=',' quote_char='"' escape_char=None encoding='utf8' double_quote=True null_values=set() strings_can_be_null=True skip_rows_before_header=0 skip_rows_after_header=0 header_definition=CsvHeaderFromCsv(header_definition_type='From CSV') true_values={'t', 'yes', 'y', 'true', '1', 'on'} false_values={'false', 'f', 'no', 'off', '0', 'n'} inference_type=<InferenceType.NONE: 'None'> ignore_errors_on_fields_mismatch=False stream=test Traceback (most recent call last): What glob should i use if i only want to look for a single csv file within the general folder ?
    u
    • 2
    • 1
  • d

    Diako

    11/05/2024, 5:27 PM
    @kapa.ai My job is loading 19000 rows in 30 minutes. How do I debug to see the reason??
    u
    • 2
    • 1
  • e

    Ethan Brown

    11/05/2024, 6:10 PM
    can I use the airbyte postgres destination with cockroachdb?
    u
    r
    • 3
    • 2
  • f

    Fabrizio Bulleri

    11/05/2024, 9:09 PM
    @kapa.ai set requests and limits for each connector
    u
    • 2
    • 1
  • c

    Charles Bockelmann

    11/05/2024, 10:59 PM
    After updating dagger and building, I am getting this error: pipelines.cli.dagger_run: The Dagger CLI version ‘0.13.7’ does not match the expected version dagger_run.py:92 ‘0.13.3’. Installing Dagger CLI ‘0.13.3’...
    u
    u
    +2
    • 5
    • 6
  • c

    Charles Bockelmann

    11/06/2024, 12:56 AM
    Is it possible to set the favicon to a Python custom destination connector¿
    u
    • 2
    • 1
  • c

    Chính Bùi Quang

    11/06/2024, 1:49 AM
    I want to limit next_page_token to only run 5 times and then stop in Builder on AirByte, what should I do? @kapa.ai
    u
    u
    +13
    • 16
    • 23
  • m

    Murilo Fugazzotto

    11/06/2024, 2:56 AM
    Failure in source: Could not infer schema as there are no rows in file.csv. If having an empty CSV file is expected, ignore this. Else, please contact Airbyte. it won’t move forward to the following file, how can I ignore this file? @kapa.ai
    u
    u
    • 3
    • 2
  • e

    Elena Mascarenas Garcia

    11/06/2024, 6:53 AM
    how long does Airbyte hold the data they transfer?
    u
    • 2
    • 1
  • e

    Elena Mascarenas Garcia

    11/06/2024, 6:55 AM
    Do you have an external ISO 27001 certification?
    u
    u
    • 3
    • 3
  • y

    Yusuke Yoshimura

    11/06/2024, 7:46 AM
    can I use the connector builder to build a SOAP APi connector?
    u
    u
    • 3
    • 3
  • n

    Naresh Kumar

    11/06/2024, 8:03 AM
    I have incremental stream for sponsored_products_report_stream in amazon_ads to postgres connector. There is a table created amazon_ads_raw__stream_sponsored_products_report_stream This table is getting larger, can it be trimmed without effecting the subsequent syncs?
    u
    • 2
    • 1
  • s

    Syed Hamza Raza Kazmi

    11/06/2024, 8:23 AM
    @kapa.ai, i am using airbyte 0.399 and getting this error : io.airbyte.workers.exception.WorkerException: Failed to create pod for read step
    u
    • 2
    • 2
  • h

    Henrik Rasmussen

    11/06/2024, 8:56 AM
    I'm trying to setup Planhat source in Airbyte with terraform:
    Copy code
    resource "airbyte_source_planhat" "src_planhat" {
      name          = "Planhat"
      workspace_id  = "1234"
    
      configuration = {
        api_token = "1234"
      }
    }
    But keep getting an error like
    The submitted value could not be found.
    I have distilled it down to the simplest example - and it works with other sources.. Any tips to what it could be?
    u
    • 2
    • 1
1...4445464748Latest