https://linen.dev logo
Join SlackCommunities
Powered by
# ask-ai
  • e

    ed

    09/02/2024, 11:51 AM
    is it possible to use airbyte metadata in an added transformation to the connector builder, for example to extract
    _airbyte_meta.sync_id
    and add it as a separate column?
    k
    • 2
    • 7
  • n

    Nimrod Rosen

    09/02/2024, 12:43 PM
    What are the default credentials for airbyte db?
    k
    • 2
    • 3
  • a

    Ananth Kumar

    09/02/2024, 12:55 PM
    #C01AHCD885S Issue with orchestrator pod not starting during sync in Airbyte
    k
    • 2
    • 1
  • a

    Ananth Kumar

    09/02/2024, 12:56 PM
    #C01AHCD885S issue with orchestrator pod not starting during sync in Airbyte
    k
    • 2
    • 1
  • t

    Tom Montgomery

    09/02/2024, 1:09 PM
    Hi @kapa.ai, I am attempting to install airbyte using
    abctl
    and would like to specify the version number as well as web app domain. I have attempted to do this by using
    abctl install --values values.yaml
    using the following
    values.yaml
    file:
    k
    • 2
    • 5
  • a

    Alexandre RG

    09/02/2024, 1:21 PM
    Hi, I'm trying to select the password and email with abctl, but the changes don't apply to my cluster and I can't log in. Thanks in advance
    k
    • 2
    • 1
  • j

    Junaid Razzaq

    09/02/2024, 1:36 PM
    Does airbyte supports compression in-flight?
    k
    • 2
    • 3
  • a

    Alexandre RG

    09/02/2024, 1:39 PM
    Encountered an issue deploying Airbyte: Pod: airbyte-abctl-workload-launcher-658cd6fbbc-f4qkp.17f170ddea955ffc Reason: Unhealthy Message: Readiness probe failed: HTTP probe fa
    k
    • 2
    • 3
  • j

    Julian Andersen

    09/02/2024, 1:43 PM
    I am receiving the following error when trying to set up Google Analytics 4 source: PermissionError: [Errno 13] Permission denied: '/config/connectionConfiguration.json'
    k
    • 2
    • 14
  • s

    Samuel Punzón Agudo

    09/02/2024, 1:44 PM
    I want to acces to airbyte installed in a google cloud platform instance that has isntalled airbyte via abctl but I get 404 error when I want to acces to the instance
    k
    • 2
    • 1
  • a

    Ananth Kumar

    09/02/2024, 1:45 PM
    #C01AHCD885S Airbyte sync is successful in 5seconds but orchestrator was not starting at all.
    k
    • 2
    • 1
  • e

    ed

    09/02/2024, 2:04 PM
    after an incremental sync | append is successful, a resync does append duplicates for that same day (but not history). Normally with the cursor being up to date it should ignore the records and not reload the same data
    k
    • 2
    • 1
  • d

    Daniel Holleran

    09/02/2024, 2:10 PM
    @kapa.ai what does the following log from the temporal pod of my airbyte kubernetes deployment mean:
    Copy code
    {
      "level": "warn",
      "ts": "2024-09-02T14:08:53.242Z",
      "msg": "Unspecified task queue kind",
      "service": "frontend",
      "wf-task-queue-name": "CONNECTION_UPDATER",
      "wf-namespace": "default",
      "logging-call-at": "workflow_handler.go:3772"
    }
    k
    • 2
    • 1
  • t

    Thomas Vannier

    09/02/2024, 2:53 PM
    In the setup of the databricks connector 3.2.2, what is the good format of the unity catalog path? Do you have an example?
    k
    • 2
    • 1
  • k

    Karl Jose Buena

    09/02/2024, 3:11 PM
    @kapa.ai I've created destination database using OSS' /destinations/create enpoint, passing a dynamic schema value. I can see detination datbase is crated, but when I check it in pgadmin, the schema is not there, or not visible.
    k
    • 2
    • 7
  • m

    Mohamed Akram Lahdir

    09/02/2024, 4:18 PM
    hey @kapa.ai i just got multiple errors like this one "io.airbyte.workers.exception.WorkerException: Could not find image: airbyte/source-hubspot:4.2.18 18" but im sure that the images are here when i run docker images
    k
    • 2
    • 2
  • k

    Karl Jose Buena

    09/02/2024, 4:25 PM
    @kapa.ai is setting scheme for destination db what's the format? Is this valid? 5_tqm9ds
    k
    • 2
    • 1
  • m

    Magali Acosta

    09/02/2024, 4:34 PM
    I have a connector in airbyte between Hubspot and Redshift. When a deal object is merged, Airbyte is not pulling it as merged
    k
    • 2
    • 1
  • r

    Rohit Chatterjee

    09/02/2024, 4:58 PM
    @kapa.ai i am running airbyte 0.50.44 using run-ab-platform.sh. now i want to upgrade to 0.58.0. do i need to migrate to kubernetes first or can i do that later?
    k
    • 2
    • 1
  • k

    Karl Jose Buena

    09/02/2024, 5:19 PM
    @kapa.ai what parameter can I pass when creating airbyte destinationand I want it on a new schema
    k
    • 2
    • 1
  • h

    Henrique Freitas Souza

    09/02/2024, 5:23 PM
    @kapa.ai i'm having an issue during the setup of an Elasticsearch connector. I'm getting the following error when trying to setup a connection with BigQuery:
    Copy code
    Internal message: java.lang.RuntimeException: io.airbyte.integrations.source.elasticsearch.UnsupportedDatatypeException: Cannot map unsupported data type to Airbyte data type: match_only_text
    Can you help me figure out what is wrong here? I'm on Airbyte 0.50.30 and the Elasticsearch connector is 0.1,2
    k
    • 2
    • 1
  • l

    Luis Gustavo

    09/02/2024, 5:43 PM
    @kapa.ai How can manipulate the "page number" of PageIncrement method
    k
    • 2
    • 26
  • k

    Karl Jose Buena

    09/02/2024, 6:01 PM
    @kapa.ai I have this error: 'Error response from daemon: configured logging driver does not support reading'
    k
    • 2
    • 1
  • m

    Mohamed Akram Lahdir

    09/02/2024, 6:09 PM
    @kapa.ai do you have a guid for migration from run-ab-platform.sh to using abctl on an ec2 instance
    k
    u
    • 3
    • 7
  • w

    Willian Yoshio Iwamoto

    09/02/2024, 6:19 PM
    How long it will take? 2024-09-02 181434 destination > INFO i.a.i.d.r.SerializedBufferingStrategy(flushSingleBuffer):124 Flushing buffer of stream tickets (200 MB) 318 2024-09-02 181434 destination > INFO i.a.i.d.s.S3ConsumerFactory(lambda$flushBufferFunction$2):119 Flushing buffer for stream tickets (200 MB) to storage 319 2024-09-02 181434 destination > INFO i.a.i.d.r.BaseSerializedBuffer(flush):172 Finished writing data to 9a4b5627-eb71-4c97-9dfd-56189749a38a116953180937773815.jsonl (200 MB) 320 2024-09-02 181434 destination > INFO a.m.s.StreamTransferManager(getMultiPartOutputStreams):329 Initiated multipart upload to bhub-lakehouse-gcog12-prd-landing-zone/zendesk/tickets/2024_09_02_1725298592612_2.jsonl with full ID aJDgNYRY8Gp8vPuBHbZI.db2GmotLxbigpS8EoGZjFVGjjn5as3TYPga8NmRm6WlX6O.n73o2f6aB6JWcgsSWh8AGiddHPPGBiUG6MF5KwimoZqRbmwNAvh8dCPlJ3TKZAf6FEd2GXSxMRLt437spA-- 321 2024-09-02 181435 destination > INFO a.m.s.StreamTransferManager(uploadStreamPart):558 [Manager uploading to bhub-lakehouse-gcog12-prd-landing-zone/zendesk/tickets/2024_09_02_1725298592612_2.jsonl with id aJDgNYRY8...t437spA--]: Finished uploading [Part number 2 containing 10.01 MB] 322 2024-09-02 181435 destination > INFO a.m.s.StreamTransferManager(uploadStreamPart):558 [Manager uploading to bhub-lakehouse-gcog12-prd-landing-zone/zendesk/tickets/2024_09_02_1725298592612_2.jsonl with id aJDgNYRY8...t437spA--]: Finished uploading [Part number 1 containing 10.01 MB] 323 2024-09-02 181435 destination > INFO a.m.s.StreamTransferManager(uploadStreamPart):558 [Manager uploading to bhub-lakehouse-gcog12-prd-landing-zone/zendesk/tickets/2024_09_02_1725298592612_2.jsonl with id aJDgNYRY8...t437spA--]: Finished uploading [Part number 4 containing 10.01 MB] 324 2024-09-02 181435 destination > INFO a.m.s.StreamTransferManager(uploadStreamPart):558 [Manager uploading to bhub-lakehouse-gcog12-prd-landing-zone/zendesk/tickets/2024_09_02_1725298592612_2.jsonl with id aJDgNYRY8...t437spA--]: Finished uploading [Part number 5 containing 10.01 MB] 325 2024-09-02 181435 destination > INFO a.m.s.StreamTransferManager(uploadStreamPart):558 [Manager uploading to bhub-lakehouse-gcog12-prd-landing-zone/zendesk/tickets/2024_09_02_1725298592612_2.jsonl with id aJDgNYRY8...t437spA--]: Finished uploading [Part number 3 containing 10.01 MB] 326 2024-09-02 181435 destination > INFO a.m.s.StreamTransferManager(uploadStreamPart):558 [Manager uploading to bhub-lakehouse-gcog12-prd-landing-zone/zendesk/tickets/2024_09_02_1725298592612_2.jsonl with id aJDgNYRY8...t437spA--]: Finished uploading [Part number 7 containing 10.01 MB] 327 2024-09-02 181435 destination > INFO a.m.s.StreamTransferManager(uploadStreamPart):558 [Manager uploading to bhub-lakehouse-gcog12-prd-landing-zone/zendesk/tickets/2024_09_02_1725298592612_2.jsonl with id aJDgNYRY8...t437spA--]: Finished uploading [Part number 6 containing 10.01 MB] 328 2024-09-02 181436 destination > INFO a.m.s.StreamTransferManager(uploadStreamPart):558 [Manager uploading to bhub-lakehouse-gcog12-prd-landing-zone/zendesk/tickets/2024_09_02_1725298592612_2.jsonl with id aJDgNYRY8...t437spA--]: Finished uploading [Part number 10 containing 10.01 MB] 329 2024-09-02 181436 destination > INFO a.m.s.StreamTransferManager(uploadStreamPart):558 [Manager uploading to bhub-lakehouse-gcog12-prd-landing-zone/zendesk/tickets/2024_09_02_1725298592612_2.jsonl with id aJDgNYRY8...t437spA--]: Finished uploading [Part number 9 containing 10.01 MB] 330 2024-09-02 181436 destination > INFO a.m.s.StreamTransferManager(uploadStreamPart):558 [Manager uploading to bhub-lakehouse-gcog12-prd-landing-zone/zendesk/tickets/2024_09_02_1725298592612_2.jsonl with id aJDgNYRY8...t437spA--]: Finished uploading [Part number 11 containing 10.01 MB] 331 2024-09-02 181436 destination > INFO a.m.s.MultiPartOutputStream(close):158 Called close() on [MultipartOutputStream for parts 1 - 10000] 332 2024-09-02 181436 destination > INFO a.m.s.StreamTransferManager(uploadStreamPart):558 [Manager uploading to bhub-lakehouse-gcog12-prd-landing-zone/zendesk/tickets/2024_09_02_1725298592612_2.jsonl with id aJDgNYRY8...t437spA--]: Finished uploading [Part number 8 containing 10.01 MB] 333 2024-09-02 181436 destination > INFO a.m.s.StreamTransferManager(uploadStreamPart):558 [Manager uploading to bhub-lakehouse-gcog12-prd-landing-zone/zendesk/tickets/2024_09_02_1725298592612_2.jsonl with id aJDgNYRY8...t437spA--]: Finished uploading [Part number 12 containing 10.01 MB] 334 2024-09-02 181436 destination > INFO a.m.s.StreamTransferManager(uploadStreamPart):558 [Manager uploading to bhub-lakehouse-gcog12-prd-landing-zone/zendesk/tickets/2024_09_02_1725298592612_2.jsonl with id aJDgNYRY8...t437spA--]: Finished uploading [Part number 15 containing 10.01 MB] 335 2024-09-02 181436 destination > INFO a.m.s.StreamTransferManager(uploadStreamPart):558 [Manager uploading to bhub-lakehouse-gcog12-prd-landing-zone/zendesk/tickets/2024_09_02_1725298592612_2.jsonl with id aJDgNYRY8...t437spA--]: Finished uploading [Part number 16 containing 10.01 MB] 336 2024-09-02 181436 destination > INFO a.m.s.StreamTransferManager(uploadStreamPart):558 [Manager uploading to bhub-lakehouse-gcog12-prd-landing-zone/zendesk/tickets/2024_09_02_1725298592612_2.jsonl with id aJDgNYRY8...t437spA--]: Finished uploading [Part number 18 containing 10.01 MB] 337 2024-09-02 181436 destination > INFO a.m.s.StreamTransferManager(uploadStreamPart):558 [Manager uploading to bhub-lakehouse-gcog12-prd-landing-zone/zendesk/tickets/2024_09_02_1725298592612_2.jsonl with id aJDgNYRY8...t437spA--]: Finished uploading [Part number 14 containing 10.01 MB] 338 2024-09-02 181436 destination > INFO a.m.s.StreamTransferManager(uploadStreamPart):558 [Manager uploading to bhub-lakehouse-gcog12-prd-landing-zone/zendesk/tickets/2024_09_02_1725298592612_2.jsonl with id aJDgNYRY8...t437spA--]: Finished uploading [Part number 13 containing 10.01 MB] 339 2024-09-02 181436 destination > INFO a.m.s.StreamTransferManager(uploadStreamPart):558 [Manager uploading to bhub-lakehouse-gcog12-prd-landing-zone/zendesk/tickets/2024_09_02_1725298592612_2.jsonl with id aJDgNYRY8...t437spA--]: Finished uploading [Part number 19 containing 10.01 MB] 340 2024-09-02 181436 destination > INFO a.m.s.StreamTransferManager(uploadStreamPart):558 [Manager uploading to bhub-lakehouse-gcog12-prd-landing-zone/zendesk/tickets/2024_09_02_1725298592612_2.jsonl with id aJDgNYRY8...t437spA--]: Finished uploading [Part number 17 containing 10.01 MB] 341 2024-09-02 181436 destination > INFO a.m.s.StreamTransferManager(uploadStreamPart):558 [Manager uploading to bhub-lakehouse-gcog12-prd-landing-zone/zendesk/tickets/2024_09_02_1725298592612_2.jsonl with id aJDgNYRY8...t437spA--]: Finished uploading [Part number 20 containing 9.86 MB] 342 2024-09-02 181436 destination > INFO a.m.s.StreamTransferManager(complete):395 [Manager uploading to bhub-lakehouse-gcog12-prd-landing-zone/zendesk/tickets/2024_09_02_1725298592612_2.jsonl with id aJDgNYRY8...t437spA--]: Completed 343 2024-09-02 181436 destination > INFO i.a.i.d.s.S3StorageOperations(loadDataIntoBucket):214 Uploaded buffer file to storage: 9a4b5627-eb71-4c97-9dfd-56189749a38a116953180937773815.jsonl -> zendesk/tickets/2024_09_02_1725298592612_2.jsonl (filename: 2024_09_02_1725298592612_2.jsonl) 344 2024-09-02 181436 destination > INFO i.a.i.d.s.S3StorageOperations(uploadRecordsToBucket):131 Successfully loaded records to stage zendesk/tickets/2024_09_02_1725298592612_ with 0 re-attempt(s) 345 2024-09-02 181436 destination > INFO i.a.i.d.r.FileBuffer(deleteFile):109 Deleting tempFile data 9a4b5627-eb71-4c97-9dfd-56189749a38a116953180937773815.jsonl 346 2024-09-02 181436 destination > INFO i.a.i.d.r.SerializedBufferingStrategy(flushSingleBuffer):128 Flushing completed for tickets 347 2024-09-02 181436 destination > INFO i.a.i.d.r.SerializedBufferingStrategy(lambda$getOrCreateBuffer$0):109 Starting a new buffer for stream tickets (current state: 0 bytes in 0 buffers)
    k
    • 2
    • 2
  • p

    poornima Venkatesha

    09/02/2024, 7:44 PM
    @kapa.ai I want to use airbyte builder to ingest data from an api. The api endpoint accepts only start and end date and provide aggregated data. There is no possibility of grouping by date. However, I want to ingest past data grouper by month for example. Is it possible to do such iteration in builder?
    k
    • 2
    • 9
  • l

    Lucas Segers

    09/02/2024, 8:39 PM
    trying to test a nocode builder stream with a specific state this is what the current connection state looks like
    Copy code
    [
      {
        "streamDescriptor": {
          "name": "devolucoes"
        },
        "streamState": {
          "custom_data": "2024-05-21 21:57:01.687210+0000"
        }
      },
      {
        "streamDescriptor": {
          "name": "representantes"
        },
        "streamState": {
          "custom_data": "2024-05-21 21:57:05.440785+0000"
        }
      },
      {
        "streamDescriptor": {
          "name": "pedidos"
        },
        "streamState": {
          "custom_data": "2024-05-21 21:54:47.332217+0000"
        }
      }
    ]
    k
    j
    • 3
    • 14
  • n

    Nathan Freystaetter

    09/02/2024, 9:01 PM
    Is it possible to export raw event level data from Google Analytics (GA4) instead of the aggregated report level streams?
    k
    c
    • 3
    • 3
  • s

    Slackbot

    09/03/2024, 12:59 AM
    This message was deleted.
    k
    • 2
    • 1
  • a

    Alan Balcazar

    09/03/2024, 1:00 AM
    fix "internalMessage" : "Unable to start the destination", 264 "externalMessage" : "Something went wrong during replication",
    k
    • 2
    • 1
1...303132...48Latest