https://linen.dev logo
Join Slack
Powered by
# ask-ai
  • o

    Oleksandr Pozden

    09/05/2024, 6:34 PM
    @kapa.ai Hi, How to apply custom transformations in latest versions of Airbyte?
    k
    • 2
    • 4
  • g

    Gabriele Cacchioni

    09/05/2024, 6:45 PM
    @kapa.ai what spins the pod orchestrator-repl-job
    k
    • 2
    • 1
  • g

    Gabriele Cacchioni

    09/05/2024, 6:56 PM
    @kapa.ai is it possible to specify the imagePullPolicy for job kune bisybox image?
    k
    • 2
    • 1
  • r

    Roberto Tolosa

    09/05/2024, 7:25 PM
    what's the easiest way to convert a Postman POST call to an Airbyte connector builder call?
    k
    • 2
    • 1
  • v

    Vikas Bansal

    09/05/2024, 7:58 PM
    @kapa.ai I want to use airbyte OSS to move data from multiple connections of same source types into a single bigquery destination. The final tables that airbyte syncs doesn't have any identifier to know which connection moved this data. How can I reliably know which airbyte connection placed which records in a table?
    k
    • 2
    • 11
  • j

    Joseph To

    09/05/2024, 7:59 PM
    @kapa.ai, why is the replication tab missing in oss airbyte version 0.64.1?
    k
    t
    • 3
    • 3
  • a

    Aasim ali

    09/05/2024, 9:27 PM
    org.apache.kafka.common.KafkaException: Failed to construct kafka consumer at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:825) at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:666) at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:614) at io.airbyte.integrations.source.kafka.format.JsonFormat.getConsumer(JsonFormat.java:52) at io.airbyte.integrations.source.kafka.format.JsonFormat.getTopicsToSubscribe(JsonFormat.java:91) at io.airbyte.integrations.source.kafka.format.JsonFormat.getStreams(JsonFormat.java:98) at io.airbyte.integrations.source.kafka.KafkaSource.discover(KafkaSource.java:43) at io.airbyte.cdk.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:159) at io.airbyte.cdk.integrations.base.IntegrationRunner.run(IntegrationRunner.java:125) at io.airbyte.integrations.source.kafka.KafkaSource.main(KafkaSource.java:61) Caused by: org.apache.kafka.common.errors.InvalidConfigurationException: enable.auto.commit cannot be set to true when default group id (null) is used. @kapa.ai
    k
    • 2
    • 1
  • c

    Colin

    09/05/2024, 9:31 PM
    @kapa.ai Can you please explain how to access detailed logs when testing a destination in Airbyte? It always gives me a 504 error and I don’t know where to begin with diagnostics
    k
    • 2
    • 1
  • c

    Colin

    09/05/2024, 10:06 PM
    @kapa.ai Can you explain what a connector error message of “State code: 08001; Message: The connection attempt failed.” is?
    k
    • 2
    • 4
  • j

    Joseph To

    09/05/2024, 10:32 PM
    @kapa.ai, can I use http://localhost in webhook url?
    k
    v
    k
    • 4
    • 12
  • d

    Du Trần

    09/06/2024, 3:20 AM
    @kapa.ai I'm using source mysql. But I see the default fetch size for the configuration is 5000 records in logger I want to change the default fetch size to 10,000 records. How can I change this config?
    k
    • 2
    • 19
  • r

    Rens O

    09/06/2024, 4:38 AM
    My minio pod is giving an error "drive path full". What could be the cause of that and how to fix it?
    k
    • 2
    • 3
  • p

    Patricio Villanueva

    09/06/2024, 6:59 AM
    Hi, I have a self hosted airbyte with kubernet, and I have a connection, that is not ussing the full resources that the pod has, and it giving me timeout
    k
    • 2
    • 1
  • c

    Chính Bùi Quang

    09/06/2024, 7:14 AM
    I want to adjust the time (airbyte_cdk.sources.streams.http.exceptions.UserDefinedBackoffException: Gateway timeout.) to 3 minutes waiting for a response from the Source, how can I do that? @kapa.ai
    k
    • 2
    • 10
  • l

    Lê Hữu Khuê

    09/06/2024, 7:42 AM
    Hi I created a ETL flow flow sqs to Postgres DB but I received error like: i.a.c.i.LineGobbler(voidCall):149 - Fatal glibc error: CPU does not support x86-64-v2. Some one help me pls
    k
    • 2
    • 1
  • l

    Luc Lagrange

    09/06/2024, 7:51 AM
    We are hosting airbyte on Kubernetes, and this morning we received this error on two runs : message='io.temporal.serviceclient.CheckedExceptionWrapper: io.airbyte.workers.exception.WorkerException: Running the launcher replication-orchestrator failed', type='java.lang.RuntimeException', nonRetryable=false What is happening?
    k
    • 2
    • 2
  • d

    Diogo Malheiro

    09/06/2024, 8:08 AM
    When trying to create a connection with Monday.com via API, im getting the following error: Configuration check failed Check failed because of an internal error Internal message: Check failed because of an internal error Failure origin: airbyte_platform io.temporal.failure.ActivityFailure: Activity with activityType='RunWithJobOutput' failed: 'Activity task failed'. scheduledEventId=12, startedEventId=13, activityId=834780a7-4e77-3c1c-8943-25795b7e7786, identity='1@391af1dfa34f', retryState=RETRY_STATE_MAXIMUM_ATTEMPTS_REACHED at java.base/java.lang.Thread.getStackTrace(Thread.java:2450) at io.temporal.internal.sync.ActivityStubBase.execute(ActivityStubBase.java:49) at io.temporal.internal.sync.ActivityInvocationHandler.lambda$getActivityFunc$0(ActivityInvocationHandler.java:83) at io.temporal.internal.sync.ActivityInvocationHandlerBase.invoke(ActivityInvocationHandlerBase.java:60) at jdk.proxy2/jdk.proxy2.$Proxy88.runWithJobOutput(Unknown Source) at io.airbyte.workers.temporal.check.connection.CheckConnectionWorkflowImpl.run(CheckConnectionWorkflowImpl.java:55) at CheckConnectionWorkflowImplProxy.run$accessor$I0Wy8491(Unknown Source) at CheckConnectionWorkflowImplProxy$auxiliary$0YlsUJzm.call(Unknown Source) at io.airbyte.micronaut.temporal.TemporalActivityStubInterceptor.execute(TemporalActivityStubInterceptor.java:79) at CheckConnectionWorkflowImplProxy.run(Unknown Source) at java.base/jdk.internal.reflect.DirectMethodHandleAccessor.invoke(DirectMethodHandleAccessor.java:103) at java.base/java.lang.reflect.Method.invoke(Method.java:580) at io.temporal.internal.sync.POJOWorkflowImplementationFactory$POJOWorkflowImplementation$RootWorkflowInboundCallsInterceptor.execute(POJOWorkflowImplementationFactory.java:339) at io.temporal.internal.sync.POJOWorkflowImplementationFactory$POJOWorkflowImplementation.execute(POJOWorkflowImplementationFactory.java:314) at io.temporal.internal.sync.WorkflowExecutionHandler.runWorkflowMethod(WorkflowExecutionHandler.java:70) at io.temporal.internal.sync.SyncWorkflow.lambda$start$0(SyncWorkflow.java:135) at io.temporal.internal.sync.CancellationScopeImpl.run(CancellationScopeImpl.java:102) at io.temporal.internal.sync.WorkflowThreadImpl$RunnableWrapper.run(WorkflowThreadImpl.java:107) at io.temporal.worker.ActiveThreadReportingExecutor.lambda$submit$0(ActiveThreadReportingExecutor.java:53) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:572) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317)
    k
    • 2
    • 1
  • m

    Michael Eaton

    09/06/2024, 8:25 AM
    How do I remove a custom connector from the OSS Airbyte UI?
    k
    • 2
    • 1
  • h

    Henrik Rasmussen

    09/06/2024, 8:27 AM
    I'm suddenly getting this error. I don't think I have changed anything that could be the reason: The above exception was caused by the following exception: pydantic.errors.PydanticUserError:
    TableMetadataSet
    is not fully defined; you should define
    TableColumn
    , then call
    TableMetadataSet.model_rebuild()
    .
    k
    • 2
    • 1
  • m

    Michael Eaton

    09/06/2024, 8:59 AM
    Can you give me some examples of ConnectorDefinition?
    k
    • 2
    • 1
  • m

    Michael Eaton

    09/06/2024, 9:04 AM
    Where is the specification of ConnectorDefinition?
    k
    • 2
    • 1
  • d

    Dani Toro

    09/06/2024, 9:13 AM
    I have a connection between Zendesk Support and Mysql DB to export all tickets and comments. Comments is a json with an array of attachments of this comment. In previous versions of Zendesk Support source connector, this attachments array creates a new table in mysql with the data. Is there any way to nest this array into a separate table?
    k
    • 2
    • 1
  • q

    Quang Nguyen

    09/06/2024, 9:15 AM
    Copy code
    Warning from source: Command failed with error 13 (Unauthorized): 'not authorized on prod_small_sort to execute command { aggregate: 1, pipeline: [ { $changeStream: {} }, { $match: { ns.coll: { $in: [ "pre_sort_scan", "small_parcel_sort_layout", "small_parcel_sort_session_shipment", "small_parcel_sort_session", "small_parcel_sort_session_putaway_task", "small_parcel_statistic", "small_parcel_sort_layout_cart", "small_parcel_sort_session_cart", "small_parcel_sort_session_bin" ] } } } ], cursor: {}, $db: "prod_small_sort", $clusterTime: { clusterTime: Timestamp(1725613423, 3), signature: { hash: BinData(0, 5E03889B7FA5267D6ECB3CCC08A020560AD8833A), keyId: 7368024277552988161 } }, lsid: { id: UUID("df4ae1a9-a9a5-464f-817e-fc1f750dbb26") }, $readPreference: { mode: "secondaryPreferred" } }' on server <http://axlehire-prod-us-west1-shard-00-01.vuags.mongodb.net:27017|axlehire-prod-us-west1-shard-00-01.vuags.mongodb.net:27017>. The full response is {"ok": 0.0, "errmsg": "not authorized on prod_small_sort to execute command { aggregate: 1, pipeline: [ { $changeStream: {} }, { $match: { ns.coll: { $in: [ \"pre_sort_scan\", \"small_parcel_sort_layout\", \"small_parcel_sort_session_shipment\", \"small_parcel_sort_session\", \"small_parcel_sort_session_putaway_task\", \"small_parcel_statistic\", \"small_parcel_sort_layout_cart\", \"small_parcel_sort_session_cart\", \"small_parcel_sort_session_bin\" ] } } } ], cursor: {}, $db: \"prod_small_sort\", $clusterTime: { clusterTime: Timestamp(1725613423, 3), signature: { hash: BinData(0, 5E03889B7FA5267D6ECB3CCC08A020560AD8833A), keyId: 7368024277552988161 } }, lsid: { id: UUID(\"df4ae1a9-a9a5-464f-817e-fc1f750dbb26\") }, $readPreference: { mode: \"secondaryPreferred\" } }", "code": 13, "codeName": "Unauthorized", "$clusterTime": {"clusterTime": {"$timestamp": {"t": 1725613423, "i": 3}}, "signature": {"hash": {"$binary": {"base64": "XgOIm3+lJn1uyzzMCKAgVgrYgzo=", "subType": "00"}}, "keyId": 7368024277552988161}}, "operationTime": {"$timestamp": {"t": 1725613423, "i": 3}}}
    k
    • 2
    • 1
  • j

    Julie Choong

    09/06/2024, 10:07 AM
    message='io.airbyte.workers.exception.WorkloadLauncherException: io.airbyte.workload.launcher.pipeline.stages.model.StageError: io.airbyte.workload.launcher.pods.KubeClientException: Init container of orchestrator pod failed to start within allotted timeout of 900 seconds. (Timed out waiting for [900000] milliseconds for [Pod] with name:[orchestrator-repl-job-3-attempt-0] in namespace [airbyte].) at io.airbyte.workload.launcher.pipeline.stages.model.Stage.apply(Stage.kt:46
    k
    • 2
    • 1
  • m

    Murilo Belarmino

    09/06/2024, 10:32 AM
    I'm building a connector using low-code. My API endpoint requires the end_date to be greater than start_date. If I want to request data from 20 of august, my start date must be 20 of august but my end date should be 21 of august. How to adjust steps and granularity to get the wanted behavior?
    k
    • 2
    • 3
  • d

    Daniel Holleran

    09/06/2024, 10:56 AM
    @kapa.ai how I do edit the configuration of the
    flags.yaml
    in the helm chart?
    k
    • 2
    • 1
  • g

    Gowthami Muddam

    09/06/2024, 11:14 AM
    @kapa.ai when Iam updating the destination with correct credentials it is still throughing error like this An unknown error occurred. (HTTP 504). What might be the reason
    k
    • 2
    • 1
  • s

    Slackbot

    09/06/2024, 11:39 AM
    This message was deleted.
    k
    • 2
    • 2
  • k

    kapa.ai

    09/06/2024, 11:57 AM
    In Airbyte, a catalog defines all the streams that a connector supports. If a stream is missing from the catalog, it could lead to issues during data synchronization. According to the [Airbyte Protocol](https://docs.airbyte.com/understanding-airbyte/airbyte-protocol#schema-mismatches), if a stream exists in the data store but not in the catalog, the source will never look for the data related to that stream and thus will not emit it. Similarly, if a stream exists in the catalog but not in the data store, the source will not find the data for that stream, but all other streams will sync as usual. In short, if the catalog is ever out of sync with the schema of the underlying data store, it should never block replication for data that is present. If you're encountering an error message like
    A stream status (public.transaction) has been detected for a stream not present in the catalog
    , it indicates that the stream is not found in the catalog. In such cases, you might need to check your catalog configuration or refresh the schema in replication settings. For example, in a [Github issue](https://github.com/airbytehq/airbyte/issues/40555), a contributor suggested checking if the stream is selected in the connection and also retrieving the catalog used in the job which fails with the error.
    Copy code
    sql select config from jobs where id = <JOB_ID>
    Please note that the actual solution might vary depending on the specific circumstances and configurations of your data sources and destinations.
    r
    • 2
    • 5
  • j

    Jordi Beunk

    09/06/2024, 12:25 PM
    @kapa.ai there are schema updates to one of my mysql connections. when i select review changes i do see that 1 new stream is added, which is expected. however, when i save my changes, the newly added stream is not visible anymore and i still have the option to review non-breaking schema changes.
    k
    • 2
    • 1
1...353637...48Latest