https://linen.dev logo
Join Slack
Powered by
# ask-ai
  • p

    premsurawut

    10/28/2024, 6:30 AM
    I have pull the postgres destination connector and push to store in my private registry but it did not support the normalized transformation as the image that I have pull. I pulled it without change or add anything. @kapa.ai
    u
    • 2
    • 1
  • s

    Syed Hamza Raza Kazmi

    10/28/2024, 6:49 AM
    i have setup postgres to postgres replication using xmin method and my replication strategy is incremental append now i want to swtich from to cdc method what will happen to existing tables ?
    u
    u
    +2
    • 5
    • 6
  • a

    Arif Chaudhary

    10/28/2024, 8:19 AM
    What is the recommended AWS EC2 instance type for running Airbyte OSS version 1.1.0 via abctl ? I’ve reviewed the documentation, but I feel the resource recommendations might not be accurate. Could anyone share which EC2 instance type has worked well for them with Airbyte OSS version 1.1.0 ?
    u
    • 2
    • 1
  • p

    Poorva

    10/28/2024, 11:04 AM
    @kapa.ai airbyte downgrade to 0.58.0 on K8s is failing with temporal crashloopbackoff. The error is "Unable to start server. Error: could not build arguments for function "go.temporal.io/server/common/pprof".LifetimeHooks (/home/builder/temporal/common/pprof/fx.go:39): failed to build *pprof.PProfInitializerImpl: could not build arguments for function "go.temporal.io/server/common/pprof".NewInitializer (/home/builder/temporal/common/pprof/pprof.go:56): failed to build *config.PProf: received non-nil error from function "go.temporal.io/server/temporal".ServerOptionsProvider (/home/builder/temporal/temporal/fx.go:152): sql schema version compatibility check failed: pq: no pg_hba.conf entry for host "10.0.27.212", user "airbyte", database "temporal", no encryption"
    u
    • 2
    • 1
  • m

    Mor Iluz

    10/28/2024, 12:10 PM
    Using Airbyte cloud version and trying to connect the new Microsoft Entra ID connector: https://docs.airbyte.com/integrations/sources/microsoft-entra-id?_gl=1*kz99yi*_gcl_aw[…]MS4xNzMwMDMwMzY1LjE5ODE5MDE1NjAuMTczMDExNTE2Mi4xNzMwMTE1ODMx But after we've configured all the properties (client_id, client_secret) and setup the permissions of the application, we're getting an error in the 'Set up source' of 400 Bad request: File "/usr/local/lib/python3.10/site-packages/airbyte_cdk/sources/streams/http/requests_native_auth/abstract_oauth.py", line 132, in _get_refresh_access_token_response response.raise_for_status() File "/usr/local/lib/python3.10/site-packages/requests/models.py", line 1024, in raise_for_status raise HTTPError(http_error_msg, response=self) requests.exceptions.HTTPError: 400 Client Error: Bad Request for url: https://login.microsoftonline.com/****/oauth2/v2.0/token what might be the issue?
    u
    u
    +6
    • 9
    • 12
  • t

    Tom De Kooning

    10/28/2024, 12:34 PM
    When will the XmlDecoder be released for Airbyte OSS?
    u
    • 2
    • 1
  • s

    satya

    10/28/2024, 12:35 PM
    #C01AHCD885S Is there any global setting for notifications instead of per workspace ? Also airbyte bitnami support kafaka integration w.r.t notifications ?
    u
    • 2
    • 2
  • h

    Henrik Nilsson

    10/28/2024, 12:38 PM
    Using CDC Based replication on a Postgres source with tables that dont have primary key, can i run sync mode “Incremental, Append + Deduped” if the table is configure with REPLICA IDENTITY FULL?
    u
    • 2
    • 1
  • k

    Kaustav Ghosh

    10/28/2024, 12:42 PM
    @kapa.ai
    Copy code
    replication-orchestrator > readFromDestination: exception caught
    java.lang.NullPointerException: Cannot invoke "io.airbyte.protocol.models.AirbyteGlobalState.getStreamStates()" because the return value of "io.airbyte.protocol.models.AirbyteStateMessage.getGlobal()" is null
    	at io.airbyte.workers.internal.bookkeeping.ParallelStreamStatsTracker.updateDestinationStateStats(ParallelStreamStatsTracker.kt:130) ~[io.airbyte-airbyte-commons-worker-1.1.0.jar:?]
    	at io.airbyte.workers.internal.syncpersistence.SyncPersistenceImpl.updateDestinationStateStats(SyncPersistence.kt:322) ~[io.airbyte-airbyte-commons-worker-1.1.0.jar:?]
    	at io.airbyte.workers.internal.bookkeeping.AirbyteMessageTracker.acceptFromDestination(AirbyteMessageTracker.kt:65) ~[io.airbyte-airbyte-commons-worker-1.1.0.jar:?]
    	at io.airbyte.workers.general.ReplicationWorkerHelper.internalProcessMessageFromDestination(ReplicationWorkerHelper.kt:443) ~[io.airbyte-airbyte-commons-worker-1.1.0.jar:?]
    	at io.airbyte.workers.general.ReplicationWorkerHelper.processMessageFromDestination(ReplicationWorkerHelper.kt:317) ~[io.airbyte-airbyte-commons-worker-1.1.0.jar:?]
    	at io.airbyte.workers.general.BufferedReplicationWorker.readFromDestination(BufferedReplicationWorker.java:488) ~[io.airbyte-airbyte-commons-worker-1.1.0.jar:?]
    	at io.airbyte.workers.general.BufferedReplicationWorker.lambda$runAsync$2(BufferedReplicationWorker.java:215) ~[io.airbyte-airbyte-commons-worker-1.1.0.jar:?]
    	at java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) ~[?:?]
    	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) ~[?:?]
    	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ~[?:?]
    	at java.base/java.lang.Thread.run(Thread.java:1583) [?:?]
    2024-10-28 11:40:50 replication-orchestrator > readFromDestination: done. (writeToDestFailed:false, dest.isFinished:false)
    2024-10-28 11:40:50 replication-orchestrator > writeToDestination: exception caught
    java.lang.IllegalStateException: No exit code found.
    	at io.airbyte.workers.internal.ContainerIOHandle.getExitCode(ContainerIOHandle.kt:104) ~[io.airbyte-airbyte-commons-worker-1.1.0.jar:?]
    	at io.airbyte.workers.internal.LocalContainerAirbyteSource.getExitValue(LocalContainerAirbyteSource.kt:90) ~[io.airbyte-airbyte-commons-worker-1.1.0.jar:?]
    	at io.airbyte.workers.general.BufferedReplicationWorker.writeToDestination(BufferedReplicationWorker.java:440) ~[io.airbyte-airbyte-commons-worker-1.1.0.jar:?]
    	at io.airbyte.workers.general.BufferedReplicationWorker.lambda$runAsyncWithTimeout$5(BufferedReplicationWorker.java:243) ~[io.airbyte-airbyte-commons-worker-1.1.0.jar:?]
    	at java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) ~[?:?]
    	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) ~[?:?]
    	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ~[?:?]
    	at java.base/java.lang.Thread.run(Thread.java:1583) [?:?]
    u
    • 2
    • 1
  • h

    Henrik Nilsson

    10/28/2024, 12:52 PM
    Document regarding Postgres and replication methods describe “Both CDC and xmin are the most reliable methods of updating your data” can you expand on why Standard should be avoided?
    u
    u
    • 3
    • 2
  • n

    Nanagoud

    10/28/2024, 12:54 PM
    what is the issue in this https://github.com/airbytehq/airbyte/issues/47547
    u
    u
    +3
    • 6
    • 7
  • s

    satya

    10/28/2024, 1:01 PM
    @kapa.ai lets say if i update on successful sync notification webhook url then on successful sync how the json body from airbyte looks like? does it have the connection id for which the sync happened and any additional details ?
    u
    u
    u
    • 4
    • 4
  • a

    Alfred Joseph A

    10/28/2024, 1:44 PM
    How do i set up external id
    AWS_ASSUME_ROLE_EXTERNAL_ID
    in locally deployed instance
    u
    • 2
    • 1
  • a

    Alfred Joseph A

    10/28/2024, 1:45 PM
    How do i set up environment variables in locally deployed instance ?
    u
    • 2
    • 1
  • s

    Simon Veerman

    10/28/2024, 2:39 PM
    Hi @kapa.ai I have an array type user input. For each value in the input I need to run a stream: Stream name "Orders" Url parameter: user input (account id) For each account ID I need to run that stream sync. So I'm looking for something along the lines of a for loop or something else. What would be my best way to deal with this?
    u
    u
    u
    • 4
    • 5
  • v

    Vasil Boshnakov

    10/28/2024, 2:44 PM
    @kapa.ai I'm trying to create a Redshift destination but i'm getting the following error: Could not connect with provided SSH configuration. Error: getSQLState(...) must not be null
    u
    • 2
    • 2
  • m

    Matthew Mombrea

    10/28/2024, 2:54 PM
    @kapa.ai where does abctl install airbyte to on the filesystem?
    u
    • 2
    • 1
  • a

    Andres

    10/28/2024, 3:08 PM
    I’m defining Cursor Pagination for one of my streams. However the response returns a ‘__next’ property containing the URL to fetch more elements, with the cursor already filled in. Do I need to provide the UID of the next cursor object or can I just use this URL?
    u
    • 2
    • 1
  • e

    Eric Gottschalk

    10/28/2024, 3:22 PM
    How do you authenticate a private gcloud docker registry with an upgraded Kubernetes deplyment?
    u
    u
    • 3
    • 3
  • v

    Vasil Boshnakov

    10/28/2024, 3:35 PM
    @kapa.ai I'm using the OSS version which is deployed on EC2 instance and I'm trying to create a Redshift destination but i'm getting the following error: Could not connect with provided SSH configuration. Error: getSQLState(...) must not be null Also I'm using ssh tunnelling to access the WEB UI.
    u
    • 2
    • 1
  • y

    Yannick Sacherer

    10/28/2024, 3:42 PM
    @kapa.ai in which order will the http_methods been called?
    u
    u
    +5
    • 8
    • 10
  • j

    Jake Duckworth

    10/28/2024, 3:52 PM
    When using the airbyte SDK I get an error when I call the function that returns a list of workspaces. Function:
    Copy code
    sdk.workspaces().listWorkspaces().call()
    Error:
    Copy code
    Exception in thread "main" java.lang.NoClassDefFoundError: org/openapitools/jackson/nullable/JsonNullable
    	at com.airbyte.api.utils.Utils.resolveOptionals(Utils.java:517)
    	at com.airbyte.api.utils.Security.parseBasicAuthScheme(Security.java:153)
    	at com.airbyte.api.utils.Security.parseSecurityScheme(Security.java:76)
    	at com.airbyte.api.utils.Security.configureSecurity(Security.java:42)
    	at com.airbyte.api.utils.Utils.configureSecurity(Utils.java:274)
    	at com.airbyte.api.Workspaces.listWorkspaces(Workspaces.java:484)
    	at com.airbyte.api.models.operations.ListWorkspacesRequestBuilder.call(ListWorkspacesRequestBuilder.java:37)
    u
    • 2
    • 1
  • m

    Matheus Dantas

    10/28/2024, 3:53 PM
    I am facing the following error when trying to test my low-code connector:
    Copy code
    The manifest version 5.6.0 is greater than the airbyte-cdk package version (1.8.0). Your manifest may contain features that are not in the current CDK version.
    u
    • 2
    • 1
  • p

    poornima Venkatesha

    10/28/2024, 4:29 PM
    @kapa.ai my current configuration for update method in source connection is set to “detect changes with admin System Column” I want to change this to “Read changes using write ahead log (CDC) “ What are the steps to do so
    u
    • 2
    • 1
  • l

    Lucas Segers

    10/28/2024, 4:47 PM
    how can I find which "secrets" (stored on the secret table) are used on my source?
    u
    u
    • 3
    • 3
  • j

    John Mizerany

    10/28/2024, 4:54 PM
    Can we disable table locking with the source PostgreSQL connector?
    u
    u
    • 3
    • 3
  • m

    Mounika Naga

    10/28/2024, 5:22 PM
    @kapa.ai I'm not able to give s3_path_format ${STREAM_NAME}/year=${YEAR}/month=${MONTH}/day=${DAY}/hour=${HOUR}/ using terraform while creating airbtye destination. Any idea on this?
    u
    • 2
    • 1
  • c

    Chris Potter

    10/28/2024, 5:54 PM
    @kapa.ai - I'm receiving the following error message that doesn't make much sense to me. Do you have any more information on what might be causing this to occur? A stream status has been detected for a stream not present in the catalog
    u
    • 2
    • 1
  • c

    Chris Potter

    10/28/2024, 7:04 PM
    @kapa.ai - I'm trying to extract data from a relatively large BigQuery table, but am receiving the following message. Can you provide more information? java.lang.RuntimeException: java.lang.RuntimeException: com.google.cloud.bigquery.BigQueryException: Response too large to return. Consider specifying a destination table in your job configuration.
    u
    c
    • 3
    • 3
  • m

    Mounika Naga

    10/28/2024, 7:59 PM
    @kapa.ai Error: failure to invoke API │ │ with airbyte_connection.bigquery_connection, │ on connection.tf line 28, in resource "airbyte_connection" "bigquery_connection": │ 28: resource "airbyte_connection" "bigquery_connection" { │ │ unknown status code returned: Status 500 │ {"status":500,"type":"https://reference.airbyte.com/reference/errors","title":"unexpected-problem","detail":"An unexpected problem has occurred. If this is an error │ that needs to be addressed, please submit a pull request or github issue.","documentationUrl":null,"data":{"message":"Something went wrong in the connector. │ logs:Something went wrong in the connector. See the logs for more details."}}. Can you help me resolve this
    u
    u
    u
    • 4
    • 4
1...404142...48Latest