https://linen.dev logo
Join Slack
Powered by
# help-api-cli-orchestration
  • j

    Justas Černas

    08/25/2023, 11:34 AM
    Hey all I've created a source and destination using terraform, but when I try to create a connection using them, I get a terraform
    Copy code
    500 Internal Server Error
    And the logs from the
    airbyte-api-server
    Copy code
    ERROR i.m.h.s.RouteExecutor(logException):444 - Unexpected error occurred: null
    java.lang.NullPointerException: null
            at io.airbyte.api.server.helpers.AirbyteCatalogHelper.hasStreamConfigurations(AirbyteCatalogHelper.kt:51) ~[io.airbyte-airbyte-api-server-0.50.21.jar:?]
            at io.airbyte.api.server.controllers.ConnectionsController.createConnection(ConnectionsController.kt:81) ~[io.airbyte-airbyte-api-server-0.50.21.jar:?]
            at io.airbyte.api.server.controllers.$ConnectionsController$Definition$Exec.dispatch(Unknown Source) ~[io.airbyte-airbyte-api-server-0.50.21.jar:?]
            at io.micronaut.context.AbstractExecutableMethodsDefinition$DispatchedExecutableMethod.invoke(AbstractExecutableMethodsDefinition.java:371) ~[micronaut-inject-3.9.4.jar:3.9.4]
            at io.micronaut.context.DefaultBeanContext$4.invoke(DefaultBeanContext.java:594) ~[micronaut-inject-3.9.4.jar:3.9.4]
            at io.micronaut.web.router.AbstractRouteMatch.execute(AbstractRouteMatch.java:303) ~[micronaut-router-3.9.4.jar:3.9.4]
            at io.micronaut.web.router.RouteMatch.execute(RouteMatch.java:111) ~[micronaut-router-3.9.4.jar:3.9.4]
            at io.micronaut.http.context.ServerRequestContext.with(ServerRequestContext.java:103) ~[micronaut-http-3.9.4.jar:3.9.4]
            at io.micronaut.http.server.RouteExecutor.lambda$executeRoute$14(RouteExecutor.java:659) ~[micronaut-http-server-3.9.4.jar:3.9.4]
            at reactor.core.publisher.FluxDeferContextual.subscribe(FluxDeferContextual.java:49) ~[reactor-core-3.5.5.jar:3.5.5]
            at reactor.core.publisher.Flux.subscribe(Flux.java:8671) ~[reactor-core-3.5.5.jar:3.5.5]
            at reactor.core.publisher.FluxFlatMap$FlatMapMain.onNext(FluxFlatMap.java:427) ~[reactor-core-3.5.5.jar:3.5.5]
            at io.micronaut.reactive.reactor.instrument.ReactorSubscriber.onNext(ReactorSubscriber.java:57) ~[micronaut-runtime-3.9.4.jar:3.9.4]
            at reactor.core.publisher.MonoCreate$DefaultMonoSink.success(MonoCreate.java:172) ~[reactor-core-3.5.5.jar:3.5.5]
            at io.micronaut.http.server.netty.RoutingInBoundHandler$4.doOnComplete(RoutingInBoundHandler.java:965) ~[micronaut-http-server-netty-3.9.4.jar:3.9.4]
            at io.micronaut.core.async.subscriber.CompletionAwareSubscriber.onComplete(CompletionAwareSubscriber.java:79) ~[micronaut-core-reactive-3.9.4.jar:3.9.4]
            at io.micronaut.http.server.netty.jackson.JsonContentProcessor$1.doOnComplete(JsonContentProcessor.java:136) ~[micronaut-http-server-netty-3.9.4.jar:3.9.4]
            at io.micronaut.core.async.subscriber.CompletionAwareSubscriber.onComplete(CompletionAwareSubscriber.java:79) ~[micronaut-core-reactive-3.9.4.jar:3.9.4]
            at java.util.Optional.ifPresent(Optional.java:178) ~[?:?]
            at io.micronaut.core.async.processor.SingleThreadedBufferingProcessor.doOnComplete(SingleThreadedBufferingProcessor.java:48) ~[micronaut-core-reactive-3.9.4.jar:3.9.4]
            at io.micronaut.jackson.core.parser.JacksonCoreProcessor.doOnComplete(JacksonCoreProcessor.java:94) ~[micronaut-jackson-core-3.9.4.jar:3.9.4]
            at io.micronaut.core.async.subscriber.SingleThreadedBufferingSubscriber.onComplete(SingleThreadedBufferingSubscriber.java:71) ~[micronaut-core-reactive-3.9.4.jar:3.9.4]
            at io.micronaut.http.server.netty.jackson.JsonContentProcessor.doOnComplete(JsonContentProcessor.java:161) ~[micronaut-http-server-netty-3.9.4.jar:3.9.4]
            at io.micronaut.core.async.subscriber.CompletionAwareSubscriber.onComplete(CompletionAwareSubscriber.java:79) ~[micronaut-core-reactive-3.9.4.jar:3.9.4]
            at io.micronaut.http.netty.reactive.HandlerPublisher.publishMessage(HandlerPublisher.java:383) ~[micronaut-http-netty-3.9.4.jar:3.9.4]
            at io.micronaut.http.netty.reactive.HandlerPublisher.flushBuffer(HandlerPublisher.java:470) ~[micronaut-http-netty-3.9.4.jar:3.9.4]
            at io.micronaut.http.netty.reactive.HandlerPublisher.publishMessageLater(HandlerPublisher.java:360) ~[micronaut-http-netty-3.9.4.jar:3.9.4]
            at io.micronaut.http.netty.reactive.HandlerPublisher.complete(HandlerPublisher.java:423) ~[micronaut-http-netty-3.9.4.jar:3.9.4]
            at io.micronaut.http.netty.reactive.HandlerPublisher.handlerRemoved(HandlerPublisher.java:418) ~[micronaut-http-netty-3.9.4.jar:3.9.4]
            at io.netty.channel.AbstractChannelHandlerContext.callHandlerRemoved(AbstractChannelHandlerContext.java:1122) ~[netty-transport-4.1.94.Final.jar:4.1.94.Final]
            at io.netty.channel.DefaultChannelPipeline.callHandlerRemoved0(DefaultChannelPipeline.java:637) ~[netty-transport-4.1.94.Final.jar:4.1.94.Final]
            at io.netty.channel.DefaultChannelPipeline.remove(DefaultChannelPipeline.java:477) ~[netty-transport-4.1.94.Final.jar:4.1.94.Final]
            at io.netty.channel.DefaultChannelPipeline.remove(DefaultChannelPipeline.java:423) ~[netty-transport-4.1.94.Final.jar:4.1.94.Final]
            at io.micronaut.http.netty.stream.HttpStreamsHandler.removeHandlerIfActive(HttpStreamsHandler.java:512) ~[micronaut-http-netty-3.9.4.jar:3.9.4]
            at io.micronaut.http.netty.stream.HttpStreamsHandler.handleReadHttpContent(HttpStreamsHandler.java:320) ~[micronaut-http-netty-3.9.4.jar:3.9.4]
            at io.micronaut.http.netty.stream.HttpStreamsHandler.channelRead(HttpStreamsHandler.java:283) ~[micronaut-http-netty-3.9.4.jar:3.9.4]
            at io.micronaut.http.netty.stream.HttpStreamsServerHandler.channelRead(HttpStreamsServerHandler.java:134) ~[micronaut-http-netty-3.9.4.jar:3.9.4]
            at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:442) ~[netty-transport-4.1.94.Final.jar:4.1.94.Final]
            at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420) ~[netty-transport-4.1.94.Final.jar:4.1.94.Final]
            at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412) ~[netty-transport-4.1.94.Final.jar:4.1.94.Final]
            at io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:93) ~[netty-transport-4.1.94.Final.jar:4.1.94.Final]
            at io.netty.handler.codec.http.websocketx.extensions.WebSocketServerExtensionHandler.channelRead(WebSocketServerExtensionHandler.java:88) ~[netty-codec-http-4.1.94.Final.jar:4.1.94.Final]
            at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:442) ~[netty-transport-4.1.94.Final.jar:4.1.94.Final]
            at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420) ~[netty-transport-4.1.94.Final.jar:4.1.94.Final]
            at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412) ~[netty-transport-4.1.94.Final.jar:4.1.94.Final]
            at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103) ~[netty-codec-4.1.94.Final.jar:4.1.94.Final]
            at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444) ~[netty-transport-4.1.94.Final.jar:4.1.94.Final]
            at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420) ~[netty-transport-4.1.94.Final.jar:4.1.94.Final]
            at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412) ~[netty-transport-4.1.94.Final.jar:4.1.94.Final]
            at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103) ~[netty-codec-4.1.94.Final.jar:4.1.94.Final]
            at io.netty.handler.codec.MessageToMessageCodec.channelRead(MessageToMessageCodec.java:111) ~[netty-codec-4.1.94.Final.jar:4.1.94.Final]
            at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:442) ~[netty-transport-4.1.94.Final.jar:4.1.94.Final]
            at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420) ~[netty-transport-4.1.94.Final.jar:4.1.94.Final]
            at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412) ~[netty-transport-4.1.94.Final.jar:4.1.94.Final]
            at io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:93) ~[netty-transport-4.1.94.Final.jar:4.1.94.Final]
            at io.netty.handler.codec.http.HttpServerKeepAliveHandler.channelRead(HttpServerKeepAliveHandler.java:64) ~[netty-codec-http-4.1.94.Final.jar:4.1.94.Final]
            at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:442) ~[netty-transport-4.1.94.Final.jar:4.1.94.Final]
            at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420) ~[netty-transport-4.1.94.Final.jar:4.1.94.Final]
            at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412) ~[netty-transport-4.1.94.Final.jar:4.1.94.Final]
            at io.netty.handler.flow.FlowControlHandler.dequeue(FlowControlHandler.java:202) ~[netty-handler-4.1.94.Final.jar:4.1.94.Final]
            at io.netty.handler.flow.FlowControlHandler.read(FlowControlHandler.java:139) ~[netty-handler-4.1.94.Final.jar:4.1.94.Final]
            at io.netty.channel.AbstractChannelHandlerContext.invokeRead(AbstractChannelHandlerContext.java:837) ~[netty-transport-4.1.94.Final.jar:4.1.94.Final]
            at io.netty.channel.AbstractChannelHandlerContext.read(AbstractChannelHandlerContext.java:814) ~[netty-transport-4.1.94.Final.jar:4.1.94.Final]
            at io.micronaut.http.netty.reactive.HandlerPublisher.requestDemand(HandlerPublisher.java:165) ~[micronaut-http-netty-3.9.4.jar:3.9.4]
            at io.micronaut.http.netty.stream.HttpStreamsHandler$2.requestDemand(HttpStreamsHandler.java:274) ~[micronaut-http-netty-3.9.4.jar:3.9.4]
            at io.micronaut.http.netty.reactive.HandlerPublisher$ChannelSubscription.receivedDemand(HandlerPublisher.java:556) ~[micronaut-http-netty-3.9.4.jar:3.9.4]
            at io.micronaut.http.netty.reactive.HandlerPublisher$ChannelSubscription.lambda$request$0(HandlerPublisher.java:494) ~[micronaut-http-netty-3.9.4.jar:3.9.4]
            at io.netty.util.concurrent.AbstractEventExecutor.runTask(AbstractEventExecutor.java:174) ~[netty-common-4.1.94.Final.jar:4.1.94.Final]
            at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:167) ~[netty-common-4.1.94.Final.jar:4.1.94.Final]
            at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:470) ~[netty-common-4.1.94.Final.jar:4.1.94.Final]
            at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:566) ~[netty-transport-4.1.94.Final.jar:4.1.94.Final]
            at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997) ~[netty-common-4.1.94.Final.jar:4.1.94.Final]
            at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) ~[netty-common-4.1.94.Final.jar:4.1.94.Final]
            at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) ~[netty-common-4.1.94.Final.jar:4.1.94.Final]
            at java.lang.Thread.run(Thread.java:1589) ~[?:?]
    Any directions on this, what might be causing it?
    u
    t
    • 3
    • 6
  • s

    Sunil Jimenez

    08/25/2023, 1:14 PM
    Hello. http://portal.airbyte.com/ having DNS Problems. Is there another way of managing API credentials? @Bryce Groff (Airbyte)?
    u
    b
    • 3
    • 2
  • e

    Eduardo Trevisani

    08/28/2023, 9:16 PM
    Hello! I’m having trouble deploying a Zendesk Support source using the Terraform Provider. The problem seems to happen when using the
    source_zendesk_support_authentication_api_token
    nested field of the credentials field (available in the docs). When applying a Terraform code that uses it, the API gets an unexpected response code 422, with the details that read:
    Copy code
    {"type":"<https://reference.airbyte.com/reference/errors#unprocessable-entity>","status":422,"title":"unprocessable-entity","detail":"The provided
    configuration does not fulfill the specification. Errors: json schema validation failed when comparing the data to the json schema. \nErrors: $.credentials:
    null found, object expected, $.credentials: null found, object expected "}
    Looks like that the parser sets the credentials field to
    null
    when I’m using this field, as shown in the API payload that generated the above response:
    Copy code
    {
      "configuration": {
        "credentials": null,
        "ignore_pagination": true,
        "sourceType": "zendesk-support",
        "start_date": "2023-01-01T00:00:00Z",
        "subdomain": "[REDACTED]"
      },
      "name": "airbyte.zendesk-support",
      "workspaceId": "[REDACTED]"
    }
    When using a different field like
    source_zendesk_support_update_authentication_api_token
    (available it this doc), the error does not occur. Even though, while the source is created, the credentials are not, resulting in a source created without any Authentication fields, as seen in the print-screen attached. The Terraform code I’m using is
    Copy code
    resource "airbyte_source_zendesk_support" "source_zendesk_support" {
      name          = "airbyte.zendesk-support"
      workspace_id  = var.workspace_id
    
      configuration = {
        source_type       = "zendesk-support"
        subdomain         = var.zendesk["subdomain"]
        start_date        = "2023-01-01T00:00:00Z"
        ignore_pagination = true
        credentials = {
          source_zendesk_support_authentication_api_token = {
            credentials = "api_token"
            api_token   = var.zendesk["api_token"]
            email       = var.zendesk["email"]
          }
        }
      }
    }
    j
    l
    • 3
    • 3
  • f

    Frank Mena

    08/31/2023, 5:09 PM
    curl -u airbyte:password --request GET \ --url http://localhost:8000/v1/health \ --header 'accept: application/json' \ --header 'content-type: application/json'
    u
    • 2
    • 1
  • f

    Frank Mena

    08/31/2023, 5:09 PM
    I'm trying to get API working. It's not
  • f

    Frank Mena

    08/31/2023, 5:09 PM
    The response is an HTML page instead of the help result
  • g

    Gaëtan Podevijn

    09/01/2023, 8:45 AM
    Hi everyone. Is it possible to use the Config API with a Cloud instance?
    u
    m
    • 3
    • 2
  • t

    Thibaut B

    09/04/2023, 10:40 AM
    Hey @kapa.ai what are different values of field “status” for connection configuration in an octavia yaml ?
    u
    • 2
    • 2
  • c

    Christos Gkournelos

    09/05/2023, 5:38 PM
    Hey everyone!! I would like to load a custom-built Source Connector in my local Airbyte instance through the API for testing purposes. In the documentation is mentioned that this is feasible. Could anyone here assist me?
    u
    • 2
    • 2
  • s

    Steven Schmatz

    09/05/2023, 7:03 PM
    I’m looking to set up Prefect and Airbyte Cloud together. Where can I find the host and port of the Airbyte Cloud server? I’m not self-hosting so I don’t have my own host/port. Sorry, I know this is a basic question, but I can’t seem to find this after 15 mins. Also I’m assuming for username + password I should just create a service account?
    d
    • 2
    • 1
  • u

    user

    09/05/2023, 9:38 PM
    Yes, there is a Prefect connector for Airbyte. It's called the AirbyteConnectionTask and it's an official integration Task in the Prefect project. This task allows you to trigger synchronization jobs in Prefect. Here is an example of how to use it:
    Copy code
    python sync_airbyte_connection = AirbyteConnectionTask(     max_retries=3, retry_delay=timedelta(seconds=10) )  # Sync Airbyte GitHub data airbyte_github_sync = sync_airbyte_connection(     airbyte_server_host="localhost",     airbyte_server_port=8000,     airbyte_api_version="v1",     connection_id=airbyte_github_connection_id, )  # Sync dbt GitHub data dbt_github_sync = sync_airbyte_connection(     connection_id=dbt_github_connection_id,     airbyte_server_host="localhost",     airbyte_server_port=8000,     airbyte_api_version="v1", )  # Sync Prefect GitHub data prefect_github_sync = sync_airbyte_connection(     airbyte_server_host="localhost",     airbyte_server_port=8000,     airbyte_api_version="v1",     connection_id=prefect_github_connection_id, )
    You can find more information about this in the [Prefect Airbyte Task documentation](https://docs.prefect.io/api/latest/tasks/airbyte.html#airbyteconnectiontask) and the [Airbyte documentation](https://docs.airbyte.com/operator-guides/using-prefect-task). However, please note that the examples provided are for a local instance of Airbyte. If you're using Airbyte Cloud, you'll need to replace "localhost" and "8000" with the appropriate host and port for your Airbyte Cloud instance.
  • b

    Brandon Freeman

    09/06/2023, 12:47 PM
    Continuously getting a
    404
    when hitting the
    localhost:8006/v1/jobs/{job_id}
    endpoint for valid `job_id`s in my
    v0.50.14
    OSS implementation. Seems that others are experiencing the same with
    v0.50.24
    . Hitting
    /v1/jobs?status=running
    I'll get the expected output:
    Copy code
    {'data': [
        {
            'connectionId': 'aaeb5f07-76dc-4b0a-9498-40f26c4cdfe4',
            'duration': 'PT0S',
            'jobId': 4442,
            'jobType': 'sync',
            'startTime': '2023-09-05T22:15:43Z',
            'status': 'running'
        },
        {
            'connectionId': '52342a79-b9f7-4258-ad53-d067d0340c6e',
            'duration': 'PT0S',
            'jobId': 4439,
            'jobType': 'sync',
            'startTime': '2023-09-05T16:08:52Z',
            'status': 'running'
        }
    ]}
    Yet when I make the call to
    /v1/jobs/4442
    or
    /v1/jobs/4439
    I simply get a
    404 - Not Found
    response.
    j
    • 2
    • 8
  • c

    Chris Eik

    09/07/2023, 4:20 PM
    I'm trying to create a snowflake destination via terraform. the authorization method and password seems to not be set even though it is provided via terraform. i tried the three options in the screenshot. Also attaching a screenshot of how the connection looks in the UI afterwards. I also can't find an option to set the
    Data Staging Method
    via terraform. Manually selecting both and putting in a password works fine. terraform 0.3.3 and airbyte 0.50.27
  • c

    Carolina Buckler

    09/08/2023, 6:06 PM
    Is there a way to move a connection from one workspace to another without losing the job history?
  • v

    Veeraswamy G

    09/08/2023, 7:54 PM
    Hi Team, We wanted to bring up a question regarding the Airbyte API's configuration. In our current setup, we have the capability to test both source and destination connections within the Configuration API. However, upon reviewing the Airbyte API documentation, we couldn't find a similar option. Could you please clarify whether there is an existing REST endpoint or any plans to introduce one in the future that would allow us to test the source or destination connection in the Airbyte API?
    r
    • 2
    • 4
  • j

    Jason Maddern

    09/11/2023, 2:13 AM
    Hi All, I'm looking to do a partial stream reset (as I only need to change one stream) as shown in the image below, but when doing so I get the error:
    ERROR i.a.w.i.EmptyAirbyteSource(start):101 - The state a legacy one but we are trying to do a partial update, this is not supported.
    Source: Oracle 0.4.0 (latest) Destination: Snowflake 3.1.2 (latest) Airbyte: 0.50.5 (which agreed needs an upgrade) Any ideas on which component specifically isn't supporting this and I can upgrade/change?
    • 1
    • 1
  • s

    Steven Murphy

    09/11/2023, 2:40 PM
    With an OSS deployment: can the Airbyte API be used to provision connections that utilise custom connectors?
    u
    • 2
    • 2
  • n

    Nguyen Dao (Bach)

    09/11/2023, 5:58 PM
    @kapa.ai With OSS deployment by Docker Composer on GCP VM, do I need to expose Airbyte IP to the public for using Terraform to configure Airbyte objects?
    u
    u
    +4
    • 7
    • 9
  • v

    Vignesh Sivarajah

    09/12/2023, 10:35 AM
    Hi! new to Airbyte, product looks very good! Is there any kubernetes operator that allows to manage connectors using kubernetes manifests?
    u
    • 2
    • 2
  • l

    Louis Auneau

    09/13/2023, 7:44 PM
    Hello! I have an issue when creating my sources using Terraform. I got a similar issue in both
    airbyte_source_postgres
    and
    airbyte_source_zendesk_support
    with provider in v0.3.4. I have some attributes that are ignored and sent as
    null
    to the API, resulting in a fail during the terraform apply. Here are my two sources:
    Copy code
    resource "airbyte_source_zendesk_support" "******_zendesk_support" {
      name         = "Zendesk Support"
      workspace_id = var.workspace_id
    
      configuration = {
        credentials = {
          source_zendesk_support_authentication_api_token = {
            email       = "******"
            api_token   = var.zendesk_api_token
            credentials = "api_token"
          }
        }
        ignore_pagination = false
        source_type       = "zendesk-support"
        start_date        = "2023-05-01T00:00:00Z"
        subdomain         = "******"
      }
    }
    
    resource "airbyte_source_postgres" "******_postgres" {
      workspace_id = var.workspace_id
      name         = "CloudSQL Production"
    
      configuration = {
        host        = var.database_hostname
        port        = 5432
        database    = "******"
        username    = var.database_username
        password    = var.database_password
        source_type = "postgres"
    
        replication_method = {
          source_postgres_update_method_read_changes_using_write_ahead_log_cdc = {
            method = "CDC"
            publication = "******"
            replication_slot = "******"
          }
        }
    
        schemas = [******]
      }
    }
    When running the terraform apply, the
    airbyte_source_zendesk_support.configuration.credentials
    and
    airbyte_source_postgres.configuration.replication_method
    are set as null. Do you have any idea why this happens to these specific blocks. They both seem to respect the schema. Thank you by advance! Louis
  • v

    Vignesh Sivarajah

    09/18/2023, 7:16 AM
    Hi, i am running Airbyte on Kubernetes and wanted to know where i can find api docs. I want to create a microservice that communicates with the Airbyte server to create
    source|destination|connection
    with code. The official docs are pointing to
    <http://api.airbyte.com|api.airbyte.com>
    : https://reference.airbyte.com/reference/createsource.
    u
    r
    • 3
    • 3
  • a

    Amit Jaiswal

    09/19/2023, 8:52 PM
    Hello, I am having problem with the slack notifications. I currently using Airbyte OSS v0.50.7. from UI when I test a Slack webhook It says
    Copy code
    Webhook test failed. Please verify that the webhook URL is valid
    When I hit the notification API using the following params:
    Copy code
    curl --location --request POST '<http://localhost:80002/api/v1/notifications/try>' \
    --header 'Authorization: Basic ' \
    --header 'Content-Type: application/json' \
    --data-raw '{
        "notificationType": "slack",
        "sendOnSuccess": true,
        "sendOnFailure": true,
        "slackConfiguration": {
            "webhook":    "<https://hooks.slack.com/services/><EXTRA_CHARS>"
        },
        "customerioConfiguration": {}
    }'
    Getting an empty response with 200 status? I see server logs and I see nothing there. Is there a way to fix this.
  • r

    Rutul Saraiya

    09/21/2023, 5:24 AM
    Hello I am using Airbyte OSS. I have installed it in my local using docker. I am able to execute list connection, list destination, list source apis. But when I try to create source using following details. http://localhost:8006/v1/sources
    Copy code
    Authorization:Basic xxxxxxxxxxxxxxx
    accept:application/json
    content-type:application/json
    Copy code
    {
      "configuration": {
        "sourceType": "mysql",
        "port": 3306,
        "ssl_mode": {
          "mode": "preferred"
        },
        "replication_method": {
          "method": "STANDARD"
        },
        "tunnel_method": {
          "tunnel_method": "NO_TUNNEL"
        },
        "host": "xx.xx.xx.xx",
        "database": "sourcedb",
        "username": "root",
        "password": "xxxxx"
      },
      "name": "servermysql",
      "workspaceId": "544cc0ed-7f05-4949-9e60-2a814f90c035"
    }
    I always get following response
    Copy code
    {
      "type": "<https://reference.airbyte.com/reference/errors>",
      "title": "unexpected-problem",
      "status": 500,
      "detail": "An unexpected problem has occurred. If this is an error that needs to be addressed, please submit a pull request or github issue."
    }
    Can anyone suggest in this issue ?
  • s

    Shubham

    09/21/2023, 7:35 AM
    Hello, I am using
    AirbyteTriggerSyncOperator
    to trigger sync of my airbyte jobs using airfow. I want to trigger these tasks inside a dag conditionally. I.e. if a condition is false then I want to skip this task which triggers a sync job and move on with the rest of the DAG How do I do it?
    m
    • 2
    • 2
  • c

    Cyprien Barbault

    09/21/2023, 9:50 AM
    Hello I'm trying to move my Airbyte config to the terraform provider, but I can't manage to find where we define the normalisation process for the connection Did someone manage to make it work ?
    u
    • 2
    • 2
  • c

    Cyprien Barbault

    09/21/2023, 12:59 PM
    Hey again, when deploying a big Salesforce <> Redshift connection, terraform fail with a timeout error. Is there a way to configure this ?
    Copy code
    context deadline exceeded (Client.Timeout exceeded while awaiting headers)
  • n

    Nguyen Dao (Bach)

    09/25/2023, 11:15 PM
    Hello, Has anyone tried to create custom connectors using Terraform without clicking manually on the UI?
    u
    u
    +7
    • 10
    • 14
  • t

    Tim Clemans

    09/25/2023, 11:36 PM
    Hello, I am trying to use the Airbyte API in a script to to create 4 different connections between 4 postgres sources and the same bigquery destination. Once I was able to successfully create 4 connections. But I usually get one of these errors:
    data: 'stream timeout'
    or
    data: 'upstream request timeout'
    I believe the create connection api call is failing. AirByte OSS version:
    0.50.30
    Postgres Source version:
    3.1.8
    BigQuery Destination version:
    2.0.16
    Any ideas?
    u
    • 2
    • 2
  • k

    KENNEDY SITATI

    09/26/2023, 7:43 AM
    Hello, how do you specify only the list of tables you need to sync in airbyte dagster integration using "airbyte_sync_op.configured"
    u
    • 2
    • 1
  • a

    Anil Kumar

    09/27/2023, 9:26 AM
    Hello I am working on a product where i want the functionality like the registered user can connect to their business profiles(facebook, snapchat) to get their analytics and can be stored in to the database so i need to know does airbyte provide any api which we can use as a third party api in our website to make it work, please do reply
    u
    r
    • 3
    • 3
12345Latest