https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • j

    Johannes Müller

    02/10/2023, 1:25 PM
    Hi, How do I change the webserver's port in the docker compose setup? The docs say I have to change the
    .env
    file's
    WEBAPP_URL
    which I did, but the webserver is still on the original port 8000:
    airbyte_change_port.mp4
    u
    • 2
    • 2
  • d

    Dusty Shapiro

    02/10/2023, 1:35 PM
    For K8s deployers: Is there a way to specify an automatic cleanup of pods upon completion?
    j
    a
    s
    • 4
    • 24
  • a

    Alexander Ettingshausen

    02/10/2023, 1:36 PM
    Hi there, we are currently trying to setup a connection from Amazon Ads to Big Query with GCS Staging. The Sync returns right at the start of the sync the following error: 2023-02-10 122132 INFO i.a.v.j.JsonSchemaValidator(test):130 - JSON schema validation failed. errors: $.method: must be a constant value Standard 2023-02-10 122132 INFO i.a.v.j.JsonSchemaValidator(test):130 - JSON schema validation failed. errors: $.credential.hmac_key_secret: object found, string expected, $.credential.hmac_key_access_id: object found, string expected I simply copied the hmac key information into the input fields of the source, so they should be correct. Anyone successfully established a BigQuery destination with GCS Staging enabled?
    plus1 4
    ✅ 1
    n
    • 2
    • 7
  • j

    Justen Walker

    02/10/2023, 2:16 PM
    I'm following up on this PR #22521 to get this triaged/reviewed. Is there a channel for Airbyte Dev and/or Kubernetes? It's a relatively minor addition that makes the Airbyte Helm Chart usable with external Postgres DB w/ SSL (ie: AWS RDS)
    plus1 1
    ✅ 1
    a
    n
    +2
    • 5
    • 10
  • d

    Dany Chepenko

    02/10/2023, 3:50 PM
    Hey folks, has anyone had an issue with Amplitude recently? The connector suddenly dropped without changing the line of code,
    u
    • 2
    • 1
  • l

    Luiz Aléssio Cesa

    02/10/2023, 5:03 PM
    Hey friends, when replicating data using CDC, in cases that there are many updates in a database that is being tracked but only a tiny number of updates are related to the table(s) and schema(s) for which the connector is capturing changes, the connector reads from the database transaction log as usual but rarely emits change records. This means that no offset updates are committed and the connector does not have an opportunity to send the latest retrieved LSN to the database. The database retains WAL files that contain events that have already been processed by the connector and this can generate storage problems. I've tried using the LSN commit behavior: While reading data, however the database Oldest Replication Slot Lag kept growing and storage decreasing. How would you recommend dealing with it?
    u
    • 2
    • 1
  • a

    Adrian Bakula

    02/10/2023, 5:21 PM
    Hello all, bump on this message: https://airbytehq.slack.com/archives/C021JANJ6TY/p1675179716253569 This was brought up by an auditing team on our side, so would love to have an answer for this
  • s

    Sam Stoelinga

    02/10/2023, 5:21 PM
    any docs on how to add an icon to a connector? or python example connector that has done this? I'm looking at several destinations but none of them seem to have an icon. CC @Marcos Marx (Airbyte) @Joe Reuter
    n
    • 2
    • 3
  • j

    John Munson

    02/10/2023, 6:43 PM
    Good afternoon. I am wondering if anyone has had any success with configuring the Airbyte database to store it's internal config in a schema other than
    public
    . This isn't specifically addressed in the docs, but using the .env file I was able to use the currentSchema parameter in the database connection string to accomplish this. Something like...
    Copy code
    DATABASE_URL=jdbc:<postgresql://db:5432/airbyte?currentSchema=airbyte_metadata>
    Airbyte successfully connects to the
    airbyte_metadata
    schema and bootstraps the necessary tables. The problem though seems to be that as Airbyte continues initializing/bootstrapping, there are things further downstream that still expect to find these tables in the public schema. Has anyone else come across this issue? Or does it seem that I have something configured wrong and this should totally be possible? Thanks! (by the way, if it wasn't clear, I am using airbyte open source)
    m
    • 2
    • 2
  • s

    Sharath Chandra

    02/10/2023, 6:53 PM
    There has been an error. It says source connection broken. I see something like below screenshot, so the balance records which are emitted but not committed, will airbyte try to load them in further attempts ??
    n
    • 2
    • 2
  • c

    Camilo Atencio

    02/10/2023, 6:55 PM
    Hi everyone im experiencing issues when trying to use basic normalization. Im using airbyte 0.42.23, deployed on aws kubernetes. I’m trying a connection from e2e-testing to snowflake, enabling basic normalization. The process fails in the normalization step. The message is
    Additional Failure Information: message='io.temporal.serviceclient.CheckedExceptionWrapper: java.util.concurrent.ExecutionException: io.airbyte.workers.exception.WorkerException: Normalization Failed.', type='java.lang.RuntimeException', nonRetryable=false
    . Its not resources related, its the only job i run. Any other run without normalization will be succesful. If you look at the logs, it seems like the normalization runs fine. Actually I see data in the table. So im not sure why is the normalization being marked as failed. Anyone has any ideas?
    78c42552_eb23_4533_bf65_f2deca9608b2_logs_1737_txt
    c
    • 2
    • 1
  • w

    Wesley Gormley

    02/10/2023, 7:04 PM
    Is there a reason why
    SFTP Bulk
    is not included in the default connectors?
    u
    • 2
    • 1
  • m

    mohd shaikh

    02/10/2023, 7:09 PM
    Hi Everyone! Has anyone tried connecting mongodb atlas as source in airbyte? Struggling with connection URL. Any help would be appreciated.
  • m

    Marco de Nijs

    02/10/2023, 7:55 PM
    I have a (stupid) question about the pricing calculator. The first slider asks me to fill in the 'Data to replicate from your database sources'. Does that mean the full size of the database that I want airbyte to sync? Or just the amount of new data added each month? In other words, does the calculator keep incremental sync in mind or how do I use this?
  • n

    Nag R

    02/10/2023, 8:08 PM
    When the AirByte sends a notification to the webhook (configured to receive sync notifications), does it post any data into the webhook?
    a
    • 2
    • 1
  • f

    Francisco Viera

    02/10/2023, 9:04 PM
    Hello Guys, Exist issue with airbyte helm couldn't find key STRICT_COMPARISON_NORMALIZATION_TAG in ConfigMap airbyte/airbyte-airbyte-env: CreateContainerConfigError please Approve this PR https://github.com/airbytehq/airbyte/pull/22697
    s
    • 2
    • 1
  • j

    Jake Vernon

    02/10/2023, 10:36 PM
    Hi I deployed airbyte on digital ocean using docker and IM trying to setup ssl and a domain to point to the container. Any good docs on this?
    ✅ 1
    u
    u
    • 3
    • 5
  • j

    Jake Vernon

    02/10/2023, 10:37 PM
    I saw the
    webapp_url
    in the .env and I did configure that to use my domain but that doesnt do anything
    u
    • 2
    • 2
  • r

    Rytis Zolubas

    02/11/2023, 9:26 AM
    Hello! I am getting this error: docker deployment v.40.19 Airbyte Googel ads - 0.2.4 snowflake - 0.4.40 Any ideas what is happening?
    Copy code
    2023-02-11 09:22:27 [1;31mERROR[m i.a.c.t.ConnectionManagerUtils(getWorkflowState):222 - Exception thrown while checking workflow state for connection id 5cbf7ac0-8a79-4394-95cb-d5f435c42668
    io.temporal.client.WorkflowServiceException: workflowId='connection_manager_5cbf7ac0-8a79-4394-95cb-d5f435c42668', runId='', workflowType='ConnectionManagerWorkflow'}
    	at io.temporal.internal.sync.WorkflowStubImpl.throwAsWorkflowFailureExceptionForQuery(WorkflowStubImpl.java:379) ~[temporal-sdk-1.17.0.jar:?]
    	at io.temporal.internal.sync.WorkflowStubImpl.query(WorkflowStubImpl.java:274) ~[temporal-sdk-1.17.0.jar:?]
    	at io.temporal.internal.sync.WorkflowInvocationHandler$SyncWorkflowInvocationHandler.queryWorkflow(WorkflowInvocationHandler.java:309) ~[temporal-sdk-1.17.0.jar:?]
    	at io.temporal.internal.sync.WorkflowInvocationHandler$SyncWorkflowInvocationHandler.invoke(WorkflowInvocationHandler.java:272) ~[temporal-sdk-1.17.0.jar:?]
    	at io.temporal.internal.sync.WorkflowInvocationHandler.invoke(WorkflowInvocationHandler.java:178) ~[temporal-sdk-1.17.0.jar:?]
    	at jdk.proxy2.$Proxy91.getState(Unknown Source) ~[?:?]
    	at io.airbyte.commons.temporal.ConnectionManagerUtils.getWorkflowState(ConnectionManagerUtils.java:220) ~[io.airbyte-airbyte-commons-temporal-0.40.19.jar:?]
    	at io.airbyte.commons.temporal.ConnectionManagerUtils.isWorkflowStateRunning(ConnectionManagerUtils.java:228) ~[io.airbyte-airbyte-commons-temporal-0.40.19.jar:?]
    	at io.airbyte.commons.temporal.TemporalClient.startNewManualSync(TemporalClient.java:226) ~[io.airbyte-airbyte-commons-temporal-0.40.19.jar:?]
    	at io.airbyte.server.scheduler.TemporalEventRunner.startNewManualSync(TemporalEventRunner.java:27) ~[io.airbyte-airbyte-server-0.40.19.jar:?]
    	at io.airbyte.server.handlers.SchedulerHandler.submitManualSyncToWorker(SchedulerHandler.java:417) ~[io.airbyte-airbyte-server-0.40.19.jar:?]
    	at io.airbyte.server.handlers.SchedulerHandler.syncConnection(SchedulerHandler.java:347) ~[io.airbyte-airbyte-server-0.40.19.jar:?]
    	at io.airbyte.server.apis.ConnectionApiController.lambda$syncConnection$7(ConnectionApiController.java:77) ~[io.airbyte-airbyte-server-0.40.19.jar:?]
    	at io.airbyte.server.apis.ApiHelper.execute(ApiHelper.java:17) ~[io.airbyte-airbyte-server-0.40.19.jar:?]
    	at io.airbyte.server.apis.ConnectionApiController.syncConnection(ConnectionApiController.java:77) ~[io.airbyte-airbyte-server-0.40.19.jar:?]
    	at jdk.internal.reflect.DirectMethodHandleAccessor.invoke(DirectMethodHandleAccessor.java:104) ~[?:?]
    	at java.lang.reflect.Method.invoke(Method.java:578) ~[?:?]
    	at org.glassfish.jersey.server.model.internal.ResourceMethodInvocationHandlerFactory.lambda$static$0(ResourceMethodInvocationHandlerFactory.java:52) ~[jersey-server-2.31.jar:?]
    	at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher$1.run(AbstractJavaResourceMethodDispatcher.java:124) ~[jersey-server-2.31.jar:?]
    	at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.invoke(AbstractJavaResourceMethodDispatcher.java:167) ~[jersey-server-2.31.jar:?]
    	at org.glassfish.jersey.server.model.internal.JavaResourceMethodDispatcherProvider$TypeOutInvoker.doDispatch(JavaResourceMethodDispatcherProvider.java:219) ~[jersey-server-2.31.jar:?]
    	at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.dispatch(AbstractJavaResourceMethodDispatcher.java:79) ~[jersey-server-2.31.jar:?]
    	at org.glassfish.jersey.server.model.ResourceMethodInvoker.invoke(ResourceMethodInvoker.java:469) ~[jersey-server-2.31.jar:?]
    	at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:391) ~[jersey-server-2.31.jar:?]
    	at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:80) ~[jersey-server-2.31.jar:?]
    	at org.glassfish.jersey.server.ServerRuntime$1.run(ServerRuntime.java:253) ~[jersey-server-2.31.jar:?]
    	at org.glassfish.jersey.internal.Errors$1.call(Errors.java:248) ~[jersey-common-2.31.jar:?]
    	at org.glassfish.jersey.internal.Errors$1.call(Errors.java:244) ~[jersey-common-2.31.jar:?]
    	at org.glassfish.jersey.internal.Errors.process(Errors.java:292) ~[jersey-common-2.31.jar:?]
    	at org.glassfish.jersey.internal.Errors.process(Errors.java:274) ~[jersey-common-2.31.jar:?]
    	at org.glassfish.jersey.internal.Errors.process(Errors.java:244) ~[jersey-common-2.31.jar:?]
    	at org.glassfish.jersey.process.internal.RequestScope.runInScope(RequestScope.java:265) ~[jersey-common-2.31.jar:?]
    	at org.glassfish.jersey.server.ServerRuntime.process(ServerRuntime.java:232) ~[jersey-server-2.31.jar:?]
    	at org.glassfish.jersey.server.ApplicationHandler.handle(ApplicationHandler.java:680) ~[jersey-server-2.31.jar:?]
    	at org.glassfish.jersey.servlet.WebComponent.serviceImpl(WebComponent.java:394) ~[jersey-container-servlet-core-2.31.jar:?]
    	at org.glassfish.jersey.servlet.WebComponent.service(WebComponent.java:346) ~[jersey-container-servlet-core-2.31.jar:?]
    	at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:366) ~[jersey-container-servlet-core-2.31.jar:?]
    	at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:319) ~[jersey-container-servlet-core-2.31.jar:?]
    	at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:205) ~[jersey-container-servlet-core-2.31.jar:?]
    	at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:763) ~[jetty-servlet-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:569) ~[jetty-servlet-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) ~[jetty-server-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1377) ~[jetty-server-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) ~[jetty-server-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:507) ~[jetty-servlet-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) ~[jetty-server-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1292) ~[jetty-server-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) ~[jetty-server-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) ~[jetty-server-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.server.Server.handle(Server.java:501) ~[jetty-server-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:383) ~[jetty-server-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:556) ~[jetty-server-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:375) ~[jetty-server-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:273) ~[jetty-server-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) ~[jetty-io-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) ~[jetty-io-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) ~[jetty-io-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:336) ~[jetty-util-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:313) ~[jetty-util-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:171) ~[jetty-util-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:129) ~[jetty-util-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:375) ~[jetty-util-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:806) ~[jetty-util-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:938) ~[jetty-util-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at java.lang.Thread.run(Thread.java:1589) ~[?:?]
    Caused by: io.grpc.StatusRuntimeException: DEADLINE_EXCEEDED: deadline exceeded after 9.999985556s. [closed=[], open=[[remote_addr=airbyte-temporal/172.26.0.8:7233]]]
    	at io.grpc.stub.ClientCalls.toStatusRuntimeException(ClientCalls.java:271) ~[grpc-stub-1.50.2.jar:1.50.2]
    	at io.grpc.stub.ClientCalls.getUnchecked(ClientCalls.java:252) ~[grpc-stub-1.50.2.jar:1.50.2]
    	at io.grpc.stub.ClientCalls.blockingUnaryCall(ClientCalls.java:165) ~[grpc-stub-1.50.2.jar:1.50.2]
    	at io.temporal.api.workflowservice.v1.WorkflowServiceGrpc$WorkflowServiceBlockingStub.queryWorkflow(WorkflowServiceGrpc.java:4099) ~[temporal-serviceclient-1.17.0.jar:?]
    	at io.temporal.internal.client.external.GenericWorkflowClientImpl.lambda$query$9(GenericWorkflowClientImpl.java:206) ~[temporal-sdk-1.17.0.jar:?]
    	at io.temporal.internal.retryer.GrpcSyncRetryer.retry(GrpcSyncRetryer.java:67) ~[temporal-serviceclient-1.17.0.jar:?]
    	at io.temporal.internal.retryer.GrpcRetryer.retryWithResult(GrpcRetryer.java:60) ~[temporal-serviceclient-1.17.0.jar:?]
    	at io.temporal.internal.client.external.GenericWorkflowClientImpl.query(GenericWorkflowClientImpl.java:201) ~[temporal-sdk-1.17.0.jar:?]
    	at io.temporal.internal.client.RootWorkflowClientInvoker.query(RootWorkflowClientInvoker.java:140) ~[temporal-sdk-1.17.0.jar:?]
    	at io.temporal.internal.sync.WorkflowStubImpl.query(WorkflowStubImpl.java:270) ~[temporal-sdk-1.17.0.jar:?]
    	... 63 more
    n
    • 2
    • 1
  • p

    Praveenraaj K S

    02/11/2023, 10:04 AM
    Hi Team, We have a MySQL(V1.0.18) to BigQuery(V1.2.9) connection for a one-time dump of a huge table containing over 170 crore records(Airbyte Version 0.40.28). I haven't seen any errors during the syncing process. But After 22+ hours of syncing and a log row size of 10 lakhs, the data reading has frozen for a very long time (around 6 hours, still no movement of data). can someone help me to resolve this?
    🙏 1
  • a

    Ankit Kumar

    02/11/2023, 1:23 PM
    hi team @airbyte am getting error while working with mongo connector The connection tests failed. State code: 18; Message: Exception authenticating MongoCredential{mechanism=SCRAM-SHA-1, userName=‘itachi’, source=‘admin’, password=<hidden>, mechanismProperties=<hidden>} logs : 2023-02-11 130837 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):167 - completed source: class io.airbyte.integrations.source.mongodb.MongoDbSource 2023-02-11 130838 INFO i.a.w.g.DefaultCheckConnectionWorker(run):114 - Check connection job received output: io.airbyte.config.StandardCheckConnectionOutput@5515c9e[status=failed,message=State code: 18; Message: Exception authenticating MongoCredential{mechanism=SCRAM-SHA-1, userName=‘itachi’, source=‘admin’, password=<hidden>, mechanismProperties=<hidden>}] 2023-02-11 130838 INFO i.a.w.t.TemporalAttemptExecution(get):163 - Stopping cancellation check scheduling... 2023-02-11 130838 INFO i.a.c.i.LineGobbler(voidCall):114 - 2023-02-11 130838 INFO i.a.c.i.LineGobbler(voidCall):114 - ----- END CHECK ----- 2023-02-11 130838 INFO i.a.c.i.LineGobbler(voidCall):114 -
    m
    u
    • 3
    • 4
  • l

    Lior Chen

    02/11/2023, 1:46 PM
    hello all getting error in my shopify connectors:
    Copy code
    2023-02-11 13:20:11 normalization > WARN: Unknown type for column source_url at abandoned_checkouts
    2023-02-11 13:20:11 normalization > WARN: Unknown type for column updated_at at abandoned_checkouts
    2023-02-11 13:20:11 normalization > WARN: Unknown type for column location_id at abandoned_checkouts
    2023-02-11 13:20:11 normalization > WARN: Unknown type for column source_name at abandoned_checkouts
    2023-02-11 13:20:11 normalization > WARN: Unknown type for column total_price at abandoned_checkouts
    2023-02-11 13:20:11 normalization > WARN: Unknown type for column completed_at at abandoned_checkouts
    2023-02-11 13:20:11 normalization > WARN: Unknown type for column landing_site at abandoned_checkouts
    2023-02-11 13:20:11 normalization > WARN: Unknown type for column total_weight at abandoned_checkouts
    2023-02-11 13:20:11 normalization > WARN: Unknown type for column referring_site at abandoned_checkouts
    .....
    which eventually causing errors during the dbt execution:
    Copy code
    2023-02-11 13:20:45 normalization >   001790 (42601): SQL compilation error:
    2023-02-11 13:20:45 normalization >   inconsistent data type for result columns for set operator input branches, expected VARCHAR(16777216), got VARIANT for expression [{2}] branch {3}
    2023-02-11 13:20:45 normalization > Database Error in model ABANDONED_CHECKOUTS_SCD (models/generated/airbyte_incremental/scd/ORG_6950_01GRBK7DJ1QN50C5F3GXAVWAWQ/ABANDONED_CHECKOUTS_SCD.sql)
    this this is basically for all streams that synced successfully until now and to all similar connectors. so it doesnt seem like a local data type collision, but rather a separate problem. any idea? or suggestions on how to debug that?..
    n
    l
    • 3
    • 4
  • c

    Chris

    02/11/2023, 6:23 PM
    Hello guys! I am using Airbyte Open Source on Google Compute Engine. Recently I noticed that my Airbyte Sync fails because the disk space is full (apparently Airbyte continues take more and more space after each sync). I could get it to work again after deleting workspace volume but I thought it would be better to use an external database. So I have set up a PostgreSQL on Google Cloud SQL but when I try to run
    docker compose up -d
    , the Container airbyte-bootloader doesn’t seem to get created properly. I followed steps in the following documentation: https://docs.airbyte.com/operator-guides/configuring-airbyte-db/ https://cloud.google.com/sql/docs/postgres/connect-instance-compute-engine I am using Private IP method. I have set up same network for my Compute Engine and Cloud SQL. I also changed
    .env
    :
    Copy code
    DATABASE_USER=username
    DATABASE_PASSWORD=password
    DATABASE_HOST=project-id:us-central1:airbyte-sql
    DATABASE_PORT:5432
    DATABASE_DB=airbyte
    DATABASE_URL=jdbc:<postgresql://project-id>:us-central1:airbyte-sql:5432/airbyte
    Does anyone know why it is not working, and know how to set it up properly?
    m
    w
    • 3
    • 2
  • m

    Martin Larsson

    02/11/2023, 7:09 PM
    I am working on a low-code source connector and I just ran a sync with some unwanted behavior. For testing I set up E2E as destination and also tried syncing to BigQuery. When running with E2E destination it looks like I get the following message for each record or maybe per page:
    Copy code
    INFO i.a.v.j.JsonSchemaValidator(test):130 - JSON schema validation failed. 
    errors: $: null found, string expected
    When I run the sync with BigQuery as destination I get the same messages but the sync succeeds but with the result that about half of the rows are empty.
    n
    • 2
    • 5
  • d

    Dallin Bentley

    02/12/2023, 12:40 AM
    Hey all, I'm trying to run a connection from Mixpanel <> Postgres. Was trying to test locally and running into errors. Wanted to test in cloud version of Airbyte and the sync has been hung up with nothing but this message in the logs for about 3 hours now.
    Copy code
    2023-02-11 22:00:04 WARN c.n.s.JsonMetaSchema(newValidator):278 - Unknown keyword example - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
    Trying to get all of the streams available for Mixpanel... I know that there are api rate limits and such, so wanted to double check and see if others have had issues. This is just a free Mixpanel plan as well.
  • k

    Kaiming Wan

    02/12/2023, 9:08 AM
    I can’t start airbyte with the basic step: 1. clone airbyte 2. docker-compose up I add the detailed info in github issue, can anyone give me some help?
    m
    u
    • 3
    • 5
  • k

    Kaiming Wan

    02/12/2023, 9:08 AM
    https://github.com/airbytehq/airbyte/issues/22651
  • p

    Paulo José Ianes Bernardo Filho

    02/12/2023, 1:15 PM
    Hello guys!! How can I set manually the s3 keys in my destination connector? I am receiving this error: Retrying task after failure: The specified key does not exist and its just in my machine
    n
    • 2
    • 2
  • c

    Carolina Buckler

    02/12/2023, 2:25 PM
    Getting this error when trying to setup a connection to Facebook API, but the Facebook Ads App page that shows the app is indeed on v16.0. There are no other options for me to select. I don’t see an option to specify an API version in Airbyte.
    Copy code
    FacebookAPIException('Error: 2635, (#2635) Your app has been upgraded to version v16.0, please use this version or newer. This can be verified in the settings tab on the App Dashboard.')
    n
    l
    t
    • 4
    • 12
  • d

    Dusty Shapiro

    02/12/2023, 3:25 PM
    What’s the best way to bump our postgres Normailization K8s pod Python image from
    python:3.9.9-slim-bullseye
    to
    python:3.9.16-slim-bullseye
    ? Our vulnerability detection software is throwing quite a few warnings for the former, and recommends bumping to the later. We’re on K8s via Helm Deploy. Thanks!
    u
    r
    m
    • 4
    • 4
1...141142143...245Latest