https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • m

    Mehmet Berk Souksu

    03/01/2023, 9:29 AM
    Hello all, I started to use Airbyte with the Freshdesk Source. What I could not figure out is that even most the mhas id and updated_at column, I cannot use a incremental refresh. What is the reason of this? 😅
  • b

    Bob De Schutter

    03/01/2023, 9:54 AM
    Hi I have deployed airbyte using docker compose on a azure VM running ubuntu 22.04LTS. the airbyte database is hosted on an azure flexible postgres database. Everything seems to be working fine at first sight, I can access the airbyte UI but often when working in the UI there is an error popping up where we get the 'Oops! Something went wrong' message in the UI. I checked the server logs and this is the error that we keep getting from time to time:
    Copy code
    2023-03-01 09:52:20 ERROR i.a.s.a.ApiHelper(execute):37 - Unexpected Exception
    java.lang.RuntimeException: Failed to fetch remote definitions
            at io.airbyte.config.init.RemoteDefinitionsProvider.getRemoteDefinitionCatalog(RemoteDefinitionsProvider.java:122) ~[io.airbyte.airbyte-config-init-0.41.0.jar:?]
            at io.airbyte.config.init.RemoteDefinitionsProvider.getSourceDefinitionsMap(RemoteDefinitionsProvider.java:65) ~[io.airbyte.airbyte-config-init-0.41.0.jar:?]
            at io.airbyte.config.init.RemoteDefinitionsProvider.getSourceDefinitions(RemoteDefinitionsProvider.java:91) ~[io.airbyte.airbyte-config-init-0.41.0.jar:?]
            at io.airbyte.commons.server.handlers.SourceDefinitionsHandler.getLatestSources(SourceDefinitionsHandler.java:155) ~[io.airbyte-airbyte-commons-server-0.41.0.jar:?]
            at io.airbyte.commons.server.handlers.SourceDefinitionsHandler.listLatestSourceDefinitions(SourceDefinitionsHandler.java:151) ~[io.airbyte-airbyte-commons-server-0.41.0.jar:?]
            at io.airbyte.server.apis.ApiHelper.execute(ApiHelper.java:23) ~[io.airbyte-airbyte-server-0.41.0.jar:?]
            at io.airbyte.server.apis.SourceDefinitionApiController.listLatestSourceDefinitions(SourceDefinitionApiController.java:96) ~[io.airbyte-airbyte-server-0.41.0.jar:?]
            at io.airbyte.server.apis.$SourceDefinitionApiController$Definition$Exec.dispatch(Unknown Source) ~[io.airbyte-airbyte-server-0.41.0.jar:?]
            at io.micronaut.context.AbstractExecutableMethodsDefinition$DispatchedExecutableMethod.invoke(AbstractExecutableMethodsDefinition.java:371) ~[micronaut-inject-3.8.5.jar:3.8.5]
            at io.micronaut.context.DefaultBeanContext$4.invoke(DefaultBeanContext.java:594) ~[micronaut-inject-3.8.5.jar:3.8.5]
            at io.micronaut.web.router.AbstractRouteMatch.execute(AbstractRouteMatch.java:246) ~[micronaut-router-3.8.5.jar:3.8.5]
            at io.micronaut.web.router.RouteMatch.execute(RouteMatch.java:111) ~[micronaut-router-3.8.5.jar:3.8.5]
            at io.micronaut.http.context.ServerRequestContext.with(ServerRequestContext.java:103) ~[micronaut-http-3.8.5.jar:3.8.5]
            at io.micronaut.http.server.RouteExecutor.lambda$executeRoute$14(RouteExecutor.java:659) ~[micronaut-http-server-3.8.5.jar:3.8.5]
            at reactor.core.publisher.FluxDeferContextual.subscribe(FluxDeferContextual.java:49) ~[reactor-core-3.5.0.jar:3.5.0]
            at reactor.core.publisher.InternalFluxOperator.subscribe(InternalFluxOperator.java:62) ~[reactor-core-3.5.0.jar:3.5.0]
            at reactor.core.publisher.FluxSubscribeOn$SubscribeOnSubscriber.run(FluxSubscribeOn.java:194) ~[reactor-core-3.5.0.jar:3.5.0]
            at io.micronaut.reactive.reactor.instrument.ReactorInstrumentation.lambda$init$0(ReactorInstrumentation.java:62) ~[micronaut-runtime-3.8.5.jar:3.8.5]
            at reactor.core.scheduler.WorkerTask.call(WorkerTask.java:84) ~[reactor-core-3.5.0.jar:3.5.0]
            at reactor.core.scheduler.WorkerTask.call(WorkerTask.java:37) ~[reactor-core-3.5.0.jar:3.5.0]
            at io.micronaut.scheduling.instrument.InvocationInstrumenterWrappedCallable.call(InvocationInstrumenterWrappedCallable.java:53) ~[micronaut-context-3.8.5.jar:3.8.5]
            at java.util.concurrent.FutureTask.run(FutureTask.java:317) ~[?:?]
            at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) ~[?:?]
            at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ~[?:?]
            at java.lang.Thread.run(Thread.java:1589) ~[?:?]
    Caused by: java.net.http.HttpTimeoutException: request timed out
            at jdk.internal.net.http.HttpClientImpl.send(HttpClientImpl.java:844) ~[java.net.http:?]
            at jdk.internal.net.http.HttpClientFacade.send(HttpClientFacade.java:123) ~[java.net.http:?]
            at io.airbyte.config.init.RemoteDefinitionsProvider.getRemoteDefinitionCatalog(RemoteDefinitionsProvider.java:113) ~[io.airbyte.airbyte-config-init-0.41.0.jar:?]
            ... 24 more
    2023-03-01 09:52:20 ERROR i.a.s.e.UncaughtExceptionHandler(handle):28 - Uncaught exception
    java.lang.RuntimeException: Failed to fetch remote definitions
            at io.airbyte.config.init.RemoteDefinitionsProvider.getRemoteDefinitionCatalog(RemoteDefinitionsProvider.java:122) ~[io.airbyte.airbyte-config-init-0.41.0.jar:?]
            at io.airbyte.config.init.RemoteDefinitionsProvider.getSourceDefinitionsMap(RemoteDefinitionsProvider.java:65) ~[io.airbyte.airbyte-config-init-0.41.0.jar:?]
            at io.airbyte.config.init.RemoteDefinitionsProvider.getSourceDefinitions(RemoteDefinitionsProvider.java:91) ~[io.airbyte.airbyte-config-init-0.41.0.jar:?]
            at io.airbyte.commons.server.handlers.SourceDefinitionsHandler.getLatestSources(SourceDefinitionsHandler.java:155) ~[io.airbyte-airbyte-commons-server-0.41.0.jar:?]
            at io.airbyte.commons.server.handlers.SourceDefinitionsHandler.listLatestSourceDefinitions(SourceDefinitionsHandler.java:151) ~[io.airbyte-airbyte-commons-server-0.41.0.jar:?]
            at io.airbyte.server.apis.ApiHelper.execute(ApiHelper.java:23) ~[io.airbyte-airbyte-server-0.41.0.jar:?]
            at io.airbyte.server.apis.SourceDefinitionApiController.listLatestSourceDefinitions(SourceDefinitionApiController.java:96) ~[io.airbyte-airbyte-server-0.41.0.jar:?]
            at io.airbyte.server.apis.$SourceDefinitionApiController$Definition$Exec.dispatch(Unknown Source) ~[io.airbyte-airbyte-server-0.41.0.jar:?]
            at io.micronaut.context.AbstractExecutableMethodsDefinition$DispatchedExecutableMethod.invoke(AbstractExecutableMethodsDefinition.java:371) ~[micronaut-inject-3.8.5.jar:3.8.5]
            at io.micronaut.context.DefaultBeanContext$4.invoke(DefaultBeanContext.java:594) ~[micronaut-inject-3.8.5.jar:3.8.5]
            at io.micronaut.web.router.AbstractRouteMatch.execute(AbstractRouteMatch.java:246) ~[micronaut-router-3.8.5.jar:3.8.5]
            at io.micronaut.web.router.RouteMatch.execute(RouteMatch.java:111) ~[micronaut-router-3.8.5.jar:3.8.5]
            at io.micronaut.http.context.ServerRequestContext.with(ServerRequestContext.java:103) ~[micronaut-http-3.8.5.jar:3.8.5]
            at io.micronaut.http.server.RouteExecutor.lambda$executeRoute$14(RouteExecutor.java:659) ~[micronaut-http-server-3.8.5.jar:3.8.5]
            at reactor.core.publisher.FluxDeferContextual.subscribe(FluxDeferContextual.java:49) ~[reactor-core-3.5.0.jar:3.5.0]
            at reactor.core.publisher.InternalFluxOperator.subscribe(InternalFluxOperator.java:62) ~[reactor-core-3.5.0.jar:3.5.0]
            at reactor.core.publisher.FluxSubscribeOn$SubscribeOnSubscriber.run(FluxSubscribeOn.java:194) ~[reactor-core-3.5.0.jar:3.5.0]
            at io.micronaut.reactive.reactor.instrument.ReactorInstrumentation.lambda$init$0(ReactorInstrumentation.java:62) ~[micronaut-runtime-3.8.5.jar:3.8.5]
            at reactor.core.scheduler.WorkerTask.call(WorkerTask.java:84) ~[reactor-core-3.5.0.jar:3.5.0]
            at reactor.core.scheduler.WorkerTask.call(WorkerTask.java:37) ~[reactor-core-3.5.0.jar:3.5.0]
            at io.micronaut.scheduling.instrument.InvocationInstrumenterWrappedCallable.call(InvocationInstrumenterWrappedCallable.java:53) ~[micronaut-context-3.8.5.jar:3.8.5]
            at java.util.concurrent.FutureTask.run(FutureTask.java:317) ~[?:?]
            at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) ~[?:?]
            at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ~[?:?]
            at java.lang.Thread.run(Thread.java:1589) ~[?:?]
    Caused by: java.net.http.HttpTimeoutException: request timed out
            at jdk.internal.net.http.HttpClientImpl.send(HttpClientImpl.java:844) ~[java.net.http:?]
            at jdk.internal.net.http.HttpClientFacade.send(HttpClientFacade.java:123) ~[java.net.http:?]
            at io.airbyte.config.init.RemoteDefinitionsProvider.getRemoteDefinitionCatalog(RemoteDefinitionsProvider.java:113) ~[io.airbyte.airbyte-config-init-0.41.0.jar:?]
            ... 24 more
    Someone have any idea what could be causing this?
    g
    d
    t
    • 4
    • 5
  • l

    laila ribke

    03/01/2023, 12:10 PM
    Hi all, I´m struggling with the Facebook Marketing App Review, for my open source. I managed to set a connection with the Facebook marketing source via the amazing doc written by @CM. But I don´t know which credentials I should provide (attached image). And also, in the doc it says to record and document the successful sync, but I don´t see where to attach images/files..
    u
    • 2
    • 2
  • t

    Thiago Villani

    03/01/2023, 1:07 PM
    Hello, I have a source database sql server, and destination Postgresql, because of the version, the source does not support CDC, the replication mode I am using Incremental Sync - Deduped History, when executing the first sync it takes a while, but if I make only one change in the table, and redo the sync, it takes the same time, I understand that it is normal, because it will read the entire table. If I enable CDC on the source, will the second sync be much faster, or will it have the same behavior?, as using Incremental Sync - Deduped History.
    j
    u
    m
    • 4
    • 7
  • e

    Ernesto Costa

    03/01/2023, 2:01 PM
    Hi, I'm having trouble with the deployment of Airbyte following the steps on https://docs.airbyte.com/deploying-airbyte/local-deployment. I'm getting an error which can be found here https://discuss.airbyte.io/t/airbyte-webapp-not-starting/3348/7 and it was supposedly already fixed but it does not seem to be the case. Can you please help?
    u
    h
    • 3
    • 7
  • c

    Chris Pinchot

    03/01/2023, 2:17 PM
    I am having an issue where airbyte is telling me bool values of 0 and 1 are not valid. It’s stopping my sync from Planetscale -> Big Query from completing. Any thoughts?
    n
    • 2
    • 5
  • s

    Sean Glynn

    03/01/2023, 2:27 PM
    Hi everyone, We're facing an issue while using Airbyte OSS on K8s. In a nutshell, all of our sync jobs (running 12 / hour) come to a standstill and the Airbyte worker cannot spin up any more job pods. We get the following errors:
    Copy code
    2023-03-02 10:11:16 ERROR i.a.w.g.DefaultReplicationWorker(replicate):259 - Sync worker failed.
    io.airbyte.workers.exception.WorkerException: java.io.IOException: kubectl cp failed with exit code 1
    
    2023-03-01 10:10:50 WARNi.t.i.s.WorkflowExecuteRunnable(throwAndFailWorkflowExecution):134 - Workflow execution failure WorkflowId='sync_2465', RunId=9dbc128b-e826-4cf2-b86e-6daad66ea1e2, WorkflowType='SyncWorkflow'
    io.temporal.failure.ActivityFailure: scheduledEventId=51, startedEventId=52, activityType='Normalize', activityId='6be893fc-04a9-301c-b7ff-e921b5e0e2c8', identity='1@data-airbyte-worker-7dfdfd79f7-tq9q5', retryState=RETRY_STATE_MAXIMUM_ATTEMPTS_REACHED
    .....
    Log4j2Appender says: Completing future exceptionally...
    2023-03-01 10:10:50 INFOi.a.w.t.TemporalAttemptExecution(lambda$getWorkerThread$5):198 - Completing future exceptionally...
    io.airbyte.workers.exception.WorkerException: Normalization Failed.
    .....
    Log4j2Appender says: Normalization failed for job 2466.
    2023-03-01 10:34:56 ERROR i.a.w.g.DefaultReplicationWorker(replicate):259 - Sync worker failed.
    io.airbyte.workers.exception.WorkerException: Cannot invoke "java.lang.Integer.intValue()" because the return value of "io.airbyte.workers.process.KubePortManagerSingleton.take()" is null
    Once we restart the worker pod - things return to normal. We face this issue 1-2 times per day at different times. Are there any related open issues or suggestions on how to tune configurations to avoid this? Related issues: • This issue has been closed but we are experiencing the same issue: https://github.com/airbytehq/airbyte/issues/10748#issuecomment-1069139006 • We have deployed the newest Airbyte version which includes the increased timeout fix mentioned here: https://github.com/airbytehq/airbyte/issues/22907 Airbyte version:
    0.41.0
    I can confirm that the worker has sufficient resources (cpu/memory/disk) so I do not believe this is the issue.
    👀 1
    👍 5
    ➕ 1
    m
    p
    +11
    • 14
    • 57
  • t

    Tim Josefsson

    03/01/2023, 3:09 PM
    Hey everyone! I've started to experiment some with Airbyte so naturally I fired up the docker-compose variant as described here https://docs.airbyte.com/quickstart/deploy-airbyte added a s3 source without any problems however when I attempted to add my local ClickHouse (running on my laptop) I couldn't figure out how to create the connection. It seems the deployed docker doesn't have any access to the host network (naturally) but how to go about making the docker able to connect to the local clickhouse? Using
    host.docker.internal
    didn't seem to do the trick and if I try adding by modifying the docker-compose.yaml as suggested here https://docs.airbyte.com/troubleshooting/new-connection/
    Copy code
    extra_hosts:
    - "host.docker.internal:host-gateway"
    I just get the following when trying to run
    ./run-ab-platform.sh
    Copy code
    (root) Additional property extra_hosts is not allowed
    So what's the proper way of getting this to work?
    u
    • 2
    • 1
  • g

    Gregory Morgen

    03/01/2023, 4:15 PM
    Hello channel - I cannot initialize octavia - doesn't seem to be connecting to my instance. I have performed a default helm install with no customization and need to export the config and import it on another airbyte instance:
    Copy code
    helm install airbyte airbyte/airbyte
    I have to use octavia to export the airbyte configuration since the option the export from the UI was removed. Octavia is not able to initialize:
    Copy code
    octavia --airbyte-url <http://127.0.0.1:8000> --airbyte-username airbyte --airbyte-password password --workspace-id c21a4d1d-5eed-4aac-8dbd-8fd8373b8fa6 init
    Error: Could not reach your Airbyte instance, make sure the instance is up and running and network reachable: HTTPConnectionPool(host='127.0.0.1', port=8000): Max retries exceeded with url: /api/v1/health (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0xffff97fc1be0>: Failed to establish a new connection: [Errno 111] Connection refused'))
    simply curling the health endpoint is fine:
    Copy code
    curl <http://127.0.0.1:8000/api/v1/health>
    {"available":true}%
    I wonder if this is an auth issue - how do I find or reset the username and password in this case of a default helm install? Thank you for your assistance.
    s
    • 2
    • 2
  • d

    Dayten Sheffar

    03/01/2023, 4:56 PM
    Hello, I had a connection that was working/pulling data from the AppStore (
    sales_reports
    ) to a local postgres db. I reset the data but when I “refresh source schema” the table
    sales_reports
    disappears (so I can’t save the changes). I tried setting up a brand new source (i.e., AppStore 2) and all the default connection tests pass. When I use this duplicated source in a connection (again to postgres) the table doesn’t even appear under “Activate the streams you want to sync”… I understand the AppStore connector is in alpha and may not receive support, hoping someone has solved this before. I can pull data from python from the AppStore with the same access keys (but defeats the point of having Airbyte if I have to orchestrate python scripts anyway). Can I rollback my AppStore connector to a working version at least?
    u
    • 2
    • 3
  • a

    Andre Santos

    03/01/2023, 5:45 PM
    Hi Folks, I have a question regarding extracting Salesforce Report data using Airbyte. I have a Connection extracting data from Report salesforce object successfully. However, The data extracted from this object is Report Metadata data... https://developer.salesforce.com/docs/atlas.en-us.object_reference.meta/object_reference/sforce_api_objects_report.htm It seems to me the every report data extraction it's not supported by Airbyte Salesforce connector. Am I right?
    u
    • 2
    • 1
  • a

    Andrew Reeve

    03/01/2023, 5:48 PM
    Hello. Can a maintainer help with merging a PR to fix a source? Source Woocommerce has a bug with pagination that I have a PR to fix: https://github.com/airbytehq/airbyte/pull/23599 Ideally I can merge this into core instead of running my own version for now.
    u
    • 2
    • 1
  • j

    Jeff Jolicoeur

    03/01/2023, 6:05 PM
    Has anyone encountered a jdbc Oauth2 connection being unavailable when it is writing to the destination (Snowflake)? It seems to get the refreshToken successfully (though I never see the log line output for that, just the log line that it started that process). The connection works and I can see the refresh toklen response with an addtl log line I added. airbyte: 0.41.0 destination-snowflake: 0.4.49
    e
    n
    • 3
    • 21
  • c

    Carolina Buckler

    03/01/2023, 7:13 PM
    Upgraded to Airbyte 0.40.30, and am having issues with existing connectors related to the
    Non-breaking schema updates detected.
    A few of the connectors (Bing Ads, Facebook Marketing, MySQL) don’t actually show the schema changes, and are now having data type issues and the stream is breaking. The Salesforce source connector does show a readout of the changes, and has not failed.
    s
    • 2
    • 9
  • u

    user

    03/01/2023, 7:47 PM
    Hello Shawn Tucker, it's been a while without an update from us. Are you still having problems or did you find a solution?
  • a

    Adrián Velasco

    03/01/2023, 7:47 PM
    hi there! - I’m trying to build my own custom source/connector. I was already able to push it to my DockerHub Registry and now I’d like to start using it on Airbyte. However, I cannot find this button “+ New connector” on your app. where should I look at? https://docs.airbyte.com/operator-guides/using-custom-connectors/#3-click-on--new-connector
    u
    u
    • 3
    • 8
  • m

    Malik

    03/01/2023, 8:03 PM
    Hi there, I am trying to deploy airbyte with helm on EKS with terraform and I see the database volumes are failing to provision with this error
    running PreBind plugin "VolumeBinding": binding volumes: timed out waiting for the condition
    Online I saw that I might need the EBS CSI driver add-on, but I did not find that in the airbyte docs and was unsure if that is the right approach to solving this issue. please advise
    ✅ 1
    m
    u
    j
    • 4
    • 7
  • u

    user

    03/01/2023, 8:13 PM
    Hey there, Sorry to hear that you're having an issue with airbyte and your nginx configuration. Could you provide more information on the exact error you're receiving and the steps you're taking when you get the error? It would also be helpful to know what version of airbyte you're using. Thanks
  • r

    Ryan Taylor

    03/01/2023, 8:36 PM
    Hi. I am seeking help on why my sync failed. See the attached logs. The source is pulling data from a large table in MySQL table and writing to S3 destination. I can see the entry "Exported 6500361 of 196414794 records for table 'dn3m.memberclaim' after 015231.658" at line 16974 of the log file. Shortly after on line 17026 of the log, looks like the first attempt fails but there is no error message to indicate why the worker failed. How can I determine why the sync failed?
    airbyte-logs-20230301.txt
    n
    • 2
    • 10
  • k

    Ketan Mangukiya

    03/01/2023, 9:08 PM
    Hello Team, I have tried to connect google Sheets with PostgreSQL. In my case, the google sheet name and prefix are too long so now what is my database table name I mean any logic for the specific characters of sheet name or anything example : my sheet name and prefix both is --- alan_talent_beyond_boundaries_tbb_employer_feedback_survey , in my database table name is --- alan_talent_beyond_b__loyer_feedback_survey what is the logic for that Thanks in advance 🙏
    l
    d
    • 3
    • 4
  • y

    yulia norenko

    03/01/2023, 10:10 PM
    Hello, I have a question if I point two different airbyte shopify connections to the same destination in S3 (sync mode is Incremental|Append) would airbyte sync still work properly without any data being accidentally deleted in S3? Thank you!
  • s

    Santiago Estupiñan Romero

    03/01/2023, 11:16 PM
    Hi! Has anyone successfully connected the Zoho CRM source with a BigQuery destination? When I try to sync there is an error showing up that
    JSON schema validation failed
    . Then the sync fails and it says that there is an error in the normalization and that it cannot find the JSON schema. Does anyone know why this happens orr how to fix it? Do I need to open an issue?
    m
    u
    +2
    • 5
    • 11
  • s

    Sharath Chandra

    03/02/2023, 5:04 AM
    #C021JANJ6TY Source: Mixpanel (StartDate: 20-Feb-2023, EndDate: I left it blank ) Destination: Redshift I see the sync failed, but the records got inserted. Could someone please help? Attaching the error screenshot and log file too.
    d059fece_e814_41f4_a343_90b248c5e32f_logs_193_txt.txt
    n
    r
    • 3
    • 14
  • a

    archna singh

    03/02/2023, 5:21 AM
    i am facing issue while seting up source mysql with below credentials
    r
    • 2
    • 5
  • c

    Camilo Atencio Franco

    03/02/2023, 5:36 AM
    Hey all, im still having issues with normalization. According to the logs everything within normalization ran succesfully, it creates the table, however im not able to see any records in the table itself. Im using version 0.40.23, postgres source 1.0.39 and Snowflake destination 0.4.44 https://airbytehq.slack.com/archives/C021JANJ6TY/p1676055325524329 .
    u
    • 2
    • 2
  • s

    Sergey Kedrov

    03/02/2023, 6:39 AM
    Hi! Have a question. It’s clear that Airbyte is designed for EL(T) scenarios. I’m curious, do users face scenarios when data transformations before persisting give advantages? Say, I know that for some columnar databases make sense to apply dictionaries, resolve IP addresses, enrich attributes before storing etc. What advice does Airbyte provide for such cases?
    u
    u
    • 3
    • 3
  • a

    Aman Kesharwani

    03/02/2023, 7:23 AM
    Hi All, I am trying to sync the data from source mongo to s3 destination with parquet format as file format the sync is failing with below error, attaching the full error log as well for reference , I am using airbyte open source version 0.39.35.
    Copy code
    2023-02-28 12:54:17 - Additional Failure Information: java.lang.NullPointerException: Array contains a null element at 0. Set parquet.avro.write-old-list-structure=false to turn on support for arrays with null elements.
    n
    • 2
    • 4
  • a

    Aman Kesharwani

    03/02/2023, 7:24 AM
    I am looking to understand where can I change this parameter
    parquet.avro.write-old-list-structure=false
    in the destination connector code or any alternate approach to resolve this issue
  • t

    Thomas Gerber

    03/02/2023, 7:50 AM
    Folks, for those who want to quickly tests their sources or connections without necessarily deploying an airbyte-server, you can use this open-source airbyte-local-cli. Enjoy!
    u
    • 2
    • 1
  • s

    Sebastian Brickel

    03/02/2023, 8:20 AM
    Hi team, this is related to the Mailchimp connector issues 20814 and 20571. I was wonder if, as a dirty fix, it would be possible to modify the state of my running connector and simply change the stream_state to a date that is not as long ago and thereby ignore the huge historic data load that is causing this issue? If so how could one do that if Airbyte is running in docker on GCP? Thanks
    u
    m
    • 3
    • 6
1...153154155...245Latest