https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • d

    Dennis Hinnenkamp

    04/05/2022, 2:12 PM
    Hi together, Is there a possibility for an SQL connector to display functions in addition to views and tables and to select them for the extraction of data? Unfortunately, our customer cannot activate CDC and the amount of data in the individual tables is too large to always do a full refresh. I would be very grateful for any help or ideas!
    o
    m
    • 3
    • 3
  • s

    Sami RIAHI

    04/05/2022, 3:48 PM
    Hello Recently I downloaded Airbyte on my local machine and I made some modifications to the code (for example the "setFetchSize" and I increased from 1K to 1M ), I increase also JOB_MAIN_CONTAINER_MEMORY_REQUEST=20g, JOB_MAIN_CONTAINER_MEMORY_LIMIT=26g ,After that I deployed Airbyte on Docker and I launched the synchronization but I do not see changes in the logs. can you please help me ?
    ✅ 1
    m
    • 2
    • 7
  • y

    Yusuf Khan

    04/05/2022, 6:14 PM
    Hi I'm following the instructions to install airbyte locally and when I run the docker-compose up command it says:
    Copy code
    ERROR:
            Can't find a suitable configuration file in this directory or any
            parent. Are you in the right directory?
    
            Supported filenames: docker-compose.yml, docker-compose.yaml, compose.yml, compose.yaml
    my docker-compose version is:
    Copy code
    docker-compose version 1.29.2, build 5becea4c
    and I just cloned the airbyte repository now so should be the latest version on the main branch Edit: And I have cd-ed into the airbyte local directory after cloning
    ✅ 1
    • 1
    • 3
  • y

    Yusuf Khan

    04/05/2022, 8:57 PM
    Just put this on stack overflow but cannot figure out how to create my first connection using MSSQL as a source. I have it installed as standalone on my machine, I have airbyte running on that same machine. I enabled TCP/IP from SQL Server's Configuration Manager. But still getting this message:
    Copy code
    Could not connect with provided configuration. Error: Cannot create PoolableConnectionFactory (The TCP/IP connection to the host GAAZDSTDEV02D, port 1433 has failed. Error: "GAAZDSTDEV02D. Verify the connection properties. Make sure that an instance of SQL Server is running on the host and accepting TCP/IP connections at the port. Make sure that TCP connections to the port are not blocked by a firewall.".)
    Even though I can see the service is running and am able to connect through pymssql https://stackoverflow.com/questions/71758242/connection-to-sql-server-from-airbyte-failing-cannot-create-poolableconnectionf
  • p

    Parth Gangrade

    04/06/2022, 12:06 PM
    I am moving the data from MongoDB to BIgquery and all the nested columns are there in the raw table but the final table the values are null .I tried this same in the month of December at that time the json objects were coming in the string format but now it is coming blank . I figured out the json values which are of structure type are getting captured by the json_extract function and not by json_extract_scalar function.How can i change the json_extract_scalar by json_extract in the code ?using mongoDB version- 0.1.13
    • 1
    • 1
  • y

    Yusuf Khan

    04/06/2022, 4:09 PM
    Deleted my last post and adding this again. Because I edited my post on stackoverflow I'm unable to get my first connection working in Airbyte. I have standalone sql server and my airbyte container running on the same machine. I've enabled TCP/IP on my sql server. I've tested logging in via sql server management studio using the credentials I'm giving airbyte. And I've also tested logging in and reading/writing using the same credentials via pymssql. I feel like I'm not sure how to isolate the issue further, and it looks like its coming from airbyte. I do see there's an issue in triage here: https://github.com/airbytehq/airbyte/issues/11124 but I'm not explicitly doing port forwarding if my airbyte container and sql server are on the same virtual machine? https://stackoverflow.com/questions/71758242/connection-to-sql-server-from-airbyte-failing-cannot-create-poolableconnectionf
  • s

    Samuel Rodríguez

    04/06/2022, 10:25 PM
    Caused by: io.airbyte.workers.WorkerException: Normalization Failed. at io.airbyte.workers.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:57) ~[io.airbyte-airbyte-workers-0.35.61-alpha.jar:?] at io.airbyte.workers.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:18) ~[io.airbyte-airbyte-workers-0.35.61-alpha.jar:?] at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:155) ~[io.airbyte-airbyte-workers-0.35.61-alpha.jar:?]
  • s

    Samuel Rodríguez

    04/06/2022, 10:26 PM
    I have a problem
    🙏 1
  • a

    Ayoade Abel Adegbite

    04/07/2022, 8:32 AM
    Hello, Where do I start my learning from ? Thank you
    m
    • 2
    • 2
  • a

    Andrei Batomunkuev

    04/07/2022, 6:05 PM
    Hello! I have some concerns about deploying AirByte to AWS. I'm choosing between having both AirByte and Postgre DB on a single instance or separate instances. Which way is preferred?
    m
    • 2
    • 2
  • y

    Yusuf Khan

    04/08/2022, 5:24 PM
    I've enabled CDC on my SQL Server db and tables. I'm using airbyte to load it into Snowflake. Is there a way I can have the replication be faster than every 5 minutes? Also the Sync mode is only showing me Full refresh | Append. I don't see the incremental option for source anymore even though my airbyte data source(SQL server) was configured for CDC
    • 1
    • 1
  • k

    Kevin Salmeron

    04/08/2022, 8:53 PM
    Hello Community, I'm looking for enable a sign in screen on my AirByte instance and if is possible a log console or plain text auto-writing to track the airbyte activity. In fact it is a Security and compliance department request. Somebody can helpme with a hint?
  • r

    Rafael Auyer

    04/08/2022, 10:28 PM
    Hi ! Is there a way for one to had code a higher Batch size for the existing connections ? while this is not ready. Im running on k8s with helm fyi https://github.com/airbytehq/airbyte/issues/4314
    o
    • 2
    • 1
  • b

    Bogdan Pirvu

    04/09/2022, 4:02 PM
    Hello! I’m trying to use Airbyte to ingest data from MongoDB (managed service on Digital Ocean) into Snowflake. First I tried to use the Open Source version running on my laptop, however I get a
    java.net.UnknownHostException
    which I don’t understand (using MongoDB Compass I can connect to the same host), so I figured I’d give Airbyte Cloud a shot, since I assumed it’s better than the self-hosted version. However, with Airbyte Cloud I get exactly the same error. Can somebody please help? What’s also quite strange is that neither in the Airbyte Opern Source nor in the Airbyte Cloud interface there is a way to add my cluster’s CA certificate to the connection settings. Without that it’s not possible to verify the TLS connection to my MongoDB server, which has TLS enabled. However currently I’m not even getting that far due to the
    java.net.UnknownHostException
    … What I already worked in the past (few weeks ago) is to replicate data from MongoDB with TLS disabled on the server, however disabling TLS is of course a no-go…
    d
    • 2
    • 1
  • g

    Gil Luz

    04/10/2022, 12:29 PM
    Hi great community. We just installed Airbyte on EC2. And I have a question: It seems that anyone that can reach port 8000 (or any other port I will configure) can enter Airbyte, is there an authentication feature of the tool or should we develop something ourselves?
    j
    • 2
    • 3
  • r

    Rajiv Thatipalli

    04/10/2022, 7:10 PM
    note able to install airbyte locally on my windows getting the below error
  • r

    Rajiv Thatipalli

    04/10/2022, 7:10 PM
    io.grpc.StatusRuntimeException: DEADLINE_EXCEEDED: Deadline exceeded after 4.998586577s. airbyte-worker | at io.grpc.stub.ClientCalls.toStatusRuntimeException(ClientCalls.java:262) ~[grpc-stub-1.44.1.jar:1.44.1] airbyte-worker | at io.grpc.stub.ClientCalls.getUnchecked(ClientCalls.java:243) ~[grpc-stub-1.44.1.jar:1.44.1] airbyte-worker | at io.grpc.stub.ClientCalls.blockingUnaryCall(ClientCalls.java:156) ~[grpc-stub-1.44.1.jar:1.44.1] airbyte-worker | at io.grpc.health.v1.HealthGrpc$HealthBlockingStub.check(HealthGrpc.java:252) ~[grpc-services-1.44.1.jar:1.44.1] airbyte-worker | at io.temporal.serviceclient.WorkflowServiceStubsImpl.lambda$checkHealth$2(WorkflowServiceStubsImpl.java:286) ~[temporal-serviceclient-1.8.1.jar:?] airbyte-worker | at io.temporal.internal.retryer.GrpcSyncRetryer.retry(GrpcSyncRetryer.java:61) ~[temporal-serviceclient-1.8.1.jar:?] airbyte-worker | at io.temporal.internal.retryer.GrpcRetryer.retryWithResult(GrpcRetryer.java:51) ~[temporal-serviceclient-1.8.1.jar:?] airbyte-worker | at io.temporal.serviceclient.WorkflowServiceStubsImpl.checkHealth(WorkflowServiceStubsImpl.java:279) ~[temporal-serviceclient-1.8.1.jar:?] airbyte-worker | at io.temporal.serviceclient.WorkflowServiceStubsImpl.<init>(WorkflowServiceStubsImpl.java:186) ~[temporal-serviceclient-1.8.1.jar:?] airbyte-worker | at io.temporal.serviceclient.WorkflowServiceStubs.newInstance(WorkflowServiceStubs.java:51) ~[temporal-serviceclient-1.8.1.jar:?] airbyte-worker | at io.temporal.serviceclient.WorkflowServiceStubs.newInstance(WorkflowServiceStubs.java:41) ~[temporal-serviceclient-1.8.1.jar:?] airbyte-worker | at io.airbyte.workers.temporal.TemporalUtils.lambda$createTemporalService$0(TemporalUtils.java:61) ~[io.airbyte-airbyte-workers-0.35.65-alpha.jar:?] airbyte-worker | at io.airbyte.workers.temporal.TemporalUtils.getTemporalClientWhenConnected(TemporalUtils.java:190) [io.airbyte-airbyte-workers-0.35.65-alpha.jar:?] airbyte-worker | at io.airbyte.workers.temporal.TemporalUtils.createTemporalService(TemporalUtils.java:57) [io.airbyte-airb
    o
    • 2
    • 2
  • a

    Arun Sivasankaran

    04/10/2022, 8:07 PM
    https://discuss.airbyte.io/t/help-setting-up-s3-source/539
    o
    • 2
    • 3
  • a

    Arun Sivasankaran

    04/10/2022, 8:09 PM
    wondering if someone can help me with s3 source configs. I am running into
    Copy code
    pyarrow.lib.ArrowInvalid: CSV parse error: Expected 1 columns, got 9
    when attempting to sync. I think its related to my config atleast. Here is an example file in s3...
    Copy code
    {"userId":"615630dc81cd","sourceLanguage":"en","targetLanguage":"de","sourcePhrase":"a bit","sourceRoot":"bit","sourcePrefix":"a","targetPhrase":"ein Bisschen","incrementalViews":1,"eventName":"translationViewed"}
    {"userId":"615630dc81cd","sourceLanguage":"en","targetLanguage":"de","sourcePhrase":"job","sourceRoot":"job","targetPhrase":"Arbeit","incrementalViews":1,"eventName":"translationViewed"}
    {"userId":"615630dc81cd","sourceLanguage":"en","targetLanguage":"de","sourcePhrase":"information","sourceRoot":"information","targetPhrase":"Information","incrementalViews":1,"eventName":"translationViewed"}
    {"userId":"615630dc81cd","sourceLanguage":"en","targetLanguage":"de","sourcePhrase":"example","sourceRoot":"example","targetPhrase":"Beispiel","incrementalViews":1,"eventName":"translationViewed"}
    {"userId":"615630dc81cd","sourceLanguage":"en","targetLanguage":"de","sourcePhrase":"greatly","sourceRoot":"greatly","targetPhrase":"sehr","incrementalViews":1,"eventName":"translationViewed"}
    {"userId":"615630dc81cd","sourceLanguage":"en","targetLanguage":"de","sourcePhrase":"example","sourceRoot":"example","targetPhrase":"Beispiel","incrementalViews":0,"eventName":"translationViewed"}
    {"userId":"615630dc81cd","sourceLanguage":"en","targetLanguage":"de","sourcePhrase":"day","sourceRoot":"day","targetPhrase":"Tag","incrementalViews":1,"eventName":"translationViewed"}
    {"userId":"615630dc81cd","sourceLanguage":"en","targetLanguage":"de","sourcePhrase":"functionality","sourceRoot":"functionality","targetPhrase":"Funktionalität","incrementalViews":1,"eventName":"translationViewed"}
    • 1
    • 3
  • d

    Dimitriy Ni

    04/11/2022, 1:05 PM
    Hey everyone. I am trying to access the internal airbyte database. This instructions are showing how to connect with PSQL: https://docs.airbyte.com/operator-guides/configuring-airbyte-db/#accessing-the-default-database-located-in-docker-airbyte-db But I cant connect to this db with an databasetool (HeidiSQL or similar), allthough I use out its docker container ip 🤔 Any help / idea?
    l
    • 2
    • 2
  • l

    Louis-Marius Gendreau

    04/11/2022, 3:04 PM
    Hi folks, is there a way to configure Airbyte so it does not unnest repeated records? I want to this with the Shopify to BigQuery connectors and handle it myself. Thanks!
    👍 1
    ❓ 1
  • e

    Egor Satiukov

    04/11/2022, 3:43 PM
    Hi everyone! I'm trying to use the Facebook Marketing connector. My FB app doesn't really need to get Advanced Access, but I need it to work with Airbyte. I described this situation in request for Advanced Access, but my request was denied (my app cannot be reviewed properly because it doesn't actually work with FB features and permissions). Now I'm working with Airbyte using Standard Access, but my pipilines fail every time due to rate limits. Can anyone suggest me, what should I do?
  • d

    Daniele Ferrero

    04/11/2022, 3:44 PM
    Hi! I'm tying to start Airbyte to my windows docker, but i can't.
  • a

    Anton Nosovitsky

    04/12/2022, 9:33 AM
    hey there, i tried updating my instance and seeing issues, could anyone help? 🙂
    Copy code
    airbyte-server      | 2022-04-12 09:31:28 INFO i.a.d.Databases(createPostgresDatabaseWithRetryTimeout):99 - Database available!
    airbyte-server      | 2022-04-12 09:31:28 INFO i.a.d.Databases(createPostgresDatabaseWithRetry):57 - Database available!
    airbyte-temporal    | {"level":"info","ts":"2022-04-12T09:31:28.811Z","msg":"none","service":"matching","component":"matching-engine","wf-task-queue-name":"/_sys/GET_SPEC/3","wf-task-queue-type":"Activity","lifecycle":"Starting","logging-call-at":"taskQueueManager.go:238"}
    airbyte-temporal    | {"level":"info","ts":"2022-04-12T09:31:28.811Z","msg":"none","service":"matching","component":"matching-engine","wf-task-queue-name":"/_sys/GET_SPEC/3","wf-task-queue-type":"Activity","lifecycle":"Started","logging-call-at":"taskQueueManager.go:242"}
    airbyte-temporal    | {"level":"info","ts":"2022-04-12T09:31:28.826Z","msg":"none","service":"matching","component":"matching-engine","wf-task-queue-name":"1@07b096598acc:23029e80-aa07-4c6f-aa67-81fbefda049e","wf-task-queue-type":"Workflow","lifecycle":"Starting","logging-call-at":"taskQueueManager.go:238"}
    airbyte-temporal    | {"level":"info","ts":"2022-04-12T09:31:28.827Z","msg":"none","service":"matching","component":"matching-engine","wf-task-queue-name":"1@07b096598acc:23029e80-aa07-4c6f-aa67-81fbefda049e","wf-task-queue-type":"Workflow","lifecycle":"Started","logging-call-at":"taskQueueManager.go:242"}
    airbyte-temporal    | {"level":"info","ts":"2022-04-12T09:31:28.828Z","msg":"none","service":"matching","component":"matching-engine","wf-task-queue-name":"/_sys/1@07b096598acc:23029e80-aa07-4c6f-aa67-81fbefda049e/2","wf-task-queue-type":"Workflow","lifecycle":"Starting","logging-call-at":"taskQueueManager.go:238"}
    airbyte-temporal    | {"level":"info","ts":"2022-04-12T09:31:28.829Z","msg":"none","service":"matching","component":"matching-engine","wf-task-queue-name":"/_sys/1@07b096598acc:23029e80-aa07-4c6f-aa67-81fbefda049e/2","wf-task-queue-type":"Workflow","lifecycle":"Started","logging-call-at":"taskQueueManager.go:242"}
    airbyte-temporal    | {"level":"info","ts":"2022-04-12T09:31:28.837Z","msg":"none","service":"matching","component":"matching-engine","wf-task-queue-name":"/_sys/1@07b096598acc:23029e80-aa07-4c6f-aa67-81fbefda049e/3","wf-task-queue-type":"Workflow","lifecycle":"Starting","logging-call-at":"taskQueueManager.go:238"}
    airbyte-temporal    | {"level":"info","ts":"2022-04-12T09:31:28.837Z","msg":"none","service":"matching","component":"matching-engine","wf-task-queue-name":"/_sys/1@07b096598acc:23029e80-aa07-4c6f-aa67-81fbefda049e/3","wf-task-queue-type":"Workflow","lifecycle":"Started","logging-call-at":"taskQueueManager.go:242"}
    airbyte-server      | 2022-04-12 09:31:28 INFO i.a.c.EnvConfigs(getEnvOrDefault):834 - Using default value for environment variable CONFIGS_DATABASE_INITIALIZATION_TIMEOUT_MS: '60000'
    airbyte-server      | 2022-04-12 09:31:28 INFO o.f.c.i.l.s.Slf4jLog(info):49 - Flyway Community Edition 7.14.0 by Redgate
    airbyte-server      | 2022-04-12 09:31:28 INFO o.f.c.i.l.s.Slf4jLog(info):49 - Database: jdbc:<postgresql://db:5432/airbyte> (PostgreSQL 13.6)
    airbyte-server      | 2022-04-12 09:31:29 ERROR i.a.s.ServerApp(main):249 - Server failed
    airbyte-server      | java.lang.NullPointerException: Cannot invoke "org.flywaydb.core.api.MigrationInfo.getVersion()" because the return value of "io.airbyte.db.instance.DatabaseMigrator.getLatestMigration()" is null
    airbyte-server      |   at io.airbyte.db.instance.MinimumFlywayMigrationVersionCheck.assertMigrations(MinimumFlywayMigrationVersionCheck.java:75) ~[io.airbyte.airbyte-db-lib-0.35.65-alpha.jar:?]
    airbyte-server      |   at io.airbyte.server.ServerApp.assertDatabasesReady(ServerApp.java:135) ~[io.airbyte-airbyte-server-0.35.65-alpha.jar:?]
    airbyte-server      |   at io.airbyte.server.ServerApp.getServer(ServerApp.java:160) ~[io.airbyte-airbyte-server-0.35.65-alpha.jar:?]
    airbyte-server      |   at io.airbyte.server.ServerApp.main(ServerApp.java:247) [io.airbyte-airbyte-server-0.35.65-alpha.jar:?]
    airbyte-server exited with code 1
  • a

    Alistair Wright

    04/12/2022, 8:34 PM
    Hi, Is it possible to name a connection to make it easier to see which tables are being transferred or is this a case where I am creating too many connections? I'm pulling some moderately large datasets so I have tended to create a connection for each stream under the assumption that if I have problems with one stream it's not going to stop others from processing - should I instead be putting all streams in the same connection? I'm currently using the docker instances to test out the system where I the UI version is displayed as 0.35.53-alpha so this may be functionality that is in a later version. Thanks
  • l

    Larry Gormley

    04/12/2022, 10:01 PM
    While creating a destination connection on Postgres it fails and tells me it cannot create a poolable connection. If this is a familiar issue to someone I'd appreciate your good advice. Thanks! Here is the log:
  • l

    Larry Gormley

    04/12/2022, 10:01 PM
    2022-04-12 214121 INFO i.a.w.p.a.DefaultAirbyteStreamFactory(lambda$create$0):61 - at sun.nio.ch.Net.pollConnect(Native Method) ~[?:?] 2022-04-12 214121 INFO i.a.w.p.a.DefaultAirbyteStreamFactory(lambda$create$0):61 - at sun.nio.ch.Net.pollConnectNow(Net.java:672) ~[?:?] 2022-04-12 214121 INFO i.a.w.p.a.DefaultAirbyteStreamFactory(lambda$create$0):61 - at sun.nio.ch.NioSocketImpl.timedFinishConnect(NioSocketImpl.java:549) ~[?:?] 2022-04-12 214121 INFO i.a.w.p.a.DefaultAirbyteStreamFactory(lambda$create$0):61 - at sun.nio.ch.NioSocketImpl.connect(NioSocketImpl.java:597) ~[?:?] 2022-04-12 214121 INFO i.a.w.p.a.DefaultAirbyteStreamFactory(lambda$create$0):61 - at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:327) ~[?:?] 2022-04-12 214121 INFO i.a.w.p.a.DefaultAirbyteStreamFactory(lambda$create$0):61 - at java.net.Socket.connect(Socket.java:633) ~[?:?] 2022-04-12 214121 INFO i.a.w.p.a.DefaultAirbyteStreamFactory(lambda$create$0):61 - at org.postgresql.core.PGStream.createSocket(PGStream.java:231) ~[postgresql-42.2.18.jar:42.2.18] 2022-04-12 214121 INFO i.a.w.p.a.DefaultAirbyteStreamFactory(lambda$create$0):61 - at org.postgresql.core.PGStream.<init>(PGStream.java:95) ~[postgresql-42.2.18.jar:42.2.18] 2022-04-12 214121 INFO i.a.w.p.a.DefaultAirbyteStreamFactory(lambda$create$0):61 - at org.postgresql.core.v3.ConnectionFactoryImpl.tryConnect(ConnectionFactoryImpl.java:98) ~[postgresql-42.2.18.jar:42.2.18] 2022-04-12 214121 INFO i.a.w.p.a.DefaultAirbyteStreamFactory(lambda$create$0):61 - at org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:213) ~[postgresql-42.2.18.jar:42.2.18] 2022-04-12 214121 INFO i.a.w.p.a.DefaultAirbyteStreamFactory(lambda$create$0):61 - at org.postgresql.core.ConnectionFactory.openConnection(ConnectionFactory.java:51) ~[postgresql-42.2.18.jar:42.2.18] 2022-04-12 214121 INFO i.a.w.p.a.DefaultAirbyteStreamFactory(lambda$create$0):61 - at org.postgresql.jdbc.PgConnection.<init>(PgConnection.java:225) ~[postgresql-42.2.18.jar:42.2.18] 2022-04-12 214121 INFO i.a.w.p.a.DefaultAirbyteStreamFactory(lambda$create$0):61 - at org.postgresql.Driver.makeConnection(Driver.java:465) ~[postgresql-42.2.18.jar:42.2.18] 2022-04-12 214121 INFO i.a.w.p.a.DefaultAirbyteStreamFactory(lambda$create$0):61 - at org.postgresql.Driver.connect(Driver.java:264) ~[postgresql-42.2.18.jar:42.2.18] 2022-04-12 214121 INFO i.a.w.p.a.DefaultAirbyteStreamFactory(lambda$create$0):61 - at org.apache.commons.dbcp2.DriverConnectionFactory.createConnection(DriverConnectionFactory.java:55) ~[commons-dbcp2-2.7.0.jar:2.7.0] 2022-04-12 214121 INFO i.a.w.p.a.DefaultAirbyteStreamFactory(lambda$create$0):61 - at org.apache.commons.dbcp2.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:355) ~[commons-dbcp2-2.7.0.jar:2.7.0] 2022-04-12 214121 INFO i.a.w.p.a.DefaultAirbyteStreamFactory(lambda$create$0):61 - at org.apache.commons.dbcp2.BasicDataSource.validateConnectionFactory(BasicDataSource.java:115) ~[commons-dbcp2-2.7.0.jar:2.7.0] 2022-04-12 214121 INFO i.a.w.p.a.DefaultAirbyteStreamFactory(lambda$create$0):61 - at org.apache.commons.dbcp2.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:665) ~[commons-dbcp2-2.7.0.jar:2.7.0] 2022-04-12 214121 INFO i.a.w.p.a.DefaultAirbyteStreamFactory(lambda$create$0):61 - ... 13 more 2022-04-12 214122 INFO i.a.w.t.TemporalAttemptExecution(get):131 - Stopping cancellation check scheduling...
  • v

    Varun Rayen

    04/13/2022, 3:02 AM
    Hey all, I am trying to establish a sync between mongodb and elasticsearch. But I end up stuck at this message
    writing 1 records in bulk operation
    in elasticsearch the index is created, but no records have been inserted
  • g

    Gil Luz

    04/13/2022, 7:09 AM
    Hi. We've installed Airbyte on EC2 and seen this message. "we collect data only for product improvements" We deliberately choose the ec2 installation to make sure that the Airbyte machine won't be exposed since it has access to sensitive data. Is there information that will go out from EC2 to Airbyte HQ?
  • m

    Malik Awais Khan

    04/13/2022, 10:08 AM
    hello..i have built a connection between mongodb and posgress sql..now i want to ask where the models are created.i couldnot find it?
1...323334...245Latest