https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • a

    Aman Kesharwani

    04/05/2023, 10:37 AM
    Hi Airbyte community, I am trying to setup airbyte in EKS cluster and I have disabled minio through values.yaml, but deployment is failing for airbyte-webapp on digging deeper I found the event log where it is unable to create pv and pvc attaching the values.yaml file as well for reference
    Copy code
    ExternalProvisioning   persistentvolumeclaim/airbyte-minio-pv-claim-airbyte-minio-0   waiting for a volume to be created, either by external provisioner "<http://ebs.csi.aws.com|ebs.csi.aws.com>" or manually created by system administrator
    values.yaml
    u
    a
    +2
    • 5
    • 13
  • v

    VISHAL B

    04/05/2023, 10:38 AM
    Hello team, we have a Airbyte 0.41.0 deployed on #k8s when i migrate data from mysql to bigquery json columns are being typecasted to string as mentioned in airbyte docs but i need json as it is in the destination also Please help Asap!
    k
    i
    m
    • 4
    • 5
  • d

    Dipankar Kumar

    04/05/2023, 10:59 AM
    Hi team, I am new here and not able to find Twitter in source list but as per documentation it should be there. Please help me on this
    k
    • 2
    • 2
  • s

    Slackbot

    04/05/2023, 11:19 AM
    This message was deleted.
  • s

    Slackbot

    04/05/2023, 11:20 AM
    This message was deleted.
  • a

    Aidan Fogarty

    04/05/2023, 11:56 AM
    Hi all 👋 We currently have airbyte deployed on EKS and looking to setup S3 for storing logs instead of the existing minio volume. Is it possible to to configure access for the S3 bucket using a role based eks service account or is it currently limited to needing a
    AWS_ACCESS_KEY_ID
    and
    AWS_SECRET_ACCESS_KEY
    ? This has probably been asked before so apologies in advance!
    k
    • 2
    • 2
  • b

    Bishan Singh

    04/05/2023, 12:06 PM
    can anyone help me how to make a custom DBT for the transformation source is MySQL and the destination is BigQuery
    k
    • 2
    • 2
  • s

    Shreepad Khandve

    04/05/2023, 12:16 PM
    Im getting following error for docker compose up -d while composing on ec2 instance
    Copy code
    service "bootloader" didn't complete successfully: exit 255
    k
    • 2
    • 2
  • r

    Rachel RIZK

    04/05/2023, 1:31 PM
    Hello 👋 We are trying to upgrade Airbyte from 0.39.42-alpha to 0.42.1. However: • it looks like a regressive error appeared in latest versions: the bootloader is checking if semver is respected for custom connectors (even those deleted with
    tombstone=true
    ) which is not expected • it's failing as we are using a custom ECR with specific non-semver naming • even if we wanted to use semver now, because we deleted those source definitions we're stuck as the bootloader checks also the source definitions flagged as`tombstone=true` in actor_definition 😕 Has anyone else encountered this problem? do you see an obvious workaround I may have missed? Thanks for your help 🙏
    Copy code
    airbyte-bootloader        | 2023-04-05 12:10:55 ERROR i.a.b.Application(main):25 - Unable to bootstrap Airbyte environment.
    airbyte-bootloader        | java.lang.IllegalArgumentException: Invalid version string: my-custom-deleted-source_0.0.3
    ✅ 1
    a
    m
    • 3
    • 5
  • v

    Victor Matheus Rodrigues de Carvalho

    04/05/2023, 2:13 PM
    Hello folks! I have Airbyte deployed on EKS and am trying to set up S3 as source. I have specified the
    AWS_ACCESS_KEY_ID
    and
    AWS_SECRET_ACCESS_KEY
    correctly (checked many times already) and still getting the following error when testing:
    The connection tests failed.
    Internal Server Error: The AWS Access Key Id you provided does not exist in our records. (Service: S3, Status Code: 403, Request ID: RS709RYJPT1MKX5A, Extended Request ID: n074u40uX670fmx2fiKXjOthpU8l0ts8lZpS+deomOyX+n8HAg/xuskizMQ+prKr+0mxxLawv8Q=)
    What am I missing?
    k
    • 2
    • 2
  • r

    Reid Roman

    04/05/2023, 2:34 PM
    Hey, I’m noticing that the Slack source creates its stream_slices by fixed timestamp (1 day per API call). My issue is that many of those calls will be sparse (less than the 1000-record page limit), especially within single channels. the alternative would be using Slack’s returned cursor, letting Slack return a guaranteed 1000 records. given Slack’s rate limiting, the current structure is substantially increasing ingestion time from cold start. but maybe state is easier to track with Airbyte-generated timestamps?
    k
    • 2
    • 6
  • f

    Faisal Anees

    04/05/2023, 2:39 PM
    Hi ! Got a question about the cloud Airbyte API. Does it support integration with GCP Secret Manager to store credentials (similar to the open source offering) ?
    k
    r
    • 3
    • 7
  • k

    Kyle Cheung

    04/05/2023, 3:25 PM
    Hi all, I’m using the BambooHR connector with my API Key I’m able to pull in a custom field called
    hireDate
    via Postman, however, using the same API key and inputting the same field
    hireDate
    I’m getting the following access denied error. If I don’t put anything into the custom fields input then the connector works
    Copy code
    The connection tests failed.
    
    CustomFieldsAccessDeniedError('Access to fields: hireDate - denied. Please check your access level.')
    k
    d
    • 3
    • 12
  • j

    Jesus Rivero

    04/05/2023, 3:35 PM
    Hi all. I am trying to download logs server from airbyte 0.40.25, and i am getting "Unable to download logs at this time.". I see that
    airbyte-server
    pod throws the following exception. Some knows what can cause this exception?
    Copy code
    SEVERE: An I/O error has occurred while writing a response message entity to the container output stream.
    org.glassfish.jersey.server.internal.process.MappableException: org.eclipse.jetty.io.EofException
    	at org.glassfish.jersey.server.internal.MappableExceptionWrapperInterceptor.aroundWriteTo(MappableExceptionWrapperInterceptor.java:67)
    	at org.glassfish.jersey.message.internal.WriterInterceptorExecutor.proceed(WriterInterceptorExecutor.java:139)
    	at org.glassfish.jersey.message.internal.MessageBodyFactory.writeTo(MessageBodyFactory.java:1116)
    	at org.glassfish.jersey.server.ServerRuntime$Responder.writeResponse(ServerRuntime.java:638)
    	at org.glassfish.jersey.server.ServerRuntime$Responder.processResponse(ServerRuntime.java:371)
    	at org.glassfish.jersey.server.ServerRuntime$Responder.process(ServerRuntime.java:361)
    	at org.glassfish.jersey.server.ServerRuntime$1.run(ServerRuntime.java:256)
    	at org.glassfish.jersey.internal.Errors$1.call(Errors.java:248)
    	at org.glassfish.jersey.internal.Errors$1.call(Errors.java:244)
    	at org.glassfish.jersey.internal.Errors.process(Errors.java:292)
    	at org.glassfish.jersey.internal.Errors.process(Errors.java:274)
    	at org.glassfish.jersey.internal.Errors.process(Errors.java:244)
    	at org.glassfish.jersey.process.internal.RequestScope.runInScope(RequestScope.java:265)
    	at org.glassfish.jersey.server.ServerRuntime.process(ServerRuntime.java:232)
    	at org.glassfish.jersey.server.ApplicationHandler.handle(ApplicationHandler.java:680)
    	at org.glassfish.jersey.servlet.WebComponent.serviceImpl(WebComponent.java:394)
    	at org.glassfish.jersey.servlet.WebComponent.service(WebComponent.java:346)
    	at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:366)
    	at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:319)
    	at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:205)
    	at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:763)
    	at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:569)
    	at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)
    	at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1377)
    	at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188)
    	at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:507)
    	at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186)
    	at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1292)
    	at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
    	at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
    	at org.eclipse.jetty.server.Server.handle(Server.java:501)
    	at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:383)
        at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:556)
    	at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:375)
    	at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:273)
    	at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)
    	at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105)
    	at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104)
    	at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:336)
    	at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:313)
    	at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:171)
    	at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:129)
    	at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:375)
    	at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:806)
    	at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:938)
    	at java.base/java.lang.Thread.run(Thread.java:1589)
    Caused by: org.eclipse.jetty.io.EofException
    	at org.eclipse.jetty.io.ChannelEndPoint.flush(ChannelEndPoint.java:279)
    	at org.eclipse.jetty.io.WriteFlusher.flush(WriteFlusher.java:422)
    	at org.eclipse.jetty.io.WriteFlusher.write(WriteFlusher.java:277)
    	at org.eclipse.jetty.io.AbstractEndPoint.write(AbstractEndPoint.java:381)
    	at org.eclipse.jetty.server.HttpConnection$SendCallback.process(HttpConnection.java:826)
    	at org.eclipse.jetty.util.IteratingCallback.processing(IteratingCallback.java:241)
    	at org.eclipse.jetty.util.IteratingCallback.iterate(IteratingCallback.java:223)
    	at org.eclipse.jetty.server.HttpConnection.send(HttpConnection.java:544)
    	at org.eclipse.jetty.server.HttpChannel.sendResponse(HttpChannel.java:838)
    	at org.eclipse.jetty.server.HttpChannel.write(HttpChannel.java:910)
    	at org.eclipse.jetty.server.HttpOutput.channelWrite(HttpOutput.java:284)
    	at org.eclipse.jetty.server.HttpOutput.channelWrite(HttpOutput.java:268)
    	at org.eclipse.jetty.server.HttpOutput.write(HttpOutput.java:833)
    	at org.glassfish.jersey.servlet.internal.ResponseWriter$NonCloseableOutputStreamWrapper.write(ResponseWriter.java:301)
    	at org.glassfish.jersey.message.internal.CommittingOutputStream.write(CommittingOutputStream.java:200)
    	at org.glassfish.jersey.message.internal.WriterInterceptorExecutor$UnCloseableOutputStream.write(WriterInterceptorExecutor.java:276)
    	at org.glassfish.jersey.message.internal.ReaderWriter.writeTo(ReaderWriter.java:93)
    	at org.glassfish.jersey.message.internal.AbstractMessageReaderWriterProvider.writeTo(AbstractMessageReaderWriterProvider.java:56)
    	at org.glassfish.jersey.message.internal.FileProvider.writeTo(FileProvider.java:95)
    	at org.glassfish.jersey.message.internal.FileProvider.writeTo(FileProvider.java:44)
    	at org.glassfish.jersey.message.internal.WriterInterceptorExecutor$TerminalWriterInterceptor.invokeWriteTo(WriterInterceptorExecutor.java:242)
    	at org.glassfish.jersey.message.internal.WriterInterceptorExecutor$TerminalWriterInterceptor.aroundWriteTo(WriterInterceptorExecutor.java:227)
    	at org.glassfish.jersey.message.internal.WriterInterceptorExecutor.proceed(WriterInterceptorExecutor.java:139)
    	at org.glassfish.jersey.server.internal.JsonWithPaddingInterceptor.aroundWriteTo(JsonWithPaddingInterceptor.java:85)
    	at org.glassfish.jersey.message.internal.WriterInterceptorExecutor.proceed(WriterInterceptorExecutor.java:139)
    	at org.glassfish.jersey.server.internal.MappableExceptionWrapperInterceptor.aroundWriteTo(MappableExceptionWrapperInterceptor.java:61)... 45 more
    Caused by: java.io.IOException: Broken pipe
    	at java.base/sun.nio.ch.FileDispatcherImpl.writev0(Native Method)
    	at java.base/sun.nio.ch.SocketDispatcher.writev(SocketDispatcher.java:66)
    	at java.base/sun.nio.ch.IOUtil.write(IOUtil.java:226)
    	at java.base/sun.nio.ch.IOUtil.write(IOUtil.java:157)
    	at java.base/sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:574)
    	at java.base/java.nio.channels.SocketChannel.write(SocketChannel.java:642)
    	at org.eclipse.jetty.io.ChannelEndPoint.flush(ChannelEndPoint.java:273)
    k
    • 2
    • 9
  • a

    Aidan Fogarty

    04/05/2023, 4:16 PM
    Hey 👋 We currently have airbyte version 0.40.32 (still using kustomize). If we have S3 setup for logging, and RDS for external database, is the
    airbyte-volume-configs
    persistent volume still needed?
    k
    • 2
    • 2
  • d

    Dale Bradman

    04/05/2023, 4:22 PM
    👋 is there anywhere to subscribe to latest Source Connector updates? A channel/RSS feed or something... ?
    k
    r
    s
    • 4
    • 5
  • j

    Jack Reid

    04/05/2023, 4:23 PM
    Hey team 👋 I'm currently using Airbyte's Launchdarkly connector to ingest data into Snowflake. According to Launchdarkly's API docs, an
    _id
    field should be provided as part of the
    flags
    stream response: https://apidocs.launchdarkly.com/tag/Feature-flags#section/Sample-feature-flag-representation however it is missing - is this expected?
    k
    s
    • 3
    • 3
  • k

    Kevin Ruprecht

    04/05/2023, 5:40 PM
    Hello - I am testing out OSS Airbyte in GCP to see if it will meet my team's needs. First time setting it up. It's a simple test trying to move data from 1 table in Cloud SQL Postgres to BigQuery. 5000 rows. I have verified that the compute service account has BigQuery permissions. I am still running into
    The request signature we calculated does not match the signature you provided. Check your Google secret key and signing method. (Service: Amazon S3; Status Code: 403; Error Code: SignatureDoesNotMatch; Request ID: null; S3 Extended Request ID: null; Proxy: null)".
    I have seen this error pop up here and there without details on a resolution. Help?
    ✅ 1
    k
    • 2
    • 4
  • m

    Marcos Marx (Airbyte)

    04/05/2023, 6:01 PM
    Hello, we’re starting another office hour! Enter the huddle if you want to discuss about Airbyte features or issues! See you in #C045VK5AF54
  • m

    Mohammed Mogary

    04/05/2023, 7:31 PM
    I need to configure Oracle Autonomous Database that is using Wallet as a source, I would appreciate any help with that. Thank you
    k
    u
    +2
    • 5
    • 6
  • a

    Algis Setkus

    04/05/2023, 8:01 PM
    Hello All, I am trying to connect to a mySQL DB in AWS and getting the following error:
    Copy code
    The connection tests failed.
    
    Message: HikariPool-1 - Connection is not available, request timed out after 60001ms.
    Below is the test log output. Thank you for the help in advance!
    Copy code
    2023-04-05 19:51:27 INFO i.a.c.i.LineGobbler(voidCall):149 - ----- START CHECK -----
    2023-04-05 19:51:27 INFO i.a.c.i.LineGobbler(voidCall):149 - 
    2023-04-05 19:51:27 INFO i.a.c.EnvConfigs(getEnvOrDefault):1222 - Using default value for environment variable SIDECAR_KUBE_CPU_LIMIT: '2.0'
    2023-04-05 19:51:27 INFO i.a.c.EnvConfigs(getEnvOrDefault):1222 - Using default value for environment variable SOCAT_KUBE_CPU_LIMIT: '2.0'
    2023-04-05 19:51:27 INFO i.a.c.EnvConfigs(getEnvOrDefault):1222 - Using default value for environment variable SIDECAR_KUBE_CPU_REQUEST: '0.1'
    2023-04-05 19:51:27 INFO i.a.c.EnvConfigs(getEnvOrDefault):1222 - Using default value for environment variable SOCAT_KUBE_CPU_REQUEST: '0.1'
    2023-04-05 19:51:27 INFO i.a.c.EnvConfigs(getEnvOrDefault):1222 - Using default value for environment variable LAUNCHDARKLY_KEY: ''
    2023-04-05 19:51:27 INFO i.a.c.EnvConfigs(getEnvOrDefault):1222 - Using default value for environment variable FEATURE_FLAG_CLIENT: ''
    2023-04-05 19:51:27 INFO i.a.c.i.LineGobbler(voidCall):149 - Checking if airbyte/source-mysql:2.0.0 exists...
    2023-04-05 19:51:27 INFO i.a.c.i.LineGobbler(voidCall):149 - airbyte/source-mysql:2.0.0 was found locally.
    2023-04-05 19:51:27 INFO i.a.w.p.DockerProcessFactory(create):130 - Creating docker container = source-mysql-check-5a4a9dd2-0779-4f3c-a787-02ab1ad1249d-0-ygnls with resources io.airbyte.config.ResourceRequirements@5e38f78a[cpuRequest=,cpuLimit=,memoryRequest=,memoryLimit=] and allowedHosts null
    2023-04-05 19:51:27 INFO i.a.w.p.DockerProcessFactory(create):175 - Preparing command: docker run --rm --init -i -w /data/5a4a9dd2-0779-4f3c-a787-02ab1ad1249d/0 --log-driver none --name source-mysql-check-5a4a9dd2-0779-4f3c-a787-02ab1ad1249d-0-ygnls --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e DEPLOYMENT_MODE=OSS -e STRICT_COMPARISON_NORMALIZATION_WORKSPACES= -e WORKER_CONNECTOR_IMAGE=airbyte/source-mysql:2.0.0 -e AUTO_DETECT_SCHEMA=true -e LAUNCHDARKLY_KEY= -e SOCAT_KUBE_CPU_REQUEST=0.1 -e STRICT_COMPARISON_NORMALIZATION_TAG=strict_comparison2 -e SOCAT_KUBE_CPU_LIMIT=2.0 -e USE_STREAM_CAPABLE_STATE=true -e FIELD_SELECTION_WORKSPACES= -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_ROLE= -e APPLY_FIELD_SELECTION=false -e WORKER_JOB_ATTEMPT=0 -e FEATURE_FLAG_CLIENT= -e AIRBYTE_VERSION=0.42.0 -e WORKER_JOB_ID=5a4a9dd2-0779-4f3c-a787-02ab1ad1249d airbyte/source-mysql:2.0.0 check --config source_config.json
    2023-04-05 19:51:27 INFO i.a.w.i.VersionedAirbyteStreamFactory(create):107 - Reading messages from protocol version 0.2.0
    2023-04-05 19:51:28 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):168 - INFO i.a.i.s.m.MySqlSource(main):309 starting source: class io.airbyte.integrations.source.mysql.MySqlSource
    2023-04-05 19:51:29 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):168 - INFO i.a.i.b.IntegrationCliParser(parseOptions):126 integration args: {check=null, config=source_config.json}
    2023-04-05 19:51:29 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):168 - INFO i.a.i.b.IntegrationRunner(runInternal):105 Running integration: io.airbyte.integrations.base.ssh.SshWrappedSource
    2023-04-05 19:51:29 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):168 - INFO i.a.i.b.IntegrationRunner(runInternal):106 Command: CHECK
    2023-04-05 19:51:29 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):168 - INFO i.a.i.b.IntegrationRunner(runInternal):107 Integration config: IntegrationConfig{command=CHECK, configPath='source_config.json', catalogPath='null', statePath='null'}
    2023-04-05 19:51:29 WARN i.a.w.i.DefaultAirbyteStreamFactory(internalLog):165 - WARN c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
    2023-04-05 19:51:29 WARN i.a.w.i.DefaultAirbyteStreamFactory(internalLog):165 - WARN c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
    2023-04-05 19:51:29 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):168 - INFO i.a.i.b.s.SshTunnel(getInstance):204 Starting connection with method: NO_TUNNEL
    2023-04-05 19:51:29 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):168 - INFO c.z.h.HikariDataSource(<init>):80 HikariPool-1 - Starting...
    2023-04-05 19:51:29 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):168 - INFO c.z.h.HikariDataSource(<init>):82 HikariPool-1 - Start completed.
    2023-04-05 19:52:29 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):168 - INFO c.z.h.HikariDataSource(close):350 HikariPool-1 - Shutdown initiated...
    2023-04-05 19:52:30 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):168 - INFO c.z.h.HikariDataSource(close):352 HikariPool-1 - Shutdown completed.
    2023-04-05 19:52:30 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):168 - INFO i.a.i.b.IntegrationRunner(runInternal):182 Completed integration: io.airbyte.integrations.base.ssh.SshWrappedSource
    2023-04-05 19:52:30 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):168 - INFO i.a.i.s.m.MySqlSource(main):311 completed source: class io.airbyte.integrations.source.mysql.MySqlSource
    2023-04-05 19:52:30 INFO i.a.w.g.DefaultCheckConnectionWorker(run):120 - Check connection job received output: io.airbyte.config.StandardCheckConnectionOutput@1e2d972f[status=failed,message=Message: HikariPool-1 - Connection is not available, request timed out after 60001ms.]
    2023-04-05 19:52:30 INFO i.a.w.t.TemporalAttemptExecution(get):169 - Stopping cancellation check scheduling...
    2023-04-05 19:52:30 INFO i.a.c.i.LineGobbler(voidCall):149 - 
    2023-04-05 19:52:30 INFO i.a.c.i.LineGobbler(voidCall):149 - ----- END CHECK -----
    2023-04-05 19:52:30 INFO i.a.c.i.LineGobbler(voidCall):149 -
    k
    • 2
    • 2
  • m

    Mostafa Saeed

    04/05/2023, 9:04 PM
    Hi everyone , i'm trying to make a connection to mongodb and it shows an error while fetching the schema
    'FieldPath field names may not start with '$'.' on server
    k
    • 2
    • 4
  • a

    Albert Wong

    04/05/2023, 9:25 PM
    I just learned about cursors. I have a large event table that fails sync due to size. Has anyone successfully set a cursor manually so that the initial table load is a small manageable size and can provide some helpful pointers? I only need to capture deltas going forward but am unable to successfully load a table to bootstrap the sync process
    k
    • 2
    • 3
  • a

    aidan

    04/05/2023, 9:43 PM
    I have my own custom python connector for xero. I have seen that there is a xero connector in airbyte that I tested previously . However it did not have the Journals stream which is needed to generate a general leger (It has manual journals which is not the same ) I have looked at the code and see that there are alot more streams defined in the source code including journals . However they have not been added to the connector fully. I plan to modify the code and add them. I just want to know if there was a specific reason they were not added ?
    k
    j
    i
    • 4
    • 7
  • s

    Santiago Estupiñan Romero

    04/05/2023, 10:24 PM
    Hi all! I’m currently using the appsflyer source, and they will block de usage of the API Token V1 starting May 31. I was wondering if there is a team/members of Airbyte currently looking into it to upgrade the source to the Token V2 (because I tried using said version and the source did not work)
    k
    • 2
    • 2
  • w

    Wilfredo Molina

    04/06/2023, 12:30 AM
    Metrics Question Thread
    k
    a
    • 3
    • 4
  • b

    Brian Fertig

    04/06/2023, 1:04 AM
    question.. using MSSQL source.. against azure synapse. The source is setup but when trying to fetch schema I am getting failed to fetch. I am trying to find information in the logs but nada. Any insights?
    k
    • 2
    • 2
  • u

    高松拳人

    04/06/2023, 2:15 AM
    Is it possible to run AirByte's JOB from the command line? Example We want to schedule AirByte JOBs in Prefect, which is available on Google Kubernetes Engine. Prefect and Airbyte are running on different PODs.
    k
    • 2
    • 2
  • t

    Tim O'Connell

    04/06/2023, 3:30 AM
    I'm trying to sign up to airbyte cloud but I'm just getting a loading animation. Is cloud.airbyte.com down atm?
    k
    a
    • 3
    • 5
  • n

    navod perera

    04/06/2023, 4:07 AM
    Hello team, I created custom connector through the airbyte and connector send large amount of data. So I’m trying to ingest just one table of a database which is the biggest one with 150 million lines and 38Gb size. I want to insert this data into mysql or Kafka. But I am getting following error.. 2023-04-05155335 ERROR i.a.w.DefaultReplicationWorker(run):174 - Sync worker failed. java.util.concurrent.ExecutionException: io.airbyte.workers.DefaultReplicationWorker$SourceException: Source cannot be stopped! Have you any advices for handling this amount of data ?
    k
    • 2
    • 3
1...175176177...245Latest