https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • b

    Bassem Ben Jemaa

    09/25/2022, 8:42 PM
    is there a way to get the changes replicated in only one table - called Account instead of 3 ?
  • t

    Tiri Georgiou

    09/26/2022, 7:33 AM
    Hi team, We are currently having an issue with the
    salesforce connector
    . We notice every time we sync
    Case
    salesforce object there are significant number of duplicates (i.e. per case_id we could have up to 13 duplicates) in the source table (i.e.
    _airbyte_raw_case
    ). It looks like the normalization stage takes care of the deduplication in the transform stage, however the duplication of the raw data is causing significant overhead when loading into a DWH. I was thinking of raising this as an issue on GH but want to first make sure: 1. This is not expected behaviour? 2. Could this be a quick fix and if so does anybody know where in the codebase this issue might be originating from? Thanks
    ✍️ 1
    b
    s
    +3
    • 6
    • 22
  • m

    Maykon Lopes

    09/26/2022, 11:47 AM
    Hey all, does anyone know how or have a tip to clean tmp datasets created by airbyte in bigquery during sync?
    👀 1
    🙏 1
    ✍️ 1
    s
    m
    • 3
    • 8
  • c

    Chirag Gupta

    09/26/2022, 1:50 PM
    How can I use embed in airbyte can anyone guide me to a tutorial?
    ✍️ 1
    n
    m
    s
    • 4
    • 6
  • r

    Resford Rouzer

    09/26/2022, 5:32 PM
    With Google Analytics how do you pull in Goal data? I was wondering if I need to create a custom report to pull that information.
    ✍️ 1
    s
    • 2
    • 5
  • a

    AirbyteForumPost

    09/23/2022, 4:28 PM
    Postgres sync getting stuck at a specific point during sync [Troubleshooting] source-postgres Is this your first time deploying Airbyte?: No OS Version / Instance: Amazon Linux 2 / t3.large EC2 instance Memory / Disk: 8Gb memory / 30 Gb EBS volume Deployment: Docker on EC2 Airbyte Version: v0.40.8 Source name/version: Postgres / 1.10.0 Destination name/version: S3 / 0.3.15 Step: The issue is happening during sync Description: Postgres sync is getting stuck at a specific point every time…
    👍 1
    a
    p
    • 3
    • 2
  • c

    Caio Henrique

    09/27/2022, 1:09 AM
    Hi! I'm trying install airbyte on Kubernetes but i can't configure source-mysq-check pod resources, they always is displayed empty:
    2022-09-27 01:05:09 INFO i.a.w.p.KubeProcessFactory(create):100 - Attempting to start pod = source-mysql-check-db3ddf73-2c06-4734-b9b3-566809c430c4-0-rpvnv for airbyte/source-mysql:0.6.14 with resources io.airbyte.config.ResourceRequirements@681e40af[cpuRequest=,cpuLimit=,memoryRequest=,memoryLimit=]
    Someone could help me?
    ✍️ 1
    • 1
    • 2
  • a

    Akshay Baura

    09/27/2022, 8:41 AM
    Hi guys, I am researching on ways to scale airbyte. I see that it is not yet supported on ecs. Not wanting to go with eks as this point, has anyone tried airbyte on ec2 with asg ? is there any documentation around it?
    ✍️ 1
    • 1
    • 4
  • s

    Sanjar Baghchehsaraee

    09/27/2022, 12:32 PM
    Hi all, I got a question about this guide: https://docs.airbyte.com/integrations/sources/google-analytics-v4/?_ga=2.38548897.387982838.1664202739-240083929.1664202739 It is to connect Google Analytics. I understand most parts but i am confused about this part: 1. Go to the Service accounts page. 2. Click
    Create service account
    . I start to create a service account but on step two (optional) i can select a role (see picture), do i need to choose a role or can i just leave it? If so, what role do i need to choose? docs.airbyte.com Google Analytics 4 (GA4) | Airbyte Documentation This page guides you through the process of setting up the Google Analytics source connector.
    ✍️ 1
    s
    • 2
    • 4
  • k

    Kyle Rosenstein

    09/27/2022, 8:22 PM
    Hi everyone, I was wondering if anyone had ideas around this very obscure error I’m facing. I’m currently trying to export data from DocumentDB (using MongoDB source) to Snowflake. I have the connection working properly and data is being exported accordingly. However, it seems that the JSON/BSON field “identifier” within the documents is having its values replaced with empty strings. One example: A document I see in DocumentDB:
    Copy code
    {
      "_id": "<airbyte-id>",
      "active": true,
      "identifier": [
        {
          "system": "<some-link>",
          "value": "<some-value>"
        }
      ],
      "managingOrganization": {
        "identifier": {
          "system": "<some-link>",
          "value": "<some-value>"
        }
      },
      "meta": {
        "lastUpdated": "<some-date>",
        "source": "<some-link>"
      }
    }
    The same document in Snowflake:
    Copy code
    {
      "_id": "<airbyte-id>",
      "active": true,
      "identifier": "",
      "managingOrganization": {
        "identifier": ""
      },
      "meta": {
        "lastUpdated": "<some-date>",
        "source": "<some-link>"
      }
    }
    When checking the export logs, I see:
    WARN i.a.w.g.DefaultReplicationWorker(lambda$getReplicationRunnable$5):339 - Schema validation errors found for stream <stream-name>. Error messages: [$.identifier is of an incorrect type. Expected it to be object, $.managingOrganization.identifier is of an incorrect type. Expected it to be object]
    For some reason regardless of the data type, if the field is called “identifier” it is being wiped and replaced as an empty string during export. I was able to double check by duplicating these Documents in the DocDB collection but changing the “identifier” fields to something like “org_identifier” and when I did that, the data was being preserved just find. This makes me think it is not a data type issue but rather that the word “identifier” is either throwing Airbyte off or is some DocumentDB preserved key word. Please let me know if you have seen anything like this, have any ideas, or know of ways to cast these fields during the connection such that I can either fix this for all existing records or get a better sense if this issue is coming from Airbyte or DocDB
    ✍️ 1
    • 1
    • 9
  • c

    Craig Condie

    09/27/2022, 10:59 PM
    I wrote a custom connector to pull data from an http api as a source. I used the requests library. This connector has to go 4 levels deep: For each record in level 1, it calls each record in level 2 and for each record of level 2, it calls each record of level 3, etc... Because of this, it takes probably an hour to go through all the calls that it has to make, even though it'll only bring back about 1,200 records and 127 MB of data. I would like to think that if I could make the api calls asynchronous, that it would speed up the process dramatically. I was thinking about using the httpx library. Maybe I was doing it wrong, but I tried to override the "read_records" method and create the async client there, but couldn't figure out how to get it to return the necessary generator for the rest of the cdk to run with. Has anyone done something like that before or is there any documentation that I missed that someone can point me to?
    ✍️ 1
    s
    • 2
    • 3
  • k

    Krisjan Oldekamp

    09/28/2022, 11:20 AM
    I see that there are one or two similar questions about running Airflow on Google Cloud Run. The last answer by someone from the Airbyte team (about 10+ months ago) was that Google Cloud Run is not a recommended way to deploy Airbyte. Is this still the case? And if so, what's the reasoning behind it? Thanks!
    👋 1
    a
    n
    • 3
    • 3
  • m

    Mycchaka Kleinbort

    09/28/2022, 1:10 PM
    Hi all, my first day using Airbyte 🙂 I'm looking to automate a sync between a rest API and snowflake The API is of the form https://{PROVIDER}.com/api/{ENDPOINT}/{ENTITY}?api_token={KEY} And returns a JSON with the latest data for that entity. The "latest" data updates once a day. Is Airbyte a good tool to automate the pulling of this data into Snowflake?
    ✍️ 1
    ✅ 1
    h
    s
    • 3
    • 6
  • t

    Tanmay Kulkarni

    09/28/2022, 1:13 PM
    Hi everyone! I have some data in a postgres database (AWS Aurora). I want to extract some data using a query, and dump it onto an S3 destination bucket. I know I can configure postgres as a source and S3 as destination. Is there a way in Airbyte to specify the query to be run at the source to extract the data? Like
    Copy code
    SELECT department, count(distinct employeeId)
    FROM Employee
    GROUP BY department
    Then the result of this query will be dumped on the S3 target.
    ✍️ 1
    ✅ 1
    h
    s
    • 3
    • 3
  • b

    Bruno Agresta González

    09/28/2022, 5:09 PM
    Hi all, 1 ) I’m testing the tool, I recently download Airbyte and run it in my local enviroment. 2 ) I tried to create a new connection with GCP BigQuery, but i got an error. The config page never finished and I found and error in the dockers log. Error starting Micronaut server: Failed to inject value for parameter [dslContext] of method [configDatabase] of class: io.airbyte.db.Database Any ideas ?? Thanks 🙂
    ✍️ 1
    m
    a
    • 3
    • 9
  • g

    Giovani Freitas

    09/29/2022, 12:11 AM
    Hi guys! I have no experience with deploys, and I'm struggling to install airbyte locally in development version (https://docs.airbyte.com/contributing-to-airbyte/developing-locally/) to test a feature that is open (https://github.com/airbytehq/airbyte/pull/16032). I use windows, so I installed Docker Desktop and WSL 2 following the steps in this tutorial (https://docs.airbyte.com/deploying-airbyte/local-deployment/). Obviously I stopped at step 3 of the "Deploy on Windows" section, and then went back to the dev version installation tutorial. When I run
    SUB_BUILD=PLATFORM ./gradlew build
    it starts to build, but after a few minutes it always fails:
    Copy code
    FAILURE: Build failed with an exception.
    
    * What went wrong:
    Gradle build daemon disappeared unexpectedly (it may have been killed or may have crashed)
    I have no idea how to proceed, I searched a lot about this error but because I don't have much knowledge in docker, I'm pretty lost. Can anyone give me a light?
    ✍️ 1
    e
    s
    • 3
    • 7
  • c

    Chirag Gupta

    09/29/2022, 7:50 AM
    How can I setup my client Ids and get consent URL in development env where to do that change?
    ✍️ 1
    u
    • 2
    • 4
  • c

    Chirag Gupta

    09/29/2022, 7:52 AM
    and how to enable sign-in with google button in dev env I want to set it up with my client IDs and secrets? so that my team members can just simply sign in
  • d

    Dan Siegel

    09/29/2022, 12:18 PM
    • Is this your first time deploying Airbyte?: First on K8s • OS Version / Instance: EKS • Memory / Disk: 50gb • Deployment: EKS • Airbyte Version: 0.40.9 - stable with resource limits Hi - we're trying to deploy via EKS. The Webapp is coming up, but the Server and Worker Pods are failing at initialization: Server:
    Copy code
    2022-09-29 00:11:20 INFO i.a.c.EnvConfigs(getEnvOrDefault):1096 - Using default value for environment variable AWS_ACCESS_KEY_ID: ''
    2022-09-29 00:11:20 INFO i.a.c.EnvConfigs(getEnvOrDefault):1096 - Using default value for environment variable AWS_SECRET_ACCESS_KEY: ''
    2022-09-29 00:11:20 INFO i.a.c.EnvConfigs(getEnvOrDefault):1096 - Using default value for environment variable SHOULD_RUN_SYNC_WORKFLOWS: 'true'
    2022-09-29 00:11:20 INFO i.a.c.EnvConfigs(getEnvOrDefault):1096 - Using default value for environment variable WORKER_PLANE: 'CONTROL_PLANE'
    2022-09-29 00:11:20 INFO i.a.c.EnvConfigs(getEnvOrDefault):1096 - Using default value for environment variable CONFIG_DATABASE_USER: 'airbyte'
    2022-09-29 00:11:20 INFO i.a.c.EnvConfigs(getEnvOrDefault):1096 - Using default value for environment variable CONFIG_DATABASE_PASSWORD: '**********'
    2022-09-29 00:11:20 INFO i.a.c.EnvConfigs(getEnvOrDefault):1096 - Using default value for environment variable CONFIG_DATABASE_URL: 'REMOVED for SLACK'
    2022-09-29 00:11:21 INFO c.z.h.HikariDataSource(<init>):80 - HikariPool-1 - Starting...
    2022-09-29 00:11:21 INFO c.z.h.HikariDataSource(<init>):82 - HikariPool-1 - Start completed.
    2022-09-29 00:11:21 INFO c.z.h.HikariDataSource(<init>):80 - HikariPool-2 - Starting...
    2022-09-29 00:11:21 INFO c.z.h.HikariDataSource(<init>):82 - HikariPool-2 - Start completed.
    2022-09-29 00:11:22 WARN c.n.s.JsonMetaSchema(newValidator):278 - Unknown keyword existingJavaType - you should define your own Meta Schema. If the keyword is irrelevant for valida
    2022-09-29 00:11:23 ERROR i.a.s.ServerApp(main):336 - Server failed
    java.lang.IllegalArgumentException: null
        at com.google.common.base.Preconditions.checkArgument(Preconditions.java:131) ~[guava-31.0.1-jre.jar:?]
        at io.airbyte.config.storage.DefaultS3ClientFactory.validateBase(DefaultS3ClientFactory.java:36) ~[io.airbyte.airbyte-config-config-models-0.40.9.jar:?]
        at io.airbyte.config.storage.MinioS3ClientFactory.validate(MinioS3ClientFactory.java:33) ~[io.airbyte.airbyte-config-config-models-0.40.9.jar:?]
        at io.airbyte.config.storage.MinioS3ClientFactory.<init>(MinioS3ClientFactory.java:27) ~[io.airbyte.airbyte-config-config-models-0.40.9.jar:?]
        at io.airbyte.config.helpers.CloudLogs.createCloudLogClient(CloudLogs.java:48) ~[io.airbyte.airbyte-config-config-models-0.40.9.jar:?]
        at io.airbyte.config.helpers.LogClientSingleton.createCloudClientIfNull(LogClientSingleton.java:164) ~[io.airbyte.airbyte-config-config-models-0.40.9.jar:?]
        at io.airbyte.config.helpers.LogClientSingleton.setWorkspaceMdc(LogClientSingleton.java:151) ~[io.airbyte.airbyte-config-config-models-0.40.9.jar:?]
        at io.airbyte.server.ServerApp.getServer(ServerApp.java:177) ~[io.airbyte-airbyte-server-0.40.9.jar:?]
        at io.airbyte.server.ServerApp.main(ServerApp.java:333) ~[io.airbyte-airbyte-server-0.40.9.jar:?]
    2022-09-29 00:11:23 INFO c.z.h.HikariDataSource(close):350 - HikariPool-1 - Shutdown initiated...
    2022-09-29 00:11:23 INFO c.z.h.HikariDataSource(close):352 - HikariPool-1 - Shutdown completed.
    2022-09-29 00:11:23 INFO c.z.h.HikariDataSource(close):350 - HikariPool-2 - Shutdown initiated...
    2022-09-29 00:11:23 INFO c.z.h.HikariDataSource(close):352 - HikariPool-2 - Shutdown completed.
    Worker Pod Log:
    Copy code
    ___    _      __          __
        /   |  (_)____/ /_  __  __/ /____
       / /| | / / ___/ __ \/ / / / __/ _ \
      / ___ |/ / /  / /_/ / /_/ / /_/  __/
     /_/  |_/_/_/  /_.___/\__, /\__/\___/
                         /____/
             : airbyte-workers :
     
       Micronaut (v3.6.3)
     
     2022-09-29 00:14:42 INFO i.m.c.e.DefaultEnvironment(<init>):159 - Established active environments: [k8s, cloud, ec2, control]
     2022-09-29 00:14:43 INFO c.z.h.HikariDataSource(<init>):71 - HikariPool-1 - Starting...
     2022-09-29 00:14:43 INFO c.z.h.HikariDataSource(<init>):73 - HikariPool-1 - Start completed.
     2022-09-29 00:14:43 INFO c.z.h.HikariDataSource(<init>):71 - HikariPool-2 - Starting...
     2022-09-29 00:14:43 INFO c.z.h.HikariDataSource(<init>):73 - HikariPool-2 - Start completed.
     2022-09-29 00:14:43 INFO i.m.l.PropertiesLoggingLevelsConfigurer(configureLogLevelForPrefix):107 - Setting log level 'DEBUG' for logger: 'io.airbyte.bootloader'
     2022-09-29 00:14:44 INFO i.a.w.c.DatabaseBeanFactory(configsDatabaseMigrationCheck):139 - Configs database configuration: removedfromslack
     2022-09-29 00:14:44 WARN i.a.a.TrackingClientSingleton(get):30 - Attempting to fetch an initialized track client. Initializing a default one.
     2022-09-29 00:14:44 INFO i.a.w.t.TemporalUtils(getTemporalClientWhenConnected):220 - Waiting for temporal server...
     2022-09-29 00:14:44 WARN i.a.w.t.TemporalUtils(getTemporalClientWhenConnected):231 - Waiting for namespace default to be initialized in temporal...
     2022-09-29 00:14:46 INFO i.t.s.WorkflowServiceStubsImpl(<init>):188 - Created GRPC client for channel: ManagedChannelOrphanWrapper{delegate=ManagedChannelImpl{logId=1, target=airby
     2022-09-29 00:14:51 INFO i.a.w.t.TemporalUtils(getTemporalClientWhenConnected):248 - Temporal namespace default initialized!
     2022-09-29 00:14:51 INFO i.a.c.EnvConfigs(getEnvOrDefault):1096 - Using default value for environment variable AWS_ACCESS_KEY_ID: ''
     2022-09-29 00:14:51 INFO i.a.c.EnvConfigs(getEnvOrDefault):1096 - Using default value for environment variable AWS_SECRET_ACCESS_KEY: ''
     2022-09-29 00:14:51 INFO i.a.c.EnvConfigs(getEnvOrDefault):1096 - Using default value for environment variable SHOULD_RUN_SYNC_WORKFLOWS: 'true'
     2022-09-29 00:14:51 INFO i.a.c.EnvConfigs(getEnvOrDefault):1096 - Using default value for environment variable WORKER_PLANE: 'CONTROL_PLANE'
     2022-09-29 00:14:51 WARN i.a.m.l.MetricClientFactory(getMetricClient):46 - MetricClient has not been initialized. Must call MetricClientFactory.CreateMetricClient before using Metr
     2022-09-29 00:14:52 INFO i.a.w.ApplicationInitializer(initializeCommonDependencies):174 - Initializing common worker dependencies.
     2022-09-29 00:14:52 INFO i.a.c.EnvConfigs(getEnvOrDefault):1096 - Using default value for environment variable METRIC_CLIENT: ''
     2022-09-29 00:14:52 INFO i.a.c.EnvConfigs(getEnvOrDefault):1096 - Using default value for environment variable METRIC_CLIENT: ''
     2022-09-29 00:14:52 WARN i.a.m.l.MetricClientFactory(initialize):74 - MetricClient was not recognized or not provided. Accepted values are `datadog` or `otel`.
     2022-09-29 00:14:52 ERROR i.m.r.Micronaut(handleStartupException):338 - Error starting Micronaut server: null
     java.lang.IllegalArgumentException: null
         at com.google.common.base.Preconditions.checkArgument(Preconditions.java:131) ~[guava-31.1-jre.jar:?]
         at io.airbyte.config.storage.DefaultS3ClientFactory.validateBase(DefaultS3ClientFactory.java:36) ~[io.airbyte.airbyte-config-config-models-0.40.9.jar:?]
         at io.airbyte.config.storage.MinioS3ClientFactory.validate(MinioS3ClientFactory.java:33) ~[io.airbyte.airbyte-config-config-models-0.40.9.jar:?]
         at io.airbyte.config.storage.MinioS3ClientFactory.<init>(MinioS3ClientFactory.java:27) ~[io.airbyte.airbyte-config-config-models-0.40.9.jar:?]
         at io.airbyte.config.helpers.CloudLogs.createCloudLogClient(CloudLogs.java:48) ~[io.airbyte.airbyte-config-config-models-0.40.9.jar:?]
         at io.airbyte.config.helpers.LogClientSingleton.createCloudClientIfNull(LogClientSingleton.java:164) ~[io.airbyte.airbyte-config-config-models-0.40.9.jar:?]
         at io.airbyte.config.helpers.LogClientSingleton.setWorkspaceMdc(LogClientSingleton.java:151) ~[io.airbyte.airbyte-config-config-models-0.40.9.jar:?]
         at io.airbyte.workers.ApplicationInitializer.initializeCommonDependencies(ApplicationInitializer.java:180) ~[io.airbyte-airbyte-workers-0.40.9.jar:?]
         at io.airbyte.workers.ApplicationInitializer.onApplicationEvent(ApplicationInitializer.java:153) ~[io.airbyte-airbyte-workers-0.40.9.jar:?]
         at io.airbyte.workers.ApplicationInitializer.onApplicationEvent(ApplicationInitializer.java:65) ~[io.airbyte-airbyte-workers-0.40.9.jar:?]
         at io.micronaut.context.event.ApplicationEventPublisherFactory.notifyEventListeners(ApplicationEventPublisherFactory.java:262) ~[micronaut-inject-3.6.3.jar:3.6.3]
         at io.micronaut.context.event.ApplicationEventPublisherFactory.access$200(ApplicationEventPublisherFactory.java:60) ~[micronaut-inject-3.6.3.jar:3.6.3]
         at io.micronaut.context.event.ApplicationEventPublisherFactory$2.publishEvent(ApplicationEventPublisherFactory.java:229) ~[micronaut-inject-3.6.3.jar:3.6.3]
         at io.micronaut.http.server.netty.NettyHttpServer.lambda$fireStartupEvents$15(NettyHttpServer.java:574) ~[micronaut-http-server-netty-3.6.3.jar:3.6.3]
         at java.util.Optional.ifPresent(Optional.java:178) ~[?:?]
         at io.micronaut.http.server.netty.NettyHttpServer.fireStartupEvents(NettyHttpServer.java:568) ~[micronaut-http-server-netty-3.6.3.jar:3.6.3]
         at io.micronaut.http.server.netty.NettyHttpServer.start(NettyHttpServer.java:297) ~[micronaut-http-server-netty-3.6.3.jar:3.6.3]
         at io.micronaut.http.server.netty.NettyHttpServer.start(NettyHttpServer.java:104) ~[micronaut-http-server-netty-3.6.3.jar:3.6.3]
         at io.micronaut.runtime.Micronaut.lambda$start$2(Micronaut.java:81) ~[micronaut-context-3.6.3.jar:3.6.3]
         at java.util.Optional.ifPresent(Optional.java:178) ~[?:?]
         at io.micronaut.runtime.Micronaut.start(Micronaut.java:79) ~[micronaut-context-3.6.3.jar:3.6.3]
         at io.micronaut.runtime.Micronaut.run(Micronaut.java:323) ~[micronaut-context-3.6.3.jar:3.6.3]
         at io.micronaut.runtime.Micronaut.run(Micronaut.java:309) ~[micronaut-context-3.6.3.jar:3.6.3]
         at io.airbyte.workers.Application.main(Application.java:12) ~[io.airbyte-airbyte-workers-0.40.9.jar:?]
    ✍️ 1
    s
    • 2
    • 13
  • r

    Renan Rigo Calesso

    09/29/2022, 5:01 PM
    Hey everyone! I'm trying to send data from Postgres to Bigquery and after a LONG time, I recived the folowing error:
    Caused by: com.google.cloud.bigquery.BigQueryException: Cannot query rows larger than 100MB limit.
    Do you know how I can solve it?
    ✍️ 1
    s
    • 2
    • 8
  • m

    Markus Notti

    09/29/2022, 10:04 PM
    Hi! Just fooling around w/ airbyte on k8s rn and I'm wondering if it's possible/common to set up sources / sinks in code w/ k8s yamls OR if it's only possible to do so through the UI. Haven't looked too deeply, but cursory googles seem to suggest UI is most common, may be only way.
    e
    • 2
    • 2
  • r

    Robert Put

    09/29/2022, 10:36 PM
    is there a way to get more detailed logging for a connector? I have a few rows of data that are missing and trying to understand why. Postgres -> snowflake. testing airbyte to replace stitch, which was loses a few rows of the same table....
    ✍️ 1
    s
    • 2
    • 47
  • e

    Erik Eppel

    09/29/2022, 10:38 PM
    Hi, folks! Just joined the Slack, so definitely still finding my way around. That's to say if I'm not in the right place for my question, don't hesitate to point me to another channel. My question is likely very silly, but I cannot seem to find an answer to it anywhere (e.g. the forum, SO, etc. etc.). I'm confused about the [Deploy Airbyte Open Source](https://docs.airbyte.com/deploying-airbyte/on-aws-ec2/) approach of first installing Docker and Docker Compose on a VM, then clone the Airbyte repo, and finally running the code base with
    docker-compose up -d
    . It seems to me that the more straightforward approach would be to simply create your own repo with a Docker Compose file that effectively mirrors the one that ships with the codebase, but the total absence of any mention of this approach makes me suspect I'm overlooking something important. FYI, I've currently incorporated the docs for AWS into a Packer build that produces an AMI with all of the prerequisite technologies installed, so I'm not blocked by any means. I'm more anticipating the (possibly very obvious) question from my DevOps/SRE team about why I'm using Packer instead of just using Docker. Any and all information in greatly appreciated.
    ✍️ 1
    ✅ 1
    h
    e
    g
    • 4
    • 6
  • s

    Slackbot

    09/30/2022, 6:48 AM
    This message was deleted.
    ✍️ 1
    s
    n
    • 3
    • 3
  • j

    Jerri Comeau (Airbyte)

    09/30/2022, 2:55 PM
    For more on why there are big changes to the slack channels you see, check out our blog post: https://airbyte.com/blog/the-airbyte-community-assistance-team-were-changing-things-up
    ❤️ 8
  • v

    Venkat Dasari

    09/30/2022, 3:51 PM
    how to get data from rest api, i cant find the source that says rest
    ✅ 1
    m
    • 2
    • 22
  • o

    Opeyemi Daniel

    09/30/2022, 4:53 PM
    Could not connect with provided configuration. Error: Timed out after 30000 ms while waiting to connect. Client view of cluster state is {type=UNKNOWN, servers=[{address=:, type=UNKNOWN, state=CONNECTING, exception={com.mongodb.MongoSocketWriteException: Exception sending message}, caused by {javax.net.ssl.SSLHandshakeException: Remote host terminated the handshake}, caused by {java.io.EOFException: SSL peer shut down incorrectly}}] I am trying to create a mongodb connection on airbyte cloud and I am getting the error. Please what is the issue?
    ✍️ 1
    • 1
    • 2
  • s

    Steve Palm

    09/30/2022, 6:05 PM
    Maybe this needs to go to the forums, but seeing the activity here with channel renames/etc prompted me to update our install (docker images) to the latest version. That was a mistake. They start, but the server doesn't respond and I see messages in the airbyte-server about timeout waiting for namespace default to be initialized in temporal. The logs for airbyte-temporal show the namespace registration attempt and that it was already registered. Seems a docker connection issue or something? Strange.
    ✅ 1
    ✍️ 1
    m
    l
    • 3
    • 49
  • c

    Chasen Sherman

    09/30/2022, 6:36 PM
    semi-silly question, but how can one remove a connector from airbyte via the UI? I see an API for that
    *POST* /v1/source_definitions/delete_custom
    but cant figure out how to do this from the UI 😖
    ✍️ 1
    ✅ 1
    m
    s
    • 3
    • 5
  • s

    Simon Thelin

    10/01/2022, 6:26 AM
    If I use the
    incremental append
    and my cursor is
    updated_at
    . And I get a new row with
    PK
    id
    in a postgres source, it would still append the row with the new id? And if I do incremental append with let’s say
    id
    PK
    will I then loose track of updated values? Since the
    id could be the same
    Assuming this is not
    CDC
    .
    ✍️ 1
    s
    • 2
    • 4
1...686970...245Latest