https://linen.dev logo
Join Slack
Powered by
# ask-ai
  • p

    Pedro Doria

    05/29/2023, 12:43 PM
    I'm using Airbyte in GKE and I noticed that the temporal database is increasing in size very fast (I had to do a resize because the application had problems for that) Is there any way to decrease the retention of temporal database data?
    k
    • 2
    • 4
  • b

    Benjamin Groves

    05/29/2023, 1:39 PM
    Hello everyone, I am trying to deploy Airbyte to my EKS cluster with Helm chart
    v0.45.22
    and connect it to an external Postgres database using an existing kubernetes secret. Here is my
    values.yaml
    Copy code
    externalDatabase:
       host: <http://prod-data-platform.cluster-cumb1b60veam.eu-west-1.rds.amazonaws.com|prod-data-platform.cluster-cumb1b60veam.eu-west-1.rds.amazonaws.com>
       user: eterlast
       existingSecret: data-platform-external-secret
       existingSecretPasswordKey: DATABASE_PASSWORD
       database: airbyte
       port: 5432
    
    postgresql:
       enabled: false
    However, when I deploy this, I am getting this error from the
    airbyte-bootloader
    pod:
    Copy code
    Error: couldn't find key DATABASE_PASSWORD in Secret data-platform/data-platform-external-secret
    I have seen this issue discussed a few times in here, but regardless, it doesn't work for me. Does anyone have a concrete way of getting this to work? Looking into the helm template for the secret.yaml on artifacthub.io, I noticed the the
    DATABASE_PASSWORD
    secret is wrapped in an
    if statement
    which blocks its creation:
    Copy code
    {{ if eq .Values.externalDatabase.existingSecret "" -}}
      DATABASE_PASSWORD: {{ .Values.externalDatabase.password | default .Values.postgresql.postgresqlPassword | quote }}
    {{ end -}}
    k
    • 2
    • 3
  • o

    Octavia Squidington III

    05/29/2023, 7:45 PM
    πŸ”₯ Community Office Hours starts in 15 minutes πŸ”₯ Topic and schedule posted in #C045VK5AF54 octavia loves At 1pm PDT click here to join us on Zoom!
  • p

    Pedro Doria

    05/29/2023, 11:12 PM
    How to set the SOCAT_KUBE_CPU_REQUEST variable on GKE/Helm?
    k
    • 2
    • 2
  • q

    Quang Dang Vu Nhat

    05/30/2023, 3:29 AM
    Hello everyone, I have recently update my Local Airbyte with
    git pull origin master
    (which I have used many times before) but this time, when I boot up using
    /run-ab-platform.sh
    , it returned with error
    airbyte-bootloader exited with code 255
    service β€œbootloader” didn’t complete successfully: exit 255
    Then I started again by pull a fresh airbyte repo to start, but still the same error occurs Anyone know about this issue ? Appreciate your help.
    k
    • 2
    • 2
  • r

    ratibor78

    05/30/2023, 12:41 PM
    Hello there πŸ™‚ nice to meet you all
    k
    • 2
    • 2
  • r

    ratibor78

    05/30/2023, 12:42 PM
    Is it possible to download or migrate the airbyte configuration with tasks and so on if I'll move the deployment to another EC2 instance for example ? Thx
    k
    • 2
    • 4
  • r

    ratibor78

    05/30/2023, 12:58 PM
    is it possible to have the autoscaling with airbyte ?
    k
    f
    • 3
    • 5
  • a

    Anchit

    05/30/2023, 6:37 PM
    I've deployed Airbyte to EKS and publishing metrics to Prometheus using OpenTelemetry. I don't see metric tags related to the connection ID that ran, or any info about source/destination connectors. I don't see Workspace ID as well. What would be the best way to add these custom metric tags?
    k
    • 2
    • 5
  • j

    Jake Kagan

    05/30/2023, 9:23 PM
    how do you guys restart airbyte when you change
    .env
    do you just do
    compose down
    ?
    k
    • 2
    • 2
  • r

    ratibor78

    05/31/2023, 6:02 AM
    what is the best practice for exposing the Airbyte web app from the EKS installation?
    k
    • 2
    • 2
  • s

    Srikanth Sudhindra

    05/31/2023, 4:34 PM
    Hello, Is there plan to support airbyte connections and config via terraform? I dont see the official provider yet.
    k
    j
    • 3
    • 3
  • a

    AurΓ©lien Tamisier

    05/31/2023, 4:36 PM
    Hi πŸ‘‹ We are trying to setup the Sentry Integration for Airbyte Connectors, but fail to see which is the appropriate way to do so. We are running Airbyte
    v0.40.29
    on Kubernetes, using Helm charts
    v0.43.22-helm
    , and tried adding the following environment variables for
    worker
    deployments: β€’
    SENTRY_DSN: <dsn>
    , as documented in: β—¦ https://docs.airbyte.com/operator-guides/configuring-airbyte#worker β—¦ https://docs.airbyte.com/operator-guides/sentry-integration/ β€’
    JOB_ERROR_REPORTING_STRATEGY: sentry
    and
    JOB_ERROR_REPORTING_SENTRY_DSN: <dsn>
    as hinted in: β—¦ https://github.com/airbytehq/airbyte/issues/7957#issuecomment-1297672827 β—¦ https://github.com/airbytehq/airbyte/issues/18755 β—¦ https://airbytehq.slack.com/archives/C021JANJ6TY/p1681193093461609 β—¦ https://github.com/airbytehq/airbyte-platform/blob/main/airbyte-config/config-models/src/main/java/io/airbyte/config/EnvConfigs.java β—¦ https://github.com/airbytehq/airbyte-platform/blob/main/airbyte-config/config-models/src/main/java/io/airbyte/config/Configs.java β€’ adding
    SENTRY_ENVIRONMENT: <env>
    to the mix β—¦ though, this variable is not referenced anywhere in the Airbyte / Airbyte Platform codebase Here is the relevant excerpt from our `values.yaml`:
    Copy code
    worker:
      extraEnv:
        - name: JOB_ERROR_REPORTING_STRATEGY
          value: "SENTRY"
        - name: JOB_ERROR_REPORTING_SENTRY_DSN
          valueFrom:
            secretKeyRef:
              name: airbyte-sentry-dsn
              key: dsn
    The values are properly set for
    worker
    deployments and pods, but are not set by the
    worker
    when scheduling and instantiating
    jobs
    pods with source and destination connector containers. Has anyone been able to setup Sentry with Airbyte running on Kubernetes?
    k
    • 2
    • 2
  • j

    Jake Kagan

    05/31/2023, 5:40 PM
    can anyone describe how you guys have setup your process for editing files in airbyte....ideally, i would be working locally and push to gitlab, then from there it would adjust the airbyte instance. would really appreciate insight into how you guys pull that off
    k
    p
    • 3
    • 11
  • o

    Octavia Squidington III

    05/31/2023, 7:45 PM
    πŸ”₯ Office Hours starts in 15 minutes πŸ”₯ Topic and schedule posted in #C045VK5AF54 octavia loves At 1PM PDT click here to join us on Zoom!
    j
    • 2
    • 1
  • s

    Simon Thelin

    06/01/2023, 7:28 AM
    Does it help to deploy more than one temporal if workers fail to sync?
    k
    • 2
    • 2
  • s

    Simon Thelin

    06/01/2023, 7:55 AM
    I can’t find any documentation around
    Copy code
    MAX_NOTIFY_WORKERS
      MAX_SYNC_WORKERS
      MAX_SPEC_WORKERS
      MAX_CHECK_WORKERS
      MAX_DISCOVER_WORKERS
    k
    • 2
    • 3
  • s

    Simon Thelin

    06/01/2023, 9:50 AM
    Anyone who got
    Copy code
    2023-06-01 08:44:08.841 UTC [22] PANIC:  could not locate a valid checkpoint record
    With the airbyte postgres?
    k
    • 2
    • 2
  • o

    Octavia Squidington III

    06/01/2023, 1:45 PM
    πŸ”₯ Office Hours starts in 15 minutes πŸ”₯ Topic and schedule posted in #C045VK5AF54 octavia loves At 16:00 CEST / 10am EDT click here to join us on Zoom octavia loves
  • p

    Pedro Doria

    06/01/2023, 1:54 PM
    Guys, I have a problem and I've been researching and I'm trying to do it myself but I haven't made any progress. My scenario is as follows, I had a VM with Airbyte installed, but it wasn't doing enough (we ran about 400Syncs per hour) and so I opted to put that on a GKE cluster. My cluster allows a maximum of 4 nodes, and even without having migrated all the loads, the utilization is already on 3 nodes. I saw that each job pod uploads approximately 4 containers, and these containers have 100m of requests each, so each pod uploads with 400mCPU of requests. The use of nodes is very low, which leads me to believe that I can reduce these requests well so that the cluster does not become oversized for nothing. I was able to edit the value of the requests field for the main container through values.yaml (I'm deploying with helm) But the other containers (relay-stderr, relay-stdout, call-heartbeat-server) continue with 100m EACH. I saw that there is a variable in the main container called: SOCAT_KUBE_CPU_REQUEST and it has a value of 0.1. I haven't been able to change the value of this variable yet. Anyway, has anyone gone through this? Thanks.
    k
    • 2
    • 3
  • s

    Simon Thelin

    06/01/2023, 5:17 PM
    What to do if the airbyte server restarts randomly every few hours for no apparent reason?
    k
    • 2
    • 2
  • h

    Huib

    06/01/2023, 7:34 PM
    Today I decided to upgrade airbyte from 0.40 to 0.44.5 and I have run into two significant issues. Having fought with airbyte for a year and never gotten it to work stable I am quite ready to throw in the towel. Can you please. please. put some of the effort you put into redesigning the UI into building a stable product?
    k
    m
    +2
    • 5
    • 54
  • n

    Nico Gallinal

    06/02/2023, 5:46 AM
    I tried to use secure passwords in Octavia cli password: ${my_password} example and does not work, is it a know issue of latest 0.44.4
    k
    • 2
    • 2
  • o

    Omar Allouache

    06/02/2023, 11:17 AM
    Hello, I was struggling a lot with temporal service, I solved this error which is not in ERROR logs :
    Copy code
    "level":"fatal","ts":"2023-06-02T10:01:51.660Z","msg":"ListenIP failed, unable to parse bindOnIP value","address":"192.168.128.4 192.168.144.2"
    I have added this
    .env var
    to the
    airbyte-temporal docker-compose
    :
    Copy code
    - BIND_ON_IP=0.0.0.0
    - TEMPORAL_BROADCAST_ADDRESS=127.0.0.1
    πŸ’― 1
    k
    m
    q
    • 4
    • 5
  • k

    kapa.ai

    06/02/2023, 3:42 PM
    To set up Airbyte on your local Windows machine with WSL2 and Ubuntu 22.04 LTS, follow these steps: 1. Enable Hyper-V in your system's BIOS. 2. Install and enable WSL2. 3. Install Ubuntu 22.04 LTS and Windows Terminal. 4. Upgrade Python to version 3.9. 5. Install Git, Docker Desktop, and Docker Compose. 6. Enable Docker to run correctly on WSL by checking the β€œUse the WSL two-based engine” and β€œEnable Kubernetes” checkboxes in the Docker settings. 7. Navigate to your Airbyte directory in the terminal and run
    docker-compose up
    to spin up all Airbyte containers. 8. After the Airbyte banner shows up in the log, navigate to localhost:8000 to access the Airbyte UI. For more detailed instructions, please refer to the Airbyte Video Tutorial on YouTube: [How To Setup Your Development Environment In Windows Using WSL For Airbyte](

    https://www.youtube.com/watch?v=CtuU9Oi-6O8β–Ύ

    ) Additionally, you can follow the [Local Deployment](https://docs.airbyte.com/deploying-airbyte/local-deployment) guide in the Airbyte documentation for more information on setting up Airbyte locally.
  • n

    Nick Mancini

    06/02/2023, 3:49 PM
    Hello! I'm wanting to test out Airbyte and trying to install on a local windows machine running WSL v2 Ubuntu 22.04 LTS. I'm getting this message when running the ./run-ab-platform.sh script:
    .env: line 70: SYNC_JOB_INIT_RETRY_TIMEOUT_MINUTES:: command not found
    I can't seem to find this specific message anywhere. The closest thing I found mentioned a difference between windows line endings instead of unix line endings, so I tried running dos2unix on the script but the issue persists.1 Can anyone point me in the right direction? Here's the verbose script output:
    Copy code
    XXXX@XXXXXXX:~/airbyte$ sudo ./run-ab-platform.sh -x
    ./run-ab-platform.sh:14: shift
    ./run-ab-platform.sh:14: tput cols
    ./run-ab-platform.sh:14: test 175 -ge 64
    ./run-ab-platform.sh:14: echo -e '\033[32m'
    
    ./run-ab-platform.sh:14: echo -e ' β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•— β–ˆβ–ˆβ•—β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•— β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•— β–ˆβ–ˆβ•—   β–ˆβ–ˆβ•—β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•—β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•—'
     β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•— β–ˆβ–ˆβ•—β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•— β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•— β–ˆβ–ˆβ•—   β–ˆβ–ˆβ•—β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•—β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•—
    ./run-ab-platform.sh:14: echo -e 'β–ˆβ–ˆβ•”β•β•β–ˆβ–ˆβ•—β–ˆβ–ˆβ•‘β–ˆβ–ˆβ•”β•β•β–ˆβ–ˆβ•—β–ˆβ–ˆβ•”β•β•β–ˆβ–ˆβ•—β•šβ–ˆβ–ˆβ•— β–ˆβ–ˆβ•”β•β•šβ•β•β–ˆβ–ˆβ•”β•β•β•β–ˆβ–ˆβ•”β•β•β•β•β•'
    β–ˆβ–ˆβ•”β•β•β–ˆβ–ˆβ•—β–ˆβ–ˆβ•‘β–ˆβ–ˆβ•”β•β•β–ˆβ–ˆβ•—β–ˆβ–ˆβ•”β•β•β–ˆβ–ˆβ•—β•šβ–ˆβ–ˆβ•— β–ˆβ–ˆβ•”β•β•šβ•β•β–ˆβ–ˆβ•”β•β•β•β–ˆβ–ˆβ•”β•β•β•β•β•
    ./run-ab-platform.sh:14: echo -e 'β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•‘β–ˆβ–ˆβ•‘β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•”β•β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•”β• β•šβ–ˆβ–ˆβ–ˆβ–ˆβ•”β•    β–ˆβ–ˆβ•‘   β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•—  '
    β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•‘β–ˆβ–ˆβ•‘β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•”β•β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•”β• β•šβ–ˆβ–ˆβ–ˆβ–ˆβ•”β•    β–ˆβ–ˆβ•‘   β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•—
    ./run-ab-platform.sh:14: echo -e 'β–ˆβ–ˆβ•”β•β•β–ˆβ–ˆβ•‘β–ˆβ–ˆβ•‘β–ˆβ–ˆβ•”β•β•β–ˆβ–ˆβ•—β–ˆβ–ˆβ•”β•β•β–ˆβ–ˆβ•—  β•šβ–ˆβ–ˆβ•”β•     β–ˆβ–ˆβ•‘   β–ˆβ–ˆβ•”β•β•β•  '
    β–ˆβ–ˆβ•”β•β•β–ˆβ–ˆβ•‘β–ˆβ–ˆβ•‘β–ˆβ–ˆβ•”β•β•β–ˆβ–ˆβ•—β–ˆβ–ˆβ•”β•β•β–ˆβ–ˆβ•—  β•šβ–ˆβ–ˆβ•”β•     β–ˆβ–ˆβ•‘   β–ˆβ–ˆβ•”β•β•β•
    ./run-ab-platform.sh:14: echo -e 'β–ˆβ–ˆβ•‘  β–ˆβ–ˆβ•‘β–ˆβ–ˆβ•‘β–ˆβ–ˆβ•‘  β–ˆβ–ˆβ•‘β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•”β•   β–ˆβ–ˆβ•‘      β–ˆβ–ˆβ•‘   β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•—'
    β–ˆβ–ˆβ•‘  β–ˆβ–ˆβ•‘β–ˆβ–ˆβ•‘β–ˆβ–ˆβ•‘  β–ˆβ–ˆβ•‘β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•”β•   β–ˆβ–ˆβ•‘      β–ˆβ–ˆβ•‘   β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•—
    ./run-ab-platform.sh:14: echo -e 'β•šβ•β•  β•šβ•β•β•šβ•β•β•šβ•β•  β•šβ•β•β•šβ•β•β•β•β•β•    β•šβ•β•      β•šβ•β•   β•šβ•β•β•β•β•β•β•'
    β•šβ•β•  β•šβ•β•β•šβ•β•β•šβ•β•  β•šβ•β•β•šβ•β•β•β•β•β•    β•šβ•β•      β•šβ•β•   β•šβ•β•β•β•β•β•β•
    ./run-ab-platform.sh:14: echo -e '                                            Move Data'
                                                Move Data
    ./run-ab-platform.sh:14: echo -e '\033[0m'
    
    ./run-ab-platform.sh:14: sleep 1
    ./run-ab-platform.sh:14: docker compose version
    ./run-ab-platform.sh:14: Download
    ./run-ab-platform.sh:14: for file in $all_files
    ./run-ab-platform.sh:14: test -f docker-compose.yaml
    ./run-ab-platform.sh:14: find docker-compose.yaml -type f -mtime +60
    ./run-ab-platform.sh:14: test
    ./run-ab-platform.sh:14: echo -e '\033[94mfound docker-compose.yaml locally!\033[39m'
    found docker-compose.yaml locally!
    ./run-ab-platform.sh:14: for file in $all_files
    ./run-ab-platform.sh:14: test -f docker-compose.debug.yaml
    ./run-ab-platform.sh:14: find docker-compose.debug.yaml -type f -mtime +60
    ./run-ab-platform.sh:14: test
    ./run-ab-platform.sh:14: echo -e '\033[94mfound docker-compose.debug.yaml locally!\033[39m'
    found docker-compose.debug.yaml locally!
    ./run-ab-platform.sh:14: for file in $all_files
    ./run-ab-platform.sh:14: test -f .env
    ./run-ab-platform.sh:14: find .env -type f -mtime +60
    ./run-ab-platform.sh:14: test
    ./run-ab-platform.sh:14: echo -e '\033[94mfound .env locally!\033[39m'
    found .env locally!
    ./run-ab-platform.sh:14: for file in $all_files
    ./run-ab-platform.sh:14: test -f .env.dev
    ./run-ab-platform.sh:14: find .env.dev -type f -mtime +60
    ./run-ab-platform.sh:14: test
    ./run-ab-platform.sh:14: echo -e '\033[94mfound .env.dev locally!\033[39m'
    found .env.dev locally!
    ./run-ab-platform.sh:14: for file in $all_files
    ./run-ab-platform.sh:14: test -f flags.yml
    ./run-ab-platform.sh:14: find flags.yml -type f -mtime +60
    ./run-ab-platform.sh:14: test
    ./run-ab-platform.sh:14: echo -e '\033[94mfound flags.yml locally!\033[39m'
    found flags.yml locally!
    ./run-ab-platform.sh:14: for file in $dot_env $dot_env_dev
    ./run-ab-platform.sh:14: echo -e '\033[94mLoading Shell Variables from .env...\033[39m'
    Loading Shell Variables from .env...
    ./run-ab-platform.sh:14: source .env
    ./run-ab-platform.sh:14: VERSION=0.44.8
    ./run-ab-platform.sh:14: CONFIG_ROOT=/data
    ./run-ab-platform.sh:14: DATA_DOCKER_MOUNT=airbyte_data
    ./run-ab-platform.sh:14: DB_DOCKER_MOUNT=airbyte_db
    ./run-ab-platform.sh:14: WORKSPACE_ROOT=/tmp/workspace
    ./run-ab-platform.sh:14: WORKSPACE_DOCKER_MOUNT=airbyte_workspace
    ./run-ab-platform.sh:14: LOCAL_ROOT=/tmp/airbyte_local
    ./run-ab-platform.sh:14: LOCAL_DOCKER_MOUNT=/tmp/airbyte_local
    ./run-ab-platform.sh:14: HACK_LOCAL_ROOT_PARENT=/tmp
    ./run-ab-platform.sh:14: BASIC_AUTH_USERNAME=airbyte
    ./run-ab-platform.sh:14: BASIC_AUTH_PASSWORD=password
    ./run-ab-platform.sh:14: BASIC_AUTH_PROXY_TIMEOUT=900
    ./run-ab-platform.sh:14: DATABASE_USER=docker
    ./run-ab-platform.sh:14: DATABASE_PASSWORD=docker
    ./run-ab-platform.sh:14: DATABASE_HOST=db
    ./run-ab-platform.sh:14: DATABASE_PORT=5432
    ./run-ab-platform.sh:14: DATABASE_DB=airbyte
    ./run-ab-platform.sh:14: DATABASE_URL=jdbc:<postgresql://db:5432/airbyte>
    ./run-ab-platform.sh:14: JOBS_DATABASE_MINIMUM_FLYWAY_MIGRATION_VERSION=0.40.26.001
    ./run-ab-platform.sh:14: CONFIG_DATABASE_USER=
    ./run-ab-platform.sh:14: CONFIG_DATABASE_PASSWORD=
    ./run-ab-platform.sh:14: CONFIG_DATABASE_URL=
    ./run-ab-platform.sh:14: CONFIGS_DATABASE_MINIMUM_FLYWAY_MIGRATION_VERSION=0.40.23.002
    ./run-ab-platform.sh:14: TEMPORAL_HOST=airbyte-temporal:7233
    ./run-ab-platform.sh:14: INTERNAL_API_HOST=airbyte-server:8001
    ./run-ab-platform.sh:14: CONNECTOR_BUILDER_API_HOST=airbyte-connector-builder-server:80
    ./run-ab-platform.sh:14: WEBAPP_URL=<http://localhost:8000/>
    ./run-ab-platform.sh:14: CONNECTOR_BUILDER_API_URL=/connector-builder-api
    ./run-ab-platform.sh:14: SYNC_JOB_MAX_ATTEMPTS=3
    ./run-ab-platform.sh:14: SYNC_JOB_MAX_TIMEOUT_DAYS=3
    ./run-ab-platform.sh:14: SYNC_JOB_INIT_RETRY_TIMEOUT_MINUTES: 5
    .env: line 70: SYNC_JOB_INIT_RETRY_TIMEOUT_MINUTES:: command not found
    k
    s
    • 3
    • 5
  • c

    Caio CΓ©sar P. Ricciuti

    06/02/2023, 5:34 PM
    Hey All, not really a deploy question but is the airbyte api functional into
    self-hosted
    instances, I was looking into the docs and only mention Airbyte cloud... Have anyone used the API hosting airbyte yourself? TIA
    k
    • 2
    • 3
  • o

    Octavia Squidington III

    06/02/2023, 7:45 PM
    πŸ”₯ Community Office Hours starts in 15 minutes πŸ”₯ At 1pm PDT click here to join us on Zoom!
  • s

    Sharat Visweswara

    06/02/2023, 7:55 PM
    Hello, I have Airbyte deployed on Kubernetes using Plural.sh -- how do I configure CPU and Memory requests and limits for sync jobs?
    k
    • 2
    • 2
  • k

    King Ho

    06/05/2023, 6:53 AM
    Hi, if I have a pipeline that I want to migrate to another instance of airbyte, but keep the already synced data, whats the best way to do that? I have Airbyte running on docker in a VM, with source connector as Hubspot and destination as BigQuery. Please advise!
    k
    p
    • 3
    • 18
1...111213...48Latest