https://linen.dev logo
Join Slack
Channels
2nd-example
advice-data-architecture
advice-data-ingestion
advice-data-orchestration
advice-data-privacy
advice-data-quality
advice-data-transformation
advice-data-visualization
advice-data-warehouses
advice-reverse-etl
airbyte-ai
airbyte-api
airbyte-cloud-pricing
airbyte-dbt-packages
airbyte-enterprise-pricing
airbyte-for-power-users
airbyte-plus-airflow
airbyte-plus-dagster
airbyte-udemy-course
announcements
ask-ai
ask-community-for-troubleshooting
authentication
careers
cloud-master-build-failure
code-contributions-reviews
community-strategy
connector-builds
connector-build-statuses
connector-development
contributing-to-airbyte
databricks-airbyte316
dev1-alerts
dev-3-alerts
dev-frontend
discuss-ai
events-and-conferences
example-of-channel
explo-dentalxchange-trial
external-teradata--11
external-teradata-source
ext-tabular-iceberg
feedback-and-requests
github-updates
give-feedback
good-reads
gtm
hackathons
hacktoberfest-2022
help
help-api-cli-orchestration
help-connector-development
icp-lab-5
infra-dev-alerts
infra-dev-alerts-webhook
introductions
issue
kubernetes
kudos
kudos202437
license-questions
local-brazil
local-london
local-sf
low-code-migrations
mos-client
motleyfool
movedata-a-different-way-to-work
movedata-airbyte-connection-management-scale
movedata-airbyte-orchestration-in-gcp
movedata-better-data-testing-with-the-data-error-generating-process
movedata-bring-your-own-infra
movedata-building-a-real-time-user-facing-dashboard-with-airbyte-redpanda-and
movedata-building-connectors-in-minutes
movedata-cicd-for-data-building-devtest-data-environments-with-lakefs
movedata-conference-2023
movedata-data-analysts-are-setup-to-fail-and-its-our-fault
movedata-data-engineering-is-software-engineering-and-software-engineering-is-da
movedata-dataops-on-the-open-modern-data-stack
movedata-data-orchestration-is-not-just-running-jobs-on-a-schedule
movedata-dev-first-open-source-data-observability-solution
movedata-ducky-data-crunching-on-the-laptop---the-pendulum-swings
movedata-five-causes-of-data-quality-issues
movedata-future-evolution-of-the-data-stack
movedata-guardrails-not-stop-signs
movedata-if-you-build-it-will-they-come-how-to-activate-your-modern-data-stack
movedata-in-2023-data-trust-will-be-more-important-than-ever-before
movedata-ingesting-data-with-airbyte-into-a-high-speed-data-lakehouse-using-trin
movedata-keynote-building-the-foundations-of-data-movement
movedata-let-your-data-team-choose-their-own-tools
movedata-mobilize-the-worlds-data
movedata-modern-data-management---how-to-achieve-data-discovery
movedata-moving-data-reliably-ingestion-observability-are-better-together
movedata-navigating-the-modern-data-stack-with-open-source-headless-bi
movedata-open-source-communities-shape-modern-data-stacks
movedata-prefect-ing-self-hosted-airbyte
movedata-prep-your-pipelines---reverse-etl-and-the-coming-great-flood
movedata-questions
movedata-speakers
movedata-speakers-2025
movedata-streaming-made-modern
movedata-the-best-data-warehouse-is-a-lakehouse
movedata-the-data-ecosystem-is-ready-for-etl-to-be-dead
movedata-the-end-of-the-pipeline
movedata-traditional-data-catalogs-will-be-replace-by-active-metadata-platforms
movedata-using-airbyte-on-day-1of-our-startup
movedata-using-airbyte-to-build-your-dremio-open-data-lakehouse
movedata-who-needs-a-traditional-data-warehouse-when-you-can-get-real-time-analy
movedata-why-bi-is-not-enough
new-channel-test
octavia-life
office-hour
office-hour-12-october
oss-master-build-failure
oss-master-build-failure
p0-amazon-ads-03-30
p0-cloud-syncs-failing-dec-15
p0-harvest-bad-requests
p0-source-airtable-stream-failures
p1-auto-detect-schema-issues-2-1-23
partner-productpair-hightouch
prod1-alerts-test
prod1-health
proj-concurrent-cdk
proj-integration-tests
proj-python-cdk-upgrades
pyairbyte
releases
schemaless
spark-airbyte
thena-weekday
troubleshooting
turnkey-worlds
understanding-airbyte
well-seeded-accounts
write-for-the-community
Powered by
# ask-community-for-troubleshooting
  • a

    Alistair Wright

    03/28/2022, 2:41 PM
    Hi, I'm looking at airbyte in order to load some data from a third party bigquery warehouse and transform it for our use since I am unable to do this inside of the bigquery instance. Is it possible to set up a connection so that it only extracts data that is later than a specified key, for example only extracting data from 2021-01-01, but ongoing as new data gets added?
    ✅ 1
    a
    • 2
    • 2
  • o

    Oleg

    03/28/2022, 2:42 PM
    Hello, Please tell me if there is a connector for transferring data from Google BigQuery to Clickhouse?
    ✅ 1
    a
    • 2
    • 2
  • s

    Sania Zafar

    03/28/2022, 4:04 PM
    Hey All,
  • s

    Sania Zafar

    03/28/2022, 4:04 PM
    Hey All, is it possible to use Airbyte with Google Cloud Compose?
    ✅ 1
    a
    • 2
    • 3
  • k

    Kevin Phan

    03/28/2022, 6:51 PM
    AIrbyte seems to be taking the db name and prepending it to the table names. It also seems to be making a duplicate table with that naming convention.. Its a connection between postgres and snowflake. Anyone encounter this before?
    b
    • 2
    • 2
  • t

    Tino de Bruijn

    03/28/2022, 7:42 PM
    I tried the getting started guide with Docker, but Airbyte server doesn't want to start, as it gets stuck on waiting for a namespace in temporal
  • t

    Tino de Bruijn

    03/28/2022, 7:42 PM
    I get:
    Copy code
    airbyte-server | 2022-03-20 17:08:22 WARN i.a.w.t.TemporalUtils(getTemporalClientWhenConnected):194 - Ignoring exception while trying to request Temporal namespaces:
    airbyte-server | io.grpc.StatusRuntimeException: DEADLINE_EXCEEDED: Deadline exceeded after 4.998910674s.
  • t

    Tino de Bruijn

    03/28/2022, 7:43 PM
    Temporal doesn't show any errors
  • t

    Tino de Bruijn

    03/28/2022, 7:43 PM
    I tried it with
    0.35.58-alpha
    as well as with the lastest .60
  • t

    Tino de Bruijn

    03/28/2022, 7:43 PM
    What am I doing wrong?
    m
    h
    • 3
    • 13
  • w

    William Phillips

    03/28/2022, 8:15 PM
    Any reason why airbyte isn't on the terraform registry?
    👀 1
    m
    • 2
    • 1
  • j

    John (Airbyte)

    03/28/2022, 10:08 PM
    has renamed the channel from "getting-started" to "airbyte-for-beginners"
  • a

    Anuj Shirgaonkar

    03/29/2022, 6:03 AM
    Hello Community, We are in the initial phases of evaluating Airbyte OSS for our ELT needs. User attribution and security is a primary need for us. We use Okta IDP and found following resources on security documentation page - https://developer.okta.com/blog/2018/08/28/nginx-auth-request However, we will be running airbyte in our private network and would like to focus more on user attribution to actions (new connector addition, deletion, executions etc.) Is there any way to propagate user information propagated from okta to airbyte workspaces? Alternatively, can we use multiple workspaces to keep a track of user actions?
    ✅ 1
    m
    • 2
    • 1
  • v

    Vaibhav Kumar

    03/29/2022, 6:27 AM
    Hello Everyone, Can we use log based replication in near real time(mysql to mysql) in Airbyte? As of now i can see the minimum batch time required is 5 minutes. I want to do something similar to Streamsets replication in real time
    ✅ 1
    o
    j
    m
    • 4
    • 3
  • s

    Suntx

    03/29/2022, 12:26 PM
    Hello, I am a beginner and unable to connect a file located in a Personal OneDrive ( shared and available to anyone with a link ) as my data source. I have added a source - One drive sheet that was successful, but fails to load the schema whenever I try create a connection. Your guidance will be greatly appreciated.
    👀 1
    o
    o
    m
    • 4
    • 7
  • o

    Oleg

    03/29/2022, 1:03 PM
    Hello! Please tell me if the Arbyte service can help in our task: it is necessary to transfer data from Google Bigquery to Clickhouse. If yes, how long (a few days or a few hours) can it take to migrate a 1.8 Tbyte table with 1000000000 (1 billion) rows?
    ✅ 1
    o
    m
    • 3
    • 3
  • s

    Sujith Kumar.S

    03/29/2022, 1:26 PM
    Hello I’m working as a data engineer in Finance domain and new to Airbye, he to explore more on airbyte.
    ✅ 1
    o
    o
    m
    • 4
    • 4
  • r

    Ramon Vermeulen

    03/29/2022, 2:31 PM
    If I use
    File
    as destination on the local
    docker-compose
    airbyte environment for testing. Where will this file be written to? I suppose to one of the docker container runtimes? Or will it write to my local system?
    d
    • 2
    • 1
  • m

    Mikhail Masyagin

    03/29/2022, 4:50 PM
    Hello! I created incremental connection to S3. It creates new file each time it is runned. How can I get name of created file?
    👀 1
    m
    • 2
    • 1
  • r

    Rajath Chandregowda

    03/29/2022, 4:51 PM
    Hi All, how can we dbt with airbyte, dbt is transformation but with airbyte we are just extracting and loading the data. what is the benefit and for what purpose dbt is used with airbyte
    ✅ 1
    j
    • 2
    • 2
  • r

    Rafael Auyer

    03/29/2022, 4:55 PM
    Hi ! I'm new to Aribyte. I have a Postgres CDC -> S3 connection running (its a lerge database). On thing that is not clear to me is the difference between all Sync Modes. The docs (also this) says that
    Incremental - Append Sync
    and
    Full Refresh Sync
    are supported for S3, but when I set up the connection, I can only chose
    Full refresh | APPEND
    and
    Full Refresh | Overwrite
    or I get the
    The form is invalid. Please make sure that all fields are correct.
    error. check the screenshot ! What am I missing ? Thanks !
    👀 1
    m
    • 2
    • 2
  • j

    Jordan Fox

    03/29/2022, 6:48 PM
    Hey guys, Question, one of my sources is a vendor hosted Oracle database that only provides us read access to views, not tables. Is there any way to handle incremental in this situation?
    ✅ 1
    a
    • 2
    • 1
  • r

    Rafael Auyer

    03/29/2022, 7:27 PM
    What causes some tables to have Incremental Append as an option, and others not? Is this expected ? The two prints are from the same connection. (Postgres CDC -> S3 Parquet connection) If this is expected, I will need to create two connections for the same database. On with and other without CDC. In the ones that do not support Incremental | Append with CDC, I'll chose a cursor field.
    ✅ 1
    a
    • 2
    • 3
  • j

    Jyothi

    03/30/2022, 3:48 AM
    @here : Airbyte version :_0.35.61-alpha ... have question currently if airbyte has source connector which supports custom queries as source (this custom query is defined at airbyte level but not at source)._
    👀 1
    h
    • 2
    • 3
  • s

    Sujith Kumar.S

    03/30/2022, 5:02 AM
    Hi Team, I’m setting up Airbyte locally on my mac with k8 deployment. While setting up the source connection with MSSQL getting below error, is there someone can guide me on this. Log4j2Appender says: kubectl cp /tmp/cd621464-26b9-4286-9a96-221189d04df7/source_config.json default/source-mssql-sync-d1329155-996c-4ec8-87ce-d589b6a90cea-0-rqgdy:/config/source_config.json -c init 2022-03-30 045601 INFO i.a.w.t.TemporalAttemptExecution(lambda$getWorkerThread$2):158 - Completing future exceptionally... io.airbyte.workers.WorkerException: Error while getting checking connection. at io.airbyte.workers.DefaultCheckConnectionWorker.run(DefaultCheckConnectionWorker.java:84) ~[io.airbyte-airbyte-workers-0.35.61-alpha.jar:?] at io.airbyte.workers.DefaultCheckConnectionWorker.run(DefaultCheckConnectionWorker.java:27) ~[io.airbyte-airbyte-workers-0.35.61-alpha.jar:?] at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:155) ~[io.airbyte-airbyte-workers-0.35.61-alpha.jar:?] at java.lang.Thread.run(Thread.java:833) [?:?] Caused by: io.airbyte.workers.WorkerException: java.io.IOException: Cannot run program “kubectl”: error=0, Failed to exec spawn helper: pid: 137, exit value: 1 at io.airbyte.workers.process.KubeProcessFactory.create(KubeProcessFactory.java:150) ~[io.airbyte-airbyte-workers-0.35.61-alpha.jar:?] at io.airbyte.workers.process.AirbyteIntegrationLauncher.check(AirbyteIntegrationLauncher.java:58) ~[io.airbyte-airbyte-workers-0.35.61-alpha.jar:?] at io.airbyte.workers.DefaultCheckConnectionWorker.run(DefaultCheckConnectionWorker.java:53) ~[io.airbyte-airbyte-workers-0.35.61-alpha.jar:?] ... 3 more Caused by: java.lang.RuntimeException: java.io.IOException: Cannot run program “kubectl”: error=0, Failed to exec spawn helper: pid: 137, exit value: 1 at io.airbyte.workers.process.KubePodProcess.copyFilesToKubeConfigVolume(KubePodProcess.java:284) ~[io.airbyte-airbyte-workers-0.35.61-alpha.jar:?] at io.airbyte.workers.process.KubePodProcess.<init>(KubePodProcess.java:514) ~[io.airbyte-airbyte-workers-0.35.61-alpha.jar:?] at io.airbyte.workers.process.KubeProcessFactory.create(KubeProcessFactory.java:146) ~[io.airbyte-airbyte-workers-0.35.61-alpha.jar:?] at io.airbyte.workers.process.AirbyteIntegrationLauncher.check(AirbyteIntegrationLauncher.java:58) ~[io.airbyte-airbyte-workers-0.35.61-alpha.jar:?] at io.airbyte.workers.DefaultCheckConnectionWorker.run(DefaultCheckConnectionWorker.java:53) ~[io.airbyte-airbyte-workers-0.35.61-alpha.jar:?] ... 3 more
    👀 1
    o
    h
    • 3
    • 4
  • r

    Raymond Te Hok

    03/30/2022, 10:02 AM
    hi there ! I am trying to apply this tutorial and I noticed that with version 0.35.59-alpha, the command to extract normalization files uses dbt=1.0.0
    Copy code
    docker run --rm -i -v airbyte_workspace:/data -w /data/$NORMALIZE_WORKSPACE/normalize --network host --entrypoint /usr/local/bin/dbt airbyte/normalization debug --profiles-dir=. --project-dir=. 
    09:55:47  Running with dbt=1.0.0
    But when running the transformation in Airbyte will fail with
    Copy code
    2022-03-30 10:00:19 INFO i.a.c.i.LineGobbler(voidCall):82 - fishtownanalytics/dbt:0.19.1 was found locally.
    2022-03-30 10:00:19 INFO i.a.w.p.DockerProcessFactory(create):106 - Creating docker job ID: 17
    2022-03-30 10:00:19 INFO i.a.w.p.DockerProcessFactory(create):158 - Preparing command: docker run --rm --init -i -w /data/17/0/transform --log-driver none --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_ROLE= -e AIRBYTE_VERSION=0.35.59-alpha --entrypoint /bin/bash fishtownanalytics/dbt:0.19.1 entrypoint.sh run
    2022-03-30 10:00:19 dbt > Running from /data/17/0/transform/git_repo
    2022-03-30 10:00:19 dbt > detected no config file for ssh, assuming ssh is off.
    2022-03-30 10:00:19 dbt > Running: dbt run --profiles-dir=/data/17/0/transform --project-dir=/data/17/0/transform/git_repo
    2022-03-30 10:00:21 dbt > Running with dbt=0.19.1
    2022-03-30 10:00:21 dbt > Encountered an error while reading the project:
    2022-03-30 10:00:21 dbt >   ERROR: Runtime Error
    2022-03-30 10:00:21 dbt >   at path []: Additional properties are not allowed ('dispatch', 'model-paths', 'packages-install-path', 'seed-paths' were unexpected)
    Is there a way to set the dbt version in
    airbyte/normalization
    or is there a stable version to use ? Thanks !
    ✅ 1
    o
    c
    • 3
    • 12
  • r

    Ramon Vermeulen

    03/30/2022, 2:21 PM
    Hi there! I'm using
    <http://logging.info|logging.info>
    in my airbyte code (request_params function) but my logs don't show up when running in the airbyte environment, what am I doing wrong?
    o
    • 2
    • 1
  • l

    luisa zuluaga

    03/30/2022, 2:44 PM
    Hi team, i had an issue with dbt in snowflake and with the create and replace statement it is not possible to restore all data with marvelous snowflake function time travel, any possibility to change create or replace by creat or replace if not exist??
    ✅ 1
    a
    • 2
    • 2
  • k

    Kevin Phan

    03/30/2022, 4:00 PM
    are there any docs on parent/child streams? dont see any
    ✅ 1
    o
    a
    • 3
    • 2
  • d

    David Mattern

    03/30/2022, 8:53 PM
    Hello team. I am new to Airbyte. I am trying the generate source connector tutorial on Windows with Docker installed:
    airbyte/airbyte-integrations/connector-templates/generator
    I ran ./generate and received this error message:
    While trying to generate a connector, an error occurred on line 38 of generate.sh and the process aborted early.  This is probably a bug.
    That line in generate.sh is:
    docker run --rm -it --name airbyte-connector-bootstrap --user "$_UID:$_GID" -e HOME=/tmp -v "$(pwd)/../../../.":/airbyte airbyte/connector-bootstrap
    Does anyone know what could be the problem?
    m
    n
    • 3
    • 5
1...303132...245Latest