https://linen.dev logo
Join Slack
Powered by
# contributing-to-airbyte
  • s

    Sree Charan

    01/18/2022, 5:25 PM
    Thanks for this very helpful project. I was looking for some help with an issue I was facing, while I was following your docs on dbt transformations. As mentioned in the docs, I issued the following command “docker run --rm -i -v airbyte_workspace:/data -w /data/9/1/normalize --network host --entrypoint /usr/local/bin/dbt airbyte/normalization debug --profiles-dir=. --project-dir=.” to debug/check the default dbt project created by AirByte, however I face the following error: Running with dbt=0.21.1 dbt version: 0.21.1 python version: 3.8.12 python path: /usr/local/bin/python os info: Linux-4.9.125-linuxkit-x86_64-with-glibc2.2.5 Using profiles.yml file at /data/9/1/normalize/profiles.yml Using dbt_project.yml file at /data/9/1/normalize/dbt_project.yml Configuration:  profiles.yml file [ERROR invalid]  dbt_project.yml file [OK found and valid] Required dependencies:  - git [OK found] 1 check failed: Profile loading failed for the following reason: Runtime Error  Credentials in profile “normalize”, target “prod” invalid: Runtime Error   Could not find adapter type mysql! Any help in resolving this issue will be of atmost help Thanks in advance
    h
    • 2
    • 5
  • d

    Danny Dinges

    01/19/2022, 1:41 AM
    When using Postgres CDC as a source is it best to use Destination sync mode of Incremental + Deduped or is something else better?
    h
    • 2
    • 1
  • k

    kshitij chaurasiya

    01/19/2022, 6:05 PM
    Hi All, I have recently deployed the airbyte solution on my local machine (Mac OS M1), while trying to connect to a Postgres Source or Destination I'm getting following error:
    Copy code
    2022-01-19 18:03:25 INFO i.a.w.t.TemporalAttemptExecution(get):118 - Docker volume job log path: /tmp/workspace/ee299288-f423-45c1-a9d4-b3d678894763/0/logs.log
    2022-01-19 18:03:25 INFO i.a.w.t.TemporalAttemptExecution(get):123 - Executing worker wrapper. Airbyte version: version not set
    2022-01-19 18:03:27 INFO i.a.w.t.TemporalAttemptExecution(lambda$getWorkerThread$2):177 - Completing future exceptionally...
    io.airbyte.workers.WorkerException: Error while getting checking connection.
    	at io.airbyte.workers.DefaultCheckConnectionWorker.run(DefaultCheckConnectionWorker.java:84) ~[io.airbyte-airbyte-workers-0.35.5-alpha.jar:?]
    	at io.airbyte.workers.DefaultCheckConnectionWorker.run(DefaultCheckConnectionWorker.java:27) ~[io.airbyte-airbyte-workers-0.35.5-alpha.jar:?]
    	at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:174) ~[io.airbyte-airbyte-workers-0.35.5-alpha.jar:?]
    	at java.lang.Thread.run(Thread.java:833) [?:?]
    Caused by: java.lang.RuntimeException: java.io.IOException: Cannot run program "/tmp/scripts11060788679633185033/image_exists.sh": error=0, Failed to exec spawn helper: pid: 129, exit value: 1
    	at io.airbyte.workers.process.DockerProcessFactory.checkImageExists(DockerProcessFactory.java:205) ~[io.airbyte-airbyte-workers-0.35.5-alpha.jar:?]
    	at io.airbyte.workers.process.DockerProcessFactory.create(DockerProcessFactory.java:96) ~[io.airbyte-airbyte-workers-0.35.5-alpha.jar:?]
    	at io.airbyte.workers.process.AirbyteIntegrationLauncher.check(AirbyteIntegrationLauncher.java:60) ~[io.airbyte-airbyte-workers-0.35.5-alpha.jar:?]
    	at io.airbyte.workers.DefaultCheckConnectionWorker.run(DefaultCheckConnectionWorker.java:53) ~[io.airbyte-airbyte-workers-0.35.5-alpha.jar:?]
    	... 3 more
    Caused by: java.io.IOException: error=0, Failed to exec spawn helper: pid: 129, exit value: 1
    	at java.lang.ProcessImpl.forkAndExec(Native Method) ~[?:?]
    	at java.lang.ProcessImpl.<init>(ProcessImpl.java:314) ~[?:?]
    	at java.lang.ProcessImpl.start(ProcessImpl.java:244) ~[?:?]
    	at java.lang.ProcessBuilder.start(ProcessBuilder.java:1110) ~[?:?]
    	at java.lang.ProcessBuilder.start(ProcessBuilder.java:1073) ~[?:?]
    	at io.airbyte.workers.process.DockerProcessFactory.checkImageExists(DockerProcessFactory.java:193) ~[io.airbyte-airbyte-workers-0.35.5-alpha.jar:?]
    	at io.airbyte.workers.process.DockerProcessFactory.create(DockerProcessFactory.java:96) ~[io.airbyte-airbyte-workers-0.35.5-alpha.jar:?]
    	at io.airbyte.workers.process.AirbyteIntegrationLauncher.check(AirbyteIntegrationLauncher.java:60) ~[io.airbyte-airbyte-workers-0.35.5-alpha.jar:?]
    	at io.airbyte.workers.DefaultCheckConnectionWorker.run(DefaultCheckConnectionWorker.java:53) ~[io.airbyte-airbyte-workers-0.35.5-alpha.jar:?]
    	... 3 more
    2022-01-19 18:03:27 INFO i.a.w.t.TemporalAttemptExecution(get):144 - Stopping cancellation check scheduling...
    2022-01-19 18:03:27 WARN i.t.i.s.POJOActivityTaskHandler(activityFailureToResult):363 - Activity failure. ActivityId=12a45b30-5c21-336e-8a78-dd18f28ee557, activityType=Run, attempt=1
    java.util.concurrent.ExecutionException: io.airbyte.workers.WorkerException: Error while getting checking connection.
    	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:396) ~[?:?]
    	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:2073) ~[?:?]
    at io.airbyte.workers.temporal.TemporalAttemptExecution.get(TemporalAttemptExecution.java:142) ~[io.airbyte-airbyte-workers-0.35.5-alpha.jar:?]
    	at io.airbyte.workers.temporal.check.connection.CheckConnectionActivityImpl.run(CheckConnectionActivityImpl.java:81) ~[io.airbyte-airbyte-workers-0.35.5-alpha.jar:?]
    	at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?]
    	at jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77) ~[?:?]
    	at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]
    	at java.lang.reflect.Method.invoke(Method.java:568) ~[?:?]
    	at io.temporal.internal.sync.POJOActivityTaskHandler$POJOActivityInboundCallsInterceptor.execute(POJOActivityTaskHandler.java:286) ~[temporal-sdk-1.6.0.jar:?]
    	at io.temporal.internal.sync.POJOActivityTaskHandler$POJOActivityImplementation.execute(POJOActivityTaskHandler.java:252) ~[temporal-sdk-1.6.0.jar:?]
    	at io.temporal.internal.sync.POJOActivityTaskHandler.handle(POJOActivityTaskHandler.java:209) ~[temporal-sdk-1.6.0.jar:?]
    	at io.temporal.internal.worker.ActivityWorker$TaskHandlerImpl.handle(ActivityWorker.java:193) ~[temporal-sdk-1.6.0.jar:?]
    	at io.temporal.internal.worker.ActivityWorker$TaskHandlerImpl.handle(ActivityWorker.java:151) ~[temporal-sdk-1.6.0.jar:?]
    	at io.temporal.internal.worker.PollTaskExecutor.lambda$process$0(PollTaskExecutor.java:73) ~[temporal-sdk-1.6.0.jar:?]
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
    	at java.lang.Thread.run(Thread.java:833) [?:?]
    I have checked this issue online so I got some discussion on github https://github.com/airbytehq/airbyte/issues/2017, there was a comment:
    Copy code
    h7kanna commented on 27 Nov 2021
    the above error is because of the airbyte worker not able to create a process on Mac M1
    
    Add this to the worker environment in docker-compose and try once
    
    JAVA_OPTS: "-Djdk.lang.Process.launchMechanism=vfork"
    I have tried this to fix the issue, but the issue didn't solve, now the server is restarting many times. Configuration on my machine are following: Exported env variables:
    Copy code
    export DOCKER_BUILD_PLATFORM=linux/arm64
    export DOCKER_BUILD_ARCH=arm64
    export ALPINE_IMAGE=arm64v8/alpine:3.14
    export POSTGRES_IMAGE=arm64v8/postgres:13-alpine
    export JDK_VERSION=17
    Java version:
    Copy code
    kshitijchaurasiya@ckshitij:~/SandboxDev/new_airbyte/airbyte$ java --version
    java 17.0.2 2022-01-18 LTS
    Java(TM) SE Runtime Environment (build 17.0.2+8-LTS-86)
    Java HotSpot(TM) 64-Bit Server VM (build 17.0.2+8-LTS-86, mixed mode, sharing)
    Any help will be appreciated 🙂 .
    g
    c
    +3
    • 6
    • 9
  • g

    Gorkem Yurtseven

    01/19/2022, 7:59 PM
    Hi airbyte devs! I have been following the project for a while but today I got the chance to do a deeper dive. Kudos to you all 💯 At first when I saw that airbyte is using temporal I expected all the connector code to be implemented and registered as temporal activities (using temporal python sdk) deployed independently in containers. I would have imagined the Airbyte python CDK would have wrapped around the third party connector code as a temporal activity. This would mean a temporal workflow can invoke connector activities (spec, check, discover, read) and they would be picked up by a temporal worker, retried if they fail, be cancellable ect.. But looks like temporal activities are used to run containerized connector code but the underlying connector code does not communicate with temporal directly anymore. I think for each TemporalAttemptExecution a new container is created and the connector code is executed inside the newly recreated container. If the connector code fails in the container temporal does not know about it, and therefore can't retry the activity, handle cancellations and all the other nice things temporal does.. Also some of the container/process communication needs to be re-invented as this comes for free with temporal. For example there now needs to be manual logic to cancel a running connector process. Did I get this right? This makes me think that maybe I am overestimating temporal’s benefits? Is there a backstory in why connector code is not implemented as temporal activities? My guess would be that temporal python sdk was not mature enough (maybe still not) when airbyte was built.
    • 1
    • 3
  • a

    Ariyo Kabir

    01/20/2022, 10:50 AM
    Hi everyone, how can I used airbyte to handle all updated records from source database. Scenario: I have three records in my source db and I have loaded it to my destination db at the first loading time. Before the second loading time, a record was updated in the source-db but that record still has the same primary_id after update. I noticed that this updated record was added as a new record in the destination db, so I now have two record for the same primary_id. Please, how can i solve issues like this?
    m
    n
    g
    • 4
    • 3
  • m

    Moshe Mazuz

    01/20/2022, 11:17 AM
    Hi everyone! Is anybody know if Airbyte can help us to take all the data changes in SQL server(we use web edition and CDC is not working in our version ) and in mondoDB ?
    • 1
    • 1
  • n

    Nick Miller

    01/20/2022, 6:01 PM
    Hello! I’m trying to understand
    Incremental - Append
    . For typical SQL DBs it seems to reduce down to
    SELECT * from table where cursor_column >= cursor_last_ran;
    . I’ve previously built implementations with postgres similar to this, and one issue I’ve encountered with this that if there is a long running transaction/insert that the
    SELECT *
    may finish and update
    cursor_last_ran
    before the long running transaction finishes. A pretty common default would be a column called
    updated_at
    with a trigger/default value for
    NOW()
    . In postgres if you’re running inserts/updates, the 
    updated_at
     column is updated when the row is updated. But that isn’t visible to the rest of the database until the transaction is committed, which could potentially be many seconds/minutes/hours later. The problem is that the 
    updated_at
     value stays the same time after commit. Is this something Airbyte has an approach to handle?
    g
    • 2
    • 3
  • w

    Wadii Zaim

    01/21/2022, 2:58 PM
    Python3.9.7 Hi! I’m trying to run
    ./gradlew format
    but keep having
    Python call failed: .venv/bin/python -m flake8 . --config /Users/***/projects/airbyte/tools/python/.flake8
    in my source directory. From previous threads I tried removing the .venv but doesn’t work. Any idea please ?
    • 1
    • 5
  • g

    Guido Turturici

    01/21/2022, 7:12 PM
    Hello everyone, We are trying to contribute with
    source-bigcommerce
    with the
    catalog/products
    stream, but we are dealing with the
    acceptance-tests
    . If we pull direct from main branch and run the acceptance test, we are having some issues directly there without making any changes. PR Do you have any idea if this connector have some previous issues with those testing or could be something about our environment? Thanks in advance 🙂
    • 1
    • 3
  • j

    Jens

    01/25/2022, 8:16 AM
    Hi, has anyone deployed Airbyte via docker-compose and integrated as simple reverse proxy for basic auth (nginx, traefik, ...)? Would be great if you could share the compose-file.
    a
    • 2
    • 3
  • k

    kananindzya

    01/25/2022, 5:20 PM
    Hi, does airbyte have java http client?
    • 1
    • 3
  • h

    Harsha Teja Kanna

    01/25/2022, 10:03 PM
    Hi folks, I am working on a side project to Integrate Airbyte and Apache Hudi (experimental). I previously shared it in a blog. Hudi team gave me an opportunity to talk about in monthly community meeting. I want to present few slides on why and what I was trying to do Join in if this sounds interesting you to anyone. https://lnkd.in/gxxtWiJV tomorrow 7AM PST
    • 1
    • 1
  • r

    ramin

    01/26/2022, 8:39 AM
    Is this your first time deploying Airbyte: Yes OS Version / Instance:Windows 10 Memory / Disk: 1Tb SSD Deployment: Docker Airbyte Version: https://github.com/airbytehq/airbyte.git Step: Python CDK Speedrun: Creating a Source Description: When I'm trying to run ./generate.sh command I'll get "While trying to generate a connector, an error occurred on line 38 of generate.sh and the process aborted early. This is probably a bug."
    • 1
    • 1
  • k

    Kelvin Omereshone

    02/02/2022, 9:29 AM
    Hey everyone. I've been a way for a while and I was wondering what it will take to have https://fleetdm.com be a data source on Airbyte and if it's even possible. Any ideas will be great.
    s
    t
    • 3
    • 4
  • k

    kananindzya

    02/02/2022, 11:24 AM
    Hi, I wanted to clarify whether airbyte allows you to create more than 1_000_000 connections? Is there a lot of hardware needed so that, for example, more than 1_000_000 jobs work in parallel?
    • 1
    • 1
  • a

    Ameya Bapat

    02/03/2022, 7:39 AM
    I am trying to see the actual query being run during incremental sync. Added logs in after  
    airbyte-integrations/connectors/source-jdbc/src/main/java/io/airbyte/integrations/source/jdbc/AbstractJdbcSource.java:265
     LOGGER.info("Actual query : {}", sql); I ran  
    ./gradlew :airbyte-integrations:connectors:source-snowflake:build
    and 
    ./gradlew :airbyte-integrations:connectors:source-jdbc:build
     and
    VERSION=dev docker-compose up
    but it is not showing my Looger.info() statements in job's log. I can see the previous and later logger statements but mine. Am i missing any step?
    a
    • 2
    • 6
  • a

    Arian Giles Garcia

    02/03/2022, 8:05 PM
    Hi everyone, how are you? I'm trying to follow the contribution guide, but I'm running into this issue when running
    SUB_BUILD=PLATFORM ./gradlew clean build
    , it gets stuck and keeps printing the
    Database is not ready yet. Please wait a moment, it might still be initializing...
    message over and over again.. Do you know what I might be doing wrong? 🤔 Thank you very much! 🙂
    r
    • 2
    • 6
  • n

    Naveen Sai Patnana

    02/07/2022, 4:17 PM
    Hi Team, We have triggered 2 jobs where source is GoogleAds(airbyte/source-google-ads:0.1.20) and destination is Snowflake(airbyte/destination-snowflake:0.3.14).All of a sudden airbyte server got crashed. We got an error saying "DEADLINE_EXCEEDED: deadline exceeded after 69.999905719s". We even saw the related issues on github and we are running on the suggested configuration. It used to work fine in Airbyte version: 0.29.22. But we are facing this issue after we have upgraded to 0.35.15. Airbyte Version: 0.35.15-alpha Server Details:     Ubuntu 20.04.3 LTS     RAM: 16 GB     4 Core CPU     DiskSpace 200gb Im adding the job logs for better understanding:
    Copy code
    2022-02-07 09:24:27 WARN i.t.i.r.GrpcSyncRetryer(retry):56 - Retrying
    after failure
    io.grpc.StatusRuntimeException: DEADLINE_EXCEEDED: deadline exceeded
    after 69.999905719s. [closed=[],
    open=[[remote_addr=airbyte-temporal/192.168.144.6:7233]]]
    at
    io.grpc.stub.ClientCalls.toStatusRuntimeException(ClientCalls.java:262)
    ~[grpc-stub-1.42.1.jar:1.42.1]
    at io.grpc.stub.ClientCalls.getUnchecked(ClientCalls.java:243)
    ~[grpc-stub-1.42.1.jar:1.42.1]
    at io.grpc.stub.ClientCalls.blockingUnaryCall(ClientCalls.java:156)
    ~[grpc-stub-1.42.1.jar:1.42.1]
    at
    io.temporal.api.workflowservice.v1.WorkflowServiceGrpc$WorkflowServiceBlockingStub.getWorkflowExecutionHistory(WorkflowServiceGrpc.java:2642)
    ~[temporal-servicecl
    ient-1.6.0.jar:?]
    at
    io.temporal.internal.client.WorkflowClientLongPollHelper.lambda$getInstanceCloseEvent$0(WorkflowClientLongPollHelper.java:143)
    ~[temporal-sdk-1.6.0.jar:?]
    at
    io.temporal.internal.retryer.GrpcSyncRetryer.retry(GrpcSyncRetryer.java:61)
    ~[temporal-serviceclient-1.6.0.jar:?]
    at
    io.temporal.internal.retryer.GrpcRetryer.retryWithResult(GrpcRetryer.java:51)
    ~[temporal-serviceclient-1.6.0.jar:?]
    at
    io.temporal.internal.client.WorkflowClientLongPollHelper.getInstanceCloseEvent(WorkflowClientLongPollHelper.java:131)
    ~[temporal-sdk-1.6.0.jar:?]
    at
    io.temporal.internal.client.WorkflowClientLongPollHelper.getWorkflowExecutionResult(WorkflowClientLongPollHelper.java:72)
    ~[temporal-sdk-1.6.0.jar:?]
    at
    io.temporal.internal.client.RootWorkflowClientInvoker.getResult(RootWorkflowClientInvoker.java:93)
    ~[temporal-sdk-1.6.0.jar:?]
    at
    io.temporal.internal.sync.WorkflowStubImpl.getResult(WorkflowStubImpl.java:243)
    ~[temporal-sdk-1.6.0.jar:?]
    at
    io.temporal.internal.sync.WorkflowStubImpl.getResult(WorkflowStubImpl.java:225)
    ~[temporal-sdk-1.6.0.jar:?]
    at
    io.temporal.internal.sync.WorkflowInvocationHandler$SyncWorkflowInvocationHandler.startWorkflow(WorkflowInvocationHandler.java:315)
    ~[temporal-sdk-1.6.0.jar:?]
    at
    io.temporal.internal.sync.WorkflowInvocationHandler$SyncWorkflowInvocationHandler.invoke(WorkflowInvocationHandler.java:270)
    ~[temporal-sdk-1.6.0.jar:?]
    at
    io.temporal.internal.sync.WorkflowInvocationHandler.invoke(WorkflowInvocationHandler.java:178)
    ~[temporal-sdk-1.6.0.jar:?]
    at jdk.proxy2.$Proxy40.run(Unknown Source) ~[?:?]
    at
    io.airbyte.workers.temporal.TemporalClient.lambda$submitSync$3(TemporalClient.java:148)
    ~[io.airbyte-airbyte-workers-0.35.15-alpha.jar:?]
    at
    io.airbyte.workers.temporal.TemporalClient.execute(TemporalClient.java:439)
    ~[io.airbyte-airbyte-workers-0.35.15-alpha.jar:?]
    at
    io.airbyte.workers.temporal.TemporalClient.submitSync(TemporalClient.java:147)
    ~[io.airbyte-airbyte-workers-0.35.15-alpha.jar:?]
    at
    io.airbyte.workers.worker_run.TemporalWorkerRunFactory.lambda$createSupplier$0(TemporalWorkerRunFactory.java:83)
    ~[io.airbyte-airbyte-workers-0.35.15-alpha.jar:?]
    at io.airbyte.workers.worker_run.WorkerRun.call(WorkerRun.java:51)
    [io.airbyte-airbyte-workers-0.35.15-alpha.jar:?]
    at io.airbyte.workers.worker_run.WorkerRun.call(WorkerRun.java:22)
    [io.airbyte-airbyte-workers-0.35.15-alpha.jar:?]
    at
    io.airbyte.commons.concurrency.LifecycledCallable.execute(LifecycledCallable.java:94)
    [io.airbyte-airbyte-commons-0.35.15-alpha.jar:?]
    at
    io.airbyte.commons.concurrency.LifecycledCallable.call(LifecycledCallable.java:78)
    [io.airbyte-airbyte-commons-0.35.15-alpha.jar:?]
    at java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]
    at
    java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
    [?:?]
    at
    java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
    [?:?]
    at java.lang.Thread.run(Thread.java:833) [?:?]
    w
    • 2
    • 1
  • r

    Raj

    02/07/2022, 5:30 PM
    Hi all, After the upgrade to 0.35.x we have observed airbyte is using huge amount of ram. In the below screenshots with no jobs running, airbyte is utilizing 10-11GB RAM on an average. is this expected ?? We had to upgrade from a 16GB system to 32GB as 16GB was no more sufficient
    m
    w
    +2
    • 5
    • 26
  • m

    Mohamed Magdy

    02/08/2022, 5:03 PM
    Hi, is there any plan to support “webhooks” as a data source? For example events webhook https://docs.sendgrid.com/for-developers/tracking-events/event
    • 1
    • 3
  • w

    Will Sargent

    02/09/2022, 2:25 PM
    Hi Airbyte team, I was trying to sign the CLA but it gives me a blank screen: I backed out and gave permissions to the CLA Assistant, but still this:
    • 1
    • 3
  • b

    Blake Enyart

    02/12/2022, 4:06 AM
    Hi there, How difficult would it be to jump in on some feature development for this? We have been tracking this epic for sometime now and would love to be able to contribute to this if there is an entry point to this that won't slow you down too much. https://github.com/airbytehq/airbyte/issues/6911
    t
    • 2
    • 1
  • t

    Thanh Le

    02/12/2022, 11:00 AM
    Hello, I have my connector document on https://docs.airbyte.com/integrations/destinations/streamr. But how can I add it to Homepage connectors at https://airbyte.com/connectors?
    n
    • 2
    • 3
  • n

    Naveen Sai Patnana

    02/14/2022, 12:23 PM
    Hi Team, Many of our jobs are failing frequently with the error attached in logs. Even though we configured for 3 retry attempts, job runs only for 1 time. Could you please look into it Airbyte Version: 0.35.15-alpha Server Details:     Ubuntu 20.04.3 LTS     RAM: 32 GB     8 Core CPU     DiskSpace 200gb
    e
    • 2
    • 3
  • e

    Ethan Veres

    02/16/2022, 3:04 PM
    How does Airbyte order the streams for sources? Seems like they’re ordered by name descending, but that strikes me as a bit naive and there could be better ways to sort the streams. For example, Airbyte should first call the parent stream before calling its sub streams. Would make it more resilient in my opinion. Can someone point me to the sorting code and I can try and come up with a different solution?
    • 1
    • 2
  • m

    Madhu Prabhakara

    02/20/2022, 8:18 PM
    Hey everyone, So I ran into this issue while integrating smartsheet. (https://github.com/airbytehq/airbyte/issues/8099) I wanted to compile airbyte code with the fix suggested in that thread but I get the following error when I try to compile the code.
    *> Task :airbyte-integrations:connectors:source-smartsheets:flakeCheck* FAILED
    [python] .venv/bin/python -m flake8 . --config /home/ubuntu/airbyte/tools/python/.flake8
         
    ./source_smartsheets/source.py:112:17: F841 local variable 'columns' is assigned to but never used
    FAILURE: Build failed with an exception.
    Any clue on what might be wrong....? I am pretty new to this so it possible that I am doing something terribly wrong...
    • 1
    • 3
  • k

    kananindzya

    02/21/2022, 11:44 AM
    Hi guys, I want to clarify, is it possible to encrypt data of table
    actor
    ? I just noticed that all credentials stored as plain text in
    configuration
    field.
    y
    • 2
    • 6
  • k

    Keshav Agarwal

    02/24/2022, 3:35 AM
    Hi trying to make a small contribution in clickhouse normalization , in this https://docs.airbyte.com/operator-guides/transformation-and-normalization/transformations-with-sql#export-plain-sql-files When I run
    docker cp airbyte-server:/tmp/workspace/${NORMALIZE_WORKSPACE}/build/run/airbyte_utils/models/generated/ models/
    I get
    Error: No such container:path: airbyte-server:/tmp/workspace/38450/0/build/run/airbyte_utils/models/generated/
    Am I following the right doc or is there any mistake here?
    n
    • 2
    • 2
  • k

    Keshav Agarwal

    02/24/2022, 4:06 AM
    and for https://docs.airbyte.com/operator-guides/transformation-and-normalization/transformations-with-dbt#validate-dbt-project-settings when I execute
    docker run --rm -i -v airbyte_workspace:/data -w /data/$NORMALIZE_WORKSPACE/normalize --network host --entrypoint /usr/local/bin/dbt airbyte/normalization debug --profiles-dir=. --project-dir=.
    I get
    Copy code
    Running with dbt=0.21.1
    dbt version: 0.21.1
    python version: 3.8.12
    python path: /usr/local/bin/python
    os info: Linux-5.4.0-99-generic-x86_64-with-glibc2.2.5
    Using profiles.yml file at /data/38450/0/normalize/profiles.yml
    Using dbt_project.yml file at /data/38450/0/normalize/dbt_project.yml
    
    Configuration:
      profiles.yml file [ERROR invalid]
      dbt_project.yml file [OK found and valid]
    
    Required dependencies:
     - git [OK found]
    
    1 check failed:
    Profile loading failed for the following reason:
    Runtime Error
      Credentials in profile "normalize", target "prod" invalid: Runtime Error
        Could not find adapter type clickhouse!
    Anything that can I do?
    p
    n
    • 3
    • 6
  • p

    Pras

    02/24/2022, 6:45 AM
    Hello, Just to be clear on scaling airbyte in Kubernetes/GKE. Is this a fair understanding? • Bump MAX_X_WORKERS to higher number to increase per worker load, X being SPEC, CHECK, SYNC and DISCOVER. • Keep SUBMITTER_NUMBER_THREADS to a equal to or higher number as sum of all MAX_X_WORKERS times worker replicas value. For example:- if each of those is configured to 20, and there are three workers running keep this value 240+? • Expose container ports/temporal ports equal to a minimum of at-least sum of all MAX_X_WORKERS. For example:- if each of those is configured to 20 each expose 80 ports 9001 to 9080? The default just lists 30/40. What is the difference/relation between containerPort lines in worker.yaml and TEMPORAL_WORKER_PORTS env variable. One lists 30 and the other 40 - so does not seem to match. • Assuming I am going to fix/reach vertical limit per worker pod and after reaching full util out of running workers, just keep bumping replica count to higher values (as we add more connections)
    • 1
    • 4
1...2021222324Latest