https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • s

    Stuart Horgan

    11/24/2022, 4:01 PM
    Hi - docker question - can anyone tell me what to run to get airbyte local to pull in my nginx template changes in airbyte-proxy? I have tried
    docker-compose build
    and
    docker-compose up --build
    and neither seem to be doing anything. I can see in the code the chain of events that should produce the file /etc/nginx/nginx.conf: airbyte/airbyte-proxy/Dockerfile takes the template file and copies it to /etc/nginx/templates in the container Then the file airbyte/airbyte-proxy/run.sh uses that template to create etc/nginx/nginx.conf. At least that is how it looks from my limited understanding. But when i run docker-compose with --build option and copy out the nginx folder to check using
    docker cp airbyte-proxy:/etc/nginx .
    , its the original unchanged file. How do i get it to take my changes? Should I just hit the delete button next to the 'proxy' image in the Docker Desktop UI and then run docker-compose up? Very new to docker and just don't want to break everything by accident!
    ✅ 1
    m
    j
    • 3
    • 9
  • j

    James Salmon

    11/24/2022, 5:39 PM
    Hi all, new to AirByte and trying to install a standalone Docker container. However, after cloning the git repo, running docker-compose up / build just gives me a bunch of errors below. I feel like I am missing something obvious here? Traceback (most recent call last): File “/usr/lib/python3/dist-packages/urllib3/connectionpool.py”, line 699, in urlopen httplib_response = self._make_request( File “/usr/lib/python3/dist-packages/urllib3/connectionpool.py”, line 394, in _make_request conn.request(method, url, **httplib_request_kw) File “/usr/lib/python3.10/http/client.py”, line 1282, in request self._send_request(method, url, body, headers, encode_chunked) File “/usr/lib/python3.10/http/client.py”, line 1328, in _send_request self.endheaders(body, encode_chunked=encode_chunked) File “/usr/lib/python3.10/http/client.py”, line 1277, in endheaders self._send_output(message_body, encode_chunked=encode_chunked) File “/usr/lib/python3.10/http/client.py”, line 1037, in _send_output self.send(msg) File “/usr/lib/python3.10/http/client.py”, line 975, in send self.connect() File “/usr/lib/python3/dist-packages/docker/transport/unixconn.py”, line 30, in connect sock.connect(self.unix_socket) ConnectionRefusedError: [Errno 111] Connection refused During handling of the above exception, another exception occurred: Traceback (most recent call last): File “/usr/lib/python3/dist-packages/requests/adapters.py”, line 439, in send resp = conn.urlopen( File “/usr/lib/python3/dist-packages/urllib3/connectionpool.py”, line 755, in urlopen retries = retries.increment( File “/usr/lib/python3/dist-packages/urllib3/util/retry.py”, line 532, in increment raise six.reraise(type(error), error, _stacktrace) File “/usr/lib/python3/dist-packages/six.py”, line 718, in reraise raise value.with_traceback(tb) File “/usr/lib/python3/dist-packages/urllib3/connectionpool.py”, line 699, in urlopen httplib_response = self._make_request( File “/usr/lib/python3/dist-packages/urllib3/connectionpool.py”, line 394, in _make_request conn.request(method, url, **httplib_request_kw) File “/usr/lib/python3.10/http/client.py”, line 1282, in request self._send_request(method, url, body, headers, encode_chunked) File “/usr/lib/python3.10/http/client.py”, line 1328, in _send_request self.endheaders(body, encode_chunked=encode_chunked) File “/usr/lib/python3.10/http/client.py”, line 1277, in endheaders self._send_output(message_body, encode_chunked=encode_chunked) File “/usr/lib/python3.10/http/client.py”, line 1037, in _send_output self.send(msg) File “/usr/lib/python3.10/http/client.py”, line 975, in send self.connect() File “/usr/lib/python3/dist-packages/docker/transport/unixconn.py”, line 30, in connect sock.connect(self.unix_socket) urllib3.exceptions.ProtocolError: (‘Connection aborted.’, ConnectionRefusedError(111, ‘Connection refused’)) During handling of the above exception, another exception occurred: Traceback (most recent call last): File “/usr/lib/python3/dist-packages/docker/api/client.py”, line 214, in _retrieve_server_version return self.version(api_version=False)[“ApiVersion”] File “/usr/lib/python3/dist-packages/docker/api/daemon.py”, line 181, in version return self._result(self._get(url), json=True) File “/usr/lib/python3/dist-packages/docker/utils/decorators.py”, line 46, in inner return f(self, *args, **kwargs) File “/usr/lib/python3/dist-packages/docker/api/client.py”, line 237, in _get return self.get(url, **self._set_request_timeout(kwargs)) File “/usr/lib/python3/dist-packages/requests/sessions.py”, line 555, in get return self.request(‘GET’, url, **kwargs) File “/usr/lib/python3/dist-packages/requests/sessions.py”, line 542, in request resp = self.send(prep, **send_kwargs) File “/usr/lib/python3/dist-packages/requests/sessions.py”, line 655, in send r = adapter.send(request, **kwargs) File “/usr/lib/python3/dist-packages/requests/adapters.py”, line 498, in send raise ConnectionError(err, request=request) requests.exceptions.ConnectionError: (‘Connection aborted.’, ConnectionRefusedError(111, ‘Connection refused’)) During handling of the above exception, another exception occurred: Traceback (most recent call last): File “/usr/bin/docker-compose”, line 33, in <module> sys.exit(load_entry_point(‘docker-compose==1.29.2’, ‘console_scripts’, ‘docker-compose’)()) File “/usr/lib/python3/dist-packages/compose/cli/main.py”, line 81, in main command_func() File “/usr/lib/python3/dist-packages/compose/cli/main.py”, line 200, in perform_command project = project_from_options(‘.’, options) File “/usr/lib/python3/dist-packages/compose/cli/command.py”, line 60, in project_from_options return get_project( File “/usr/lib/python3/dist-packages/compose/cli/command.py”, line 152, in get_project client = get_client( File “/usr/lib/python3/dist-packages/compose/cli/docker_client.py”, line 41, in get_client client = docker_client( File “/usr/lib/python3/dist-packages/compose/cli/docker_client.py”, line 170, in docker_client client = APIClient(use_ssh_client=not use_paramiko_ssh, **kwargs) File “/usr/lib/python3/dist-packages/docker/api/client.py”, line 197, in init self._version = self._retrieve_server_version() File “/usr/lib/python3/dist-packages/docker/api/client.py”, line 221, in _retrieve_server_version raise DockerException( docker.errors.DockerException: Error while fetching server API version: (‘Connection aborted.’, ConnectionRefusedError(111, ‘Connection refused’))
    ☝️ 1
    s
    • 2
    • 4
  • g

    Gopinath Sekar

    11/24/2022, 7:09 PM
    Hello! Ive been trying to do some custom transformation on AirByte using DBT. Currently am stuck with the error message 'Fatal: Invalid --project-dir flag. Not a dbt project. Missing dbt_project.yml file. Has anyone faced this? Am I doing this wrong? Please help. Thanks in Advance!
    m
    a
    • 3
    • 4
  • s

    Sean Soutar

    11/24/2022, 8:37 PM
    hi all! I’ve been trying to sync my postgres DB to BigQuery using GCS staging approach. I’ve encountered “No timezone information found” error on my cursor column. It is of type
    timestamp
    in my Postgres DB. This matches what another user was experiencing in September of this year: https://github.com/airbytehq/airbyte/issues/17319 I created a small sub-table of the data i want to sync in postgres and I changed it to
    timestamp with time zone
    and it currently seems to be syncing using the BigQuery destination (however I changed it the beta denormalized version). It is currently syncing okay. I don’t really want to have to change the data type for the main table as it is quite large. Has anyone else experienced this problem before?
    s
    • 2
    • 8
  • f

    Fabiano Pena

    11/24/2022, 11:06 PM
    Hi Team, Is it possible to increase the parallelism by creating more than 1 pod per stream so we can speed up the loading? If not, is there another option for optimizing the job?
    s
    • 2
    • 3
  • r

    Rahul Borse

    11/25/2022, 6:19 AM
    Hi Team, My postgres source tables have primary key but still increment deduped sync option is not available. According to document it should be there. Can someone help me on this.
    m
    • 2
    • 9
  • h

    Heine Bodekær

    11/25/2022, 7:43 AM
    Hi Team - We have been struggling quite a while with stabilizing our Airbyte env. We are trying to sync a massive postgres to Snowflake. The Postgres is huge in terms of number of tables. The structure is tenant based which means tables are replicated for each tenant. We are using Octavia to setup sources and connections. We have split the connections up into tenant+subset of tables meaning we have 200+ connections with 10-20 tables in each connection. This creates a very unresponsive interface no matter the size of EC2 instance we are running on. Looking at the EC2 usage we are quite low on CPU and memory consumption however the interface is still unresponsive and we keep getting "Oops! Something went wrong…" messages. This means we actual don't know if airbyte is working or not. Looking at the target tables we don't see any updates. Currently moving to kubernetes based setups are not an option. Do you have any suggestions on addressing this issue?
    🔼 1
    u
    • 2
    • 3
  • r

    Rahul Borse

    11/25/2022, 12:08 PM
    Hi Team, As per the below airbyte roadmap for column selection implementation it will be finished in Q4. However can someone please let me know when this feature will be available, if you can share the timeline. We need to make few decision based it and use it in our product. https://app.harvestr.io/roadmap/view/pQU6gdCyc/launch-week-roadmap?p=lymTZInr0
    • 1
    • 1
  • r

    Rahul Borse

    11/25/2022, 1:46 PM
    Hi Team, Is there any way we can encrypt source columns from postgres connector? Atleast I can get help to understand where in source code I can make changes. If someone can help please let me know.
    • 1
    • 1
  • a

    Alexander Schmidt

    11/25/2022, 4:16 PM
    Is there a reason you guys have container_name in your compose file? When deploying as swarm i need to remove all container_name properties from the compose file (and rename server and db in compose and .env) or else nginx cant find the host in airbyte-server
    m
    • 2
    • 1
  • j

    Jonathan Vieira

    11/25/2022, 5:51 PM
    Hey guys. just have a quick question. Is Airbyte always collecting usage data from my deployment? or just if I have this option set. (I can't found anything on the docs)
    j
    a
    c
    • 4
    • 4
  • k

    KalaSai

    11/25/2022, 6:47 PM
    Hi, I am very new to airbyte. I am trying to setup Postgres - snowflake destination replication and would need to containerize the whole solution ec2. Can someone pls help me with what are the steps to attain this? The QuickStart points to setting up of a local application and creating connections. But I need to set the cdc and replication in ec2 or somewhere. By reading through and following few videos I think I would need: 1. airbyte running in one pod - what are the steps how to attain this? 2. Pull sNowflake destination image- pull, build with gradle and then build 3. Postgres stand-alone image setup Not understanding how do I connect the dots? Greatly appreciate your help
    • 1
    • 1
  • m

    Manish Tomar

    11/25/2022, 9:23 PM
    Hi, Even though I have specified the 'schema' name in Snowflake destination connector still Airbyte insert the data into public schema instead of the specified schema. Please Help 🙏
    • 1
    • 1
  • b

    Bhavya Verma

    11/26/2022, 7:06 AM
    Hi guys, Hope you all are doing good, So I wanted to ask since I'm not an expert on Kubernetes.... Is it possible to deploy our AIrbyte Open Source on Kubernetes over GCP? If yes how should I go about doing the same... Should I be using GKE engine which GCP has to offer? And then finally have a redirect URL to the same with a Google Sign-On authentication layer over it
    f
    • 2
    • 3
  • g

    Gopinath Sekar

    11/26/2022, 9:17 AM
    Hi Guys, am currently trying to pull a table from redshift do dbt transformation on the table and push into snowflake. I have setup the replication for the table required and the sync is happening. But, i don't see the transformation being applied. Any idea why? Can someone point it out if am doing it wrong?
    r
    • 2
    • 3
  • r

    Resford Rouzer

    11/26/2022, 6:12 PM
    Has anyone had an issue making a custom google analytics report where a metric name contains a space? I am using the GA4 Dimensions & Metrics Explorer https://ga-dev-tools.web.app/ga4/dimensions-metrics-explorer/ and when I use their query explorer it fails.
    • 1
    • 1
  • j

    junfeng pan

    11/27/2022, 4:19 AM
    I try to use airbyte to synchronize data from shopify to postgres, the basic raw_data has no problem, but in the use of transformation of normalized tabular data, will generate a lot of price_set and other sub-tables, I want to avoid the transformation of these json objects, how should I operate it, have you encountered the same problem partners?
    🙏 1
    m
    • 2
    • 4
  • b

    Benedikt Buchert

    11/27/2022, 7:57 AM
    Hi, I'm having issues writing data to BigQuery from Facebook Marketing Source, when using the GCS Staging Method. I have assigned the following permissions to the service account. See Screenshot. • BigQuery Data Editor (Was not sure with provided documentation if needed when using GCS found a hint on Slack) • BigQuery Job User (Was not sure with provided documentation if needed when using GCS found a hint on Slack) • BigQuery User (Was not sure with provided documentation if needed when using GCS found a hint on Slack) • Storage Object Admin I also generated an HMAC Key + Secret for the Bucket as described here: https://airbytehq.github.io/integrations/destinations/bigquery/#recommended-using-a-google-cloud-storage-bucket and here: https://cloud.google.com/storage/docs/authentication/managing-hmackeys#create This confused me though:
    • Make sure the HMAC key is created for the BigQuery service account, and the service account has permission to access the GCS bucket and path.
    Additionally added the optional Service Account Key JSON (Not sure what that would do) But I'm still getting permission errors.
    Copy code
    2022-11-27 07:43:14 - Additional Failure Information: com.amazonaws.services.s3.model.AmazonS3Exception: Forbidden (Service: Amazon S3; Status Code: 403; Error Code: 403 Forbidden; Request ID: null; S3 Extended Request ID: null; Proxy: null), S3 Extended Request ID: null
    Strangely it is talking about Amazon S3. See in the screenshot bellow. The Connection test when saving the connection passes though. When clicking retest connection I don't see an error nor a success message though.
    🦆 1
    ✅ 1
    y
    • 2
    • 6
  • m

    Miki Haiat

    11/27/2022, 12:07 PM
    Hi team , Im syncing Facebook afacebook ads to Big Query using normalized tabular data . After sync i can see many tables created in BQ. Are there any documents describing the new table schemes and fields?
    ✅ 1
    b
    • 2
    • 3
  • m

    Manish Tomar

    11/27/2022, 4:11 PM
    Please Help !
    • 1
    • 1
  • m

    Mazin Zreik

    11/27/2022, 8:12 PM
    Hi guys I'm considering going on Airbyte to Move my RDS MySQL data into redshift. My question is: Is it possible to set the connection to use incremental sync without having duplicates (Updating modified rows and append new only)?
    • 1
    • 1
  • t

    Temidayo Azeez

    11/28/2022, 4:17 AM
    Help needed This is my first time using Airbyte. I currently using this tutorial, and I have been having trouble setting up my source. I keep getting
    this connections test failed. Non-json response
    . Below are some of the log files. This is the link to the tutorial: https://airbyte.com/tutorials/real-time-data-analytics-pipeline
    Copy code
    airbyte-webapp                      | 172.20.0.10 - airbyte [28/Nov/2022:03:28:44 +0000] "GET /api/v1/health HTTP/1.0" 200 18 "<http://localhost:8000/workspaces/dca82ff5-4150-4056-841-57eb99c7c045/source/new-source|http://localhost:8000/workspaces/dca82ff5-4150-4056-841-57eb99c7c045/source/new-source>" "Mozilla/5.0 (Windows NT 10.0; nWin64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/107.0.0.0 Safari/537.36" "172.20.0.1  airbyte-proxy                       | 172.20.0.1 - airbyte [28/Nov/2022:03:28:44 +0000] "GET /api/v1/health HTTP/1.1" 200 18 "<http://localhost:8000/workspaces/dca82ff5-4150-4056-841b-57eb99c7c045/source/new-source|http://localhost:8000/workspaces/dca82ff5-4150-4056-841b-57eb99c7c045/source/new-source>" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/107.0.0.0 Safari/537.36" 
    airbyte-worker                      | 2022-11-28 03:28:54 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - HikariPool-1 - Shutdown initiated...airbyte-worker                      | 2022-11-28 03:28:58 INFO i.a.w.i.DefaultAirbyteStreamFactory(
    internalLog):120 - HikariPool-1 - Shutdown completed.
    airbyte-worker                      | 2022-11-28 03:28:58 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - Completed integration: io.airbyte.integrations.base.ssh.SshWrappedSource
    airbyte-worker                      | 2022-11-28 03:28:58 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - completed source: class io.airbyte.integrations.source.mysql.MySqlSourceairbyte-worker                      | 2022-11-28 03:29:00 
    -28 03:29:00 INFO i.a.c.i.LineGobbler(voidCall):114 -
    airbyte-worker                      | 2022-11-28 03:29:00 INFO i.a.c.i.LineGobbler(voidCall):114 - ----- END CHECK -----
    airbyte-worker                      | 2022-11-28 03:29:00 INFO i.a.c.i.LineGobbler(voidCall):114 -airbyte-server                      | 2022-11-28 03:29:01 INFO i.a.s.RequestLogger(filter):112 - REQ 172.20.0.7 POST 200 /api/v1/scheduler/sources/check_connection - 
    {"connectionConfiguration":"REDACTED","workspaceId":"dca82ff5-4150-4056-841b-57eb99c7c045","sourceDefinitionId":"435bb9a5-7887-4809-aa58-28c27df0d7ad"}
    airbyte-server                      | 2022-11-28 03:29:04 INFO i.a.s.RequestLogger(filter):112 - REQ 172.20.0.7 GET 200 /api/v1/health
    airbyte-webapp                      | 172.20.0.10 - airbyte [28/Nov/2022:03:29:04 +0000] "GET /api/v1/health HTTP/1.0" 200 18 "<http://localhost:8000/workspaces/dca82ff5-4150-4056-841b-57eb99c7c045/source/new-source|http://localhost:8000/workspaces/dca82ff5-4150-4056-841b-57eb99c7c045/source/new-source>" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/107.0.0.0 Safari/537.36" "172.20.0.1"
    airbyte-proxy                       | 172.20.0.1 - airbyte [28/Nov/2022:03:29:04 +0000] "GET /api/v1/health HTTP/1.1" 200 18 "<http://localhost:8000/workspaces/dca82ff5-4150-4056-841b-57eb99c7c045/source/new-source|http://localhost:8000/workspaces/dca82ff5-4150-4056-841b-57eb99c7c045/source/new-source>" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/107.0.0.0 Safari/537.36"
    airbyte-cron                        | 2022-11-28 03:29:08 INFO i.a.c.s.DefinitionsUpdater(updateDefinitions):54 - Connector definitions update disabled.
    airbyte-server                      | 2022-11-28 03:29:23 INFO i.a.s.RequestLogger(filter):112 - REQ 172.20.0.7 GET 200 /api/v1/health
    airbyte-webapp                      | 172.20.0.10 - airbyte [28/Nov/2022:03:29:23 +0000] "GET /api/v1/health HTTP/1.0" 200 18 "<http://localhost:8000/workspaces/dca82ff5-4150-4056-841b-57eb99c7c045/source/new-source|http://localhost:8000/workspaces/dca82ff5-4150-4056-841b-57eb99c7c045/source/new-source>" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/107.0.0.0 Safari/537.36" "172.20.0.1"
    airbyte-proxy                       | 172.20.0.1 - airbyte [28/Nov/2022:03:29:23 +0000] "GET /api/v1/health HTTP/1.1" 200 18 "<http://localhost:8000/workspaces/dca82ff5-4150-4056-841b-57eb99c7c045/source/new-source|http://localhost:8000/workspaces/dca82ff5-4150-4056-841b-57eb99c7c045/source/new-source>" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/107.0.0.0 Safari/537.36"
    airbyte-temporal                    | {"level":"info","ts":"2022-11-28T03:29:25.328Z","msg":"none","service":"matching","component":"matching-engine","wf-task-queue-name":"/_sys/NOTIFY/3","wf-task-queue-type":"Workflow","lifecycle":"Starting","logging-call-at":"taskQueueManager.go:238"}airbyte-temporal                    | {"level":"info","ts":"2022-11-28T03:29:25.328Z","msg":"none","service":"matching","component":"matching-engine","wf-task-queue-name":"/_sys/NOTIFY/3","wf-task-queue-type":"Workflow","lifecycle":"Started","logging-call-at":"taskQueueManager.go:242"}
    s
    • 2
    • 4
  • r

    Rahul Borse

    11/28/2022, 5:50 AM
    Hi Team, Once we setup source as postgres and destination as s3 and while creating connection we need to select schemas and tables which needs to write in destination. Problem here is if we select 2 tables then in destination it will create two files for each table. Is there any possibility while creating a connection we write a custom query and merge this two table into one file? If not is there any plan for development this functionlity or where in source code I can make changes. Is it possible to make this kind of modification?
    m
    • 2
    • 6
  • j

    Jaafar

    11/28/2022, 7:48 AM
    Hello. What are the help options when we need urgent help with Airbyte? I have posted this request in the discourse forum but it’s quite urgent, so I am willing to pay someone to help me fix it quickly.
    m
    • 2
    • 1
  • t

    Temidayo Azeez

    11/28/2022, 7:49 AM
    Is there anyone to help me out with my issue 👆
  • t

    Tmac Han

    11/28/2022, 8:45 AM
    Hello. I'm a developer from databend(https://github.com/datafuselabs/databend) and these days I add a new destination (databend cloud https://app.databend.com/databend/home) for airbyte, the pr is https://github.com/airbytehq/airbyte/pull/19815. I have made the unit tests and integration tests passed so would like to help me to review it? Thank you very much!
    • 1
    • 2
  • m

    Miki Haiat

    11/28/2022, 9:34 AM
    Hi team , Im trying to sync Facebook campaigns table to big query and encounter this error
    Copy code
    Unhandled error while executing model.airbyte_utils.campaigns_stg
    Pickling client objects is explicitly not supported.
    Clients have non-trivial state that is local and unpickleable.
    Other tables are successfully synced
    • 1
    • 1
  • f

    Frank Kody

    11/28/2022, 9:40 AM
    Is this your first time deploying Airbyte: No OS Version / Instance: AWS EC2 Linux Deployment: Docker Airbyte Version: 0.40.18 Step: AWS EC2 Linux Connection details: MySQL 1.0.12 -> Redshift 0.3.51 Description: I’ve been running into an issue several times over the past few weeks (I posted about it once on this channel but got no response). In short, I am syncing a table that results in duplicates when COPY’d into Redshift. This has occurred when I have run Full Refresh and when Ive run Incremental refreshes; and seems to occur randomly for different tables. Here is an example where I run an Incremental Append refresh for the 1st time on a table, but when it is written, the data is duplicated. This can be seen in the logs as well, where the sync summary claims that the correct amount of rows are written to the table,
    Copy code
    {
      "streamName": "table_name",
      "stats": {
        "recordsEmitted": 428544,
        "bytesEmitted": 260633069,
        "recordsCommitted": 428544
      }
    }
    but if I look at the normalization step it is showing the true amount of rows inserted - 857086 (which is duplicate of the data)
    Copy code
    2022-11-26 20:50:46 [42mnormalization[0m > 17 of 117 OK created incremental model schema.table_name..................................................... [[32mINSERT 0 857086[0m in 85.81s]
    Would be great to get some help trouble shooting what is going wrong
    n
    • 2
    • 8
  • l

    Lihan Li

    11/28/2022, 10:28 AM
    Hi, I’m getting errors from Zendesk Support connector, please see full log below
    m
    • 2
    • 7
  • c

    Chandrashekhar S

    11/28/2022, 12:21 PM
    Hi.. I'm trying to build a source HTTP connector. Added sample API and trying to do Local Deployment. I'm getting this error when i run docker compose up. airbyte-worker and airbyte-cron Getting failed. How can i get rid of this error..?
    • 1
    • 1
1...100101102...245Latest