anyone seen errors similar to this? ```airbyte-sch...
# contributing-to-airbyte
d
anyone seen errors similar to this?
Copy code
airbyte-scheduler   | 2021-04-12 12:21:46 ERROR i.t.i.w.Poller(lambda$new$0):70 - {} - Failure in thread Activity Poller taskQueue="SYNC", namespace="default": 1
airbyte-scheduler   | io.grpc.StatusRuntimeException: INTERNAL: Not enough hosts to serve the request
airbyte-scheduler   | 	at io.grpc.stub.ClientCalls.toStatusRuntimeException(ClientCalls.java:262) ~[grpc-stub-1.34.1.jar:1.34.1]
airbyte-scheduler   | 	at io.grpc.stub.ClientCalls.getUnchecked(ClientCalls.java:243) ~[grpc-stub-1.34.1.jar:1.34.1]
airbyte-scheduler   | 	at io.grpc.stub.ClientCalls.blockingUnaryCall(ClientCalls.java:156) ~[grpc-stub-1.34.1.jar:1.34.1]
airbyte-scheduler   | 	at io.temporal.api.workflowservice.v1.WorkflowServiceGrpc$WorkflowServiceBlockingStub.pollActivityTaskQueue(WorkflowServiceGrpc.java:2696) ~[temporal-serviceclient-1.0.4.jar:?]
airbyte-scheduler   | 	at io.temporal.internal.worker.ActivityPollTask.poll(ActivityPollTask.java:105) ~[temporal-sdk-1.0.4.jar:?]
airbyte-scheduler   | 	at io.temporal.internal.worker.ActivityPollTask.poll(ActivityPollTask.java:39) ~[temporal-sdk-1.0.4.jar:?]
airbyte-scheduler   | 	at io.temporal.internal.worker.Poller$PollExecutionTask.run(Poller.java:265) ~[temporal-sdk-1.0.4.jar:?]
airbyte-scheduler   | 	at io.temporal.internal.worker.Poller$PollLoopTask.run(Poller.java:241) [temporal-sdk-1.0.4.jar:?]
airbyte-scheduler   | 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1130) [?:?]
airbyte-scheduler   | 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:630) [?:?]
airbyte-scheduler   | 	at java.lang.Thread.run(Thread.java:832) [?:?]
happens when I put load on the my system. e.g. run now I'm running Acceptance Tests as well as a full
./gradlew build
u
is your temporal docker running? (sometimes mine is down and i need to run manually =p)
u
yes it is
u
the error message makes me think there are too many GRPC connections open
u
or maybe ports?
u
can you check the temporal container log? i didnt dive into this, but maybe in the container there is logged the reason why it was turn down 😄
u
haven’t run into this before