Slackbot
03/06/2023, 5:50 AMlarme (shenyang)
03/06/2023, 7:17 AMservice.py
example below:
import bentoml
from <http://bentoml.io|bentoml.io> import JSON
class MyRunnable(bentoml.Runnable):
SUPPORTED_RESOURCES = "cpu"
SUPPORTS_CPU_MULTI_THREADING = True
def __init__(self):
self.warmup()
def warmup(self):
print("warming up...")
return 1
@bentoml.Runnable.method(batchable=False)
def echo(self, req: dict):
return req
runner = bentoml.Runner(MyRunnable)
svc = bentoml.Service("my_demo", runners=[runner])
@svc.api(input=JSON(), output=JSON())
async def echo(dic):
ret = await runner.echo.async_run(dic)
return ret
This servce.py
has no problem to run the warming up codes. I wonder if Predictor
has some thing to do with the error. Could you share more detail about this Predictor
?μ©μμ€
03/06/2023, 8:54 AMclass Predictor():
def predict(self):
...
loop = asyncio.new_event_loop() # or asyncio.get_event_loop()
asyncio.set_event_loop(loop)
tasks = [
self.async_func1(),
self.async_func2(),
]
(
res1,
res2,
) = loop.run_until_complete(
asyncio.wait_for(asyncio.gather(*tasks), timeout=0.8)
)
...
return res
larme (shenyang)
03/06/2023, 9:22 AMμ©μμ€
03/06/2023, 9:24 AMlarme (shenyang)
03/06/2023, 9:32 AMpredict
is redis querying then maybe it make sense to do it asynchronously. I need more time to investigate this problem.μ©μμ€
03/06/2023, 9:34 AM