I have a question of deploying bentoml as a servic...
# ask-for-help
k
I have a question of deploying bentoml as a service. I have built a docker image for mlflow, and I can send to it my models from cli or python specifying http//<ip>5000. I want to have something similar for bentoml, a separate docker container. Is that possible?