This message was deleted.
# ask-for-help
s
This message was deleted.
a
You can use that docker image as a base image depending your usecase in your
bentofile.yaml
Copy code
docker:
  base_image: docker_image_here
One thing to note that if you are using this option, make sure python is available.
k
I see, thank you! Will try it asap
How would the service.py need to look like then? Since I already have inference in the docker image, and even an endpoint using flask.
Copy code
@app.route("/v1/analyse", methods=["POST"])
def analyse():
    "Analyses an image"
    ...
    return flask.jsonify(out_json)
a
can you give a rough ballpark of what that flask service look like/
k
Yes sure, I think explaining what we are trying to achieve with this might help: we have some ready made docker images that report in accordance to a company-defined API scheme (in JSON). We wanted to wrap them in bentos. Querying
IP:9000/v1/analyse
gives us a json with predictions (conforming with a certain schema). It is a docker image that is essentially ready to be deployed to classify, but we want to wrap it in a bento (and possibly start from a bento with all our future algo's ). The way I understand it, bentoml needs
bentofile.yaml
, where I would put
Copy code
docker:
    base_image: NAME_OF_IMAGE
and then a
service.py
, which should call the model, and route to it. Since we already have a flask server, we would need to stop it, with a new
startup.sh
script, then in
service.py
redefine the endpoint and call the model for the images.
Copy code
app = flask.Flask(__name__, static_url_path="", static_folder="static")

@app.route("/v1/analyse", methods=["POST"])
def analyse():
    """
    Analyses an image

    Expects POST parameter 'image'
    """
    global model
    # get uploaded photos
    uploaded_files = flask.request.files.getlist("image")
    out_json = {
        # API response scheme
        # .......
        "predictions": []
    }
    with tempfile.TemporaryDirectory() as tmp:
         # store images temporarily
    # read image and do classification / prediction
    out_json["predictions"].append(data)
    return flask.jsonify(out_json)
 
if __name__ == "__main__":
    load_model(os.getenv("MODEL_FOLDER_PATH", "../model"))
    app.run(
        host=os.getenv("HOST", "0.0.0.0"),
        port=int(os.getenv("PORT", "9000")),
        use_reloader=False,
    )