Slackbot
02/27/2023, 12:10 PMAaron Pham
02/27/2023, 3:48 PMbentofile.yaml
docker:
base_image: docker_image_here
One thing to note that if you are using this option, make sure python is available.Krisztián Szabó
02/27/2023, 4:19 PMKrisztián Szabó
02/27/2023, 9:34 PM@app.route("/v1/analyse", methods=["POST"])
def analyse():
"Analyses an image"
...
return flask.jsonify(out_json)
Aaron Pham
02/27/2023, 10:57 PMKrisztián Szabó
02/28/2023, 1:39 PMIP:9000/v1/analyse
gives us a json with predictions (conforming with a certain schema).
It is a docker image that is essentially ready to be deployed to classify, but we want to wrap it in a bento (and possibly start from a bento with all our future algo's ).
The way I understand it, bentoml needs bentofile.yaml
, where I would put
docker:
base_image: NAME_OF_IMAGE
and then a service.py
, which should call the model, and route to it.
Since we already have a flask server, we would need to stop it, with a new startup.sh
script, then in service.py
redefine the endpoint and call the model for the images.Krisztián Szabó
02/28/2023, 1:41 PMapp = flask.Flask(__name__, static_url_path="", static_folder="static")
@app.route("/v1/analyse", methods=["POST"])
def analyse():
"""
Analyses an image
Expects POST parameter 'image'
"""
global model
# get uploaded photos
uploaded_files = flask.request.files.getlist("image")
out_json = {
# API response scheme
# .......
"predictions": []
}
with tempfile.TemporaryDirectory() as tmp:
# store images temporarily
# read image and do classification / prediction
out_json["predictions"].append(data)
return flask.jsonify(out_json)
if __name__ == "__main__":
load_model(os.getenv("MODEL_FOLDER_PATH", "../model"))
app.run(
host=os.getenv("HOST", "0.0.0.0"),
port=int(os.getenv("PORT", "9000")),
use_reloader=False,
)