This message was deleted.
# ask-for-help
s
This message was deleted.
s
You could absolutely use the same docker image and put your training code in your service file, but I guess it depends on how you'd want to get the resulting model out.
e
Oh! Great point! So you'd have a "service" that creates new models 🤣. That probably wasn't the intended use of service files, but hey, if it works. Running with it: • On my laptop: If you mounted
-v ~/bentoml/models:/home/bento/models
, could you simply run a
bento.pytorch.save_model(model)
command to save the resulting model on the host machine? • In the cloud: presumably, you'd want a remote model registry. Maybe MLFlow+S3. Then use a
bentofile.yaml
-created Docker image, then upload it after training. Do both of those look like what you'd imagine?
s
Sure, that would absolutely work! I think it might be a little clunky because obviously it's not what we designed the API for, but it should be just fine! You should in theory be able to export and upload models just fine in your service code using
bentoml.models.export
too.