Slackbot
01/30/2023, 3:28 PMEric Riddoch
01/31/2023, 7:15 AMbentoml.<framework.save_model()
saves the model artifact at $HOME/bentoml/models
. But what if you're training in a cloud environment. Would you need to manually zip up that model artifact and save it somewhere? Or is there a cleaner integration with MLFlow/Bento registry?Sarthak Verma
01/31/2023, 7:16 AMEric Riddoch
01/31/2023, 7:26 AMbento...save_model
and mlflow.log_model
are shown. https://github.com/bentoml/BentoML/blob/main/examples/mlflow/pytorch/mnist.py#L212-L250Sarthak Verma
01/31/2023, 7:30 AMSarthak Verma
01/31/2023, 7:30 AMEric Riddoch
01/31/2023, 7:37 AMSarthak Verma
01/31/2023, 8:28 AMEric Riddoch
01/31/2023, 8:39 AMmodel = joblib.load(...)
bentoml.sklearn.save_model(model)
or
mlflow.log_model(model)
(for example)Jiang
01/31/2023, 9:08 AMbentoml models export
subcommand.Jiang
01/31/2023, 9:09 AM➜ bentoml models list
Tag Module Size Creation Time
iris_clf:ihiyxpvbbw2iqusu bentoml.sklearn 5.99 KiB 2023-01-31 10:16:22
➜ bentoml models export iris_clf:ihiyxpvbbw2iqusu
Model(tag="iris_clf:ihiyxpvbbw2iqusu") exported to /home/agent/BentoML2/iris_clf-ihiyxpvbbw2iqusu.bentomodel.
Sarthak Verma
01/31/2023, 10:19 AMSarthak Verma
01/31/2023, 10:19 AMJiang
01/31/2023, 10:19 AMSarthak Verma
01/31/2023, 10:21 AMJiang
01/31/2023, 10:21 AMSarthak Verma
01/31/2023, 10:22 AMJiang
01/31/2023, 10:38 AMSarthak Verma
01/31/2023, 10:38 AMJiang
01/31/2023, 10:38 AMSarthak Verma
01/31/2023, 10:39 AMJiang
01/31/2023, 10:39 AMJiang
01/31/2023, 10:40 AMbentoml containerize
Jiang
01/31/2023, 10:40 AMSarthak Verma
01/31/2023, 10:41 AMJiang
01/31/2023, 10:42 AMSarthak Verma
01/31/2023, 10:43 AMJiang
01/31/2023, 10:43 AM