Slackbot
04/11/2023, 5:07 PMJim Rohrer
04/11/2023, 5:36 PMPYTHONPATH=. python project/wf/services/save_model.py
Seungchan Lee
04/11/2023, 5:50 PM{
"asctime": "2023-04-11 17:47:53,414",
"name": "flytekit",
"levelname": "WARNING",
"message": "FlyteSchema is deprecated, use Structured Dataset instead."
}
This is just a warning though - not sure why itās throwing an error?Seungchan Lee
04/11/2023, 5:54 PMJim Rohrer
04/11/2023, 5:55 PMJim Rohrer
04/11/2023, 5:55 PMSeungchan Lee
04/11/2023, 5:58 PMimport torch
import os
import typing
from flytekit import workflow
from project.wf.main import Hyperparameters
from project.wf.main import run_wf
_wf_outputs=typing.NamedTuple("WfOutputs",run_wf_0=torch.nn.modules.module.Module)
@workflow
def wf_40(_wf_args:Hyperparameters)->_wf_outputs:
run_wf_o0_=run_wf(hp=_wf_args)
return _wf_outputs(run_wf_o0_)
But thatās just a warning - not sure why it would throw an error? Strange - Iāve never run into this issue beforeSeungchan Lee
04/11/2023, 5:58 PMsave_model.py
is just taking the already trained model thatās saved locally:
import torch
import bentoml
with open("/userRepoData/taeefnajib/PyTorch-MNIST/sidetrek/models/7b12772142b14606286a26751d27b878.pt", "rb") as f:
model = torch.load(f)
saved_model = bentoml.pytorch.save_model("example_model", model)
Seungchan Lee
04/11/2023, 5:59 PM/bentoml
? Why does bentoml care what other files exist in this case?Seungchan Lee
04/11/2023, 6:01 PMJim Rohrer
04/11/2023, 6:31 PMSeungchan Lee
04/11/2023, 6:32 PMSeungchan Lee
04/11/2023, 6:33 PMSeungchan Lee
04/11/2023, 6:33 PMJim Rohrer
04/11/2023, 6:34 PMJim Rohrer
04/11/2023, 6:34 PMSeungchan Lee
04/11/2023, 6:35 PMSeungchan Lee
04/11/2023, 6:35 PMJim Rohrer
04/11/2023, 6:36 PMSeungchan Lee
04/11/2023, 6:36 PMSeungchan Lee
04/11/2023, 6:37 PM.pt
file. Why would bentoml try to read the original project files?Jim Rohrer
04/11/2023, 6:39 PM~/bentoml/models
you should see a folder for your model thereJim Rohrer
04/11/2023, 6:40 PMJim Rohrer
04/11/2023, 6:40 PMSeungchan Lee
04/11/2023, 6:40 PMname: example_model
version: 7rsqyxgysc6hk3uw
module: bentoml.pytorch
labels: {}
options:
partial_kwargs: {}
metadata: {}
context:
framework_name: torch
framework_versions:
torch: 2.0.0
bentoml_version: 1.0.15
python_version: 3.10.10
signatures:
__call__:
batchable: false
api_version: v1
creation_time: '2023-04-11T17:47:53.973669+00:00'
Seungchan Lee
04/11/2023, 6:41 PMSeungchan Lee
04/11/2023, 6:41 PMSeungchan Lee
04/11/2023, 6:42 PMSeungchan Lee
04/11/2023, 6:42 PMJim Rohrer
04/11/2023, 6:42 PMSeungchan Lee
04/11/2023, 6:43 PMmodule "project" not found
error happened during save_model even though itās only using the saved local pt
file?Seungchan Lee
04/11/2023, 6:44 PMSeungchan Lee
04/11/2023, 6:44 PMpt
file, shouldnāt bentoml save_model still work?Jim Rohrer
04/11/2023, 6:48 PMproject
moduleJim Rohrer
04/11/2023, 6:49 PM__init__.py
file in your services
directory? not 100% sure on that oneSeungchan Lee
04/11/2023, 6:49 PMJim Rohrer
04/11/2023, 6:50 PMSeungchan Lee
04/11/2023, 6:50 PMSeungchan Lee
04/11/2023, 6:50 PMJim Rohrer
04/11/2023, 6:52 PMSeungchan Lee
04/11/2023, 7:07 PMSeungchan Lee
04/11/2023, 7:07 PMJiang
04/12/2023, 9:08 AMbentoml models import
what you ask?Seungchan Lee
04/12/2023, 2:06 PMsave_model
as part of a CI/CD automation, but in order to get the resulting bento model version generated by save_model
, I have to print()
to stdout and parse that which is not great. Also a simple warning prints to stderr
which makes things more brittle as itāll be treated as an error and fail to proceed to next step.Seungchan Lee
04/12/2023, 2:12 PMJim Rohrer
04/12/2023, 2:15 PMJim Rohrer
04/12/2023, 2:17 PMAaron Pham
04/12/2023, 9:31 PMJiang
04/13/2023, 1:37 AMsave_model
is actually part of the SDK, not the command line. The difference here is that save_model is used to save a model object located in Python memory to a file, and therefore it must be executed with code at the end of the training pipeline. If we want to support the REST API as you mentioned, we first need a common convention for sending Python objects in memory via the REST API. However, as we all know, such a convention does not exist. If you have a private protocol, you can implement this REST server yourself, but it cannot be promoted to the community for everyone to use.
I'm not sure if I fully understand your question. Can you provide a more specific scenario to help me understand better?Seungchan Lee
04/13/2023, 1:41 AMSeungchan Lee
04/13/2023, 1:42 AMsave_model
part. As for the motivation for wanting REST api for cli commands is for easier automation.Seungchan Lee
04/13/2023, 1:43 AMSeungchan Lee
04/13/2023, 1:44 AMJiang
04/13/2023, 1:44 AM