This message was deleted.
# announcements
s
This message was deleted.
c
just curious, are you mostly using bentoml’s save to s3 feature?
Copy code
my_prediction_service.save("s3:/...")
how do you load and serve the model saved in s3?
t
atm im not using bento (only heard about it yesterday). but i have a python app that downloads a pkl from S3. Then i used
load_learner
from fastai to load the model
c
Got it - BentoML can actually make it a lot easier to build this type of workflow. It can directly save to or load from s3
t
great, thanks