This message was deleted.
# ask-for-help
s
This message was deleted.
b
Do you have existing code?
s
No sir, I’m currently evaluating softwares and I think BentoML will be a good fit. Example:
from sklearn.preprocessing import OrdinalEncoder
enc = OrdinalEncoder()
enc.fit()
enc.transform()
I can pickle.dump this object but I was wondering what would be the best approach to load it in a bentoml service such as a runner etc
b
ya that could work too
should be hard to write a runner and test it out for your own
s
Will i just load it globally and call it inside the service predict method?
b
ya that could work. You can initialize the encoder within
Copy code
def __init__(self):
  self.enc = OrdinalEncoder()
s
Sweet thank you
❤️ 1
Something like that
Not exactly Scikit learn, but the same idea
s
Thank you, I appreciate the swift response.
😸
🦄 1
y
Extra 2 cents, for inference, you might not want to initialize a new
OrdinalEncoder
object each time and fit + transform, but rather use the one you created during training so that you don’t run into new values. So it might be suitable to use
custom_objects
when you are saving the model to save the encoder at training time, then load it when you are creating BentoML services.