This message was deleted.
# ask-for-help
s
This message was deleted.
👀 2
🍱 1
b
Hello @Maximiliano Biga Can you share the output? Actually, can you rerun the serving command with
--debug
flag?
bentoml serve service.py:svc --debug
m
2022-10-17T19:42:04-0300 [DEBUG] [cli] Importing service "service.py:svc" from working dir: "C:\Users\ABIGA\OneDrive - Pampa Energia\Documents\DS - ML Zoomcamp\7"
2022-10-17T19:42:04-0300 [INFO] [cli] NumExpr defaulting to 8 threads.
2022-10-17T19:42:05-0300 [DEBUG] [cli] Default runner method set to predict, it can be accessed both via runner.run and runner.predict.async_run
2022-10-17T19:42:05-0300 [DEBUG] [cli] 'credit_risk_classifier' imported from source: bentoml.Service(name="credit_risk_classifier", import_str="service:svc", working_dir="C:\Users\ABIGA\OneDrive - Pampa Energia\Documents\DS - ML Zoomcamp\7")
2022-10-17T19:42:05-0300 [INFO] [cli] Prometheus metrics for HTTP BentoServer from "service.py:svc" can be accessed at <http://localhost:3000/metrics>.
2022-10-17T19:42:06-0300 [INFO] [cli] Registering signals...
2022-10-17T19:42:07-0300 [INFO] [cli] Starting master on pid 16688
2022-10-17T19:42:07-0300 [DEBUG] [cli] Socket bound at 0.0.0.0:3000 - fd: 3000
2022-10-17T19:42:07-0300 [INFO] [cli] sockets started
2022-10-17T19:42:07-0300 [DEBUG] [cli] Initializing watchers
2022-10-17T19:42:07-0300 [DEBUG] [cli] cmd: C:\Users\ABIGA\Anaconda3\envs\ml-zoomcamp\python.exe
2022-10-17T19:42:07-0300 [DEBUG] [cli] args: ['-m', 'bentoml_cli.worker.http_dev_api_server', 'service.py:svc', '--fd', '$(circus.sockets._bento_api_server)', '--working-dir', 'C:\\Users\\ABIGA\\OneDrive - Pampa Energia\\Documents\\DS - ML Zoomcamp\\7', '--prometheus-dir', 'C:\\Users\\ABIGA\\bentoml\\prometheus_multiproc_dir']
2022-10-17T19:42:07-0300 [DEBUG] [cli] process args: ['C:\\Users\\ABIGA\\Anaconda3\\envs\\ml-zoomcamp\\python.exe', '-m', 'bentoml_cli.worker.http_dev_api_server', 'service.py:svc', '--fd', '3000', '--working-dir', 'C:\\Users\\ABIGA\\OneDrive - Pampa Energia\\Documents\\DS - ML Zoomcamp\\7', '--prometheus-dir', 'C:\\Users\\ABIGA\\bentoml\\prometheus_multiproc_dir']
2022-10-17T19:42:07-0300 [DEBUG] [cli] running dev_api_server process [pid 12896]
2022-10-17T19:42:07-0300 [INFO] [cli] Arbiter now waiting for commands
2022-10-17T19:42:07-0300 [INFO] [cli] dev_api_server started
2022-10-17T19:42:07-0300 [INFO] [cli] Starting development HTTP BentoServer from "service.py:svc" running on <http://0.0.0.0:3000> (Press CTRL+C to quit)
2022-10-17T19:42:08-0300 [DEBUG] [dev_api_server] Importing service "service.py:svc" from working dir: "C:\Users\ABIGA\OneDrive - Pampa Energia\Documents\DS - ML Zoomcamp\7"
2022-10-17T19:42:09-0300 [INFO] [dev_api_server] NumExpr defaulting to 8 threads.
2022-10-17T19:42:10-0300 [DEBUG] [dev_api_server] Default runner method set to predict, it can be accessed both via runner.run and runner.predict.async_run
2022-10-17T19:42:10-0300 [DEBUG] [dev_api_server] 'credit_risk_classifier' imported from source: bentoml.Service(name="credit_risk_classifier", import_str="service:svc", working_dir="C:\Users\ABIGA\OneDrive - Pampa Energia\Documents\DS - ML Zoomcamp\7")
its never stop the debug text
s
Would you be willing to share your service code with us so we can try to reproduce?
m
sure, its a simple one
s
Thanks!
m
import bentoml
from <http://bentoml.io|bentoml.io> import JSON
model_ref = bentoml.xgboost.get("credit_risk_model:hysjvi2olo7dkue2")
model_runner = <http://model_ref.to|model_ref.to>_runner()
svc = bentoml.Service("credit_risk_classifier", _runners_=[model_runner])
@svc.api(_input_=JSON(), _output_=JSON())
_def_ classify(_application_data_):
prediction = model_runner.predict.run(_application_data_)
return {"status":"Approved"}
hello! Just save my model as a new one, change the code in the service.py file with the new model tag and it works! something may have gone wrong with the save model function the first time.
thank you all
s
Ah, ok! We'll try to see if we can create a better error message for your issue, something's clearly wrong with the way we're handling errors at the moment.
🍱 1
m
surely it also has to do with the fact that I am learning these things, hehe