This message was deleted.
# ask-for-help
s
This message was deleted.
c
Could you share your code that does the mounting?
o
app = FastAPI() svc.mount_asgi_app(app) @app.get("/metadata") def metadata(): return {"name": churn_model.tag.name, "version": churn_model.tag.version} @app.post("/predict/batch/api/v1") async def predict_batch(_file_: UploadFile = File(...)): print(file.filename) # Handle the file only if it is a CSV if file.filename.endswith(".csv"): # Create a temporary file with the same name as the uploaded # CSV file to load the data into a pandas Dataframe with open(file.filename, "wb")as f: f.write(file.file.read()) data = pd.read_csv(file.filename) print(data.shape) os.remove(file.filename) pred = await model_runner.make_prediction.run(data) # Return a JSON object containing the model predictions return { pred # "Labels": model.predict(data) } else: # Raise a HTTP 400 Exception, indicating Bad Request # (you can learn more about HTTP response status codes here) raise HTTPException(_status_code_=400, detail="Invalid file format. Only CSV Files accepted.")
I tired creating an api like this with bentoml but the out put i got was this
j
Did you assign runners to the
svc
by
svc = Service(runners=[...])
?
o
Yes i did
c
@Oluwaseyi Gbadamosi did you start the server with
bentoml serve
?
or did you start the
app
with something like uvicorn directly?
o
i started with uvicorn
j
The API here is to mount ASGI App to the BentoML Service, not the other way around.
Thus we should start the bentoml service rather than the
app
here.
o
Pls can see an example of what you mean so i can know where i went wrong in my code of do we have a another command to run the server
j
Sorry for not describing the situation clearly.
The API here is to mount ASGI App to the BentoML Service, not the other way around.
The API here is for mounting an app to a BentoML Service, but it seems that you want to mount bentoml service to a existing FastAPI app.
Which is your original need? Or maybe you don't care and just want these things to work?
I can provide some examples once get the answer
o
i want things to work
demo_model = bentoml.picklable_model.get(MODEL_NAME) # Runner & Service load model_runner = bentoml.Runner(PyCaretRunnable,
_models_=[demo_model])
svc = bentoml.Service(SERVICE_NAME,
_runners_=[model_runner])
app = FastAPI()
svc.mount_asgi_app(app) @app.get("/metadata")
_def_ metadata():
return {"name": churn_model.tag.name, "version": churn_model.tag.version} @app.post("/predict/batch/api/v1")
async
_def_ predict_batch(_file_: UploadFile = File(...)):
print(
_file_.filename)
# Handle the file only if it is a CSV if
_file_.filename.endswith(".csv"):
# Create a temporary file with the same name as the uploaded # CSV file to load the data into a pandas Dataframe with open(
_file_.filename, "wb")as f:
f.write(
_file_.file.read())
data = pd.read_csv(
_file_.filename)
print(data.shape) os.remove(
_file_.filename)
pred = await model_runner.make_prediction.run(data) # Return a JSON object containing the model predictions return { pred # "Labels": model.predict(data) } else: # Raise a HTTP 400 Exception, indicating Bad Request # (you can learn more about HTTP response status codes here) raise HTTPException(
_status_code_=400, _detail_="Invalid file format. Only CSV Files accepted.")
j
Then we can just start the bento service as usual following this https://docs.bentoml.org/en/latest/tutorial.html#building-a-bento TL;DR, if your source file is
my_service.py
We could just
bentoml serve my_service
The FastAPI app is already included.
o
That is my code snippet
j
@Oluwaseyi Gbadamosi Just updated the reply
o
okay let me try it out