Slackbot
03/03/2023, 3:51 AMbentogpt
03/03/2023, 3:51 AMserver.py
run with bentoml serve ./service.py:svc --reload
class MyModelRunnable(bentoml.Runnable):
SUPPORTS_CPU_MULTI_THREADING = True
def __init__(self, model_name):
self.model = None
self.model_name = model_name
@bentoml.Runnable.method(batchable=False)
def transform_annotate(self, test_file):
return "Annotated!"
test_runner = bentoml.Runner(
MyModelRunnable,
name="my_runner_1",
runnable_init_params={
"model_name": "test_model",
}
)
svc = bentoml.Service("drug_reflector_service", runners=[test_runner])
@svc.api(input=File(), output=Text())
async def predict(test_file: io.BytesIO[Any]) -> str:
return await test_runner.transform_annotate.async_run(test_file)
and a client running
res = <http://requests.post|requests.post>(
"<http://0.0.0.0:3000/predict>",
files = {"upload_file": open('test.txt', 'rb')},
headers = {"content-type": "multipart/form-data"}
).text
but no matter what I seem to get an error
Traceback (most recent call last):
File "/opt/conda/envs/saturn/lib/python3.9/site-packages/starlette/formparsers.py", line 171, in parse
boundary = params[b"boundary"]
KeyError: b'boundary'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/opt/conda/envs/saturn/lib/python3.9/site-packages/starlette/requests.py", line 258, in _get_form
self._form = await multipart_parser.parse()
File "/opt/conda/envs/saturn/lib/python3.9/site-packages/starlette/formparsers.py", line 173, in parse
raise MultiPartException("Missing boundary in multipart.")
starlette.formparsers.MultiPartException: Missing boundary in multipart.
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/opt/conda/envs/saturn/lib/python3.9/site-packages/bentoml/_internal/server/http_app.py", line 311, in api_func
input_data = await api.input.from_http_request(request)
File "/opt/conda/envs/saturn/lib/python3.9/site-packages/bentoml/_internal/io_descriptors/file.py", line 222, in from_http_request
form = await request.form()
File "/opt/conda/envs/saturn/lib/python3.9/site-packages/starlette/requests.py", line 261, in _get_form
raise HTTPException(status_code=400, detail=exc.message)
I can't seem to get any files to upload without breaking. Does anyone have an idea?
A:
It looks like you are having an issue with the multipart form data in your request. Make sure that you are including the correct boundary in the request headers. You can also try using a different library for making the request, such as the requests library. Additionally, you can try using a different encoding type for the request, such as JSON or XML.
When using the requests library, you can use the following code to make the request:
import requests
requests.post(
"http://0.0.0.0:3000/predict",
headers={"content-type": "application/json"},
data='[[5,4,3,2]]'
).text
Alternatively, you can use the curl command to make the request:
curl -X POST -H "Content-Type: application/json" \
-data '[{"0":5,"1":4,"2":3,"3":2}]' http://0.0.0.0:3000/predict
For more information on using the requests library, please refer to the documentation here: https://requests.readthedocs.io/en/master
Powered by Bento Cloud.David Kim
03/03/2023, 3:53 AMbentogpt
03/03/2023, 3:53 AMsauyon
03/03/2023, 3:58 AMstarlette==0.23.0
and bentoml==1.0.13
?bentogpt
03/03/2023, 3:59 AMstarlette==0.23.0
and bentoml==1.0.13
?
A:
Yes, I can try downgrading to starlette==0.23.0 and bentoml==1.0.13. Additionally, I can also try explicitly providing a destination directory to MLflow so that I don't accidentally download into the root of the Bento model temp dir (using a model:/ url can cause this). I can also try using the public API download_artifacts introduced in MLflow 1.25, or for MLflow < 1.25, I can use the _download_artifact_from_uri function. Additionally, I can also specify the desired variant of BentoML in the python.packages field, such as "bentoml[grpc]", "bentoml[aws]", "bentoml[io-json]", "bentoml[io-image]", "bentoml[io-pandas]", "bentoml[tracing-jaeger]", "bentoml[tracing-zipkin]", or "bentoml[tracing-otlp]". Additionally, I can view and manage all saved models via the bentoml CLI command, serve the saved bentos by running the bentoml serve command, and generate
Powered by Bento Cloud.sauyon
03/03/2023, 4:02 AMDavid Kim
03/03/2023, 4:02 AMDavid Kim
03/03/2023, 4:03 AMDavid Kim
03/03/2023, 4:38 AMsauyon
03/03/2023, 4:38 AMDavid Kim
03/03/2023, 4:42 AMres = <http://requests.post|requests.post>(
"<http://0.0.0.0:3000/predict>",
files = {'file': ('test.txt', open('test.txt', 'rb'), 'application/octet-stream')},
headers = {}
).text
And I specified the mime type!sauyon
03/03/2023, 4:43 AMDavid Kim
03/03/2023, 4:43 AM