This message was deleted.
# ask-for-help
s
This message was deleted.
๐Ÿ 1
b
Q: Hi, I'm trying to run a simple service
server.py
run with
bentoml serve ./service.py:svc --reload
Copy code
class MyModelRunnable(bentoml.Runnable):
    SUPPORTS_CPU_MULTI_THREADING = True

    def __init__(self, model_name):
        self.model = None
        self.model_name = model_name

    @bentoml.Runnable.method(batchable=False)
    def transform_annotate(self, test_file):
        return "Annotated!"

test_runner = bentoml.Runner(
    MyModelRunnable,
    name="my_runner_1",
    runnable_init_params={
        "model_name": "test_model",
    }
)

svc = bentoml.Service("drug_reflector_service", runners=[test_runner])

@svc.api(input=File(), output=Text())
async def predict(test_file: io.BytesIO[Any]) -> str:
    return await test_runner.transform_annotate.async_run(test_file)
and a client running
Copy code
res = <http://requests.post|requests.post>(
    "<http://0.0.0.0:3000/predict>",
    files = {"upload_file": open('test.txt', 'rb')},
    headers = {"content-type": "multipart/form-data"}
).text
but no matter what I seem to get an error
Copy code
Traceback (most recent call last):
  File "/opt/conda/envs/saturn/lib/python3.9/site-packages/starlette/formparsers.py", line 171, in parse
    boundary = params[b"boundary"]
KeyError: b'boundary'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/opt/conda/envs/saturn/lib/python3.9/site-packages/starlette/requests.py", line 258, in _get_form
    self._form = await multipart_parser.parse()
  File "/opt/conda/envs/saturn/lib/python3.9/site-packages/starlette/formparsers.py", line 173, in parse
    raise MultiPartException("Missing boundary in multipart.")
starlette.formparsers.MultiPartException: Missing boundary in multipart.

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/opt/conda/envs/saturn/lib/python3.9/site-packages/bentoml/_internal/server/http_app.py", line 311, in api_func
    input_data = await api.input.from_http_request(request)
  File "/opt/conda/envs/saturn/lib/python3.9/site-packages/bentoml/_internal/io_descriptors/file.py", line 222, in from_http_request
    form = await request.form()
  File "/opt/conda/envs/saturn/lib/python3.9/site-packages/starlette/requests.py", line 261, in _get_form
    raise HTTPException(status_code=400, detail=exc.message)
I can't seem to get any files to upload without breaking. Does anyone have an idea? A: It looks like you are having an issue with the multipart form data in your request. Make sure that you are including the correct boundary in the request headers. You can also try using a different library for making the request, such as the requests library. Additionally, you can try using a different encoding type for the request, such as JSON or XML. When using the requests library, you can use the following code to make the request: import requests requests.post( "http://0.0.0.0:3000/predict", headers={"content-type": "application/json"}, data='[[5,4,3,2]]' ).text Alternatively, you can use the curl command to make the request: curl -X POST -H "Content-Type: application/json" \ -data '[{"0":5,"1":4,"2":3,"3":2}]' http://0.0.0.0:3000/predict For more information on using the requests library, please refer to the documentation here: https://requests.readthedocs.io/en/master Powered by Bento Cloud.
d
This is not the answer I'm looking for
b
Q: This is not the answer I'm looking for A: We apologize for the inconvenience. Please try refining your search terms or using different keywords to find the information you are looking for. You can also refer to the BentoML documentation overview page for more information on Ray integration, the API Reference page for more detailed information, the Advanced Guides page for more in-depth information on topics such as logging, performance, GPU, and security, or the Quick Search page for a more comprehensive search. Sources: docs.bentoml.org/en/latest/integrations/ray.html, docs.bentoml.org/en/latest/documentation_overview.html, docs.bentoml.org/en/latest/api_reference.html, docs.bentoml.org/en/latest/guides/metrics.html, docs.bentoml.org/en/latest/guides/gpu.html Powered by Bento Cloud.
s
Hm, sorry, I think there's a bug in the bot and it'll keep responding, but don't worry about that for now. Can you try downgrading to
starlette==0.23.0
and
bentoml==1.0.13
?
๐Ÿ‘ 1
b
Q: Hm, sorry, I think there's a bug in the bot and it'll keep responding, but don't worry about that for now. Can you try downgrading to
starlette==0.23.0
and
bentoml==1.0.13
? A: Yes, I can try downgrading to starlette==0.23.0 and bentoml==1.0.13. Additionally, I can also try explicitly providing a destination directory to MLflow so that I don't accidentally download into the root of the Bento model temp dir (using a model:/ url can cause this). I can also try using the public API download_artifacts introduced in MLflow 1.25, or for MLflow < 1.25, I can use the _download_artifact_from_uri function. Additionally, I can also specify the desired variant of BentoML in the python.packages field, such as "bentoml[grpc]", "bentoml[aws]", "bentoml[io-json]", "bentoml[io-image]", "bentoml[io-pandas]", "bentoml[tracing-jaeger]", "bentoml[tracing-zipkin]", or "bentoml[tracing-otlp]". Additionally, I can view and manage all saved models via the bentoml CLI command, serve the saved bentos by running the bentoml serve command, and generate Powered by Bento Cloud.
s
Ok, I've turned the bot off ๐Ÿ˜… Sorry about that!
๐Ÿ™ 1
d
Ah thank you very much Sauyon!
I pip installed those set versions, I think I'm getting the same error
Solved: was setting request headers
s
Oh, that's great! What was exactly the problem?
d
This stackoverflow post told me to avoid setting the headers, so I changed the client code to:
Copy code
res = <http://requests.post|requests.post>(
    "<http://0.0.0.0:3000/predict>",
    files = {'file': ('test.txt', open('test.txt', 'rb'), 'application/octet-stream')},
    headers = {}
).text
And I specified the mime type!
s
Ah, I see! Good to know for the future, thanks!
d
๐Ÿ˜„Thank you for your help as well!