https://bentoml.com logo
Join Slack
Powered by
# support-vasil
  • b

    Bo

    08/05/2022, 2:22 AM
    Hey guys. Seems like we being talking in dm. Let's talk here so we can help you better
  • b

    Bo

    08/05/2022, 2:25 AM
    has renamed the channel from "support-runner_process" to "support-vasil"
  • v

    Vasil Filipov

    08/05/2022, 2:29 AM
    Hello! So apart from the several model instances that i need to run in parallel on CPU, my other question would be - can i schedule big jobs with lots of images somehow... and not wait for them to complete with an open HTTP connection but rather let them run in the background and check for their status occasionally... something like BackgroundTasks in FastAPI
  • b

    Bo

    08/05/2022, 2:37 AM
    I see you are looking for running async?
  • b

    Bo

    08/05/2022, 2:38 AM
    I think what you are looking for is async endpoint. Right now we don't have that at the moment
  • b

    Bo

    08/05/2022, 2:38 AM
    Is that a blocker for you?
  • b

    Bo

    08/05/2022, 2:38 AM
    Right now you do can run multiple models parallel
  • b

    Bo

    08/05/2022, 2:39 AM
    By use the my_runner.predict.async_run
  • v

    Vasil Filipov

    08/05/2022, 2:42 AM
    yes, it needs to be async and i need to run multiple instances of the same model
  • v

    Vasil Filipov

    08/05/2022, 2:47 AM
    and, having something similar to jobs running in the background, as described in the last section of the answer here - https://stackoverflow.com/questions/63169865/how-to-do-multiprocessing-in-fastapi would be amazing
  • b

    Bo

    08/05/2022, 2:48 AM
    Adding people who are experts compare to me here