This message was deleted.
# ask-for-help
s
This message was deleted.
👀 1
l
Hi Sarthak, currently we don't have documentation about integration with jenkins yet. On this page https://docs.bentoml.org/en/latest/guides/ci.html we have some pointer to the relevant topics. Could you share us how you would like to integrate with jenkins?
s
I would be having source code in github and then will try to build a CI CD pipeline to create a prediction service using BentoML and jenkins.
Any help regarding this or any documentation will be highly appreciated
l
What's the deployment target? aws EC2?
Does the source code contain training code? So the pipeline will train a new model for each commit?
s
Deployment target is building docker image on every new commit and pushing that docker image to Azure Container Registry.Source code will contain usual BentoML service.py and yml file-same that we use to create a bento from model pkl file and then containerize that bento
l
May I ask how you want to manage model versioning? For example if you train a new version of the model and want to use it in production, how do you trigger the CI/CD pipeline to bundle the new model and produce a new docker image? Do you have a remote model registry and keep the version inside service.py respository?
s
I am using Azure VM.I have installed jenkins on top of it.So everytime a new model will be trained by committing a new code,a new bento with latest tag will be stored on VM.This is my thought process
l
If I understand correctly, that means all your training codes and BentoML codes (service.py etc.) is in the same repository. Every time you commit new training code, the pipeline should do: 1. train a new model and save it to disk 2. build a new bento using the new model and the BentoML related codes in the repository 3. maybe use
bentoml containerize
to build a docker image and push it to a registory
Is this what you want to achieve?
s
Exactly Irame..This is the thought process.Only thing is that Instead of local disk I have to use Azure VM!
l
I think the workflow is that you push the commit to github, then it will trigger the jenkins pipeline in the Azure vm (maybe using webhook). The pipeline will do 1, 2, 3 mentioned above in the Azure vm and push a docker image to the registry.
s
How to define jenkins pipeline for that?Is there any documentation for it or any blog that have done the same.Any help would be highly appreciated!
l
As I mentioned before we don't have Jenkins integration docs yet. But I can try to work out an example this week. I will ping you back with my findings.
And thanks for the information you provided. It's very helpful for us to improve our docs!
s
Sure Irame..I will desperately look forward to it as it is critical for my project! 👍
l
Hi Sarthak, please check this repo: https://github.com/larme/bentoml-jenkins-simple Hope it could provide some information for your integration
s
GR8...Thanks a lot Irame!
l
Hi @Sarthak Verma, how's your project going? I have a question about your use case. If you training pipeline is triggered by training codes changes, how do you train a new model if the code does not change but training data changes?
s
Hi @Iarme , I am working on it. I have been able to push docker image to acr.Actually I don’t have to build a model,my project involves deployment of pickle files to a prediction service.So if a new pickle file is uploaded then i would like my pipeline to be triggered
@Iarme..Can bentos be overwritten?
What if i change my code in repo and try to run the pipeline again with the same name..In that case bentos would be overwritten?
l
everytime you save a bento without specifying the version, BentoML should generate a new version for the saved bento. So the bento won't be overwritten
s
Hi Iarme..Can BentoML be integrated with Azure Pipelines?