This message was deleted.
# ask-for-help
s
This message was deleted.
f
version: '3.7'
services:
bento:
image: garbage_classification_service:nnarakeap2ra6wew
ports:
- 3000:3000
networks:
- backend
streamlit:
image: streamlit_app:v1
command: bash -c "streamlit run streamlit_app.py"
ports:
- 8501:8501
environment:
- API_ENDPOINT=bento:3000/predict
depends_on:
- bento
networks:
- backend
networks:
backend:
driver: bridge
import io
import os
import numpy as np
import requests
import streamlit as st
from PIL import Image
url = '<http://127.0.0.1:3000/predict>'
#url = os.getenv('API_ENDPOINT', '<http://127.0.0.1:3000/predict>')
#url = '<http://garbage_classification_service>:nnarakeap2ra6wew:3000/predict'
# Create the header page content
st.title("Garbage Classification App")
st.markdown(
"### Classify your gabage images between 'cardboard', 'glass', 'metal', 'paper', 'plastic' and 'trash'",
unsafe_allow_html=True,
)
# Upload a cover image
with open("garbage.jpg", "rb") as f:
st.image(f.read(), use_column_width=True)
st.text("Upload your image.")def predict(img):
"""
A function that sends a prediction request to the API and returns the gabage class.
"""
# Convert the bytes image to a NumPy array
bytes_image = img.getvalue()
numpy_image_array = np.array(Image.open(io.BytesIO(bytes_image)))
# Send the image to the API
response = <http://requests.post|requests.post>(
url,
headers={"content-type": "text/plain"},
data=str(numpy_image_array),
)
if response.status_code == 200:
return response.text
else:
raise Exception("Status: {}".format(response.status_code))# two image input components — one for file uploads and another for a web-cam input
def main():
img_file = st.file_uploader("Upload an image", type=["jpg", "png"])
if img_file is not None:
with st.spinner("Predicting..."):
prediction = predict(img_file)
st.success(f"Your image is {prediction}")
camera_input = st.camera_input("Or take a picture")
if camera_input is not None:
with st.spinner("Predicting..."):
prediction = predict(camera_input)
st.success(f"Your image is {prediction}")
if __name__ == "__main__":
main()
Hope this is understandable ... please let me know, if you need more/different info. Thanks!
j
can you try with the URL
<http://bento:3000>
?
if its still not working add a curl service to see that the bento is in visible from the network with
Copy code
curl:
    image: curlimages/curl 
    command: <http://bento:3000>
    depends_on:
      - bento
    networks:
      - backend
and then try docker compose run curl again after the bento is up and running
f
docker compose run curlHi @jjmachan, thanks so much for your answer.
<http://bento:3000>
is not reahcable in the browser
adding and running the curl part gives the following output:
Starting streamlit-bentoml_bento_1 ... done
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<title>BentoML Prediction Service</title>
<!-- Google Tag Manager -->
<script>(function(w,d,s,l,i){w[l]=w[l]||[];w[l].push({'gtm.start':
new Date().getTime(),event:'gtm.js'});var f=d.getElementsByTagName(s)[0],
j=d.createElement(s),dl=l!='dataLayer'?'&l='+l:'';j.async=true;j.src=
'<https://www.googletagmanager.com/gtm.js?id='+i+dl;f.parentNode.insertBefore(j,f)>;
})(window,document,'script','dataLayer','GTM-WNPGWRM');</script>
<!-- End Google Tag Manager -->
<link rel="stylesheet" type="text/css" href="./static_content/swagger-ui.css" />
<link rel="stylesheet" type="text/css" href="./static_content/index.css" />
<link rel="icon" type="image/png" href="./static_content/favicon-32x32.png" sizes="32x32" />
<link rel="icon" type="image/png" href="./static_content/favicon-96x96.png" sizes="96x96" />
</head>
<body>
<!-- Google Tag Manager (noscript) -->
<noscript><iframe src="<https://www.googletagmanager.com/ns.html?id=GTM-WNPGWRM>"
height="0" width="0" style="display:none;visibility:hidden"></iframe></noscript>
<!-- End Google Tag Manager (noscript) -->
<div id="swagger-ui"></div>
<script src="./static_content/swagger-ui-bundle.js" charset="UTF-8"> </script>
<script src="./static_content/swagger-ui-standalone-preset.js" charset="UTF-8"> </script>
<script src="./static_content/swagger-initializer.js" charset="UTF-8"> </script>
<div class="version">
<div class="version-section"><a href="<https://github.com/bentoml/BentoML>" class="github-corner" aria-label="Powered by BentoML">
<svg width="80" height="80" viewBox="0 0 250 250" style="fill:#151513; color:#fff; position: absolute; top: 0; border: 0; right: 0;" aria-hidden="true">
<path d="M0,0 L115,115 L130,115 L142,142 L250,250 L250,0 Z"></path>
<path d="M128.3,109.0 C113.8,99.7 119.0,89.6 119.0,89.6 C122.0,82.7 120.5,78.6 120.5,78.6 C119.2,72.0 123.4,76.3 123.4,76.3 C127.3,80.9 125.5,87.3 125.5,87.3 C122.9,97.6 130.6,101.9 134.4,103.2" fill="currentColor" style="transform-origin: 130px 106px;" class="octo-arm"></path>
<path d="M115.0,115.0 C114.9,115.1 118.7,116.5 119.8,115.4 L133.7,101.6 C136.9,99.2 139.9,98.4 142.2,98.6 C133.8,88.0 127.5,74.4 143.8,58.0 C148.5,53.4 154.0,51.2 159.7,51.0 C160.3,49.4 163.2,43.6 171.4,40.1 C171.4,40.1 176.1,42.5 178.8,56.2 C183.1,58.6 187.2,61.8 190.9,65.4 C194.5,69.0 197.7,73.2 200.1,77.6 C213.8,80.2 216.3,84.9 216.3,84.9 C212.7,93.1 206.9,96.0 205.4,96.6 C205.1,102.4 203.0,107.8 198.3,112.5 C181.9,128.9 168.3,122.5 157.7,114.1 C157.9,116.9 156.7,120.9 152.7,124.9 L141.0,136.5 C139.8,137.7 141.6,141.9 141.8,141.8 Z" fill="currentColor" class="octo-body"></path>
</svg></a>
</div>
</div>
</body>
</html>
I'm not really sure, what this tells me. Any help is appreciated. Thanks!
PD: When running
docker-compose up
, the curl service gives the following message:
curl: (7) Failed to connect to bento port 3000 after 0 ms: Couldn't connect to server
j
so the containers are connected. docker compose sets up its own network when the network driver is
bridge
(ref: Networking overview | Docker Documentation) did you try changing the url to
bento:3000
in the streamlit app? (ref: Networking in Compose | Docker Documentation)
f
Hi @jjmachan, thanks again so much for thinkig about my problem. I finally got it running. I just changed one line in the docker-compose.yaml. In the streamlit service I changed the API_ENDPOINT to
API_ENDPOINT=<http://bento:3000/predict>
and it seems to work. I'm not sure why this is neccesary though ...
However, now I always get the same prediction from my model (which is not the case, when I use it without bentoml). Any ideas about that? Thanks!
j
can you change the driver to something to
host
and verify the bento manually with swagger UI. Once the driver is changed it should be available via localhost do refer the doc I mentioned, they do a much better job at explaining how networking is setup inside docker compose.