Slackbot
06/08/2023, 9:10 AMJiang
06/08/2023, 10:34 AMAaron Pham
06/08/2023, 10:50 AMVivien Robert
06/08/2023, 2:34 PMbentoml_logger = logging.getLogger("bentoml")
@svc.api(...)
async def predict(x):
[...]
<http://logger.info|logger.info>("blablabla")
[...]
return ...
I also have the jaeger all-in-one docker running besides the service and tracing enabled using grpc
However, when I do a request to the service, the trace only contains 4 spans:
• one for the whole predict function
• one named /predict http receive
• one named /predict http send presumably for the response headers
• one named /predict http send presumably for the response body
I would like the log outputted by logger.info("blablabla") to be part of the trace. The text version of the logs contains the trace id ie ( blablabla (trace=...
) but it doesn't show in the trace from the jaeger's interface.
Does this make sense?
I have seen that bentoml already uses tools for opentelemetry instrumentation but I would have expected it to work for the bentoml logger, from what i've seen in the logging docs.
Thank you!Aaron Pham
06/26/2023, 10:42 AMbentoml_logger
is reserved specifically for BentoML internal. To include tracing from within the API function call we probably need to do more support wrt to logs