This message was deleted.
# ask-for-help
s
This message was deleted.
👀 1
s
Thanks for the insightful post, @Eric Riddoch. It seems like you are integrating with newrelic through the Python Agent. Haven’t read a lot into the implementation, it seems like the instrumentation is automatically injected into the ASGI application if the application was started with
newrelic-admin run-program uvicorn
. Treating BentoML as a simple ASGI application will not work out-of-the-box. BentoML’s runner architecture spawns multiple processes to optimize system resource utilization. Instead of a single uvicorn process, BentoML consists of multiple ASGI processes. I will need to look into New Relic further to investigate a better integration. Since BentoML is fully compliant to OpenTelemetry and Prometheus standards, I’m hopeful that New Relic has proper supports for those standards.
I’m not seeing any evidence of errors, even though most of my requests were to the
/newrelic-error
endpoint
I think this is due to New Relic not recognizing
bentoml.exceptions.InternalServerError
.
No trace ID in the response headers; with FastAPI the response headers include the trace ID. FastAPI also knows how to handle when the trace ID is present in the request headers
As discussed in an earlier thread, a better approach I think is to pass through callers trace context so the caller knows the Trace ID implicitly when errors happen. I think the gist of all these issues is the Python Agent. Again, I believe a better integration with New Relic is just around the corner.
e
If we could take advantage of Bento's already existing setup with prometheus and OpenTelemetry, that would be awesome!
s
New Relic supports OpenTelemetry Protocol for exporting trace data. In BentoML, you can customize the OTLP exporter in the Tracing configuration with New Relic’s endpoint.
New Relic supports exporting Prometheus metrics through their Prometheus OpenMetrics integration for Docker.
Happy to work closer on a POC if you’d like to explore further. 🙂
e
I'd love that! I'm liking the sound of the OpenMetrics integration for docker. If that's a subprocess that we can run in the container that hits the
/metrics
endpoint on a schedule and publishes those to NewRelic, that would be perfect.
Setting up the OpenTelemetry one is exciting, too. I'm not an OTel expert, so I'm not sure how tha tintegration would go
We could sign up for a trial new relic account and put a trivial, dockerized sample project in an open source repo--maybe under the bento organization