Exporting metrics on a default port + path? #32
Replies: 5 comments 9 replies
-
It seems like a lot of work, because on top of the OT/OM/Prom default setting variance, there are at least 2 other sources of divergence:
Since there's no real way to handle those differences at runtime (unless the language is very dynamic), that puts a lot of pressure on the code generation/macro part of AM to implement all the combinations for all use cases in my opinion |
Beta Was this translation helpful? Give feedback.
-
Okay, I think I want to propose using OpenTelemetry's port 9464 by default. That was originally the port used for the OTel JS implementation specifically but they intentionally decided to use that same port for all implementations of the OTel Prometheus exporter (open-telemetry/opentelemetry-specification#1021). Prometheus has a big list of ports used by different exporters. Theoretically, we could pick a number there. But I think you could make a case that autometrics isn't an exporter in its own right. Rather, it's an opinionated layer on top of OpenTelemetry, so it makes sense to just export the metrics it creates in the normal OpenTelemetry way. |
Beta Was this translation helpful? Give feedback.
-
My $0.02: |
Beta Was this translation helpful? Give feedback.
-
Another issue with running this on a predefined port is that it will make it make it very tricky for multiple services to be run at the same time, specially if they startup the server by default. We can make it easier for users by providing the handler logic and allowing them to hook it up to their own webserver. The ease of use and quality of this does depend on the language. Regarding usage with |
Beta Was this translation helpful? Give feedback.
-
I think the general consensus is that it is a bit odd to have this server startup when including a library, not to mention all the edge cases with the same port being used and even when using different version of the same library in a Rust application. Library authors are encouraged to make it easier to expose the metrics by encapsulating the logic for various popular web frameworks, but should not start any servers by default. |
Beta Was this translation helpful? Give feedback.
-
Should autometrics libraries automatically or optionally start a server that exports metrics on a default port + path?
Prometheus and OpenTelemetry libraries in different languages seem to have a mixed view on this. Some expect you to add the listener yourself while others do it automatically.
Standardizing this across languages would help reduce the configuration needed for any additional tooling we build around autometrics. For example, we could ship a default Prometheus config and/or config files for hosted Prometheus services like Fly.io's.
That said, we might want to make this optional (though not sure if it's opt-in or opt-out). I personally find it surprising that a library I import would open an listener on a port without my explicitly telling it to.
Unfortunately, if we do this, this is another place where OpenTelemetry, OpenMetrics, and Prometheus diverge 🤦.
Beta Was this translation helpful? Give feedback.
All reactions