I'm using the new ObservationAPI in Spring Boot 3.2.2. When I create an Observation.Context I'm supplying my own high-cardinality values (e.g. requestId, conversationId) and I thought these would be tagged and published with the metric which pushed to Elastic via the ElasticMeterRegistry but when I looked at the source for io.micrometer.core.instrument.observation.DefaultMeterObservationHandler then I can see it only creates tags for low-cardinality values from the Observation.Context. I suspect the reason for this is how registration with the MeterRegistry work:
the io.micrometer.core.instrument.Counter.Builder.register method states in its JavaDoc, a new counter is returned only if a counter with the same tag values does not yet exist. This is because each registry is guaranteed to only create one counter for the same combination of name and tags.
..therefore if tags were used for high-cardinality values then the meterregistry is potentially adding a new meter for each metric and thus you get a memory leak. Jonatan Ivanov (Spring Engineering/Micrometer) discusses this in his post https://develotters.com/posts/high-cardinality/ . I think others have asked for a feature where they can add tags dynamically such as this approach (https://dzone.com/articles/spring-boot-metrics-with-dynamic-tag-values).
If this is the case then what's the point of having cardinality values in the Observation.Context? And is there any way to just publish these high-cardinality values as extra context around the metric? I think Jonatan Ivanov is suggesting this kind of information shouldn't be in the metric itself but should be in the logging but it seems a big drawback of micrometer if you can't add this extra contextual info and if it must be in the application logging then how can you link your metric to your log statement?
Question 1
I'm not 100% sure I get this but being able to attach tags dynamically is not the same as attaching high cardinality data. You can do the former in multiple ways: https://github.com/micrometer-metrics/micrometer/pull/4097 but you should not do the latter, my blog post calls out why:
This is not unique to Micrometer but is true for every metric library and metric backend. The point of being able to add high cardinality data to an Observation is being able to use it everywhere else other than metrics: tracing, logging, etc. The Observation API is not just an extra layer that creates metrics for you, it is an API you can use to create any data from your observations you want, metrics is only one of them.
Question 2
There is, you can write your own
ObservationHandlerand whereDefaultMeterObservationHandlerattaches low cardinality tags, you can attach all. Though you need to face the consequences above, if you do this and your data is truly high cardinality, your JVM will run out of heap and your metrics backend will run out of memory/disk space.Question/Statement 3
I was suggesting using a signal that can handle high cardinality data, logging is just one of them but not the only one:
Question 3.5
As I mentioned above, this is a property of metrics, and it is not unique to Micrometer. I also gave a hint at the end to correlate metrics to other signals:
I haven't written a blog post about this but you can see this in action in one of my talks: https://www.youtube.com/watch?v=HQHuFnKvk_U#t=42m46s (I recommend watching the whole talk to have a better understanding about what is happening in the section I linked and also to see how to move between other signals.)