Skip to content

[Request] Add support for stream_consumer_max_offset_lag metric from RabbitMQ Prometheus exporter #19961

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
pad-master82 opened this issue Mar 29, 2025 · 0 comments
Assignees

Comments

@pad-master82
Copy link

Summary

A new Prometheus metric has been added to RabbitMQ in rabbitmq-server#12765:
stream_consumer_max_offset_lag

This metric is available starting with RabbitMQ 4.0.5, and it represents the maximum offset lag per stream consumer.

Why it matters

In stream-based workloads, tracking consumer lag is critical for:

Identifying slow or stalled consumers

Ensuring real-time processing requirements are met

Troubleshooting throughput bottlenecks

This metric is especially important for teams using RabbitMQ Streams, which is a newer messaging paradigm in RabbitMQ.

Proposal

If possible, it would be great to support this metric in the RabbitMQ OpenMetrics integration.

From looking at the code, maybe it’s as simple as adding the following entry to the _GAUGES list in metrics.py:

_GAUGES = [
    ...
    "rabbitmq_stream_consumer_max_offset_lag": "stream.consumer_max_offset_lag",
]

References

Prometheus metric: stream_consumer_max_offset_lag

Added in: RabbitMQ 4.0.5 (PR #12765)

Related documentation: RabbitMQ Streams

@pad-master82 pad-master82 changed the title [Request] Maybe add support for stream_consumer_max_offset_lag metric from RabbitMQ Prometheus exporter [Request] Add support for stream_consumer_max_offset_lag metric from RabbitMQ Prometheus exporter Mar 29, 2025
@iliakur iliakur self-assigned this Apr 28, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants