-
Notifications
You must be signed in to change notification settings - Fork 93
OTEL Metric Support #1511
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Labels
enhancement
New feature or request
Comments
[like] Danzo, Steve reacted to your message:
…________________________________
From: Ujjwal Kumar ***@***.***>
Sent: Sunday, April 13, 2025 3:46:47 PM
To: Arize-ai/openinference ***@***.***>
Cc: Subscribed ***@***.***>
Subject: [Arize-ai/openinference] OTEL Metric Support (Issue #1511)
This is an EXTERNAL EMAIL. Stop and think before clicking a link or opening attachments.
Hi Team,
This library is great for setting up auto intrumentation for llm libraries.
From what I have seen it looks like we are only using SPAN feature from OpenTelemetry.
It will be great if we also support OTEL Metrics<https://opentelemetry.io/docs/specs/otel/metrics/> to keep counters for total input tokens, output tokens, cached tokens, number of llm calls etc..
—
Reply to this email directly, view it on GitHub<#1511>, or unsubscribe<https://github.com/notifications/unsubscribe-auth/BARLQLFUTV47CCF4G54FNE32ZKBGPAVCNFSM6AAAAAB3BHKBD6VHI2DSMVQWIX3LMV43ASLTON2WKOZSHE4TCMZSHEYDENY>.
You are receiving this because you are subscribed to this thread.Message ID: ***@***.***>
[https://avatars.githubusercontent.com/u/13854526?s=20&v=4]kujjwal02 created an issue (Arize-ai/openinference#1511)<#1511>
Hi Team,
This library is great for setting up auto intrumentation for llm libraries.
From what I have seen it looks like we are only using SPAN feature from OpenTelemetry.
It will be great if we also support OTEL Metrics<https://opentelemetry.io/docs/specs/otel/metrics/> to keep counters for total input tokens, output tokens, cached tokens, number of llm calls etc..
—
Reply to this email directly, view it on GitHub<#1511>, or unsubscribe<https://github.com/notifications/unsubscribe-auth/BARLQLFUTV47CCF4G54FNE32ZKBGPAVCNFSM6AAAAAB3BHKBD6VHI2DSMVQWIX3LMV43ASLTON2WKOZSHE4TCMZSHEYDENY>.
You are receiving this because you are subscribed to this thread.Message ID: ***@***.***>
|
@kujjwal02 - this is good feedback and in general agree that for these instrumentors to be robust for observability they need to generate metrics too. We are however pretty focused on the tracing part as it's the biggest pain our users face right now. If you have feedback for what types of metrics you'd like to see please keep us informed! Also PRs are welcome if you would like to add some :) |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Hi Team,
This library is great for setting up auto intrumentation for llm libraries.
From what I have seen it looks like we are only using SPAN feature from OpenTelemetry.
It will be great if we also support OTEL Metrics to keep counters for total input tokens, output tokens, cached tokens, number of llm calls etc..
The text was updated successfully, but these errors were encountered: