-
-
Notifications
You must be signed in to change notification settings - Fork 7.8k
[Bug]: vllm profiling result contains invalid utf-8 code #19043
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Labels
bug
Something isn't working
Comments
What version of vLLM are you using? Have you tried the latest version? |
|
hi @chaunceyjiang , thanks for replying! I ran it on v0.8.4. So it is not the latest version. I ran into this problem when doing offline inference. That PR seems to fix it for online serving? Let me try it on latest build though. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Your current environment
The output of
python collect_env.py
🐛 Describe the bug
Profile result for meta-llama/Llama-4-Maverick-17B-128E-Instruct-FP8 contains invalid utf-8 code
Code to run the profile:
Invalid result:
Before submitting a new issue...
The text was updated successfully, but these errors were encountered: