Skip to content

Issues: vllm-project/vllm

[RFC]: Deprecating vLLM V0
#18571 opened May 22, 2025 by WoosukKwon
Open 30
[Roadmap] vLLM Roadmap Q2 2025
#15735 opened Mar 29, 2025 by simon-mo
Open 15
Beta
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Author
Filter by author
Loading
Label
Filter by label
Loading
Use alt + click/return to exclude labels
or + click/return for logical OR
Projects
Filter by project
Loading
Milestones
Filter by milestone
Loading
Assignee
Filter by who’s assigned
Assigned to nobody Loading
Sort

Issues list

[Bug]: Deepseek-R1 with DEP16 hangs after kv cache allocation bug Something isn't working
#19101 opened Jun 3, 2025 by ptarasiewiczNV
1 task done
[Bug]: Unable to Run W4A16 GPTQ Quantized Models bug Something isn't working
#19098 opened Jun 3, 2025 by mchambrec
1 task done
[Bug]: ModuleNotFoundError: No module named 'pandas' bug Something isn't working
#19078 opened Jun 3, 2025 by ZhangShuaiyi
1 task done
2
[Bug]: CUDA error: unknown error when running vllm serve on WSL2 Ubuntu22.04 bug Something isn't working
#19077 opened Jun 3, 2025 by ezioasche
1 task done
[Feature]: Metal support feature request New feature or request
#19073 opened Jun 3, 2025 by otarkhan
1 task done
[Bug]: vllm.third_party.pynvml.NVMLError_InvalidArgument: Invalid Argument bug Something isn't working
#19071 opened Jun 3, 2025 by tengdecheng
1 task done
[Bug]: Device selection broken in v0.9 bug Something isn't working
#19069 opened Jun 3, 2025 by derfred
1 task done
[RFC]: Drop CUDA 11.8 Support RFC
#19061 opened Jun 3, 2025 by simon-mo
1 task done
[Bug]: Hermes tool parser stream output error in Qwen3 case bug Something isn't working
#19056 opened Jun 3, 2025 by LiuLi1998
1 task done
[Bug]: vllm 0.9 image gives me gibberish bug Something isn't working rocm Related to AMD ROCm
#19052 opened Jun 3, 2025 by azjam78910
[Bug]: System Memory OOM after upgrading to v0.9.0.1 bug Something isn't working
#19048 opened Jun 3, 2025 by ly0koS
1 task done
[Usage]: TorchDispatchMode does not work for vllm usage How to use vllm
#19044 opened Jun 3, 2025 by helunwencser
1 task done
[Bug]: vllm profiling result contains invalid utf-8 code bug Something isn't working
#19043 opened Jun 3, 2025 by helunwencser
1 task done
[Usage]: how to load multy models in one vllm process? usage How to use vllm
#19042 opened Jun 3, 2025 by 541543262
1 task done
[Bug]: 100% CPU usage when idle. While loop in acquire_read pegging the CPU. bug Something isn't working
#19036 opened Jun 2, 2025 by MathieuBordere
1 task done
[Bug]: Dual a6000 pros not working. Arch 120. bug Something isn't working
#19025 opened Jun 2, 2025 by vladrad
1 task done
ProTip! Follow long discussions with comments:>50.