Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] 使用vLLM为基座启动deepseek-r1时,<think>标签可能会遗漏,maxkb要如何适配该问题呢? #2320

Open
nooneaaaa opened this issue Feb 18, 2025 · 5 comments

Comments

@nooneaaaa
Copy link

Contact Information

365501366@qq.com

MaxKB Version

v1.10.0-lts (build at 2025-02-10T17:05, commit: 50f2c96)

Problem Description

这两个图分别是通过maxkb回答和直接后端访问vLLM,确实少了开头的标签,感觉是vLLM的问题

Image

Image

Steps to Reproduce

使用vLLM 100%复现

The expected correct result

No response

Related log output

Additional Information

No response

@Shenguobin0102
Copy link

你好,类似的问题已经做了修复,请升级到 v1.10.1-lts 版本试试

@shaohuzhang1
Copy link
Contributor

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


Hello, similar issues have been fixed, please upgrade to v1.10.1-lts version

@nooneaaaa
Copy link
Author

你好,类似的问题已经做了修复,请升级到 v1.10.1-lts 版本试试

请问要如何设置吗?我这边升级后还是一样的问题。
Image

@shaohuzhang1
Copy link
Contributor

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


Hello, similar issues have been fixed, please upgrade to v1.10.1-lts version to try

How to set it up? I still have the same problem after upgrading.
Image

@jijujiu
Copy link

jijujiu commented Feb 19, 2025

Maybe you should open an issue for vllm.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

5 participants