Skip to content

Commit 89e9a6e

Browse files
ywang96Alvant
authored andcommitted
[Hotfix][VLM] Fixing max position embeddings for Pixtral (vllm-project#8399)
Signed-off-by: Alvant <alvasian@yandex.ru>
1 parent 77908bb commit 89e9a6e

File tree

1 file changed

+2
-0
lines changed

1 file changed

+2
-0
lines changed

vllm/transformers_utils/config.py

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -206,6 +206,8 @@ def recurse_elems(elem: Any):
206206
config_dict["tie_word_embeddings"] = config_dict.get(
207207
"tie_embeddings", False)
208208
config_dict["max_seq_len"] = config_dict.get("max_seq_len", 128_000)
209+
config_dict["max_position_embeddings"] = config_dict.get(
210+
"max_position_embeddings", 128_000)
209211

210212
if config_dict.get("moe") is not None:
211213
config_dict["architectures"] = ["MixtralForCausalLM"]

0 commit comments

Comments
 (0)