Replies: 1 comment
-
For me, "wrong number of tensors;expected 292, got 291" means the model you loaded is not in correct format, because there is a update number for llama3.1. (If you test llama-cpp-python v0.2.83 to llama 3.1, it will give error Expected 291, got 292.) You can try another gguf source or download HF version model and transform it into gguf to compare the results. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I am using llama cpp python. I am able to load model Meta-Llama-3-8B-Instruct-Q6_K.gguf. I have downloaded latest model Meta-Llama-3.1-8B-Instruct-Q5_K_M.gguf. But on using it I am getting below error
llm_load_tensors: ggml ctx size = 0.27 MiB
llama_model_load: error loading model: done_getting_tensors: wrong number of tensors; expected 292, got 291
llama_load_model_from_file: failed to load model
2024-07-31 10:45:12.086 Uncaught app exception
Traceback (most recent call last):
File "D:\pdf_to_json\env\Lib\site-packages\streamlit\runtime\scriptrunner\exec_code.py", line 75, in exec_func_with_error_handling
result = func()
^^^^^^
File "D:\pdf_to_json\env\Lib\site-packages\streamlit\runtime\scriptrunner\script_runner.py", line 574, in code_to_exec
exec(code, module.dict)
File "D:\pdf_to_json\ui1.py", line 232, in
main()
File "D:\pdf_to_json\ui1.py", line 228, in main
result = MetaFieldDataExtractionITI(file_path, fields_list, file_extension)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\pdf_to_json\ui1.py", line 193, in MetaFieldDataExtractionITI
llm = Llama(model_path=r"D:\pdf_to_json\models\Meta-Llama-3.1-8B-Instruct-Q5_K_M.gguf", n_ctx=4096, n_gpu_layers=-1)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\pdf_to_json\env\Lib\site-packages\llama_cpp\llama.py", line 358, in init
self._model = self._stack.enter_context(contextlib.closing(_LlamaModel(
^^^^^^^^^^^^
File "D:\pdf_to_json\env\Lib\site-packages\llama_cpp_internals.py", line 54, in init
raise ValueError(f"Failed to load model from file: {path_model}")
ValueError: Failed to load model from file: D:\pdf_to_json\models\Meta-Llama-3.1-8B-Instruct-Q5_K_M.gguf
Beta Was this translation helpful? Give feedback.
All reactions