Open
Description
Python -VV
from mistral_inference.transformer import Transformer
from mistral_inference.generate import generate
from mistral_common.tokens.tokenizers.mistral import MistralTokenizer
from mistral_common.protocol.instruct.messages import UserMessage
from mistral_common.protocol.instruct.request import ChatCompletionRequest
tokenizer = MistralTokenizer.from_file(f"{mistral_models_path}/tekken.json")
model = Transformer.from_folder(mistral_models_path)
prompt = "How expensive would it be to ask a window cleaner to clean all windows in Paris. Make a reasonable guess in US Dollar."
completion_request = ChatCompletionRequest(messages=[UserMessage(content=prompt)])
tokens = tokenizer.encode_chat_completion(completion_request).tokens
out_tokens, _ = generate([tokens], model, max_tokens=64, temperature=0.35, eos_id=tokenizer.instruct_tokenizer.tokenizer.eos_id)
result = tokenizer.decode(out_tokens[0])
print(result)
Pip Freeze
---------------------------------------------------------------------------
ModuleNotFoundError Traceback (most recent call last)
Input In [3], in <cell line: 1>()
----> 1 from mistral_inference.transformer import Transformer
2 from mistral_inference.generate import generate
4 from mistral_common.tokens.tokenizers.mistral import MistralTokenizer
ModuleNotFoundError: No module named 'mistral_inference.transformer'
Reproduction Steps
I use Mistral_inference for mistral-nemo ,got this issue
Expected Behavior
https://huggingface.co/mistralai/Mistral-Nemo-Instruct-2407
Additional Context
No response
Suggested Solutions
No response