Replies: 3 comments 1 reply
-
Hello @samt07! I'm here to help you solve bugs, answer questions, and become a contributor while you wait for a human maintainer. Let's tackle that issue together! To resolve the
By following these steps, you should be able to resolve the error and ensure that To continue talking to Dosu, mention @dosu. Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Bug Report | Other |
Beta Was this translation helpful? Give feedback.
-
As you can see in the code i shared, i m not storing string values in chat_history explicitly. I cant figure out where it is saving it as string when i call invoke |
Beta Was this translation helpful? Give feedback.
-
|
Beta Was this translation helpful? Give feedback.
-
`llm = ChatOpenAI(temperature=0.7, model_name=MODEL)
memory = ConversationBufferMemory(memory_key='chat_history', return_messages=True)
retriever = vectorstore.as_retriever()
conversation_chain = ConversationalRetrievalChain.from_llm(
llm=llm,
retriever=retriever,
memory=memory,
callbacks=[StdOutCallbackHandler()],
combine_docs_chain_kwargs={"prompt": custom_prompt}
)
query = "do you offer rotation services?"
print(memory.load_memory_variables({}))
result = conversation_chain.invoke({"question": query})
answer = result["answer"]
print("\nAnswer:", answer)`
I am getting value error for this during invoke. My custom_prompt function is as below
custom_prompt = ChatPromptTemplate.from_messages([ SystemMessagePromptTemplate.from_template( "You are a helpful assistant for an auto body shop. Use the context and chat history to help answer user questions about services, durations, and pricing.\ If you don't know the answer, just politely say that you don't know, don't try to make up an answer." ), MessagesPlaceholder(variable_name="chat_history"), HumanMessagePromptTemplate.from_template( "Context:\n{context}\n\nQuestion: {question}") ])
Beta Was this translation helpful? Give feedback.
All reactions