Skip to content

Commit 5736a25

Browse files
gabrielmbmbplaguss
andauthored
Apply suggestions from code review
Co-authored-by: Agus <agustin@argilla.io>
1 parent e52ae3f commit 5736a25

File tree

1 file changed

+2
-2
lines changed
  • src/distilabel/steps/tasks/magpie

1 file changed

+2
-2
lines changed

Diff for: src/distilabel/steps/tasks/magpie/base.py

+2-2
Original file line numberDiff line numberDiff line change
@@ -31,7 +31,7 @@
3131

3232
MAGPIE_MULTI_TURN_SYSTEM_PROMPT = (
3333
"You are a helpful Al assistant. The user will engage in a multi−round conversation"
34-
" with you,asking initial questions and following up with additional related questions."
34+
" with you, asking initial questions and following up with additional related questions."
3535
" Your goal is to provide thorough, relevant and insightful responses to help the user"
3636
" with their queries."
3737
)
@@ -177,7 +177,7 @@ class Magpie(Task, MagpieBase):
177177
fine-tuned LLMs. As they were fine-tuned using a chat template composed by a user message
178178
and a desired assistant output, the instruct fine-tuned LLM learns that after the pre-query
179179
or pre-instruct tokens comes an instruction. If these pre-query tokens are sent to the
180-
LLM without any user message, then the LLM will continue generating tokens as it was
180+
LLM without any user message, then the LLM will continue generating tokens as if it was
181181
the user. This trick allows "extracting" instructions from the instruct fine-tuned LLM.
182182
After this instruct is generated, it can be sent again to the LLM to generate this time
183183
an assistant response. This process can be repeated N times allowing to build a multi-turn

0 commit comments

Comments
 (0)