R1 not putting out the full model response with transformers pipeline
#21
by
rcgalbo
- opened
Use a pipeline as a high-level helper
from transformers import pipeline
messages = [
{"role": "user", "content": "Reason about the number 0?"},
]
pipe = pipeline("text-generation", model="deepseek-ai/DeepSeek-R1-Distill-Qwen-1.5B")
pipe(messages)
output:
----------
[{'generated_text': [{'role': 'user', 'content': 'Reason about the number 0?'},
{'role': 'assistant',
'content': 'To determine the value of 0, I first consider the definition of zero as the absence of any'}]}]
+1
I also see the assistant content is missing the opening <think>
tag.
My script duplicates the OP: https://github.com/guynich/deepseek_transformers