Duration
27s
Input Tokens
78968
Output Tokens
262
Cost
$0.00
Context
Input
In the blogpost about the New OpenAI compatible endpoint with Opper. There is an example where we use deepseek-r1-distill-llama-70b. What is the content (question) passed to the model?
Expected output
What is the capital of France? Please reverse the name before answering.
Model output
The content (question) passed to the model deepseek-r1-distill-llama-70b in the example from the blog post is:
"What is the capital of France? Please reverse the name before answering."
This prompt is provided as the user message in an OpenAI-compatible API call using the Opper endpoint and the Deepseek model. The example code shows that this question is sent to the model and that the expected behavior is for the model to both answer the question and to reverse the name in the answer.