Duration
18s
Input Tokens
89721
Output Tokens
127
Cost
$0.00
Context
Input
In the blogpost about the New OpenAI compatible endpoint with Opper. There is an example where we use deepseek-r1-distill-llama-70b. What is the content (question) passed to the model?
Expected output
What is the capital of France? Please reverse the name before answering.
Model output
In the example, the question/content passed to the model was: "What is the capital of France? Please reverse the name before answering."