Duration
12s
Input Tokens
89813
Output Tokens
256
Cost
$0.00
Context
Input
In the blogpost about the New OpenAI compatible endpoint with Opper. There is an example where we use deepseek-r1-distill-llama-70b. What is the content (question) passed to the model?
Expected output
What is the capital of France? Please reverse the name before answering.
Model output
In the blog post about the New OpenAI compatible endpoint with Opper, the example using deepseek-r1-distill-llama-70b passes the following question to the model:
"What is the capital of France? Please reverse the name before answering."
This question appears in both the TypeScript and Python examples shown in the blog post, where they demonstrate how to use the Opper API as a drop-in replacement for OpenAI's API.