Duration
14s
Input Tokens
89722
Output Tokens
307
Cost
$0.00
Context
Input
In the blogpost about the New OpenAI compatible endpoint with Opper. There is an example where we use deepseek-r1-distill-llama-70b. What is the content (question) passed to the model?
Expected output
What is the capital of France? Please reverse the name before answering.
Model output
In the blogpost about the New OpenAI compatible endpoint with Opper, the example using deepseek-r1-distill-llama-70b passes the following content (question) to the model:
"What is the capital of France? Please reverse the name before answering."
This is shown in both the Python and Node.js examples where they use the model "groq/deepseek-r1-distill-llama-70b" through Opper's OpenAI-compatible endpoint.