Keep Your LLM Apps Running: How Opper Handles Fallbacks and Aliases
By Mattias Lundell -
Reliable LLM Calls with Opper
With Opper, you get access to hundreds of LLMs from all major providers, all through a single API. Out of the box, Opper gives your applications stability and flexibility that other providers don’t.
Two features make your LLM calls more dependable while keeping your code simple:
- Automatic fallbacks → when one model fails, Opper tries the next.
- Organization-scoped aliases → a friendly name that maps to an ordered list of models.
The result? Higher success rates, fewer fire-drills, and painless migrations when models or vendors change.
How Fallbacks Work
The model
field can be:
- a single model (simple)
- a list of models (with built-in fallback order)
- an alias (centralized, team-friendly, and flexible)
For structured outputs, Opper also retries JSON/XML parsing several times per model before moving on. That means higher success rates—without extra client logic.
Example Flow
Three Ways to call
- Single model (no fallback)
{
"name": "summarize",
"instructions": "Summarize the following text",
"model": "openai/gpt-4o",
"input": {"text": "..."}
}
- List of models (built-in-fallback)
{
"name": "summarize",
"instructions": "Summarize the following text",
"model": [
{"name": "openai/gpt-4o", "options": {"temperature": 0.3}},
{"name": "anthropic/claude-3-5-sonnet", "options": {"temperature": 0.3}}
],
"input": {"text": "..."}
}
- Alias (team-friendly and centrally managed)
Create an alias like sonnet4
that expands to:
aws/claude-sonnet-4-eu → anthropic/claude-sonnet-4 → gcp/claude-sonnet-4
curl -X POST https://api.opper.ai/v2/models/aliases \
-H "Authorization: Bearer YOUR_OPPER_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"name": "sonnet4",
"fallback_models": [
"aws/claude-sonnet-4-eu",
"anthropic/claude-sonnet-4",
"gcp/claude-sonnet-4"
],
"description": "Preferred stack: sonnet eu AWS → anthropic → gcp"
}
Then call it directly:
{
"name": "summarize",
"instructions": "Summarize the following text",
"model": "sonnet4",
"input": {"text": "..."}
}
Options and headers carry through automatically—so you configure once and reuse across all models in the alias.
Why it helps
- More successful calls → fewer failed requests for your users.
- Easier operations → no more late-night pages when a provider rate-limits or goes down.
- Safer upgrades → swap models behind an alias without touching app code.
- Cleaner code → declare preferences once; let Opper handle reliability.
Custom + Vendor models
Aliases can mix your custom models with vendor models. If a custom model is unavailable, Opper just moves to the next model in sequence—no downtime.
Built-in Observability
As with any other operation using the Opper API, every attempt is fully traced:
- Which model ran
- Structured parse retries (JSON/XML)
- Errors, if any
Use these insights to fine-tune alias order, swap models, or confirm reliability improvements.
Keep Shipping, We’ll Handle the Rest
Point your app at a single alias and stop worrying about outages, migrations, or rewrites. Opper takes care of the fallbacks—so your LLM features stay reliable, day and night.