Gemma 4 26B MoE

by Google

Google's efficient Mixture-of-Experts model with 26B parameters but only 3.8B active. Apache 2.0 licensed.

ReasoningVisionToolsStructured outputAudioVideo
Context window
256K
Max output
8K
Input price
/1M
Output price
/1M

Available routes

The same Gemma 4 26B MoE runs on 1 route through the Opper gateway. Each route has its own data-handling posture.

ProviderRegionZero data retentionTrainingLoggingContractInputOutput
GeminiUSNoneConditional
Abuse monitoring (window not specified in public docs)
PAYG

Human review: Abuse-flagged only

Other models from Google

Start building with 300+ models

One API key. Every major provider. Up and running in minutes.

Get startedView Documentation
Gemma 4 26B MoE by Google — pricing, benchmarks, hosting routes | Opper AI