Multimodal capabilities
Model Architectures & Benchmarking
- mistral-3-large
- Architecture: mixture-of-experts (MoE) with 675B parameters.
- Licensing: open-source (Apache 2.0).
- Competitive Landscape: Benchmarked against DeepSeek V3 and kimi-k2.
- Classification: State-of-the-art non-reasoning model.
Backlinks
Source Notes
- 2026-04-14: “But OpenClaw is expensive…”