NemoClaw Knowledge Wiki

Home

❯

entities

❯

mistral 3

mistral-3

Apr 22, 20261 min read

  • ai
  • mistral
  • llm
  • moe
  • mistral-3
  • ai-models
  • mixture-of-experts
  • apache-2-0

Mistral 3

Backlink: 2026 04 14 Mistral 3 ai models

Overview

The Mistral 3 family of AI models.

mistral-3-large

  • Architecture: 675-billion parameter mixture-of-experts (MoE).
  • License: apache-20.
  • Capabilities: Validated for coding, creative writing, and logic-based challenges.
  • Access: Deployable via OpenRouter.

Sources

  • https://www.youtube.com/watch?v=WZzQNNdZ7vk
  • https://www.youtube.com/watch?v=IoTy1EDg330

Graph View

  • Mistral 3
  • Overview
  • entities/mistral-3-large
  • Sources

Backlinks

  • INDEX
  • mistral-3-large
  • IBM Mixture of Experts
  • Ibm panel
  • Mistral 3 ai models

Created with Quartz v4.5.2 © 2026

  • GitHub
  • Discord Community