Nemotron-3 Family

NVIDIA’s open-source AI model family featuring three sizes with Mixture-of-Experts (MoE) architecture:

  • Nano: 30B total parameters (3B active)
  • Super: 100B total parameters (10B active)
  • Ultra: 500B total parameters (50B active)

For a detailed review and performance test, see Gary Explains channel. Nematron 3.

2026 04 14 Gary Explains channel Nematron 3