total parameters
The sum of all trainable weights in a machine learning model’s architecture, representing its theoretical capacity to store information. This metric differs from activated parameters (the subset actively used during inference).
Examples
- Kimi K2 (Moonshot AI): 1 trillion total parameters (32 billion activated parameters) Kiki K2 - Prompt Engineering
- DeepSeek V4 (DeepSeek): Open-source suite of large language models (Analysis)
Backlink: 2026 04 14 Kiki K2 Prompt Engineering; 2026 04 24 DeepSeek V4 Next Gen Open Source LLM Performance and Efficiency Analysis
Source Notes
- 2026-04-14: [[lab-notes/2026-04-14-Optimizing-AI-Costs-and-Privacy-with-Local-Open-Source-Models-and-Hybr|“But OpenClaw is expensive…“]]
- 2026-04-24: [[lab-notes/2026-04-24-DeepSeek-V4-Next-Gen-Open-Source-LLM-Performance-and-Efficiency-Analysis|DeepSeek V4: Next-Gen Open-Source LLM Performance and Efficiency Analysis]]