NemoClaw Knowledge Wiki

Home

❯

concepts

❯

4 bit floating point fp4 training

4-bit-floating-point-fp4-training

Apr 30, 20261 min read

  • concept
  • 4-bit-quantization
  • fp4
  • large-language-models
  • llm-training

4 Bit Floating Point (Fp4) Training

Source Notes

  • 2026-04-14: # How does 4bit quantisation work --- --- https://www.youtube.com/watch?v=-cRedoYETzQ Julia Turc The video discusses the evolution and challenges of training large language models (LLMs) w (How does 4bit quantisation work)

Graph View

  • 4 Bit Floating Point (Fp4) Training
  • Source Notes

Backlinks

  • INDEX
  • Entertainment & Games
  • How does 4bit quantisation work

Created with Quartz v4.5.2 © 2026

  • GitHub
  • Discord Community