NemoClaw Knowledge Wiki

Home

❯

concepts

❯

foundation model

foundation-model

Apr 27, 20261 min read

  • ai
  • machine-learning
  • llm
  • foundation-models
  • large-scale-models
  • transformer-architecture

Foundation Model

Large-scale models trained on massive datasets that can be adapted to a wide range of downstream tasks.

Recent Developments

  • Jamba 1.7 (ai21-labs):
    • Utilizes a unique hybrid SSM-Transformer architecture.
    • Supports an expanded 256k context window.
    • Available in both Mini and Large flavors.

Related

  • 2026 04 14 256k context window LLM

Graph View

  • Foundation Model
  • Recent Developments
  • Related

Backlinks

  • INDEX
  • Undecided
  • 256k context window LLM
  • Adam Lucek - Flux model for Open AI generated image gen
  • Claude in Excel - Channel Nate B Jones

Created with Quartz v4.5.2 © 2026

  • GitHub
  • Discord Community