Foundation Model
Large-scale models trained on massive datasets that can be adapted to a wide range of downstream tasks.
Recent Developments
- Jamba 1.7 (ai21-labs):
- Utilizes a unique hybrid SSM-Transformer architecture.
- Supports an expanded 256k context window.
- Available in both Mini and Large flavors.
Related
- 2026 04 14 256k context window LLM