Jamba

Jamba is a large language model (LLM) developed by ai21-labs, featuring a hybrid SSM-Transformer architecture and a 256k context window. Released in 2026, it represents a new foundation model optimized for long-context processing.

  • Release: Newly released by AI21 Labs (2026).
  • Flavors: Jamba Mini 1.7, Jamba Large 1.7.
  • Architecture: Hybrid SSM-Transformer foundation model (combining state-space-model and Transformer (machine learning)).
  • Key Feature: 256k context window enables processing of extremely long inputs.

2026 04 14 256k context window LLM