Back

AI21: Jamba 1.6 Large

Other
Input: text
Output: text
Released: Mar 13, 2025Updated: Mar 28, 2025

AI21 Jamba Large 1.6 is a high-performance hybrid foundation model combining State Space Models (Mamba) with Transformer attention mechanisms. Developed by AI21, it excels in extremely long-context handling (256K tokens), demonstrates superior inference efficiency (up to 2.5x faster than comparable models), and supports structured JSON output and tool-use capabilities. It has 94 billion active parameters (398 billion total), optimized quantization support (ExpertsInt8), and multilingual proficiency in languages such as English, Spanish, French, Portuguese, Italian, Dutch, German, Arabic, and Hebrew.

Usage of this model is subject to the Jamba Open Model License.

256,000 Token Context

Process and analyze large documents and conversations.

Advanced Coding

Improved capabilities in front-end development and full-stack updates.

Agentic Workflows

Autonomously navigate multi-step processes with improved reliability.

Available On

ProviderModel IDContextMax OutputInput CostOutput CostThroughputLatency
AI21ai21256K4K$2.00/M$8.00/M62.5 t/s936 ms
Standard Pricing
Input Tokens
$0.000002

per 1K tokens

Output Tokens
$0.000008

per 1K tokens

Do Work. With AI.