Back

AI21: Jamba Mini 1.6

Other
Input: text
Output: text
Released: Mar 13, 2025Updated: Mar 28, 2025

AI21 Jamba Mini 1.6 is a hybrid foundation model combining State Space Models (Mamba) with Transformer attention mechanisms. With 12 billion active parameters (52 billion total), this model excels in extremely long-context tasks (up to 256K tokens) and achieves superior inference efficiency, outperforming comparable open models on tasks such as retrieval-augmented generation (RAG) and grounded question answering. Jamba Mini 1.6 supports multilingual tasks across English, Spanish, French, Portuguese, Italian, Dutch, German, Arabic, and Hebrew, along with structured JSON output and tool-use capabilities.

Usage of this model is subject to the Jamba Open Model License.

256,000 Token Context

Process and analyze large documents and conversations.

Advanced Coding

Improved capabilities in front-end development and full-stack updates.

Agentic Workflows

Autonomously navigate multi-step processes with improved reliability.

Available On

ProviderModel IDContextMax OutputInput CostOutput CostThroughputLatency
AI21ai21256K4K$0.20/M$0.40/M207.9 t/s460 ms
Standard Pricing
Input Tokens
$0.0000002

per 1K tokens

Output Tokens
$0.0000004

per 1K tokens

Do Work. With AI.