Back

Dolphin 2.9.2 Mixtral 8x22B 🐬

Mistral
Input: text
Output: text
Released: Jun 8, 2024Updated: Mar 28, 2025

Dolphin 2.9 is designed for instruction following, conversational, and coding. This model is a finetune of Mixtral 8x22B Instruct. It features a 64k context length and was fine-tuned with a 16k sequence length using ChatML templates.

This model is a successor to Dolphin Mixtral 8x7B.

The model is uncensored and is stripped of alignment and bias. It requires an external alignment layer for ethical use. Users are cautioned to use this highly compliant model responsibly, as detailed in a blog post about uncensored models at erichartford.com/uncensored-models.

#moe #uncensored

16,000 Token Context

Process and analyze large documents and conversations.

Advanced Coding

Improved capabilities in front-end development and full-stack updates.

Agentic Workflows

Autonomously navigate multi-step processes with improved reliability.

Available On

ProviderModel IDContextMax OutputInput CostOutput CostThroughputLatency
NovitaAInovitaAi16K8K$0.90/M$0.90/M13.0 t/s2198 ms
Standard Pricing
Input Tokens
$0.0000009

per 1K tokens

Output Tokens
$0.0000009

per 1K tokens

Do Work. With AI.