Back

DeepSeek-Coder-V2

Other
Input: text
Output: text
Released: May 14, 2024Updated: Mar 28, 2025

DeepSeek-Coder-V2, an open-source Mixture-of-Experts (MoE) code language model. It is further pre-trained from an intermediate checkpoint of DeepSeek-V2 with additional 6 trillion tokens.

The original V1 model was trained from scratch on 2T tokens, with a composition of 87% code and 13% natural language in both English and Chinese. It was pre-trained on project-level code corpus by employing a extra fill-in-the-blank task.

128,000 Token Context

Process and analyze large documents and conversations.

Advanced Coding

Improved capabilities in front-end development and full-stack updates.

Agentic Workflows

Autonomously navigate multi-step processes with improved reliability.

Available On

ProviderModel IDContextMax OutputInput CostOutput CostThroughputLatency
Nebius AI StudionebiusAiStudio128K-$0.04/M$0.12/M122.0 t/s234 ms
Standard Pricing
Input Tokens
$0.00000004

per 1K tokens

Output Tokens
$0.00000012

per 1K tokens

Do Work. With AI.