Back

Microsoft: Phi-3.5 Mini 128K Instruct

Other
Input: text
Output: text
Released: Aug 21, 2024Updated: Mar 28, 2025

Phi-3.5 models are lightweight, state-of-the-art open models. These models were trained with Phi-3 datasets that include both synthetic data and the filtered, publicly available websites data, with a focus on high quality and reasoning-dense properties. Phi-3.5 Mini uses 3.8B parameters, and is a dense decoder-only transformer model using the same tokenizer as Phi-3 Mini.

The models underwent a rigorous enhancement process, incorporating both supervised fine-tuning, proximal policy optimization, and direct preference optimization to ensure precise instruction adherence and robust safety measures. When assessed against benchmarks that test common sense, language understanding, math, code, long context and logical reasoning, Phi-3.5 models showcased robust and state-of-the-art performance among models with less than 13 billion parameters.

131,072 Token Context

Process and analyze large documents and conversations.

Advanced Coding

Improved capabilities in front-end development and full-stack updates.

Agentic Workflows

Autonomously navigate multi-step processes with improved reliability.

Available On

ProviderModel IDContextMax OutputInput CostOutput CostThroughputLatency
Nebius AI StudionebiusAiStudio131K-$0.03/M$0.09/M54.2 t/s303 ms
Azureazure128K-$0.10/M$0.10/M17.1 t/s345 ms
Standard Pricing
Input Tokens
$0.00000003

per 1K tokens

Output Tokens
$0.00000009

per 1K tokens

Do Work. With AI.