Back

Liquid: LFM 40B MoE

Other
Input: text
Output: text
Released: Sep 30, 2024Updated: Mar 28, 2025

Liquid's 40.3B Mixture of Experts (MoE) model. Liquid Foundation Models (LFMs) are large neural networks built with computational units rooted in dynamic systems.

LFMs are general-purpose AI models that can be used to model any kind of sequential data, including video, audio, text, time series, and signals.

See the launch announcement for benchmarks and more info.

32,768 Token Context

Process and analyze large documents and conversations.

Advanced Coding

Improved capabilities in front-end development and full-stack updates.

Agentic Workflows

Autonomously navigate multi-step processes with improved reliability.

Available On

ProviderModel IDContextMax OutputInput CostOutput CostThroughputLatency
Liquidliquid33K-$0.15/M$0.15/M176.9 t/s299 ms
Lambdalambda66K66K$0.15/M$0.15/M80.8 t/s185 ms
Standard Pricing
Input Tokens
$0.00000015

per 1K tokens

Output Tokens
$0.00000015

per 1K tokens

Do Work. With AI.