Liquid AI Releases Compact Reasoning Model for Devices
Press Space for next Tweet
Liquid AI released LFM2.5-1.2B-Thinking, a 1.17-billion-parameter reasoning model that runs in under 900 MB of RAM and operates about twice as fast as similar models. Designed for small devices, it performs competitively on reasoning benchmarks and is suited to agents that orchestrate tools, extract data, or run local workflows without cloud compute. Find all the details in The Batch ⬇️ https://hubs.la/Q045gVzd0

Liquid AI’s Small Reasoning Model Mixes Attention With Convolutional Layers for Efficiency