Finding signal on Twitter is more difficult than it used to be. We curate the best tweets on topics like AI, startups, and product development every weekday so you can focus on what matters.

Liquid AI Releases Compact Reasoning Model for Devices

DeepLearning.AI

Liquid AI released LFM2.5-1.2B-Thinking, a 1.17-billion-parameter reasoning model that runs in under 900 MB of RAM and operates about twice as fast as similar models. Designed for small devices, it performs competitively on reasoning benchmarks and is suited to agents that orchestrate tools, extract data, or run local workflows without cloud compute. Find all the details in The Batch ⬇️ https://hubs.la/Q045gVzd0

Liquid AI’s Small Reasoning Model Mixes Attention With Convolutional Layers for Efficiency

Liquid AI’s Small Reasoning Model Mixes Attention With Convolutional Layers for Efficiency

Topics

We doomscroll, you upskill.

slop ⇢ substance ⇢ signal