Finding signal on Twitter is more difficult than it used to be. We curate the best tweets on topics like AI, startups, and product development every weekday so you can focus on what matters.

Qwen3.5 with Linear Attention and Sparse MoE Design

The new @Alibaba_Qwen Qwen3.5-397B-A17B is live on OpenRouter now! This multimodal model uses a hybrid architecture combining linear attention with sparse MoE for higher inference efficiency. Available as both the open weights version and Qwen3.5 Plus with extended 1M context.

Topics

Read the stories that matter.

Save hours a day in 5 minutes