back

Poolside releases Laguna XS.2 and M.1, its first agentic coding models including an open-weight release

2026-04-29 01:17

Poolside shipped the first two models in its Laguna family on April 28: Laguna M.1 (225B total / 23B active parameters, 72.5% on SWE-bench Verified, 46.9% on SWE-bench Pro) and Laguna XS.2 (33B total / 3B active, 68.2% on SWE-bench Verified), the latter released as open weights under Apache 2.0 and runnable locally on a Mac with 36 GB RAM via Ollama. Both models were trained from scratch on 30 trillion tokens; a distributed Muon optimizer achieved equivalent training loss to AdamW in roughly 15% fewer steps, and an asynchronous RL pipeline decouples inference from training to handle long-horizon tasks without GPU idle time. Poolside also released pool, a terminal-based coding agent, and Shimmer, a cloud development environment for web apps and APIs—both built on the same infrastructure used for the models' RL training.

Citations