Moonshot’s Kimi K2.5 is ‘open,’ 595GB, and built for agent swarms — Reddit wants a smaller one

by | Jan 29, 2026 | Technology

Two days after releasing what analysts call the most powerful open-source AI model ever created, researchers from China’s Moonshot AI logged onto Reddit to face a restless audience. The Beijing-based startup had reason to show up. Kimi K2.5 had just landed headlines about closing the gap with American AI giants and testing the limits of US. chip export controls. But the developers waiting on r/LocalLLaMA, a forum where engineers trade advice on running powerful language models on everything from a single consumer GPU to a small rack of prosumer hardware, had a different concern.They wanted to know when they could actually use it.The three-hour Ask Me Anything session became an unexpectedly candid window into frontier AI development in 2026 — not the polished version that appears in corporate blogs, but the messy reality of debugging failures, managing personality drift, and confronting a fundamental tension that defines open-source AI today.Moonshot had published the model’s weights for anyone to download and customize. The file runs roughly 595 gigabytes. For most of the developers in the thread, that openness remained theoretical.Three Moonshot team members participated under the usernames ComfortableAsk4494, zxytim, and ppwwyyxx. Over approximately 187 comments, they fielded questions about architecture, training methodology, and the philosophical puzzle of what gives an AI model its “soul.” They also offered a picture of where the next round of progress will come from — and it wasn’t simply “more parameters.”Developers asked for smaller models they can actually run, and Moonshot acknowledged it has a problemThe very first wave of questions treated Kimi K2.5 less like a breakthrough and more like a logistics headache.One user asked bluntly why Moonshot wasn’t creating smaller models alongside the flagship. “Small sizes like 8B, 32B, 70B are great spots for the intelligence density,” they wrote. Another said huge models had become difficult to celebrate because many developers simply couldn’t run them. A third pointed to American competitors as size targets, requesting coder-focused variants that could fit on modest GPUs.Moonshot’s team didn’t announce a smaller model on the spot. But it acknowledged the demand in terms that suggested the complaint was familiar. “Requests well received …

Article Attribution | Read More at Article Source