Arcee aims to reboot U.S. open source AI with new Trinity models released under Apache 2.0

by | Dec 1, 2025 | Technology

For much of 2025, the frontier of open-weight language models has been defined not in Silicon Valley or New York City, but in Beijing and Hangzhou.Chinese research labs including Alibaba’s Qwen, DeepSeek, Moonshot and Baidu have rapidly set the pace in developing large-scale, open Mixture-of-Experts (MoE) models — often with permissive licenses and leading benchmark performance. While OpenAI fielded its own open source, general purpose LLM this summer as well — gpt-oss-20B and 120B — the uptake has been slowed by so many equally or better performing alternatives. Now, one small U.S. company is pushing back.Today, Arcee AI announced the release of Trinity Mini and Trinity Nano Preview, the first two models in its new “Trinity” family—an open-weight MoE model suite fully trained in the United States. Users can try the former directly for themselves in a chatbot format on Acree’s new website, chat.arcee.ai, and developers can download the code for both models on Hugging Face and run it themselves, as well as modify them/fine-tune to their liking — all for free under an enterprise-friendly Apache 2.0 license. While small compared to the largest frontier models, these releases represent a rare attempt by a U.S. startup to build end-to-end open-weight models at scale—trained from scratch, on American infrastructure, using a U.S.-curated dataset pipeline.”I’m experiencing a combination of extreme pride in my team and crippling exhaustion, so I’m struggling to put into words just how excited I am to have these models out,” wrote Arcee Chief Technology Officer (CTO) Lucas Atkins in a post on the social network X (formerly Twitter). “Especially Mini.”A third model, Trinity Large, is already in training: a 420B parameter model with 13B active parameters per token, scheduled to launch in January 2026.“We want to add something that has been missing in that picture,” Atkins wrote in the Trinity launch manifesto published on Arcee’s website. “A serious open weight model family trained end to end in America… that businesses and developers can actually own.”From Small Models to Scaled AmbitionThe Trinity project marks a turning point for Arcee AI, which until now has been known for its compact, enterprise-focused models. The company has raised $29.5 million in funding to date, including a $24 million Series A in 2024 led by Emergence Capital, and its previous releases include AFM-4.5B, a compact instruct-tuned model released in mid-2025, and SuperNova, an earli …

Article Attribution | Read More at Article Source