San Francisco-based Arcee AI released Trinity-Large-Thinking, a 399-billion parameter text-based AI model. The startup operates with a team of approximately 30 people. The model uses an open Apache 2.0 license to allow full commercial customization.
Arcee positions the model as a sovereign American alternative to Chinese-developed open-source AI. The company aims to provide a high-performance reasoning engine for enterprise developers. Performance benchmarks reportedly rival leading proprietary models at a lower cost.
The company spent $20 million on the model's training run. This training process lasted 33 days. The expenditure represents nearly half of Arcee’s total funding. Investors include Microsoft’s M12 venture fund, Samsung, and Wipro.