Open Source

Xiaomi Releases a One-Trillion Parameter Open-Source Model, Pushing the Boundaries of Open AI Development

| By The Tech Room Editorial Team
Open source code on screen representing Xiaomi's open-source AI model release

Chinese technology giant Xiaomi surprised the global AI community by releasing a one-trillion parameter large language model under a permissive open-source license. The model, trained on a mixture of Chinese and English data, represents the largest openly available LLM to date and rivals the performance of closed frontier models on several multilingual benchmarks. Xiaomi's release intensifies the ongoing debate about whether open-source or closed development will ultimately prevail in artificial intelligence. The model's sheer scale requires significant infrastructure to run at full precision, but the company also released quantized variants that can operate on more accessible hardware. The move positions Xiaomi as a serious player in the global AI race and demonstrates that state-of-the-art deep learning research is no longer the exclusive domain of American companies. Researchers and enterprises worldwide have begun fine-tuning the model for specialized applications.

The technical specifications of Xiaomi's model reveal an ambitious training effort. The model was trained on approximately 15 trillion tokens of multilingual data, with Chinese and English each accounting for roughly 40% of the training corpus and the remaining 20% covering Japanese, Korean, French, German, and Spanish. Xiaomi reported using a cluster of over 10,000 NVIDIA H100 GPUs for training, at an estimated cost exceeding $200 million. Despite the massive scale, the model's performance on English-language benchmarks falls slightly below GPT-5.4 and Claude Opus 4.6, but it leads all models on Chinese-language tasks by a significant margin, including complex reasoning, classical Chinese literature comprehension, and technical translation. The quantized 4-bit variant, which can run on a single server with 8 consumer GPUs, retains approximately 92% of the full model's capability — making it accessible to university research labs and startups that cannot afford enterprise-grade infrastructure.

The geopolitical implications of Xiaomi's release are substantial. U.S. policymakers who advocated for export controls on advanced AI chips as a way to slow Chinese AI development have been forced to reckon with the fact that Chinese labs continue to produce world-class models despite hardware restrictions. Xiaomi's open-source approach also creates a strategic challenge for Western companies: by making the model freely available, Xiaomi ensures that developers worldwide can build on Chinese AI infrastructure, potentially shifting the center of gravity of the open-source AI ecosystem toward Chinese-origin models. Over 200,000 developers downloaded the model within the first week on Hugging Face, and specialized fine-tuned versions have already emerged for healthcare, legal, and financial applications. The release has prompted renewed calls from European and American AI labs for more aggressive open-source strategies to maintain competitive influence in the global AI ecosystem.

Sources

Xiaomi, Hugging Face, Wired

The Tech Room Editorial Team

Expert analysis covering semiconductors, AI, and gaming. Learn more about our team.

← Back to Artificial Intelligence