Ant Group, the fintech affiliate of Alibaba backed by Jack Ma, is revolutionizing AI training by leveraging Chinese-made semiconductors—achieving a remarkable 20% cost reduction. Amid ongoing US export restrictions on advanced chips, Ant has turned to domestic alternatives, including processors from Huawei and Alibaba, to power its artificial intelligence ambitions.
According to internal sources, these homegrown chips are delivering performance on par with Nvidia’s H800 series, which are banned from being sold to Chinese firms under US regulations.
Unlocking Efficiency with Mixture of Experts (MoE)
The key to Ant’s breakthrough lies in the Mixture of Experts (MoE) machine learning framework. This approach divides tasks among specialized “expert” sub-models, mimicking a team of specialists tackling a problem in unison. The result is a highly efficient and resource-conscious training process.
Robin Yu, CTO at Shengshang Tech Co., likened the MoE concept to martial arts: “If you find one point of attack to beat the world’s best kung fu master, you can still say you beat them—which is why real-world application is important.”
By applying MoE and optimized hardware setups, Ant slashed training costs for 1 trillion tokens from 6.35 million yuan ($880,000) to just 5.1 million yuan.
Meet Ling-Lite and Ling-Plus: China’s Response to GPT
Ant has also rolled out two major large language models (LLMs):
- Ling-Lite: A 16.8 billion parameter model that outperforms Meta’s LLaMA on select English benchmarks
- Ling-Plus: A 290 billion parameter model, among the largest in the Chinese market
For comparison, OpenAI’s GPT-4.5 is estimated to have 1.8 trillion parameters, and DeepSeek-R1 features 671 billion. Despite being smaller, Ant’s models have shown superior results in Chinese-language tasks and are now open source, marking a bold move to attract community adoption.
Moving Away from Nvidia
While Nvidia chips are still used in some projects, Ant has increasingly incorporated AMD processors and Chinese chips into its AI training. This shift underscores a broader movement among Chinese tech firms seeking independence from Western suppliers amid tightening export controls.
Robert Lea, senior analyst at Bloomberg Intelligence, noted, “If verified, Ant’s claims suggest China is rapidly approaching AI self-sufficiency through cost-effective and computationally efficient models.”
Real-World Deployment in Healthcare and Finance
Ant Group’s AI advancements aren’t confined to labs. The firm has launched practical applications in various sectors:
- Healthcare: Through its acquisition of Haodf.com, Ant developed an “AI Doctor Assistant” that helps 290,000 doctors with medical records and diagnostics.
- Finance: Its “Maxiaocai” AI service provides digital financial consulting.
- Consumer Apps: “Zhixiaobao,” an AI-powered lifestyle assistant, offers smart suggestions for everyday needs.
Additionally, Ant has introduced AI-driven healthcare systems currently operating in seven hospitals across cities such as Beijing and Shanghai.
A Challenge to Nvidia’s Vision
Ant’s efficiency-driven direction presents a contrast to Nvidia CEO Jensen Huang’s belief that AI development requires increasingly powerful—and expensive—hardware. Nvidia’s roadmap emphasizes bigger, faster GPUs with denser processing capabilities.
However, Ant’s focus on leaner, more efficient models may represent a viable alternative—especially in an environment constrained by export limits. The company did acknowledge challenges during development, such as instability and error spikes from minor model or hardware changes.
Still, the ability to achieve high performance at lower cost using domestic chips signals a pivotal moment for China’s AI ecosystem. If the momentum continues, it may redefine how AI is trained—and who leads the race.