2025-07-23 01:03:07

Musk: 230,000 GPUs, including 30,000 GB200s, are used to train Grok in a supercomputing cluster called Colossus 1. In Colossus 2, the first batch of 550,000 GB200s and GB300s will also be online for training in a few weeks. As Huang Renxun said, the speed of xAI is unparalleled.
Email Subscription
Newsletters and emails are now available! Delivered on time, every weekday, to keep you up to date with North American business news.
ASIA TECH WIRE

Grasp technology trends

Download