👀👀👀
Elon Musk
Elon Musk23.7. klo 01.04
The @xAI goal is 50 million in units of H100 equivalent-AI compute (but much better power-efficiency) online within 5 years
Incredible scale
Elon Musk
Elon Musk23.7. klo 00.54
230k GPUs, including 30k GB200s, are operational for training Grok @xAI in a single supercluster called Colossus 1 (inference is done by our cloud providers). At Colossus 2, the first batch of 550k GB200s & GB300s, also for training, start going online in a few weeks. As Jensen Huang has stated, @xAI is unmatched in speed. It’s not even close.
2,95K