Menu

Picture

50 million nVidia GPU chips for Elon Musk's supercomputer

xAI CEO Elon Musk is going to spend lavishly to train his artificial intelligence Grok.

Elon Musk (Tesla, X, SpaceX…) recently showed how seriously he takes artificial intelligence by describing the infrastructure behind the training of Grok, the intelligent agent designed by xAI, one of his companies dedicated to AI.

Colossus 2 supercomputer with 50 million nVidia chips soon to be at work behind Grok AI££££

So, no fewer than 230,000 graphics processing units (GPUs), including 30,000 GB200s from nVidia, make up Colossus 1, the supercomputer that is constantly improving Grok. And guess what, Colossus 2 should go online in a few weeks, composed of some 550,000 GB200 and GB3000 GPUs. And this is just the beginning, as within five years, Elon Musk wants his supercomputer to incorporate no fewer than 50 million nVidia H100 GPUs, for a total power of 200 exaFlops, 20 times more than the current fastest model in the world.

image