He believes general-purpose artificial intelligence (AGI) could surpass human intelligence next year or as early as 2026, but training it would require large numbers of processors, which means huge amounts of electricity.
Musk’s company xAI is currently working on the second version of the Grok language model. Grok v.2 training required 20,000 NVIDIA H100 GPUs. Musk expects future versions to require even more resources; Approximately 100,000 NVIDIA chips for Grok v.3.
According to the billionaire, the development of artificial intelligence is hindered by two main factors: the lack of powerful processors (“like the invention of the greens”) and the shortage of electricity. A single H100 consumes around 700 Watts at peak load, and 100,000 of these GPUs consume an impressive 70 MW.
Considering servers and cooling, a data center with 100,000 NVIDIA H100s can consume approximately 100 MW; This is the same value as a small city.
Source: Ferra

I am a professional journalist and content creator with extensive experience writing for news websites. I currently work as an author at Gadget Onus, where I specialize in covering hot news topics. My written pieces have been published on some of the biggest media outlets around the world, including The Guardian and BBC News.