xAI has a allow to set up 240 MWe of gasoline generators, particularly fifteen Solar-SMT-130 models, will allow them to double the GPUs to 400,000. The 200,000 recent chips will likely be Nvidia B200s. This must be about 6-7 Zettaflops of compute. This would per chance presumably be about 11 times the 100K H100s compute worn for pre-practicing Grok 3.
The allow for xAI to set up generators, particularly fifteen Solar-SMT-130 models, will allow them to double the GPUs to 400,000.
15 of the 16 MWe generators is 240 MW. This would double the energy there from 250 MW to 490 MW. That is sufficient for 400,000 GPUs.
This submitting aligns with xAI’s ongoing efforts to amplify its infrastructure, because the supercomputer’s ability reportedly doubled to 200,000 GPUs by December 2024, with plans to scale extra toward 1 million GPUs.
Brian Wang is a Futurist Opinion Leader and a preferred Science blogger with 1 million readers month-to-month. His weblog Nextbigfuture.com is ranked #1 Science News Weblog. It covers many disruptive technology and traits including Station, Robotics, Synthetic Intelligence, Medication, Anti-aging Biotechnology, and Nanotechnology.
Known for figuring out cutting edge applied sciences, he’s at the moment a Co-Founding father of a startup and fundraiser for high doable early-stage corporations. He’s the Head of Research for Allocations for deep technology investments and an Angel Investor at Station Angels.
A frequent speaker at corporations, he has been a TEDx speaker, a Singularity University speaker and visitor at lots of interviews for radio and podcasts. He’s originate to public speaking and advising engagements.