The EU talks big when it comes to the green transition. While there has been significant political backlash to decarbonization in several sectors, one area certainly appears to be proving successful – the energy efficiency of its EuroHPC supercomputers.
The European joint venture's first exascale supercomputer, JUPITER, is currently being built at the Jülich Research Center in North Rhine-Westphalia. Its first precursor module is called JEDI (of course, in our opinion, this news should have been published on May 4th) and was installed in April.
In addition to giving some nerds the ultimate shortcut, the Jupiter Exascale Development Instrument just took the top spot on the Green500 list of the world's most energy-efficient supercomputers.
Data centers consume enormous amounts of energy. The need for computing power will only increase as training and operating AI consumes more and more of our valuable resources. According to Arm CEO Rene HaasAI could be responsible for as much electricity consumption as India by 2030.

The <3 of EU technology
The latest rumors from the EU tech scene, a story from our wise founder Boris and questionable AI art. It's free in your inbox every week. Join Now!
Therefore, it is essential that all supercomputer projects – the machines – work needed to train state-of-the-art AI – announced in recent years will be held accountable for the amount of energy they require to operate.
Nvidia again for the HPC victory
The Green500 is a ranking from the Top500 list, which is published every six months. While the latter evaluates systems purely based on their performance, the former evaluates them based on how energy efficient they are. In another triumph for Nvidia, almost all of the leading systems on the Green500 are based primarily on graphics processors or GPUs.
JEDI is based on Nvidia's GH200 Grace Hopper superchip, which combines both its GPU and CPU architecture. It also uses a direct liquid hot water cooling system that is part of the Eviden BullSequana XH3000 architecture that uses significantly less energy than traditional air cooling.
When completed, the JUPITER system will feature 24,000 Nvidia Grace Hopper chips and 125 BullSequana XH3000 racks and will exceed the one exaflop threshold. That is, a computing capacity of one trillion (that's a 1 followed by 18 zeros) floating point operations per second.
Specifically for 8-bit calculations, which are most commonly used for training AI, computing power will increase to well over 70 exaflops. This would make the system, which will go into general operation in early 2025, the most powerful AI system in the world. At least until the next one goes online, because “the future is always in motion.”