Nvidia announces next GPU generation, Blackwell, to be used by OpenAI, Google, Amazon, Microsoft, and Meta.
The company claimed that the new GPU architecture would allow customers to build and run real-time generative AI on trillion-parameter large language models at 25x less cost and energy consumption than its predecessor, the Hopper series.
Amazon, Google, Meta, Microsoft, Oracle Cloud, and OpenAI are among the companies that confirmed they will deploy Blackwell GPUs later this year.
Blackwell is named in honor of David Harold Blackwell, a mathematician who specialized in game theory and statistics, and the first Black scholar inducted into the National Academy of Sciences.
“For three decades we’ve pursued accelerated computing, with the goal of enabling transformative breakthroughs like deep learning and AI,” said Jensen Huang, founder and CEO of Nvidia.
“Generative AI is the defining technology of our time. Blackwell GPUs are the engine to power this new industrial revolution. Working with the most dynamic companies in the world, we will realize the promise of AI for every industry.”
Blackwell architecture GPUs are manufactured using a custom-built, two-reticle limit 4NP TSMC process with GPU dies connected by 10TBps chip-to-chip link into a single, unified GPU.
The GPU has 208 billion transistors, an increase on the 80bn in the Hopper series. It is twice the size of the Hopper.
The Blackwell includes a second-generation transformer engine and new 4-bit floating point AI inference capabilities.
Read more: datacenterdynamics.com