NVIDIA announced its next-generation Blackwell AI chips and software during its developer conference in San Jose on Monday, aiming to bolster its leadership in the AI sector. The Blackwell series, with its first chip named GB200, is set to ship later this year, promising significant performance boosts for AI applications. This launch comes as NVIDIA’s financials have surged, thanks in part to the AI boom initiated by ChatGPT. The Blackwell chips, featuring a transformer engine optimized for AI models like ChatGPT, are designed to meet the high demand from companies still eager to acquire NVIDIA’s current Hopper H100 GPUs.
“Hopper is fantastic, but we need bigger GPUs,” NVIDIA CEO Jensen Huang said at the company’s developer conference in California. “Blackwell’s not a chip, it’s the name of a platform.”
Additionally, NVIDIA introduced NIM, a new software to facilitate AI deployment on its GPUs, enhancing the company’s value proposition beyond hardware. With these advancements, NVIDIA not only continues to push the envelope in AI processing power — offering 20 petaflops of AI performance with the GB200 compared to the 4 petaflops of the H100 — but also positions itself as a comprehensive platform provider for AI development and deployment, mirroring strategies of tech giants like Microsoft and Apple.