Breaking news in the tech world: Nvidia has unveiled its latest innovation, the GB200 Blackwell AI chip. This cutting-edge processor represents a monumental leap forward in artificial intelligence capabilities, boasting a staggering 20 petaflops in AI performance compared to its predecessor. With the ability to train larger and more complex models, the GB200 is set to revolutionize the AI landscape. Additionally, its transformer engine is specifically designed for running transformers-based AI, a key technology powering advancements like ChatGPT.
Nvidia announces GB200 Blackwell AI chip
Revolutionizing the world of artificial intelligence, Nvidia has just unveiled its latest innovation – the GB200 Blackwell AI chip. This cutting-edge technology promises a huge performance upgrade for AI companies, boasting an impressive 20 petaflops in AI performance compared to the previous generation. The Blackwell GPU, equipped with a specialized transformer engine for transformers-based AI, is poised to enable AI companies to train bigger and more intricate models with ease. Not only will the GB200 chip be available for individual purchase, but it will also be offered as part of server clusters through cloud services provided by major tech giants like Amazon, Google, and Microsoft. The implications of this advancement are vast and game-changing, signaling a new era in AI capabilities.
Key Highlights:
Nvidia has recently announced the upcoming launch of its new generation of artificial intelligence chips, named Blackwell. The first chip in this series, the GB200, is set to hit the market later this year, promising significant improvements in AI performance. Nvidia CEO Jensen Huang stressed the need for larger GPUs, signaling a shift towards more powerful chips to meet the growing demands of AI companies.
Performance Upgrade:
The GB200 Blackwell chip offers a remarkable boost in AI performance, with 20 petaflops compared to 4 petaflops for the previous H100 chip. This substantial increase in processing power will enable AI companies to train more complex and larger models, potentially unlocking new capabilities in the AI landscape.
Industry Impact:
Major tech players like Amazon, Google, Microsoft, and Oracle are set to offer access to the GB200 chip through their cloud services. Nvidia’s focus on transforming from a chip provider to a platform provider reflects the company’s strategic vision to support and enable AI development at scale.
Cost and Deployment:
While the exact pricing for the GB200 chip and associated systems has not been disclosed, analysts estimate the previous H100 chip ranged between $25,000 to $40,000 per chip. Nvidia’s introduction of the NIM (Nvidia Inference Microservice) software aims to streamline the deployment of AI models on older Nvidia GPUs, making it cost-effective for companies to leverage their existing hardware for inference processes.
Software Integration:
Through collaborations with AI companies like Microsoft and Hugging Face, Nvidia ensures that AI models are optimized to run on Nvidia chips. The NIM software simplifies the deployment process, allowing developers to efficiently run models on Nvidia-based servers or GPU-equipped laptops.
Overall, Nvidia’s announcement of the GB200 Blackwell AI chip marks a significant step towards advancing AI capabilities and supporting the evolving needs of the industry.









