We were captivated by Jensen Huang’s opening keynote last week at the NVIDIA GTC. He did a masterful job, mixing in humor, a sly Taylor Swift refere

What NVIDIA Didn’t Say

submited by
Style Pass
2024-03-29 05:30:02

We were captivated by Jensen Huang’s opening keynote last week at the NVIDIA GTC. He did a masterful job, mixing in humor, a sly Taylor Swift reference, Michael Dell flyovers, and sci-fi references (Star Trek’s opening sequence and a callback to Silent Running’s droids).

And, of course, the Blackwell announcement. Let’s dig into that. What did Jensen say, and what didn’t he say? Here’s what we heard.  

Jensen spent the first part of his talk describing how creating simulations and “digital twins” in the NVIDIA-hosted omniverse can create wonderful new solutions. Many of these solutions, though, require ever-larger models and real-time performance in inference. Which leads to a problem: compute. General computing has “run out of steam.” We need a new approach.

What is this new approach? Big. Blackwell big: 20 petaflops (up to 40 petaflops at FP4), 208 billion transistors, multi-trillion parameter large language models (LLMs). To paraphrase Tiny Elvis, that chip is huge.  

Leave a Comment