NVIDIA’s Jensen Huang Believes AI Reasoning Requires 100x More Computing Power

In a recent CNBC interview following NVIDIA’s quarterly earnings report, CEO Jensen Huang discussed the growing demand for AI computing resources and the technological shift toward reasoning AI models.

Huang explained how the AI landscape is rapidly evolving beyond basic large language models toward more sophisticated reasoning capabilities: “This is the time when AI is thinking to itself before it answers the question, instead of just immediately generating an answer, they’ll reason about it, maybe break it down step by step.”

The computing implications of this shift are enormous. According to Huang: “The amount of computation necessary to do that reasoning process is hundred times more than what we used to do.”

Huang detailed three phases in AI development: pre-training (comparable to learning basic knowledge), post-training (involving reinforcement learning and feedback), and inference (the reasoning process). The innovation happening in post-training and the computational demands of reasoning models are driving unprecedented demand for NVIDIA’s hardware.

When discussing these reasoning models, Huang mentioned several examples currently in the market: “ChatGPT four is an example of that. Grok three reasoning is an example of that. So all of these reasoning AI models now need a lot more compute than what we used to.”

The CEO remains optimistic about NVIDIA’s growth trajectory, noting that capital investment for data centers is significantly higher than last year, with new startups constantly emerging that require immediate computing resources. With Blackwell chips shipping and new data centers coming online, Huang believes the company is well-positioned at what he calls “just the beginning of the reasoning AI era.”

AI Insider

Discover the future of AI technology with "AI Insider" - your go-to platform for industry data, market insights, and groundbreaking AI news

Subscribe today for the latest news about the AI landscape