Exclusive: Nvidia CEO says his AI chips are improving faster than Moore's Law

Photo of author

By admin


Nvidia CEO Jensen Huang says the performance of his company's AI chips is advancing faster than the historical rate dictated by Moore's Law, the rubric that has driven computing progress for decades.

“Our systems are advancing faster than Moore's Law,” Huang said in an interview with TechCrunch on Tuesday morning Delivered a keynote address to a crowd of 10,000 at CES In Las Vegas.

Coined in 1965 by Intel co-founder Gordon Moore, Moore's Law predicted that the number of transistors in computer chips would roughly double each year, essentially doubling the performance of those chips. This prediction has largely panned out and decades of rapid advances in capacity and low cost have produced.

In recent years, Moore's Law has slowed down. However, Huang claims that Nvidia's AI chips are running at their own accelerated speed.

“We can develop architectures, chips, systems, libraries and algorithms at the same time,” Huang said. “If you do that, you can move faster than Moore's Law, because you can innovate across the whole stack.”

The bold claim from Nvidia's CEO comes at a time when Many are questioning whether the progress of AI has stalled. Leading AI labs – such as Google, OpenAI, and Anthropic – use Nvidia's AI chips to train and run their AI models, and advances in these chips will likely translate to further advances in AI model capabilities.

Huang rejects the notion that AI progress is slowing down. Instead he claims that there are now three active AI scaling laws: pre-training, the initial training phase where AI models learn patterns from large amounts of data; post-training, which fine-tunes an AI model's answers using methods similar to human feedback; and test time calculations, which occur during the estimation phase and give an AI model more time to “think” after each question.

“Moore's Law was so important in the history of computing because it drove down the cost of computing,” Huang told TechCrunch. “The same thing is going to happen with estimates where we increase performance and, as a result, the cost of estimates is going to go down.”

(Of course, Nvidia has It has become the most valuable company in the world (The AI ​​rides the boom, so Huang benefits.)

Nvidia CEO Jensen Huang using the gb200 nvl72 as a shield (Image credit: Nvidia)

Nvidia's H100s used to be the chip of choice for tech companies looking to train AI models, but now that tech companies are focusing more on inference, Some have questioned whether Nvidia's pricier chips will still be at the top.

AI models that use test-time compute are expensive to run today. There are concerns that OpenAI's o3 model, which uses a scaled-up version of test-time compute, will be too expensive for most people to use. For example, OpenAI spent about $20 per task using o3 to achieve human-level scores In tests of general intelligence. A ChatGPT Plus subscription costs $20 for a full month of use.

Huang held up Nvidia's latest datacenter superchip, the GB200 NVL72, on stage like a shield during Monday's keynote. The chip is 30 to 40x faster at running AI inference workloads than Nvidia's previous best-selling chip, the H100. Huang says this performance jump means AI logic models like OpenAI's o3, which use significant amounts of computation during the inference phase, will become cheaper over time.

Huang says he's focused on making more performance chips overall, and that more performance chips lead to lower prices in the long run.

“A direct and instant solution for computing during testing, both in terms of performance and cost savings, increasing our computing power,” Huang told TechCrunch. He noted that in the long run, AI reasoning models can be used to generate better data than pre-training and post-training AI models.

We've certainly seen AI models drop in price over the past year due to the computing success of hardware companies like Nvidia. Huang says this is a trend he expects to continue with AI logic models, although the first versions we've seen from OpenAI are rather expensive.

More broadly, Huang claims his AI chips are 1,000 times better than those made 10 years ago. That's a much faster pace than the standard set by Moore's Law, one Huang says he doesn't see any signs of stopping anytime soon.



Source link

Leave a Comment