OPINION: Betting on the ‘AI PC’ race is ignoring the true AI leader: Nvidia.
“Nvidia has a history.”
You can’t take Step 10 at CES, the annual consumer electronics show in Las Vegas, without hearing about AI. Intel INTC,
AMD AMD,
Qualcomm QCOM,
and Microsoft MSFT,
Intel garnered the biggest buzz at the event, hosting a press conference and keynote to highlight its latest consumer chip, Intel Core Ultra. Meanwhile, AMD released a pre-recorded “special address” to talk about its advancements in the AI PC space, including details on its new Ryzen 8040 series chips with integrated network processing units (NPUs) for AI acceleration. AMD showed off new comparative benchmarks for these chips compared to Intel’s Core Ultra offerings. As expected, the comparisons are favorable, both in terms of AI results and integrated graphics performance shown, and will try to dissuade the industry from thinking that Intel’s latest chips are shaking up AMD.
But there has been one name strangely missing from the discussion since the “AI PC” order began mid-last year. Nvidia NVDA,
The company is hoping to change this mindset by investing its own energy into marketing and positioning AI on PCs. After all, Nvidia has a history. It was the hardware vendors that introduced CUDA, the programmable graphics processing unit (GPU) chip, paving the way for the AI market to decline today altogether and creating a software ecosystem that could compete with anyone in the industry. industry.
With Nvidia reaching a $1 trillion valuation thanks to the growth of the AI market for data centers (it sells tens of thousands of dollars worth of GPUs to companies that train the largest and most important AI models), if the company were to focus its efforts on: What do you get? This AI PC race? The most basic benefit is that it sells more GPUs into the consumer space for systems that don’t already have a separate GPU as part of the standard build. Gaming laptops and gaming PCs have always had graphics cards built in, but mainstream devices tend to exclude them due to cost reasons. If Nvidia could claim that true AI PCs feature GeForce GPUs, that would lead to more sales at a variety of price points.
Other benefits include maintaining Nvidia GPU chips as the foundation for the next generation of AI applications and reassuring the industry and investors that the advent of NPUs will not lead to fundamental changes in the AI computing market.
The Nvidia GPU angle in AI PCs is interesting because of the raw performance. Integrated NPUs on Intel platforms deliver 10 TOPS (tera operations per second, the standard way to express AI performance), while high-end GeForce GPUs can deliver over 800 TOPS. Clearly, an 80x improvement in AI computing resources means a much greater ability to build innovative and revolutionary AI applications. Even Nvidia’s mainstream discrete GPUs will offer several times more computing power than NPUs this year.
And this kind of GPU can be used not only in desktop computers but also in laptops. This means that a laptop “AI PC” doesn’t have to be a PC powered solely by Intel Core Ultra, but can include a high-performance GPU for the most powerful AI applications.
Graphics chips have been the foundation of AI application development from the beginning, and the generative AI push that has skyrocketed AI’s popularity in the consumer space also runs best on Nvidia hardware today. Local Stable Diffusion tools that can generate images from text all drive the use of Nvidia GPU hardware by default, but require careful tuning and the inclusion of specialized software modules to run effectively on Intel or AMD NPUs.
Nvidia showed off how it sees the world of AI on PCs with a few demos at CES. The first was a service that worked with a company called Convai to change how game developers create games and how gamers interact with non-player characters in games or virtual worlds. Basically, the implementation allows game developers to use large-scale language models like ChatGPT to create virtual characters with life-like personalities by adding bits of information and characteristics about the character’s background, traits, likes and dislikes. . Gamers can then speak into a microphone, converting that voice into text through another AI model and sending it to a game character, like a modern AI-powered chatbot. This means you can get responses translated into voice and animation in the game.
I saw several people interacting with this demo, challenging the AI character with various scenarios and questions. This solution worked incredibly well and quickly, enabling real-time conversations with game and context-aware AI characters. And this AI computing happens partly on local gaming machines and their GPUs, and partly in the cloud from Nvidia’s collection of GPUs. This is truly the best-case scenario for Nvidia.
Another demo used the GPU power of a desktop system to build a personalized ChatGPT-like assistant that uses open source language models and points to folders full of personal documents, papers, articles, and more. This additional data “fine-tunes” the AI model and allows users to converse or ask questions to the chatbot based on that data, which includes personal emails and previous posts. It’s just a tech demo and not ready for general release, but that’s one of the promises made by AI PCs, which in this case all run on Nvidia GPUs.
Of course there are pros and cons. In most cases, the Nvidia GeForce GPU in a laptop or desktop system uses significantly more power than the NPU integrated into a chip like an Intel Core Ultra. However, for AI tasks that require rapid output, the performance of individual GPUs will help complete those tasks faster.
I have no doubt that AI will change the way we interact with and use our PCs, for the better, and faster than many people think. There are a variety of solutions to make this possible, from low-power integrated NPUs in the latest laptop chips from Intel, AMD, and Qualcomm, to high-performance GPUs from Nvidia and AMD, to cloud and edge connected computing. All of these options blend together to deliver the best consumer experience. But talking about the “AI PC” revolution coming to our doorstep without Nvidia is a pretty big mistake.
Ryan Shrout, Chairman Signal 65 and founder Schrout Research. Follow him on X @ryanshout. Shrout has provided consulting services to AMD, Qualcomm, Intel, Arm Holdings, Micron Technology, Nvidia, and others. Shrout owns shares of Intel.
more: These big tech stocks are expected to account for the largest AI market share in 2024.