Hardware and Hyperscalers are the Picks and Shovels of the AI Gold Rush
Nvidia’s share price has been trending north all year, and then on May 24th it went vertical, creating the kind of mythical hockey stick curve so beloved by gimlet-eyed start-up pitch-writers. Not bad for a 30-year-old business that was already worth more than half a trillion dollars a day earlier.
When its share price rose by 24 percent after its most recent quarterly earnings report, the dramatic surge reflected more than just market approval for a strong quarter. The company that first built its wealth on providing systems for the global gaming industry finds itself poised to benefit from the surging interest in AI, and especially generative AI.
It’s a great reminder that success is much less linear and predictable than we like to imagine. Nvidia’s early growth was fueled by its invention of the graphics processing unit in 1999 which it commercialised in its GeForce card for PCs and consoles. It rode the growth of the global gaming industry into the $200bn sector that it has become today, an industry that is five times larger than the movie industry, and more than seven times larger than the global music industry.
However, as Peter Yang editor of the Creator Economy substack newsletter, and a product lead at Roblox noted on Twitter this month, “GPUs are just one part of the story. Nvidia launched the CUDA computing platform in 2007 to make it easier for people to program GPUs. This made it much easier for AI players to adopt Nvidia's chips versus competitors. It turns out that GPUs are also well suited for the data processing and model training demands of generative AI.”
That’s because while central processing units (CPUs) can be used for some simpler AI tasks, they are becoming less and less useful as AI advances, according to the Centre for Security and Emerging Technology. A paper by Saif M Khan and Alexander Mann in 2020 noted, “AI chips include graphics processing units (GPUs), field-programmable gate arrays (FPGAs), and application-specific integrated circuits (ASICs) that are specialised for AI.”
Three years after that paper was released it seems the market has suddenly caught on to the fact that Nvidia’s chips are uniquely – for now at least – well suited for organisations wanting to train AI models due to their large memory.
At the start of the generative AI gold rush, Nvidia finds itself selling the picks and shovels.
Nvidia’s founder and CEO Jensen Huang is making the most of his moment in the sun, pointing out after the last results which showed profits of $2bn on revenues of $7bn that “the world’s $1 trillion data centre is nearly populated entirely by CPUs today…it’s growing of course but over the last four years, call it $1tn worth of infrastructure installed, and it’s all completely based on CPUs and dumb NICs. It’s basically unaccelerated.”
As a result of its latest surge, Nvidia is now worth $960bn – making it significantly larger than Meta and Tesla and moving it much closer to membership of the trillion-dollar club where Alphabet and Apple now reside.
Nvidia is just the most obvious hardware winner today, but even businesses like Micron Technology, that have struggled in recent years, look set to thrive.
Micron released a shocker of a result on March 28, recording its worst quarterly loss ever ($2.31bn), an outcome that could not offer a more stark contrast with Nvidia.
Yet investor sentiment around even a result that saw quarterly revenues more than halve, from $7.79bn a year ago to $3.69bn, was buttressed by the potential of generative AI to turn the company’s fortunes around.
In a call with analysts to discuss the quarter, President and CEO Sanjay Mehrotra said that despite significant near-term challenges, he believes that the memory and storage total addressable market will grow to a new record in calendar 2025.
And he said it “will continue to outpace the growth of the semiconductor industry thereafter.”
Generative AI will ride to the rescue, he believes, “Recent developments in artificial intelligence (AI) provide an exciting prelude to the transformational capabilities of large language models, such as ChatGPT, which require significant amounts of memory and storage to operate. We are only in the very early stages of the widespread deployment of these AI technologies and potential exponential growth in their commercial use cases.”
Mehrotra said: “As more applications of this technology proliferate, we will see training workloads in the data centre supplemented with widespread inference capabilities in the data centre as well as in end user devices — all of which will drive significant growth in memory and storage consumption.”
With the generative AI market set to grow from a mere US$8.2bn in 2021 to $126.5bn by 2031 (a CAGR of 32 percent) data centre owners and hyperscalers likewise understand the likely effect upon demand for their services. In much the same way that smart mobility unlocked huge value in the developing world by providing access to information or services, Gen AI promises radical change.
Speaking at Data Centre Live in London earlier this month, Michael Ortiz, CEO of Layer9 Data Centres, told delegates, “I would like to say that AI and ChatGPT and all these derivative forms are an equaliser. It democratises data. It allows people in emerging markets people who otherwise wouldn't have access to things like medical advice, legal advice, accounting have now a portal to be able to source data and be able to educate themselves.”
All that extra demand will also force changes to how data centres work, according to Chris Street, managing director of Data Centres, Asia Pacific, JLL, in an interview with Techwire Asia.
“From a technical perspective, we are seeing increased power densities for IT systems that support AI applications. The surge in computing power brings challenges to existing data centres, especially older facilities not designed for these types of applications.”
It is little wonder then that hyperscalers are investing heavily in generative AI start-ups. According to Pitchbook in its Vertical Snapshot, Generative AI report March 2023, “Cloud providers … are both making significant acquisitions and striking generous partnerships to align themselves with the future of creative software.” While they reference Microsoft’s $10bn financing commitment for OpenAI and Alphabet’s VC megadeals for Anthropic and Cohere they also note that Meta, Spotify, and Apple are investing significant amounts of capital as well.
“These investments come after a relatively quiet period for hyperscaler acquisition activity in AI, demonstrating that tech giants now require urgency to prevent disruption.”
(Image by Dall-E 2: “hardware and hyperscalers are the picks and shovels of the AI gold rush”).