CPUs have for long dominated the semiconductor landscape, but GPUs are staging a silent insurrection with the advent of artificial intelligence and piggybacking on this to exponential growth beyond the traditional gaming market.

Manufacturers of graphics processing units (GPUs) are in luck. GPU sales are on the rise and expected to keep increasing, boosted by rocketing demand from a widening range of economic segments, including automotive, electronics, finance, media and entertainment, healthcare, electronics and industrial. Enterprises in all these sectors are hoping to leverage artificial intelligence (AI) to widen their competitive advantages and accelerate innovation and new product development. Over the next decade, the market for graphics processing units (GPUs) is projected to be the hottest segment of the semiconductor industry, according to forecasters who predict the segment will expand at a sizzling 33.5 percent compounded annual growth rate between 2022 and 2030. 

If the forecasts, from Reports Insights and other sources, prove accurate, GPU sales may by 2030 represent a whopping 45 percent of the $1 trillion estimated value of the entire semiconductor industry. The researchers estimate sales of GPUs will increase to $450 billion by the end of this decade, from just $44.7 billion, in 2022. This will intensify competition in the segment although observers said they expect Nvidia Corp. to remain the market leader. The company’s market share in the total GPU segment, which includes PCs and gaming, is currently estimated at more than 80 percent, with Advanced Micro Devices in second place and Intel Corp. a distant third, according to figures from Jon Peddie Research. New players are crowding into the sector, lured by promises of rapid growth and profitability.

A slowdown in demand for GPUs is not expected anytime soon due to the expansion of interest from new economic segments. Researchers at Spherical Insights said they see the market continuing to grow through 2032, rising to $594.2 billion. AI, cloud and edge computing are driving the market expansion, they said. “The growth is driven by the continuous demand for computing devices, servers, and networking equipment across various industries,” said the Spherical Insights analysts, in a statement. “The expansion of emerging technologies, including artificial intelligence, Internet of Things (IoT), and edge computing, further fuels the demand for robust hardware infrastructure. Additionally, the need for reliable hardware components in data centers and the growing adoption of cloud services also contribute to the expected growth of the hardware segment.”

GPUs have been around for decades, but they are beginning to attract further interest now as Jon Peddie, a graphics industry veteran and analyst, points out in a newly published book. In the book titled “The History of the GPU – New Developments” Peddie notes that the areas where GPUs are used nowadays have expanded to include “supercomputers, PCs, smartphones and tablets, wearables, game consoles and handhelds, TVs and every type of vehicle.” GPUs used to toil in the shadow of central processing units (CPUs), however. Once the more famous rival, CPUs ruled for decades, championed by companies like Intel, which notes on its website that the CPU “is suited to a wide variety of workloads, especially those for which latency or per-core performance are important,” adding this “makes it uniquely well equipped for jobs ranging from serial computing to running databases.” 

But even Intel sees opportunities in the GPU market and is a growing player in the segment. The company regularly unveils new GPU products and is pushing for a bigger role in the AI market. Many of Intel’s new GPU products are aimed at the AI market, the company noted in a statement on its website. “Today, GPUs run a growing number of workloads, such as deep learning and artificial intelligence (AI). A GPU or other accelerators are ideal for deep learning training with neural network layers or on massive sets of certain data, like 2D images,” Intel noted. “Deep learning algorithms were adapted to use a GPU-accelerated approach. With acceleration, these algorithms gain a significant boost in performance and bring the training time of real-world problems to a feasible and viable range. The combination of CPU and GPU, along with sufficient RAM, offers a great testbed for deep learning and AI.”

AI boost

GPUs have invaded even the data center market where CPUs once reigned supreme. In fact, GPUs are suddenly hogging the spotlights, courtesy of the world’s huge interest in everything related to artificial intelligence, according to market observers. Once confined to the gaming world, GPUs are increasingly being integrated into other applications. It is now used in many applications as an alternative to CPUs in areas where it can “meet the requirements of demanding computations as well as manage cost savings on hardware and electricity,” according to research aggregator Reports Insights. “The growing demand for cross-platform gaming results in the large adoption of GPUs in terms of high computability and performance on various devices and operating systems,” the researchers added. “Thus, the current status of GPUs is in terms of continuous advancements and improvements due to regular launches of the latest models by companies such as Nvidia, AMD and Intel.”

The race to inject GPUs into AI applications began more than 10 years ago as companies like Nvidia began searching for new ways to gain a bigger share of the semiconductor market. Executives in the industry were trying to raise the prospects for a new form of data processing that they termed “accelerated computing” and which they see helping to loosen Intel Corp.’s hard grip on the processor market. Nvidia called AI the “final frontier” in the computing world where opportunities in “deep learning” and research simulations could be further explored by researchers and businesses. That frontier was breached years ago when Nvidia began pushing the concept of GPU-accelerated computing, according to company executives. 

In a presentation in 2016 at New York University, Jensen Huang, chairman and CEO of Nvidia, explained why he thought GPUs were better suited for advancing the attainment of AI-level programming. “Computer programs contain commands that are largely executed sequentially,” Huang said. “Deep learning is a fundamentally new software model where billions of software-neurons and trillions of connections are trained, in parallel. Running DNN [Deep Neural Networks] algorithms and learning from examples, the computer is essentially writing its own software. This radically different software model needs a new computer platform to run efficiently. Accelerated computing is an ideal approach and the GPU is the ideal processor.”

While the development work and tests continued, it took several more years for the GPU-centric deep learning to become known to more people than researchers, academicians and general consumers. The explosive introduction last year of ChatGPT and other generative AI-related innovations has now increased awareness of the technology in addition to exposing its utility in areas beyond academia and research. This has also resulted in greater awareness of the role GPU vendors play in the development of the technology and in the management of data centers, according to analysts.

Competition grows

The GPU market first contracted at the beginning of the century when the segment went through a round of consolidation that reduced the number of players. Peddie noted in an earlier book (The History of the GPU – Era and Environment) that the number of suppliers in the segment “peaked” in 1998 although demand for the processors especially in the PC segment continued to rise. AMD and Nvidia led the consolidation of the segment. AMD purchased ATI Technologies Inc. in 2006 and added Xilinx to its list of acquisitions last year. Although Xilinx was not a supplier of GPUs, its FPGA operation complemented AMD’s offerings in the area of adaptive computing as CEO Lisa Su said in a statement announcing the closing of the transaction in early 2022.

“The acquisition of Xilinx brings together a highly complementary set of products, customers and markets combined with differentiated IP and world-class talent to create the industry’s high-performance and adaptive computing leader,” Su said. “Xilinx offers industry-leading FPGAs, adaptive SoCs, AI engines and software expertise that enable AMD to offer the strongest portfolio of high-performance and adaptive computing solutions in the industry and capture a larger share of the approximately $135 billion market opportunity we see across cloud, edge and intelligent devices.” 

Nvidia too has been active in the M&A market. The company has made more than 20 acquisitions since it was founded in 1993. In 2002, it purchased the intellectual assets of rival graphics chipmaker 3dfx and added Hybrid Graphics in 2006. Its biggest purchase was the $6.9 billion acquisition of Mellanox Technologies, first announced in 2019. The transaction closed in April 2020 and added critical high-performance computing technology to the company’s offerings. Many of Nvidia’s other acquisitions also brought significant software innovations that were incorporated into its current product portfolio. 

“With Mellanox, the new NVIDIA has end-to-end technologies from AI computing to networking, full-stack offerings from processors to software, and significant scale to advance next-generation data centers,” said Nvidia’s Huang, in a statement announcing the completion of the Mellanox purchase in April 2020. “Our combined expertise, supported by a rich ecosystem of partners, will meet the challenge of surging global demand for consumer internet services, and the application of AI and accelerated data science from cloud to edge to robotics.”

The three-horse race of AMD-ATI-Nvidia in the GPU market ended with the purchase of ATI by AMD but other fronts in the GPU war have since opened with Intel now one of the major contenders in the sector. Other companies are also gunning for a share of the market, offering software and IP GPUs, especially for mobile handset applications. Other players include Imagination Technologies and many new entrants from China. The expected growth spurt and high valuation given to Nvidia, which recently crossed the $1 trillion market capitalization level, could result in additional players making efforts to establish a presence in the GPU market, according to Reports Insights. 

“The highly competitive market of graphic processors involves established market players such as Intel, Nvidia, AMD, and Intel,” the researcher said. “Such market players continuously innovate and expand their market shares through technological advancements due to increased demand for high performance, power efficiency, and increased functionality of GPs. Price competition and shifting consumer preferences are also significant factors that influence the overall market’s competitiveness.”

Future of GPU

The huge sales growth predicted for GPUs will keep the segment active for a long time and attract new players as well as additional investments from current market players. Demand for GPUs is also expected to remain strong for the foreseeable future due to growing interest from companies that are developing their own deep learning AI accelerators, according to observers. Data center and cloud services companies are also active in this segment and demand for GPUs from these companies will help fuel growth and demand for the processors for years into the future. Demand is also rising from the automotive, industrial and medical markets in addition to the traditional gaming and entertainment segments, sources said. 

“These technologies rely on powerful GPUs for processing and rendering graphics, analyzing sensor data, and enabling intelligent decision-making,” said Spherical insights, in its report. “The automotive industry’s focus on enhancing user experiences, improving safety, and developing autonomous vehicles creates a strong demand for GPUs.”

www.mouser.com