Booming AI Market Sparks Increase in Server and Storage Components Income
In the dynamic world of technology, the data centre component market is witnessing a significant shift. Hyperscalers, with their custom solutions, are rapidly gaining ground, according to the latest reports.
The data centre component market revenues in 2024 were dominated by NVIDIA, which led all vendors. The company's GPU systems, featuring the Hopper platform, were the primary drivers of growth. Notably, Samsung and SK Hynix trailed behind NVIDIA in the same year.
The year 2024 also saw a record USD244 billion in server and storage component revenues. This growth was primarily driven by GPUs, custom accelerators, HBM, and storage solutions tailored for AI workloads. The demand for technologies such as GPUs, custom accelerators, HBM, high-capacity SSDs, and NICs for back-end networks is projected to experience strong double-digit revenue growth in 2025.
The top 4 US cloud service providers, Google, Amazon, and Microsoft, significantly increased their adoption of accelerated servers in 2024. AMD has also gained traction with major cloud service providers, particularly with its Instinct platform. The CPU market is projected to increase by 45% in 2025, marking a recovery.
Smart NIC and DPU revenues increased by 77% in 2024, driven by the strong deployment of ethernet adapters in AI clusters. Improved pricing was observed throughout the year for memory and storage in the general-purpose server market, indicating a strong rebound.
The server and storage systems component market is forecast to grow over 40% in 2025. The report provides a forecast of the demand for server and storage semiconductors and components based on shipments to hyperscale cloud service providers and the rest-of-the-market.
The report tracks revenue, unit and capacity shipments, unit and capacity pricing, and market share of major semiconductor and component manufacturers starting from 2018. It includes a forecast of the demand for technologies such as CPUs, accelerators like GPUs, FPGAs, custom AI ASICs, Ethernet and InfiniBand NICs, HDDs, NAND/SSDs.
In the future, the top US cloud providers will likely rely on NVIDIA/AMD GPUs alongside their own custom AI ASICs for efficient AI processing. Custom AI ASICs, like Google's TPU and Amazon's Inferentia, are increasingly being supplied by these companies to tailor their cloud infrastructures.
The Dell'Oro Group's Data Center IT Semiconductors and Components Quarterly Report is a comprehensive resource for understanding the trends and forecasts in this rapidly evolving market.
Read also:
- A continuous command instructing an entity to halts all actions, repeated numerous times.
- Oxidative Stress in Sperm Abnormalities: Impact of Reactive Oxygen Species (ROS) on Sperm Harm
- Is it possible to receive the hepatitis B vaccine more than once?
- Transgender Individuals and Menopause: A Question of Occurrence?