It is estimated that the output value of AI servers will account for 65% of the server market in 2024
According to the latest "AI Server Industry Analysis Report" released by TrendForce Consulting, in 2024, large CSPs (cloud service providers) and brand customers will have high demand for high-level AI servers. With the gradual expansion of production by CoWoS original TSMC and HBM (high bandwidth memory) original manufacturers such as SK hynix, Samsung and Micron (Micron Technology), the shortage situation has been greatly eased after the second quarter of this year. The Lead Time (Lead Time) of NVIDIA's main program H100 dropped from 40-50 weeks previously to less than 16 weeks, so TrendForce Consulting estimates that AI server shipments in the second quarter will increase by nearly 20%, and annual shipments will be revised up to 1.67 million units. The annual growth rate is 41.5%.
TrendForce Consulting said that this year's large CSPs (cloud service providers) budget continues to focus on the procurement of AI servers, and thus crowding out the growth of general servers, compared to the high growth rate of AI servers, general server shipments increased by only 1.9%. The share of AI servers in overall server shipments is expected to reach 12.2%, an increase of about 3.4 percentage points from 2023. If the output value is estimated, the revenue growth contribution of AI servers is more obvious than that of general servers, and it is estimated that the output value of AI servers will reach $187 billion in 2024, with a growth rate of 69%, and the output value of the overall server is as high as 65%.
From the perspective of AI server with AI chip types, CSPs from North America (such as AWS, Meta, etc.) continue to expand their self-developed ASics (application-specific integrated circuits), as well as local Chinese players such as: Alibaba, Baidu, Huawei and other companies are actively expanding their own ASIC solutions, promoting the proportion of ASIC servers in the overall AI server to increase to 26% in 2024, while the proportion of mainstream AI servers equipped with Gpus is about 71%.
In terms of the distribution of AI chip suppliers equipped with AI servers, looking at AI servers equipped with Gpus alone, Nvidia has the highest market share, approaching 90%, and AMD's market share is only about 8%. However, if you add all AI server AI chips including GPU, ASIC, FPGA (programmable logic array), Nvidia's market share this year is about 64%.
According to the survey of TrendForce Consulting, the market demand for high-level AI servers is still strong in 2025, especially with Nvidia's new generation of Blackwell (including GB200, B100/B200, etc.) will replace the Hopper platform to become the mainstream of the market, which will also drive CoWoS and HBM demand. In terms of Nvidia's B100, its chip size will double that of H100, and it will consume more CoWoS consumption. It is estimated that the CoWoS production scale of the main supplier TSMC in 2025 will reach 550k-600k by the end of the year, with a growth rate of nearly 80%. In addition, in terms of HBM consumption, the mainstream H100 will be equipped with 80GB HBM3 in 2024, and the main chips such as Nvidia Blackwell Ultra or AMD MI350 will be equipped with 288GB HBM3e by 2025, and the unit consumption will grow by more than three times, with the continued strong demand for AI servers. It is expected to double the overall supply of HBM in 2025.