Recently, global research firm TrendForce held a seminar titled “AI Era: Global Semiconductor Landscape – 2025 Technology Industry Forecast.” The seminar covered various high-tech industries, including wafer foundry, HBM, NAND Flash, AI PMIC, AI servers, panel-level packaging, liquid cooling, and AI PCs.
The rapid development of the artificial intelligence (AI) industry is undoubtedly a powerful force reshaping the industrial landscape, having a broad and profound impact on many other fields. TrendForce’s forecast delves into how AI specifically affects the semiconductor industry and how it will lead to new trends in future semiconductor development.
1. Forecasting AI Industry Growth in the Wafer Foundry Market by 2025
Driven by the high demand for efficient computing chips over the past two years, high-performance applications have become the biggest driver for advanced processes and the overall wafer foundry industry. Starting in 2025, in addition to AI chip suppliers and CSPs developing their own chips, memory suppliers are seeking collaborations with advanced process wafer foundry partners to better adapt HBM and logic chips to high-performance requirements. The entire upstream and downstream ecosystem, including IP, design services, and backend testing, has become a necessary resource in the AI arms race, shifting focus from just advanced frontend process technologies.
In analyzing the overall wafer foundry industry, aside from opportunities in advanced processes, the demand for AI-driven power management may bring new vitality to the historically stable mature processes. The transformation of the wafer foundry industry under Cloud AI and Edge AI developments by 2025 is also a key focus. Various applications are expected to conclude their two-year inventory correction cycle in 2024. TrendForce estimates that the global wafer foundry industry’s output value will see a 20% growth in 2025, with TSMC continuing to lead, while other foundries may also experience nearly 12% annual growth.
2. High Demand Boosts AI PMIC for Mature Process Development
Given ongoing global economic uncertainties, the primary growth driver for the wafer foundry industry in 2024 will come from AI server-related chips. This influence is expected to sustain high capacity utilization rates for advanced processes until 2025. However, the recovery of mature processes at 28nm and above is relatively slow, with an estimated average capacity utilization increase of only 5% to 10% by 2025, reaching about 80%; the average utilization rate for 8-inch wafers is around 75%, necessitating new growth drivers to fill capacity gaps.
Notably, as AI chips continue to evolve, their thermal design power (TDP) is rising. For instance, NVIDIA’s A100 has a maximum TDP of about 400W, increasing to 700W for the H100, and the next-generation Blackwell series is expected to exceed 1,000W. Higher TDPs require more Power ICs to assist in power transmission management, reducing energy loss and improving overall efficiency. TrendForce estimates that the demand for Smart Power Stages (SPS) needed for AI GPUs will increase sharply by 2 to 3 times from 2023 to 2025, becoming a new driving force for mature process capacity.
3. AI Drives Liquid Cooling Technology into New Markets
In recent years, major U.S. cloud providers like Google, AWS, and Microsoft have actively established new data centers worldwide and accelerated the deployment of AI servers. With the upgrade of chip performance, TDP will significantly increase; for example, NVIDIA’s new GB200 NVL72 cabinet has a TDP of around 140kW, requiring liquid cooling solutions to effectively manage heat. Initially, a liquid-to-air (L2A) approach will be mainstream. By 2025, with the official rollout of the GB200 cabinet, the penetration rate of liquid cooling for AI chips is expected to rise from 11% in 2024 to 24% in 2025.
Additionally, among cloud providers developing high-end AI ASICs, Google is the most proactive in adopting liquid cooling solutions, while other providers primarily use air cooling. Furthermore, as global governments and regulatory bodies increasingly recognize ESG (Environmental, Social, and Governance) principles, the transition from air to liquid cooling solutions is anticipated to accelerate, with penetration rates expected to climb annually, prompting power supply manufacturers, cooling solution providers, and system integrators to compete in the AI liquid cooling market, creating a new competitive landscape.
4. Key Developments in the Global AI Server Market and Supply Chain by 2025
Observing global AI market dynamics, strong demand from CSPs and brand clients for building AI infrastructure is projected to lead to a 42% year-over-year growth rate in global AI server shipments (including those equipped with GPU, FPGA, and ASIC) in 2024. By 2025, driven by demand from cloud providers and sovereign cloud needs, AI server shipments are expected to grow by about 28%, increasing AI servers’ share of the overall server market to nearly 15%. In the medium to long term, as various cloud AI training and inference application services advance, the share of AI servers in the overall server market is expected to approach 19% by 2027.
Among major AI chip suppliers, NVIDIA’s market share in GPUs is expected to approach 90% in 2024. In the first half of 2025, the new Blackwell platform is projected to ramp up, becoming the mainstream in NVIDIA’s high-end GPU shipments. Solutions like the pure GPU (e.g., B200) or the integrated GB200 with Grace CPU aim to meet diverse client needs. Other players like AMD, Intel, and CSPs are actively developing the next generation of AI chips, which will enhance shipment growth momentum in 2025 and further double the shipment volume of CoWoS and HBM.
5. HBM Market Challenges and Future Outlook
The HBM market remains in a high-growth phase, driven by the continuous deployment of AI servers, with GPU performance and memory capacity set for upgrades, making HBM an essential component. The specifications and capacity of HBM are on the rise, with NVIDIA’s Blackwell platform adopting 192GB HBM3e memory and AMD’s MI325 exceeding 288GB. However, the high production difficulty and notable improvement potential in yield rates significantly increase overall production costs, with average prices approximately three to five times that of DRAM products. As HBM3e enters mass production and capacity gradually expands, its revenue contribution is expected to grow quarter by quarter.
6. NAND Flash Market Analysis: Opportunities and Challenges
After experiencing significant losses in 2023, NAND Flash suppliers have adopted a more conservative capital expenditure approach. Meanwhile, the demand for DRAM and HBM storage products benefiting from the AI wave is expected to crowd out investment in NAND Flash devices in 2025, alleviating the previously severe oversupply conditions.
As AI technology rapidly advances, the NAND Flash market is undergoing unprecedented transformations. The growing demand for high-speed, high-capacity storage for AI applications is fueling robust growth in the Enterprise SSD (eSSD) market.
7. Panel-Level Packaging and AI Chip Integration Possibilities
FOPLP product applications can be divided into three main categories: PMIC and RF IC, consumer CPUs and GPUs, and AI GPUs. The PMIC and RF IC use chip-first technology, primarily developed by OSAT players, with IDM and panel manufacturers entering the market to expand production scale. Consumer CPUs and GPUs adopt chip-last technology, developed by OSAT players with existing production experience and capacity, with the earliest mass production expected in 2026. AI GPUs, on the other hand, are led by wafer foundry manufacturers using chip-last technology, aiming to expand from wafer-level to panel-level packaging, with the earliest mass production anticipated in 2027. However, FOPLP development faces challenges, including technical bottlenecks and diverse panel sizes, which could dilute research and development capacity. Additionally, players tend to prioritize investing in recoverable FOWLP capacity before allocating resources to FOPLP.
8. Current Status and Future of AI PCs: Building a Foundation for Killer Applications
As AI technology progresses, AI features are expected to become standard in laptops. TrendForce predicts that by 2025, the market penetration rate of AI laptops will reach 21.7%, and by 2029, nearly 80% of laptops will be equipped with AI technology.
In the future, user experiences will become more intuitive through voice recognition and natural language processing. Furthermore, AI technology can provide more accurate business insights through data analysis, helping companies make wiser decisions. Although current applications of AI technology are largely known forms and heavily reliant on cloud services, the future market potential remains promising as technology matures, acceptance grows, and demand for related products increases. Breakthrough AI applications could bring new opportunities to the mature and stable laptop industry, providing consumers with more valuable choices.
Related:
Disclaimer:
- This channel does not make any representations or warranties regarding the availability, accuracy, timeliness, effectiveness, or completeness of any information posted. It hereby disclaims any liability or consequences arising from the use of the information.
- This channel is non-commercial and non-profit. The re-posted content does not signify endorsement of its views or responsibility for its authenticity. It does not intend to constitute any other guidance. This channel is not liable for any inaccuracies or errors in the re-posted or published information, directly or indirectly.
- Some data, materials, text, images, etc., used in this channel are sourced from the internet, and all reposts are duly credited to their sources. If you discover any work that infringes on your intellectual property rights or personal legal interests, please contact us, and we will promptly modify or remove it.