articles Ratings /ratings/en/research/articles/231114-gen-ai-is-writing-a-new-credit-story-for-tech-giants-12909906 content esgSubNav
In This List
COMMENTS

Gen AI Is Writing A New Credit Story For Tech Giants

COMMENTS

Private Markets: How Will Private Credit Respond To Declining Yields?

COMMENTS

Monetary Easing: What If The Interest Rate Descent Disappoints?

COMMENTS

Energy Transition: How Will The U.S. And Europe Respond To China’s Clean-Tech Leadership?

COMMENTS

Corporates: Can Monetary Easing Bring Enough Relief To Justify Current Market Optimism?


Gen AI Is Writing A New Credit Story For Tech Giants

The generative AI market is poised for explosive growth. The debut of ChatGPT in late 2022 revealed the potential of generative artificial intelligence (AI). Like other transformative technologies such as the internet, generative AI will require large upfront investment to develop and to commercialize.

We see clear winners and potential followers for at least the next three years. Yet the long-term implications are less certain, and depend on tech firms' adaptability to the evolving AI market, with properly paced and timed investments, good execution, and financial discipline.

From processing units to virtual assistants

image

Implications For Rated Tech Firms Over The Next Three Years

Early beneficiaries of this forecast investment will include:

  • Foundry leader Taiwan Semiconductor Manufacturing Co. Ltd. (TSMC);
  • Fabless chip designers NVIDIA Corp. and Broadcom Inc.;
  • Memory chip producers SK Hynix Inc. and Samsung Electronics Co. Ltd.; and
  • Networking equipment providers Cisco Systems Inc. and electronic manufacturing service (EMS) firm Foxconn Industrial Internet Co. Ltd. (Fii).

All these firms have some technological or capability leadership in the generative AI value chain. They also have ample financial resources to further invest and expand their AI-related technologies and capability.

We also believe server vendors, including Lenovo Group Ltd., Dell Technologies Inc., and Hewlett Packard Enterprise Co. (HPE) will see incremental sales of generative AI servers. This is despite competition from original design manufacturers (ODM) and additional efforts needed to recalibrate their product portfolios, which are now heavily exposed to conventional servers.

Table 1

Rating implications of generative AI are largely positive
Tech subsectors  Rated firms  Preliminary assessment on rating implication from generative AI 
Fabless chip designer  NVIDIA Corp. (A+/Stable/A-1+)  Positive 
Broadcom Inc. (BBB-/Watch Pos/A-3)  Positive 
Foundry  Taiwan Semiconductor Manufacturing Co. Ltd. (TSMC, AA-/Stable/--)  Positive 
Memory chip  SK Hynix Inc. (BBB-/Negative/--)  Positive 
Samsung Electronics Co. Ltd. (AA-/Stable/A-1+)  Modestly positive 
Micron Technology Inc. (BBB-/Stable/--)  Neutral 
Networking  Cisco Systems, Inc. (AA-/Stable/A-1+)  Positive 
Server vendors  Hewlett Packard Enterprise Co. (BBB/Stable/A-2)  Modestly positive  
Dell Technologies Inc. (BBB/Stable/--)  Modestly positive 
Lenovo Group Ltd. (BBB/Stable/--)  Modestly positive* 
EMS/ODM  Foxconn Industrial Internet Co., Ltd. (A-/Stable/--)  Positive 
*Our base case assumes the that recent U.S. controls on tech exports control will not disrupt Lenovo's generative AI server business, so long as the production and sales are in unrestricted overseas markets. EMS--Electronic manufacturing service. ODM--Original design manufacturer.

Those that have more to gain from generative AI value are usually rated tech firms at the upstream level (e.g., semiconductor producers) and those at the downstream (e.g., hyperscale CSPs) as opposed to midstream producers (e.g., those involved in passive components and sever assembly). This divergence stems from the technological strengths of semiconductor providers and end-demand concentration among CSPs, which can easily afford hefty generative AI investments.

Challenges Over The Longer Term

The longer-term implications to tech firms are less certain. The inchoate state of the generative AI market means that technologies, supply chain and end demand are fluid and can change rapidly. Hence, the ability to stay relevant and responsive to ongoing changes with competitive technology and product offerings is critical to tech firms' market position. This requires companies to accurately pace and time AI-related investments, with good execution and financial discipline.

Investing too much too early would strand assets and waste resources. This may in turn affect key credit measures, especially for asset-intensive segments such as semiconductors. Industry-wide overinvestment could even exacerbate the high volatility of technology markets.

On the other hand, companies that fail to invest sufficiently and at the right time risk ceding technological leadership, losing market share, and profitability erosion due to technology obsolescence.

There are also external risks such as trade tensions and restrictions. The U.S. for instance has tightened export control on advanced AI chips and semiconductor components to countries including China. The effects of such a policy may be immaterial for the next 12-18 months. But they could limit growth prospects of the generative AI market longer term, especially given our estimate that China will make up 10%-15% of end demand by AI server procurement. See "More Risks Ahead For Global Technology Companies As U.S. Restrictions Tighten," Nov. 09, 2023, published on RatingsDirect).

Overall, the longer-term prospects of generative AI market, after this early wave of investments over the next three to four years, depend on commercialization of the technology. Development of AI-enabled end applications and the opportunities to monetize them are key determinants.

Chart 2

image

Generating Investment

Investments in generative AI will be a key growth driver for IT spending. International Data Corp. (IDC) forecasts that global investment into generative AI solutions (including software, infrastructure, and IT services) will rise to US$16 billion in 2023 and top US$143 billion by 2027.

Chart 3

image

From Infrastructure Investment To Peripheral Hardware

The bulk of investments should flow into infrastructure as generative AI training is growing in scale and algorithmic complexity. The training involves feeding raw data into large language models (LLM), which learn patterns. This entails vast amounts of advanced computing power.

Chart 4

image

Generative AI infrastructure comprises a range of hardware categories (e.g., AI processors, networking equipment, and high bandwidth memories). It can be 10x-30x more expensive than the conventional IT infrastructure for general-purpose data centers.

Beyond two years, we expect more investment will extend to peripheral tech hardware to support AI inference, whereby live data is applied to trained models to make predictions and conduct real-world tasks.

How We Assess Tech Subsectors

Foundries

TSMC   An expansion in the volume of processor orders for generative AI should help solidify TSMC's market leadership and diversify the foundry's revenue stream. The uptrend will boost chip fabrication and advanced semiconductor packaging--a technology that houses and integrates various semiconductor contents.

TSMC is well positioned to capture such demand, given its leadership in process technology, strong production yields, and comprehensive integrated circuit (IC) design and packaging offerings (e.g. CoWoS (Chip on Wafer on Substrate)). The foundry forecasts sales of AI-related processors will increase at a compound annual growth rate of 50% over the next five years. Under this forecast, its revenue contribution will increase to about 10%-13% from 6%. We also expect TSMC to double its CoWoS capacity by the end 2024, given surging demand.

Meanwhile, soft non-AI end demand, which still made up more than 90% of TSMC's revenue by the first half of 2023 remains an overhang. We expect TSMC's EBITDA margin to moderate to 67.3% in 2023, mainly because of lower capacity utilization, and to recover to 69.5% in 2024, compared with 69% in 2022. This, in tandem with potential delays in overseas expansion, could drag TSMC's capex to US$32 billion in 2023 and US$30 billion in 2024, from US$35 billion in 2022. We estimate TSMC's free cash flow will be negative at new Taiwan dollar (NT$) 60 billion in 2023 but reverse to a positive NT$300 billion in 2024.

Longer term, as TSMC stays committed to expanding capacity for advanced process nodes, it may find it tough to sustain utilization and profitability if generative AI end-demand does not grow as we expect. The recent tightening of U.S. export control on semiconductor is one such test.

Chart 5

image

EMS

Fii  China-based Foxconn Industrial Internet Co. Ltd. (Fii) is a subsidiary of Taipei-headquartered Hon Hai Group, the world's largest EMS provider by revenue and market share. Fii should see benefit to its sales, product mix, and profitability as the generative AI boom boosts demand for AI servers. These new products enjoy price premium over conventional servers due to customization and complex engineering.

Fii's engineering expertise, network of global production plants, and stable relationship with CSPs should enable it to capture this opportunity. Fii is the only EMS firm capable of providing full solutions for server production. It has controlled more than 70% of the global share in server graphics processing unit (GPU) modules and 50% of the GPU baseboard manufacturing, the profitable upstream of the generative AI server value chain. We expect the company to grow its exposure to the downstream such as in server assembly in ODM direct production mode.

We forecast the gross margin of Fii's cloud-computing segment to reach 5%-6% in 2025, from 4.0% in 2022. This is owing to a change in product mix, with increasing sales of AI servers and modules. We consequently forecast Fii's overall EBITDA margin to increase to more than 5.5% in three years from 5.1% in 2022. The cloud-computing segment accounted for 42% of Fii's revenue in 2022.

The price premium and profitability upside embedded in generative AI servers could diminish over time if rivals can ramp up competitive supplies. Such a development would help CSPs in their bid to diversify their supply chain and increase their competitiveness.

Chart 6

image

Hon Hai  The benefits from Fii's generative AI server business are moderate for Hon Hai on a consolidated basis. This business would only make up a small portion of Hon Hai's revenue over the next two to three years (4%-5% in 2022). Key segments of the group (such as computing and smartphone assembly) could still face challenges from weak end demand and Apple's attempts to diversify suppliers to contain supply chain and geopolitical risks. Furthermore, Hon Hai may struggle to rapidly grow its EV business, given its existing order backlog and project pipelines with EV start-ups. We now expect Hon Hai's revenue to drop by 8%-10% in 2023 and recover by 3%-5% in 2024.

Memory chip producers

SK Hynix, Samsung   Demand for generative AI servers, which have high memory density, may spur memory usage, particularly DRAM (Dynamic Random Access Memory). Furthermore, generative AI servers adopt advanced DRAM solutions including HBM3 (High Bandwidth Memory, a stacked memory technology that contains several DRAM modules) and DDR5 (Double Data Rate). These chips are seeing pent-up demand and sell at a notable premium and have a much higher operating margin than regular memory chips. Trendforce forecasts that global HBM revenue could more than double in 2024 to US$8.9 billion from its 2023 level, owing to a doubling in HBM bit supplies and high average selling price of HBM3.

Table 2

Evolving AI computing entails larger memory capacity
Server AI server Future AI server
Server DRAM Content 500~600GB 1.2~1.7TB 2.2~2.7TB
Server SSD Content 4.1TB 4.1TB 8TB
HBM Usage - 320~640GB 512~1,024GB
TB--Terabyte. GB--Gigabyte. Sources: TrendForce, April 2023. S&P Global Ratings.

Chart 7

image

We view Korean memory makers, SK Hynix and Samsung, as better positioned over the next two to three years than their U.S. peer Micron to benefit from the generative AI boom. SK Hynix is the first mover in mass-producing HBM3 and was the sole HBM3 supplier to NVIDIA by the third quarter of 2023. Solid sales of HBM3 and DDR5 fueled a turnaround of SK Hynix's DRAM business in the third quarter of 2023 after two quarters of losses. We expect Samsung to ramp up its HBM3 chip supply in the fourth quarter of 2023. We see SK Hynix and Samsung as likely key providers of AI server memory chips from 2024, implying upside to their product mix and profitability.

Micron  Micron trails its competitors in the generative AI race. We don't expect it to achieve meaningful HBM3 revenues until the first quarter of 2024. A late start, despite a few months apart, suggests Micron will face additional tests in trying to rapidly grow its market share. This is because early movers may have agreed on certain supplies with key customers such as NVIDIA. Despite that, we still expect Micron to grab some share of the market because NVIDIA wants supplier diversity as business grows. Given Micron's overreliance on DRAM versus NAND, we believe this high-growth, high margin AI business will add to Micron's overall financial performance. However, we still expect it to lag its peers in two to three years.

Fabless chip designers

NVIDIA  We upgraded NVIDIA on June 5, 2023, to 'A+' from 'A' to reflect its stronger market position as the biggest beneficiary of the rapid generative AI investments by cloud providers and enterprises. We now view NVIDIA's market opportunity from generative AI as immediate and massive, and forecast its data center segment will triple in scale over the next four years through fiscal 2027 ending January 2027. This is because cloud providers prioritize GPU spending to enable generative AI and large language models.

NVIDIA has a meaningful competitive edge over its rivals. This stems from its design skills across computing and networking silicon, leading process technology through TSMC, and a full-stack ecosystem of software and services to enable deployment of AI-related models. NVIDIA enjoys high barriers to entry, given the increasing technological complexity of GPU and the company's significant research and development capacity (US$7 billion in fiscal 2023, or 27% of revenues). A reflection of this is its more than 90% share in GPU market.

NVIDIA does, however, face emerging tests. The massive growth opportunity in generative AI and the high price of NVIDIA's solutions have driven customers to increasingly invest in their own generative AI solutions or seek alternative supplies. Examples are Amazon's self-developed AI chips Trainium and Inferentia, and Google's Tensor Processing Unit. Advanced Micro Devices Inc., a competitor of NVIDIA, is also catching up with heavy investment and is a credible secondary supplier.

Although NVIDIA has clear leadership in hardware for complex "training" applications, it could face higher competition in inference applications beyond two to three years. Proliferating AI inference use cases bring new opportunities for competitors to provide highly customized solutions. As is the case in most technological evolutions, we expect NVIDIA to gradually cede market share and for its profit margin to compress as credible alternatives emerge.

Broadcom  Generative AI computing should drive Broadcom's sales growth in fiscal 2024 and beyond and provide a solid offset to its more volatile non-AI semiconductor businesses. Continued generative AI infrastructure buildout by hyperscalers should boost demand for Broadcom's high-speed network and custom application-specific IC solutions. We expect the company's AI-related revenues will amount to 20%-25% of its semiconductor solutions segment sales. This base-case forecast may not fully capture the growth potential of AI-related business, as we consider some uncertainties to the order flow beyond fiscal 2024.

We expect shipments of Broadcom's AI-related products to grow to about US$4.5 billion in fiscal 2023 from about US$2.5 billion in fiscal 2022. This should drive the company's revenue to increase by 8% year-on year. Such growth moderates from a stronger rise in the past two years but remains solid amid the unfavorable industrial and macroeconomic conditions, which squeeze non-AI product sales.

Networking equipment providers

Cisco, NVIDIA  Infiniband gained significant traction in the early stages of AI networking buildouts because of its low latency feature, which is required for generative AI workloads. However, we believe its advantage stems mainly from the fact that it's embedded into NVIDIA's systems. We believe ethernet switches will eventually take share from InfiniBand in AI data centers because of their larger existing ecosystem, power efficiency, and scalability.

Additionally, as the evolution of generative AI extends to inferencing from training, the adoption of ethernet networking equipment for generative AI workloads should grow.

Arista (unrated) is a major provider of ethernet networking switches to hyperscalers such as Microsoft and Meta and is aggressively growing its presence in enterprise data centers. Cisco has forecast the total addressable market in AI/ML (machine learning) switching to grow to US$8.5 billion in 2027 from US$2.1 billion in 2023. This compares to Cisco's US$57 billion in revenue in fiscal 2023 (ending in July). In 2023, ethernet will have a 25% share; this will increase to up to 75% in 2027.

While we view this opportunity as incremental, rather than transformative, over the next few years, Arista and Cisco's early lead in ethernet networking equipment for generative AI workloads gives them a catalyst for growth. We view favorably the adoption of ethernet networking equipment for generative AI workloads, but we anticipate Infiniband will remain a competitive solution, given the likelihood of NVIDIA integrating its GPU modules with Infiniband in its future product roadmaps.

Server Vendors

Lenovo, Dell, HPE  The expanding addressable market for generative AI will create solid demand for AI servers. Spending on this new hardware may crowd out some traditional data center investments in certain instances, but we view the overall AI-relative opportunity as something that will add to the general server market rather than take away from it. Compared to dominant players in some tech subsectors, server vendors need to make extra effort to profit from the generative AI boom. Efforts include recalibrating product portfolio that are heavily exposed to enterprise and general servers, as well as managing competition from large ODM direct manufacturers such as Fii, Quanta Services Inc., and Wiwyn Corp.

Nevertheless, this secular trend should favor Lenovo's Infrastructure Solutions Group (ISG; 15% of total revenues in fiscal 2023). About 20% of the segment's US$2 billion annual sales derive from AI-related infrastructure. And we believe such AI-centric revenues will grow robustly in line with gains in market share. Helping to enable this are Lenovo's focus and investments into its fast-growing ODM+ (hardware solutions for cloud service providers) and its global manufacturing presence. The company has unveiled US$1 billion investment in AI-related solutions over the next three years. With such investments, we forecast Lenovo's adjusted net debt-to-EBITDA at about 0.7x-0.8x in the period.

Our current base case for Lenovo assumes that the latest U.S. tech restrictions will not sever its access to advanced chips for server production, as long as it supplies to overseas markets. However, as a company headquartered in Beijing, China, and North Carolina, U.S., it is still uncertain whether and how the U.S. restrictions will affect it. Restricted access to advanced chips for the company could hinder its diversification from PCs. See "More Risks Ahead For Global Technology Companies As U.S. Restrictions Tighten," Nov. 9, 2023.

Dell recently raised its ISG revenue growth target based largely on expected strong demand from generative AI. Dell reported US$2 billion in AI server backlog, mostly from tier-2 cloud providers, in its last earnings call. Over the next three to five years, Dell should generate consistent AI-related demand from its installed base of enterprise customers as they seek to use generative AI to train sensitive data in their own data centers behind the firewall. While hyperscalers are ahead of the enterprises when it comes to spending on generative AI, we believe enterprises will maintain a hybrid cloud approach, especially when it comes to proprietary data.

HPE also reported strong generative AI orders exiting fiscal 2023. Much like Dell, it should also reap additional revenue opportunities in fiscal 2024 given its strong positioning in high performance computing (supercomputers). Its early start in as-a-service (HPE Greenlake) should also provide AI-related opportunities as enterprise customers rent computing power through HPE.

The generative AI race will be a long one. Keeping pace with changes in complexity and seizing on the opportunities will be crucial.

Editor: Lex Hall

Digital design: Halie Mustow

Related Research

(Editor's note: This is the final of a three-part series, and follows our Nov. 2 publication, "China's Chip 'Moon Shot'—The Response To Restrictions" and our Nov. 9 publication, "More Risks Ahead For Global Technology Companies As U.S. Restrictions Tighten.")

This report does not constitute a rating action.

Primary Credit Analyst:HINS LI, Hong Kong + 852 2533 3587;
hins.li@spglobal.com
Secondary Contacts:David T Tsui, CFA, CPA, San Francisco + 1 415-371-5063;
david.tsui@spglobal.com
Andrew Chang, San Francisco + 1 (415) 371 5043;
andrew.chang@spglobal.com
David L Hsu, Taipei +886-2-2175-6828;
david.hsu@spglobal.com
Clifford Waits Kurz, CFA, Hong Kong + 852 2533 3534;
clifford.kurz@spglobal.com
Ji Cheong, Hong Kong +852 25333505;
ji.cheong@spglobal.com

No content (including ratings, credit-related analyses and data, valuations, model, software, or other application or output therefrom) or any part thereof (Content) may be modified, reverse engineered, reproduced, or distributed in any form by any means, or stored in a database or retrieval system, without the prior written permission of Standard & Poor’s Financial Services LLC or its affiliates (collectively, S&P). The Content shall not be used for any unlawful or unauthorized purposes. S&P and any third-party providers, as well as their directors, officers, shareholders, employees, or agents (collectively S&P Parties) do not guarantee the accuracy, completeness, timeliness, or availability of the Content. S&P Parties are not responsible for any errors or omissions (negligent or otherwise), regardless of the cause, for the results obtained from the use of the Content, or for the security or maintenance of any data input by the user. The Content is provided on an “as is” basis. S&P PARTIES DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, ANY WARRANTIES OF MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE OR USE, FREEDOM FROM BUGS, SOFTWARE ERRORS OR DEFECTS, THAT THE CONTENT’S FUNCTIONING WILL BE UNINTERRUPTED, OR THAT THE CONTENT WILL OPERATE WITH ANY SOFTWARE OR HARDWARE CONFIGURATION. In no event shall S&P Parties be liable to any party for any direct, indirect, incidental, exemplary, compensatory, punitive, special or consequential damages, costs, expenses, legal fees, or losses (including, without limitation, lost income or lost profits and opportunity costs or losses caused by negligence) in connection with any use of the Content even if advised of the possibility of such damages.

Credit-related and other analyses, including ratings, and statements in the Content are statements of opinion as of the date they are expressed and not statements of fact. S&P’s opinions, analyses, and rating acknowledgment decisions (described below) are not recommendations to purchase, hold, or sell any securities or to make any investment decisions, and do not address the suitability of any security. S&P assumes no obligation to update the Content following publication in any form or format. The Content should not be relied on and is not a substitute for the skill, judgment, and experience of the user, its management, employees, advisors, and/or clients when making investment and other business decisions. S&P does not act as a fiduciary or an investment advisor except where registered as such. While S&P has obtained information from sources it believes to be reliable, S&P does not perform an audit and undertakes no duty of due diligence or independent verification of any information it receives. Rating-related publications may be published for a variety of reasons that are not necessarily dependent on action by rating committees, including, but not limited to, the publication of a periodic update on a credit rating and related analyses.

To the extent that regulatory authorities allow a rating agency to acknowledge in one jurisdiction a rating issued in another jurisdiction for certain regulatory purposes, S&P reserves the right to assign, withdraw, or suspend such acknowledgement at any time and in its sole discretion. S&P Parties disclaim any duty whatsoever arising out of the assignment, withdrawal, or suspension of an acknowledgment as well as any liability for any damage alleged to have been suffered on account thereof.

S&P keeps certain activities of its business units separate from each other in order to preserve the independence and objectivity of their respective activities. As a result, certain business units of S&P may have information that is not available to other S&P business units. S&P has established policies and procedures to maintain the confidentiality of certain nonpublic information received in connection with each analytical process.

S&P may receive compensation for its ratings and certain analyses, normally from issuers or underwriters of securities or from obligors. S&P reserves the right to disseminate its opinions and analyses. S&P's public ratings and analyses are made available on its Web sites, www.spglobal.com/ratings (free of charge), and www.ratingsdirect.com (subscription), and may be distributed through other means, including via S&P publications and third-party redistributors. Additional information about our ratings fees is available at www.spglobal.com/usratingsfees.

 

Create a free account to unlock the article.

Gain access to exclusive research, events and more.

Already have an account?    Sign in