Stocks News

Micron’s AI Drive: Is It Overtaking Nvidia in the Memory Chip Market?

Artificial intelligence (AI) has transformed major industries including healthcare, finance, retail, automotive, and manufacturing. NVIDIA Corporation (NVDA) We have been at the forefront of advancing AI through graphics processing units (GPUs). These GPUs are essential for training large-scale language models (LLMs) such as OpenAI’s ChatGPT, and the company’s sales and earnings have grown tremendously.

As a result, NVDA’s stock price has risen nearly 148% over the past six months and more than 205% over the past year. Exceptional Performance of Nvidia Stock The market capitalization has surpassed $3 trillion.This makes the company the second most valuable company in the United States.

But another representative semiconductor company is Micron Technology Corporation (MU)Known for its innovative memory and storage solutions, is experiencing incredible growth as AI adoption rapidly expands.

We explore how the ongoing AI boom is fueling Micron’s impressive growth and assess whether Micron can overtake Nvidia in the memory chip market.

Micron’s Solid Q3 Financials and Optimistic Outlook

MU had revenue of $6.81 billion. Third quarter ended May 30, 2024It beat analysts’ expectations of $6.67 billion, compared to $5.82 billion in the previous quarter and $3.75 billion in the same period last year. Strong AI demand and strong execution led Micron to exceptional revenue growth that exceeded its guidance range for the third quarter.

Micron’s non-GAAP gross margin was $1.92 billion, down from $1.16 billion in the prior quarter and a loss of $603 million in the third quarter of 2023. Non-GAAP operating income was $941 million, down from $204 million in the prior quarter and a loss of $1.47 billion in the same period of 2023.

Additionally, the company reported non-GAAP net income and earnings per share of $702 million and $0.62, respectively, compared to a net loss of $1.57 billion and a loss per share of $1.43 in the same quarter last year. EPS beat the consensus estimate of $0.53.

MU’s adjusted free cash flow was $425 million, compared to minus $29 million in the prior quarter and minus $1.36 billion in the same quarter of 2023. The company ended the quarter with cash, liquid investments, and restricted cash of $9.22 billion.

“We are expanding share in high-margin products like high-bandwidth memory (HBM), and our data center SSD revenue reached a record high, demonstrating the strength of our AI product portfolio across DRAM and NAND. We look forward to the expanding AI-based opportunity ahead and are well-positioned to achieve significant revenue growth in fiscal 2025,” said Sanjay Mehrotra, president and CEO of Micron Technology.

Micron expects revenue of $7.6 billion plus or minus $2 billion in the fourth quarter of 2024. The midpoint of its guidance range for revenue ($7.6 billion) represents a 90% increase over the same period last year. Non-GAAP gross margin is expected to be 34.5% plus or minus 1%. The company also expects non-GAAP earnings per share to be $1.08 plus or minus 0.08, reversing a loss of $1.07 per share in the same period last year.

A Crucial Role in the AI ​​Ecosystem

The success of MU in the AI ​​ecosystem is largely driven by high-bandwidth memory (HBM) chips, which are essential for high-performance computing (HPC), GPUs, AI, and other data-intensive applications. These chips provide fast and efficient memory access to process large amounts of data quickly.

Micron sold $100 million worth of its products. HBM3E It expects to sell the chips only in the third quarter. The company also expects HBM3E revenue to grow from “hundreds of millions of dollars” in fiscal 2024 to “billions of dollars” in fiscal 2025.

Earlier this year, the company HBM2E solution mass production begins Used in Nvidia’s latest AI chips. Micron’s 24GB 8H HBM3E will be part of the NVIDIA H200 Tensor Core GPU.

In addition, Micron’s DRAM (dynamic random access memory) and NAND flash memory are critical components in AI applications. June, MU We’ve sampled the next-generation GDDR7 graphics memory. For AI, gaming, and HPC workloads. Leveraging Micron’s 1β (1-beta) DRAM technology and advanced architecture, GDDR7 delivers 32Gb/s high-performance memory in a power-optimized design.

On May 1, the company achieved an industry milestone by completing its first validation and delivery. 128GB DDR5 32Gb Server DRAM To address the growing demands for extreme speed and capacity in memory-intensive Gen AI applications, 128GB DDR5 RDIMM memory powered by Micron’s 1β technology delivers more than 45 percent higher bit density. Up to 22% improved energy efficiency, Reduce latency by up to 16% Superior to competing 3DS TSV (through silicon via) products.

AI-driven demand for smartphones, PCs, and data centers

AI is driving strong demand for memory chips across a range of sectors, including smartphones, personal computers (PCs), and data centers. In a recent earnings conference call, Micron executives noted that AI-enabled PCs are expected to have 40% to 80% more DRAM content and larger storage capacities than today’s PCs. Likewise, this year’s AI-enabled smartphones will feature 50% to 100% more DRAM than last year’s flagship models.

This trend points to a bright future for the global memory chip market. According to a Business Research Company report, the market is expected to reach $130.42 billion by 2028. Growing at a CAGR of 6.9%.

Micron, Competitive Advantage and Attractive Value Relative to Nvidia

MU is expected to outpace Nvidia’s growth next year, despite NVDA’s expected revenues jumping from $60.9 billion in fiscal 2023 to about $120 billion this year. Micron’s revenues could grow 50% year-over-year next fiscal year, outpacing Nvidia’s projected growth rate of 33.7%.

In terms of non-GAAP P/E (FY2), MU is currently trading at 13.76x, which is 60.9% lower than NVDA, which is trading at 35.18x. MU’s expected EV/Sales and EV/EBITDA are 5.98x and 16.44x, respectively, which are lower than NVDA’s 26.04x and 40.56x. MU’s trailing 12-month price-to-book value multiple is also significantly lower at 3.28x, compared to NVDA’s 64.15x.

Therefore, Micron is an attractive investment opportunity for those looking to enter the AI-based memory chip market at a reasonable price.

conclusion

MU is experiencing significant growth due to the AI ​​boom, with impressive Q3 financial results and strong outlook for the coming quarters. The company’s strategic position in the AI-based memory chip market, especially HBM3E chips, is essential for high-performance computing and data-intensive applications. This has enabled Micron to capitalize on the surging AI demand across a range of segments, including smartphones, PCs, and data centers.

June 27 Goldman Sachs analyst Toshiya Hari We maintained our Buy rating on MU. Buy the stock and raised its price target from $138 to $158. Goldman Sachs’ position indicates strong confidence in Micron’s long-term prospects, particularly its expansion of AI computing capabilities and strategic initiatives in the memory market.

Additionally, Rosenblatt Securities reaffirmed its Buy rating on Micron Technology stock. A stable target price of $225The company’s optimistic outlook is driven by expectations of robust financial performance that will beat analyst estimates, driven by progress in AI and HBM development.

Compared to Nvidia, Micron offers solid growth potential at a more reasonable valuation. Despite Nvidia’s dominant position in AI and data center segments and its outstanding stock performance, Micron’s revenue growth is expected to outpace Nvidia’s next year, driven by its expanding AI product portfolio and increasing market share in high-margin memory products.

For investors looking to capitalize on the AI ​​revolution, Micron presents an attractive opportunity based on its solid financial performance, innovative product offerings, and competitive edge in the memory chip market.

Related Articles

Back to top button