Every piece of modern tech, whether it’s a supercomputer or a smartphone, depends on two essential functions: logic & memory
Logic chips act as the brain, running instructions and processing data.
Memory chips store that data temporarily with DRAM or long term with NAND.
Recently we looked at Intel’s deal with NVIDIA in the logic space. On the memory side, the AI boom is creating just as much momentum. Micron’s latest earnings give
1. Micron’s new business units
Micron, a U.S.based integrated device manufacturer, designs and produces memory chips. Starting in Q4 FY25 (August), it reorganized into four business units that line up with how customers actually buy chips:
Cloud Memory – $4.5B revenue (+214% Y/Y). The AI driver. This segment covers large cloud providers and all HBM sales. It’s also the highest-margin business at 48%. HBM alone contributed nearly $2B in the quarter, putting it on pace for about $8B annually.
Quick refresher on HBM: Instead of spreading out memory like a warehouse, HBM stacks DRAM vertically, like a skyscraper next to the processor. That design delivers huge bandwidth in a compact space perfect for Ai but it’s harder and more expensive to make.
Core Data Center – $1.6B revenue (-23% Y/Y). Memory and storage for traditional servers (Dell, HP, etc)
Mobile and Client – $3.8B revenue (+25% Y/Y). Memory and storage for phones, PCs, and other personal devices.
Automotive and Embedded – $1.4B revenue (+17% Y/Y). Chips for cars, industrial machines, and consumer electronics.
This new structure makes it easier to see how AI-driven memory is performing, while smoothing out some of the volatility in results. The only drawback: year-over-year comparisons will be tricky until Micron publishes historical restatements.
Key numbers
Q4 FY25 (August): Revenue: $11.3B (+46% Y/Y), about $200M above estimates
Gross margin: 45% (+9pp Y/Y) & Non GAAP EPS: $3.03, beating by $0.17
Q1 FY26 guidance (November): Revenue: ~$12.5B, ~$700M above consensus
Gross margin: >50% & Non GAAP EPS: ~$3.75, beating by ~$0.71
Despite strong results, the market reaction was muted. Micron’s stock has already nearly doubled in 2025, so even solid beats were taken as just “good enough.” That says a lot about how much optimism is already priced in.
2. The HBM driven recovery
DRAM (including HBM) has bounced back. After peaking in 2022 and bottoming in mid-2023, trailing 12-month DRAM revenue has climbed back to about $29B, fueled by the HBM ramp and DDR5 demand.
NAND is finding its footing. Revenue has recovered to around $9B LTM, with help from enterprise SSDs and early AI PC adoption.
Why it matters?
As high bandwidth memory (HBM) makes up a larger share of sales, DRAM revenue grows even faster thanks to higher average selling prices. NAND is also stabilizing with help from DDR5 and stronger client SSD demand, which should smooth out the usual swings in the cycle.
In short, the memory up-cycle has arrived, led by AI-driven demand for HBM. Micron’s new reporting structure makes this shift more transparent and shows a clearer path to higher profitability. With HBM in tight supply, gross margins are already pushing past 50%, a level that was hard to imagine in past cycles.
Micron’s earnings confirm that memory remains the limiting factor for AI systems.
Key quotes from management
CEO Sanjay Mehrotra on momentum:
“The combined revenue from HBM, high capacity DIMMs, and LP server DRAM reached $10B, more than 5× last year”
AI memory is no longer experimental it’s become a core business line. It’s also lifting the broader data center segment, adding an extra tailwind to Micron’s growth.
On competition:
“We are pleased to note that our HBM share is on track to grow again and be in line with our overall DRAM share in this calendar Q3, delivering on our targets that we have discussed for several quarters now”
Micron and SK Hynix have pulled ahead of Samsung in the HBM race by getting newer products to market faster.
On U.S. positioning:
“As the only US-based memory manufacturer, Micron is uniquely positioned to capitalize on the AI opportunity ahead”
That plays well with hyperscalers, who see value in supply-chain diversification.
On HBM customer base:
“Our HBM customer base has expanded and now includes six customers. We have pricing agreements with almost all customers for a vast majority of our HBM3E supply in calendar 2026. We are in active discussions with customers on the specifications and volumes for HBM4, and we expect to conclude agreements to sell out the remainder of our total HBM calendar 2026 supply in the coming months”
Micron grew its HBM customer count from four to six in just one quarter and is essentially sold out of premium supply more than a year in advance. That kind of demand shows this isn’t just a short blip it’s a real up cycle.
CFO Mark Murphy on durability:
“These supply-demand factors are there—we believe they’re durable. On the demand side, data center spend continues to increase. On the supply side, customer inventory levels are healthy. Our supply is lean. Our DRAM inventories are below target”
Murphy is pointing to a lasting profitability cycle. Still, the company avoided giving specifics on whether today’s elevated HBM margins will hold.
What to watch, well..
HBM revenue: Already at a run-rate of ~$8B. Track whether growth comes from higher volume or better pricing.
Customization: Next-gen HBM4E (expected 2027) may include custom versions with TSMC, opening the door to stickier, higher-margin products.
Production and pricing: Packaging throughput remains the biggest bottleneck. NAND pricing discipline will also be important.
CapEx: Micron is signaling higher investment for FY26, a bullish sign if execution holds up.
How about Risks ?
Execution: Higher CapEx means Micron has to deliver on yields and throughput.
Supply-demand imbalances: HBM yields and cleanroom capacity are tight; any hiccup could ripple through the industry.
Pricing pressure: HBM3E pricing may soften in 2026 as large buyers lock in long-term contracts, though Micron says most of next year’s supply is already committed.
MoonMaster & Investor lens
Two clear narratives stand out
Bull case: This is the first sustained up cycle since 2018, with AI fueling broad-based demand. HBM is sold out, customer count is expanding, and the HBM4 ramp is already in motion. Momentum could run longer than usual.
Bear case: HBM margins may come down next year, and Micron’s valuation already reflects a lot of optimism. At these levels, the risk/reward is less compelling
the shift to high margin HBM is real. The data center business now makes up more than half of revenue, reshaping Micron’s profile. This could raise the industry’s profitability floor and smooth the sharp cycles of the past. But cycles never disappear entirely. As always, it’s risky to assume “this time is different.”
Logic chips act as the brain, running instructions and processing data.
Memory chips store that data temporarily with DRAM or long term with NAND.
Recently we looked at Intel’s deal with NVIDIA in the logic space. On the memory side, the AI boom is creating just as much momentum. Micron’s latest earnings give
1. Micron’s new business units
Micron, a U.S.based integrated device manufacturer, designs and produces memory chips. Starting in Q4 FY25 (August), it reorganized into four business units that line up with how customers actually buy chips:
Cloud Memory – $4.5B revenue (+214% Y/Y). The AI driver. This segment covers large cloud providers and all HBM sales. It’s also the highest-margin business at 48%. HBM alone contributed nearly $2B in the quarter, putting it on pace for about $8B annually.
Quick refresher on HBM: Instead of spreading out memory like a warehouse, HBM stacks DRAM vertically, like a skyscraper next to the processor. That design delivers huge bandwidth in a compact space perfect for Ai but it’s harder and more expensive to make.
Core Data Center – $1.6B revenue (-23% Y/Y). Memory and storage for traditional servers (Dell, HP, etc)
Mobile and Client – $3.8B revenue (+25% Y/Y). Memory and storage for phones, PCs, and other personal devices.
Automotive and Embedded – $1.4B revenue (+17% Y/Y). Chips for cars, industrial machines, and consumer electronics.
This new structure makes it easier to see how AI-driven memory is performing, while smoothing out some of the volatility in results. The only drawback: year-over-year comparisons will be tricky until Micron publishes historical restatements.
Key numbers
Q4 FY25 (August): Revenue: $11.3B (+46% Y/Y), about $200M above estimates
Gross margin: 45% (+9pp Y/Y) & Non GAAP EPS: $3.03, beating by $0.17
Q1 FY26 guidance (November): Revenue: ~$12.5B, ~$700M above consensus
Gross margin: >50% & Non GAAP EPS: ~$3.75, beating by ~$0.71
Despite strong results, the market reaction was muted. Micron’s stock has already nearly doubled in 2025, so even solid beats were taken as just “good enough.” That says a lot about how much optimism is already priced in.
2. The HBM driven recovery
DRAM (including HBM) has bounced back. After peaking in 2022 and bottoming in mid-2023, trailing 12-month DRAM revenue has climbed back to about $29B, fueled by the HBM ramp and DDR5 demand.
NAND is finding its footing. Revenue has recovered to around $9B LTM, with help from enterprise SSDs and early AI PC adoption.
Why it matters?
As high bandwidth memory (HBM) makes up a larger share of sales, DRAM revenue grows even faster thanks to higher average selling prices. NAND is also stabilizing with help from DDR5 and stronger client SSD demand, which should smooth out the usual swings in the cycle.
In short, the memory up-cycle has arrived, led by AI-driven demand for HBM. Micron’s new reporting structure makes this shift more transparent and shows a clearer path to higher profitability. With HBM in tight supply, gross margins are already pushing past 50%, a level that was hard to imagine in past cycles.
Micron’s earnings confirm that memory remains the limiting factor for AI systems.
Key quotes from management
CEO Sanjay Mehrotra on momentum:
“The combined revenue from HBM, high capacity DIMMs, and LP server DRAM reached $10B, more than 5× last year”
AI memory is no longer experimental it’s become a core business line. It’s also lifting the broader data center segment, adding an extra tailwind to Micron’s growth.
On competition:
“We are pleased to note that our HBM share is on track to grow again and be in line with our overall DRAM share in this calendar Q3, delivering on our targets that we have discussed for several quarters now”
Micron and SK Hynix have pulled ahead of Samsung in the HBM race by getting newer products to market faster.
On U.S. positioning:
“As the only US-based memory manufacturer, Micron is uniquely positioned to capitalize on the AI opportunity ahead”
That plays well with hyperscalers, who see value in supply-chain diversification.
On HBM customer base:
“Our HBM customer base has expanded and now includes six customers. We have pricing agreements with almost all customers for a vast majority of our HBM3E supply in calendar 2026. We are in active discussions with customers on the specifications and volumes for HBM4, and we expect to conclude agreements to sell out the remainder of our total HBM calendar 2026 supply in the coming months”
Micron grew its HBM customer count from four to six in just one quarter and is essentially sold out of premium supply more than a year in advance. That kind of demand shows this isn’t just a short blip it’s a real up cycle.
CFO Mark Murphy on durability:
“These supply-demand factors are there—we believe they’re durable. On the demand side, data center spend continues to increase. On the supply side, customer inventory levels are healthy. Our supply is lean. Our DRAM inventories are below target”
Murphy is pointing to a lasting profitability cycle. Still, the company avoided giving specifics on whether today’s elevated HBM margins will hold.
What to watch, well..
HBM revenue: Already at a run-rate of ~$8B. Track whether growth comes from higher volume or better pricing.
Customization: Next-gen HBM4E (expected 2027) may include custom versions with TSMC, opening the door to stickier, higher-margin products.
Production and pricing: Packaging throughput remains the biggest bottleneck. NAND pricing discipline will also be important.
CapEx: Micron is signaling higher investment for FY26, a bullish sign if execution holds up.
How about Risks ?
Execution: Higher CapEx means Micron has to deliver on yields and throughput.
Supply-demand imbalances: HBM yields and cleanroom capacity are tight; any hiccup could ripple through the industry.
Pricing pressure: HBM3E pricing may soften in 2026 as large buyers lock in long-term contracts, though Micron says most of next year’s supply is already committed.
MoonMaster & Investor lens
Two clear narratives stand out
Bull case: This is the first sustained up cycle since 2018, with AI fueling broad-based demand. HBM is sold out, customer count is expanding, and the HBM4 ramp is already in motion. Momentum could run longer than usual.
Bear case: HBM margins may come down next year, and Micron’s valuation already reflects a lot of optimism. At these levels, the risk/reward is less compelling
the shift to high margin HBM is real. The data center business now makes up more than half of revenue, reshaping Micron’s profile. This could raise the industry’s profitability floor and smooth the sharp cycles of the past. But cycles never disappear entirely. As always, it’s risky to assume “this time is different.”
🟣MasterClass moonypto.com/masterclass
🟢Signal moonypto.com/signal
🔵News t.me/moonypto
⚪ t.me/moonyptofarsi
🟢Signal moonypto.com/signal
🔵News t.me/moonypto
⚪ t.me/moonyptofarsi
相关出版物
免责声明
这些信息和出版物并不意味着也不构成TradingView提供或认可的金融、投资、交易或其它类型的建议或背书。请在使用条款阅读更多信息。
🟣MasterClass moonypto.com/masterclass
🟢Signal moonypto.com/signal
🔵News t.me/moonypto
⚪ t.me/moonyptofarsi
🟢Signal moonypto.com/signal
🔵News t.me/moonypto
⚪ t.me/moonyptofarsi
相关出版物
免责声明
这些信息和出版物并不意味着也不构成TradingView提供或认可的金融、投资、交易或其它类型的建议或背书。请在使用条款阅读更多信息。
