← April 12, 2026
bigtech money

Samsung's Memory Division Just Out-Earned Amazon, Meta, and Microsoft Combined. Nobody in SV Is Talking About It.

Samsung's Memory Division Just Out-Earned Amazon, Meta, and Microsoft Combined. Nobody in SV Is Talking About It.
CoinCentral

What launched / what broke

Nvidia launched the modern AI boom with CUDA and the H100 in 2022. What broke was the story that software and American founders would capture nearly all the value. Samsung's memory division reported preliminary operating profit of 57.2 trillion won for Q1 2026, approximately 41 billion dollars at current exchange rates, an eightfold increase year-over-year. That figure exceeds the combined quarterly profits of Amazon Web Services, Meta advertising, and Microsoft cloud. Revenue jumped 68 percent year-over-year. HBM demand from frontier model training drove DRAM prices up more than 30 percent. SK Hynix posted similar records and its stock rose 13 percent in a day. Foxconn saw AI server revenue surge 30 percent, hitting approximately 66.6 billion dollars in Q1 revenue. Every serious AI model requires massive HBM volumes during both training and inference. Samsung controls roughly 30 percent of world supply. Physical fab capacity, not software cleverness, now sets the ceiling.

The AI story was sold as a software margin era. The reality is a Korean conglomerate's cyclical memory business extracting richer profits than any hyperscaler or AI lab, because HBM supply is constrained by physics, not by code.

What Nobody at the Company Can Say

US tech media cannot say the clearest winner of the AI wave is neither a Silicon Valley startup nor an American company. They cannot say the returns are flowing to engineers optimizing deposition processes in Suwon rather than prompt engineers in San Francisco. The unsayable fact is that this is supply-chain arithmetic, not a morality play about innovation ecosystems or founder genius. American venture capital cannot admit that its portfolio companies are structurally taxed by a foreign manufacturer with pricing power. Samsung is not riding the AI wave. It is the wave.

The Engineer Who Quit

The departure signal is the movement of technical talent. Multiple senior process engineers from advanced packaging teams at major US semiconductor firms have taken roles at Samsung's HBM stack group in the past year, attracted by compensation increases of 30 to 40 percent and the chance to work on the actual bottleneck. Several researchers from Tier One AI labs followed. They cited compensation, ownership of the real constraint, and exhaustion with incremental model tweaks that still fail without more memory. A 1 percent improvement in HBM yield is worth more than almost any algorithmic gain at current scale. The best technical talent is quietly migrating toward the margin. The quitters have concluded that atoms are outbidding bits.

Who Pays

AI labs

Every training run, today

Spot prices on HBM modules that have quadrupled since 2023. Every training run writes a large check to Samsung.

Hyperscalers (Amazon, Meta, Microsoft, Google)

Ongoing per GPU rack purchased

Nvidia systems whose cost structure includes Samsung's margins. Their unit economics on AI services are permanently compressed.

American tech workers

Slow-burn through equity vesting cycles

Company equity grows slower than it should because value is leaking offshore to Korean shareholders. The tax is invisible in most P&L statements.

Dead Pool Watch

Pure software AI companies without proprietary hardware access sit in the deadpool. Their burn rates assume memory will commoditize. It will not. Traditional DRAM vendors who missed the HBM transition are already irrelevant. Intel continues its slow decline: without a memory moat or foundry success it has no answer to this shift. Watch any listed AI infrastructure name whose gross margins compress more than 800 basis points in the next two quarters. That will be the tell.

In 6 Months

Samsung reports another record quarter near $50B operating profit. HBM4 ramps with persistent shortage.

Signal Nvidia signs longer-term supply agreements with Samsung and SK Hynix that include capacity reservation fees.

US onshoring memory announcements produce press releases and no actual capacity. Several AI labs delay model launches citing input costs.

Signal An AI lab's earnings call or blog post mentions 'compute constraints' that trace back to memory availability, not GPU availability.

Korean memory stocks rise 30-50%. Several major US AI names trade sideways.

Signal Samsung's market cap overtakes a major US tech company's by year-end.

What Would Change This

A genuine breakthrough in alternative memory technologies that scales outside current HBM manufacturing physics would change the bottom line. So would a sudden plateau in model scaling where additional parameters deliver no meaningful gains and demand collapses. Sustained China-subsidized HBM flooding the market at below cost would erode pricing power within two years. None of these scenarios look probable inside 18 months. Cleanroom capacity remains the binding constraint. Until multiple new fabs reach volume production in 2028 or later, Samsung continues to collect the tax. The memory layer owns the margin structure of the AI era. All other narratives are secondary.

Sources

Wccftech — Lead story: Samsung's memory division profit exceeds combined AWS, Meta ads, and Microsoft cloud quarterly results.
Seoul Economic Daily — Primary Korean financial press coverage. Brokerages raising price targets to 360,000 KRW. Record Q1 2026 profit of 57.2 trillion won.
Blocks and Files — Technical: HBM demand driving the 8x year-over-year profit surge. Revenue up 68% to 133 trillion won ($96B). Memory prices driven by AI training demand.
Tech Insider — Foxconn Q1 2026: $66.6B revenue on 30% AI server surge. March alone up 45.6%. Confirms hardware supply chain is the real winner.

Related