
Tech Is Firing People and Hiring Machines: How to Invest in the AI Arms Race
Something strange is happening in the tech industry right now. Companies are laying off workers at an accelerating pace while simultaneously pouring record amounts of money into artificial intelligence. Prediction markets put an 87.1% probability on tech layoffs accelerating beyond current levels, even as a fierce three-way race for AI dominance heats up between Anthropic (50% chance of having the best AI model by December 2026), Google (28%), and OpenAI (just 10%, and falling). This isn't a contradiction. It's a pattern economists call creative destruction, and it's one of the clearest investment signals in years.
Think of it like a company renovating a restaurant. You fire the old kitchen staff, tear out the countertops, and spend a fortune on a brand-new setup. From the outside, the payroll shrinks and expenses spike at the same time. That's what the tech sector is doing right now, cutting non-AI headcount aggressively to fund the AI arms race.
The prediction market data paints a broader picture of caution in the tech world beyond AI. A SpaceX IPO has only a 5% chance of happening by May 2026 and just 25.5% by June, suggesting private capital markets remain tight and selective. Tesla's Optimus humanoid robot has only an 18% chance of going on sale by year-end. And the Musk vs. Altman lawsuit, a legal battle that could reshape AI governance, sits at a coin-flip-ish 37% probability of ruling in Musk's favor. The takeaway: capital is flowing narrowly and specifically into AI compute, not broadly into ambitious tech moonshots.
The Gold Rush Logic: Sell Shovels, Not Pans
During the California Gold Rush, the people who got reliably rich weren't the miners. They were the ones selling pickaxes, denim jeans, and wheelbarrows. The same logic applies here. It doesn't matter whether Anthropic, Google, or OpenAI wins the AI race if you own the companies that supply all three of them.
This is the infrastructure thesis, and the creative destruction cycle playing out in tech makes it especially powerful. Here's how the self-reinforcing loop works:
- Companies decide AI is existential to their future.
- They slash non-AI headcount to free up budget (the 87.1% layoff acceleration).
- That freed-up capital flows into AI compute: GPUs, networking, data centers, power.
- More AI compute creates better AI models, which justifies more investment.
- Competitors see the results and repeat steps 1 through 4.
Every turn of this cycle sends money to the same set of infrastructure companies. Let's walk through the entire stack.
The AI Infrastructure Stack
The Chip Makers
NVDA is the most direct beneficiary. NVIDIA supplies the GPUs that Anthropic, OpenAI, Google DeepMind, and everyone else trains their models on. Whether Anthropic holds its 50% lead or Google closes the gap with its 28% odds, every token trained and every inference run flows through NVIDIA silicon. The company commands over 80% market share in AI training chips, and the CUDA software ecosystem acts like a moat that makes switching painful for developers. Confidence: 78%. But the valuation is already stretched at roughly 30x forward revenue, pricing in a level of perfection that leaves little room for stumbles.
AVGO (Broadcom) is the quieter but arguably more interesting pick. Broadcom designs custom AI accelerators for Google (the chips inside Google's TPUs) and Meta, and it dominates the networking silicon that connects GPU clusters together. AI training clusters need thousands of chips talking to each other at incredible speeds, and Broadcom's networking ASICs and Ethernet switches make that possible. With an infrastructure relevance score of 85 out of 100, this is one of the purest "shovel seller" plays available. Confidence: 80%.
The Chip Equipment Makers (Two Levels Upstream)
ASML is the deepest upstream play you can make. ASML is the sole manufacturer of EUV lithography machines, the equipment needed to etch the impossibly small circuits on advanced AI chips. TSMC, Samsung, and Intel all need ASML's machines. There is no competitor. This is a literal monopoly. If the AI arms race requires more advanced chips, it requires more ASML machines. Infrastructure relevance: 88 out of 100. Confidence: 75%.
AMAT (Applied Materials) and LRCX (Lam Research) round out the equipment layer. Applied Materials makes the deposition, etching, and inspection equipment that fabs use to manufacture chips. Lam Research has outsized exposure to HBM, or High Bandwidth Memory, the specialized memory stacks that sit directly on NVIDIA GPUs. HBM is currently one of the biggest bottlenecks in AI chip production, which makes Lam's position especially valuable. Confidence: 74% and 71% respectively.
The Network Plumbing
ANET (Arista Networks) makes the high-speed switches that form the connective tissue of AI data centers. AI training requires massive amounts of data moving between GPUs simultaneously, and Arista's 400G and 800G switches handle that traffic. Here's a useful detail from the prediction market pattern: the 87% layoff probability actually helps Arista. Companies cut people, not networking infrastructure. Confidence: 76%.
Power and Cooling (The Physical Layer Nobody Talks About)
AI data centers consume staggering amounts of electricity. VRT (Vertiv) provides the power distribution, cooling, and UPS systems that keep those data centers running. This is a true picks-and-shovels play. Every AI player needs Vertiv's products regardless of who builds the best model. Infrastructure relevance: 82. Confidence: 77%.
VST (Vistra) and CEG (Constellation Energy) represent the actual electricity generation layer. Constellation is the largest nuclear power operator in the US and has signed landmark agreements to supply data centers, including a deal with Microsoft to restart capacity at Three Mile Island. Nuclear power is uniquely attractive for AI data centers because it runs 24/7, produces zero carbon, and can support the massive baseload requirements of AI training clusters. Vistra operates natural gas, nuclear, and solar generation in deregulated markets. Confidence: 70% and 69% respectively.
EATON provides electrical switchgear, transformers, and power management equipment for data center construction. It's less of a pure play than Vertiv since it serves many end markets, but that diversification also provides downside protection. Confidence: 70%.
Data Center Real Estate
EQIX (Equinix) is the world's largest colocation data center operator. As AI models move from training to deployment, inference happens in colocation facilities close to enterprise customers, and Equinix owns those facilities. The catch is that it's structured as a REIT, which makes it sensitive to interest rates, and hyperscalers are increasingly building their own campuses. Confidence: 63%.
The Competitors Themselves
GOOGL at 28% probability for best AI model represents compelling value. Google has DeepMind, massive compute infrastructure, proprietary TPU chips, and the distribution advantages of Search, Android, and Google Cloud. The creative destruction cycle actually helps Google because they can redirect resources from legacy projects to AI while maintaining cash flow from advertising. Confidence: 72%.
AMZN benefits through AWS, which is Anthropic's primary cloud partner backed by Amazon's $4 billion-plus investment. AWS collects infrastructure rent from AI workloads regardless of which model wins. The 87% layoff acceleration also helps Amazon's cost structure as tech talent costs normalize. Confidence: 72%.
MSFT is the more complicated case. Microsoft has invested over $13 billion in OpenAI, but prediction markets show OpenAI's probability dropping (down 3.5% in 24 hours, now at just 10% for best AI). If Anthropic truly displaces OpenAI, Microsoft's massive bet faces impairment risk. Azure still has value as infrastructure, but the narrative that drove Microsoft's AI premium in 2023 and 2024 may be unwinding. Confidence: 62%.
The Risks (Read These Carefully)
The biggest single risk across this entire thesis is that AI investment turns out to be a bubble. If companies collectively decide that AI spending isn't generating returns, the entire infrastructure stack gets hit. NVIDIA would face the most severe multiple compression. Fab equipment orders could evaporate. Data center buildouts would slow.
More specific risks include:
- Valuation. Nearly every name on this list is already trading at premium multiples that price in significant AI growth. If that growth disappoints even modestly, stocks can fall hard without the thesis being wrong, just late.
- Custom silicon. Google's TPUs, Amazon's Trainium chips, and Meta's in-house designs could erode NVIDIA's dominance over time.
- Export controls. US restrictions on advanced chip and equipment sales to China reduce the addressable market for NVIDIA, ASML, Applied Materials, and Lam Research.
- Cyclicality. Semiconductor equipment is notoriously boom-and-bust. Hyperscaler spending comes in lumps, creating volatile quarters for companies like Arista.
- Power constraints. Nuclear plant lifetime extensions require regulatory approval, and power grid limitations in key markets like Northern Virginia could delay data center projects.
- Interest rates. Capital-intensive infrastructure companies, especially REITs like Equinix and utilities like Constellation, are sensitive to the rate environment.
One name deserves special mention: the analysis originally considered Super Micro Computer (intended ticker SMCI) as an AI server infrastructure play, but accounting irregularities, auditor changes, and delayed SEC filings make it uninvestable despite a sound business model. Governance risk matters.
Why This Matters for Your Money
If you have a 401(k) or index fund that tracks the S&P 500, you already own many of these companies. But the prediction market data reveals something important: broad tech indices may be deceptively stable right now. The mega-caps pouring money into AI are masking the pain of layoffs and contraction across the rest of the sector. Your tech allocation is likely more concentrated in the AI arms race than you realize.
The creative destruction pattern also has implications beyond your portfolio. If companies are replacing human workers with AI compute at the rate these markets suggest, the labor market shifts in ways that affect everyone. The 87.1% layoff probability isn't just a trading signal. It's a signal about the economy your grocery bills, your neighbor's job prospects, and the skills that will matter in five years.
The investment playbook here is straightforward in concept even if the details are complex: own the infrastructure that every AI competitor needs, stay diversified across the stack from chips to power, and size positions modestly given the stretched valuations. The Gold Rush rewarded the shovel sellers. This one probably will too.
Analysis based on prediction market data as of April 2, 2026. This is not investment advice.
How This Story Evolved
First detected Mar 20 · Updated daily
The headline was shortened by removing the phrase "Without Picking a Winner." The opening paragraphs were lightly reworded for clarity, but the key statistics and main ideas stayed the same.
The headline was updated to swap "machines" for "robots" and added a note about investing without picking a winner. The article's opening was rewritten to be simpler and clearer, removing the specific probability breakdowns for Anthropic, Google, and OpenAI, and adding a new framing about "creative destruction."
Read this version →