In partnership with

THE BRRR’s BOTTOM LINE

Editor’s Note: We just released our monthly update to our list of stocks most likely to 5X. Check it out here:

https://www.thebrrr.com/upside-ranks


The macro backdrop was supposed to be toxic for stocks: crude still above $100, Hormuz disruption, March PCE at +3.5% year over year, core PCE at +3.2%, ISM prices paid screaming 84.6, and a Fed meeting that looked less like a boring hold and more like a committee fracture.

On Wednesday, The Fed held at 3.50%–3.75%, but the vote split mattered: one official wanted an immediate cut, while three objected to language implying cuts remain more likely than hikes.

Traders briefly priced up to a 25% chance of a hike over the next year.

Then earnings showed up and ruined the doom party. Reuters had the S&P 500 closing Friday at 7,230.12 and the Nasdaq at 25,114.44, both record closing highs and both logging a sixth straight weekly gain.

LSEG/Reuters had Q1 S&P earnings growth running 27.8% year over year, with 83% of reporters beating EPS and 78% beating revenue. That is not a market ignoring macro. That is a market deciding earnings proof and AI infrastructure ROI are strong enough to overpower the macro fear tape for now.

The Fed angle got nastier because Powell’s last meeting as chair was not really a clean handoff.

He is reportedly staying on the Board after his chair term ends, which keeps him inside the institution as Kevin Warsh prepares to take over. That turns the next phase into more than a rates debate.

It is Fed independence, White House pressure, oil inflation, and a divided committee all shoved into the same blender.

The labor market did not give the Fed an escape hatch either. Initial claims fell to 189,000 for the week ending April 25, continuing claims sat around 1.79 million, and unemployment was around 4.3%.

That is not boom-time labor strength, but it is also not recessionary enough to justify panic cuts while gasoline is ripping. The Fed can talk patience with a straight face. Markets can hate that later.

Oil is the swing factor because it is no longer just a commodity chart; it is the thing deciding whether inflation stays tactical or becomes policy poison. Reuters/Kpler’s Hormuz number remains ridiculous: only seven ships crossed versus the usual 125–140, and none were carrying oil for the global market.

OPEC+ added a 188,000 bpd June quota increase, but quota barrels are not the same thing as deliverable barrels when the chokepoint is impaired. Saudi March output was 7.76 million bpd versus a June quota of 10.291 million. Paper barrels do not sail through a blocked strait.

And yet, the market ripped because the AI capex cycle is now showing up as more than a stock-market hallucination.

Q1 GDP grew +2.0% annualized versus +2.3% expected, but the composition was the story: equipment investment rose 17.2%, intellectual products rose 13%, and business investment added more than one point to GDP while consumer spending slowed to +1.6%.

The consumer is coughing. Data centers are doing chest day.

This is why the week was not generic AI euphoria and not generic stagflation. It was a selective proof-of-payback tape.

Alphabet passed because Google Cloud grew 63%, enterprise AI sales were up eightfold, backlog was near $460 billion, and TPUs are becoming an external product lane.

Amazon mostly passed: AWS grew.

Microsoft showed real adoption with Azure up 40% and 20 million paid M365 Copilot users.

Meta raised 2026 capex to $125–$145 billion, sold $25 billion of bonds, and the market immediately asked where the receipt was.

That is the new filter: AI capex is no longer automatically bullish. Productive capex gets rewarded. Vague superintelligence capex gets discounted. The market is not buying every AI press release with a pulse anymore.

It wants cloud acceleration, backlog, enterprise adoption, power access, memory scarcity, custom silicon leverage, or some credible path from spending to margin.

The geopolitics are not separate from this. Ukraine hitting Russian oil infrastructure and shadow-fleet tankers is the same regime as Iran weaponizing Hormuz: energy infrastructure is now the battlefield.

So the right read is selective, not euphoric. AI is carrying the tape, but oil is holding the Fed hostage.

The market still wants the productivity trade. It just wants proof before it pays.

BRRR Premium: Owning The AI Supercycle

The paid edge this week is that AI stopped trading like a story and started trading like an infrastructure audit. The market reaction to Alphabet, Amazon, Microsoft and Meta earnings proves the market is not rewarding every company that says “AI” anymore.

It is grading the buildout by receipts: cloud growth, backlog, utilization, capex discipline, power access, networking capacity, custom silicon leverage, and whether the spend has a credible path back to revenue.

That is why Alphabet ripped, Amazon was tolerated, Meta got punished, and photonics suddenly started acting like the next bottleneck trade. The AI boom is still real. The free-money phase of the narrative is not.

Alphabet passed the receipt test better than anyone. Q1 revenue rose 22% year over year to $109.9 billion, operating income was $39.7 billion, and Google Cloud was the cleanest AI monetization proof point of the week: $20.0 billion of revenue, +63% year over year, $6.6 billion of operating income, and a cloud operating margin near 32.9%.

The detail that mattered most was backlog: management said Google Cloud backlog was over $460 billion, nearly doubling quarter over quarter. That is what investors wanted to see. Not just “we are spending on AI,” but “customers are signing contracts large enough to justify the spend.” The market reaction matched the receipt: GOOGL finished the week +12.0% and closed +10.0% the day after earnings.

Amazon showed the strongest cloud scale, but the cash-flow debate is now impossible to ignore. Q1 revenue rose 17% year over year to $181.5 billion, AWS revenue rose 28% to $37.6 billion, and AWS operating income hit $14.2 billion, implying a 37.7% segment margin. Andy Jassy said AWS is growing at its fastest pace in 15 quarters, Amazon’s chips business has topped a $20 billion revenue run-rate, and Bedrock customer spend grew 170% quarter over quarter. That is real. But the other side of the ledger is also real: Amazon’s trailing twelve-month free cash flow fell to just $1.2 billion, driven by a $59.3 billion year-over-year increase in property and equipment purchases net of proceeds and incentives, primarily reflecting AI investment. That is why AMZN’s stock reaction was muted: +1.6% for the week, +0.8% post-earnings close-to-close. Investors like the AWS acceleration. They are not giving unlimited credit for the capex bill.

Meta is the warning label for this phase of the AI trade. The core business was not weak. Q1 revenue rose 33% year over year to $56.3 billion, operating income was $22.9 billion, operating margin was 41%, Family daily active people reached 3.56 billion, ad impressions rose 19%, and average price per ad rose 12%. In a normal tape, that is a victory lap. Instead, the stock fell 9.8% for the week and 8.6% post-earnings close-to-close because management raised 2026 capex guidance to $125 billion–$145 billion, up from $115 billion–$135 billion.

Meta is still an elite operating business, but the market is asking a different question now: how much of this AI spending becomes measurable revenue, and how much becomes a giant open-ended infrastructure obligation with a “superintelligence” label slapped on it?

Microsoft sits in the middle of the debate: real AI adoption, but still a capex-duration story. The freshest disclosed adoption number was strong: Microsoft said it had more than 20 million paid Microsoft 365 Copilot seats. That matters because Copilot is one of the few enterprise AI products with a direct per-seat monetization path.

But the investor question is not just whether Copilot adoption is real. It is whether Azure AI demand, OpenAI infrastructure commitments, and internal AI workloads can absorb the capital intensity fast enough to protect returns. This is the same filter hitting the rest of the group: AI software revenue is great, but the market is now underwriting the data centers, chips, power, and networking required to deliver it.

Photonics, the science of using light (photons) to do jobs we usually do with electricity (electrons), like sending information or doing computations. Think of it like swapping copper wires for tiny laser beams shooting through glass fibers or special chips.

For AI, this matters because training and running huge models requires moving massive amounts of data between thousands of GPUs, and electrical wires are starting to hit physical limits on speed and heat.

Light is much faster, generates far less heat, and can carry many signals at once on different wavelengths, which is huge when your data center is burning megawatts just shuffling numbers around.

Companies are betting that "co-packaged optics" and even all-optical computing chips could be the only way to keep scaling AI without melting the grid.

On April 2, photonics darling AAOI announced a new $71 million 800G transceiver order from a major hyperscaler, bringing that customer’s orders to $124 million since mid-March and more than doubling its existing backlog from that customer.

AAOI also said it shipped the first 10,000 units of an 800G order to another hyperscale data-center customer.

On April 17, AAOI announced a Houston-area expansion to 900,000 square feet, with a target of up to 700,000 combined 800G and 1.6T transceivers per month and roughly 350% laser fab capacity expansion by year-end 2027. Then on April 29, AAOI received a $20.85 million Texas Semiconductor Innovation Fund grant tied to a 210,000-square-foot Sugar Land manufacturing facility and 500+ expected jobs.

That is why the photonics trade has legs: it is no longer only “AI will need optics.” It is orders, shipments, factories, grants, and capacity targets.

The actual bottleneck is bandwidth per watt. The old AI trade was compute scarcity: GPUs. Then it became memory bandwidth scarcity: HBM and advanced packaging. The next choke point is network bandwidth and energy efficiency: 800G/1.6T transceivers, lasers, silicon photonics, co-packaged optics, optical circuit switching, switches, cables, test equipment, and fiber.

Once clusters move from thousands to tens of thousands of accelerators, the question changes from “can I buy chips?” to “can the chips talk fast enough without melting the power budget?” This is why photonics is moving up the stack. It is not a telecom side quest anymore. It is becoming part of AI factory architecture.

Lumentum and Coherent are the strategic read-throughs, but the setup is different from AAOI. Lumentum is the cleaner “laser scarcity” expression: Q2 FY2026 revenue was $665.5 million, up 65.5% year over year, with 42.5% non-GAAP gross margin, 25.2% non-GAAP operating margin, and components revenue up 68.3%. Coherent is the broader platform: materials, lasers, InP, silicon photonics, pluggables, and roadmap exposure across 1.6T, 3.2T, and 12.8T+ architectures.

The market is paying for both as strategic AI networking suppliers, not old-cycle optical component vendors. The risk is valuation: recent prices put LITE and COHR at nosebleed reported P/E multiples around 279x and 320x, while AAOI remains loss-making on trailing EPS. The thesis is real. The stocks are no longer cheap.

The second-order AI trade is broadening beyond Nvidia, but not away from Nvidia. Nvidia is still the center of gravity, but the hyperscalers are not going to donate their margins forever. Amazon is pushing Trainium, Google is externalizing TPUs, Microsoft has Maia, and every large buyer has the same strategic incentive: reduce dependence where possible, keep Nvidia where necessary, and capture more of the AI stack internally.

That does not kill Nvidia. It changes the shape of the profit pool. The alpha question becomes: which suppliers become unavoidable as the stack fragments? The answer is probably not “every AI app.” It is power, memory, packaging, networking, optical components, custom silicon, cooling, and data-center finance.

The power layer still looks underappreciated relative to the chip layer. AI data centers are not SaaS assets. They are industrial cities with GPUs inside. The constraints are interconnection queues, substations, transformers, cooling systems, firm power contracts, permitting, land, and local political tolerance.

This is why the AI capex debate keeps getting bigger: the spend is not only chips. It is the entire physical plant required to keep those chips utilized. If you want to know where the next bottleneck emerges, follow the layer that is hardest to deploy quickly. GPUs can be allocated. Power infrastructure has to be permitted, built, and connected.

Inference is where the bill eventually comes due. Training gets the headlines, but inference is where AI economics either compound or break. Cloud providers can monetize usage directly. Enterprise software companies can monetize seats. Advertising businesses monetize indirectly through engagement, targeting, and creative throughput.

Consumer AI apps need subscriptions, ads, or enterprise distribution. The infrastructure side is already being capitalized as if utilization will be enormous. The next phase is proving revenue per token, margin per workload, and payback period per data center.

The premium takeaway: AI is still the dominant productivity trade, but the edge has moved from “own the obvious winners” to “map the bottlenecks before they show up in consensus.”

This week’s map is clear: Alphabet has the cleanest cloud/backlog receipt; Amazon has real AWS acceleration but cash-flow pressure; Meta has a great core business but a credibility gap on AI payback; Microsoft has paid Copilot adoption but still needs capex absorption; AAOI has the freshest order/capacity catalyst; LITE is Nvidia-backed laser scarcity; COHR is the broad optical platform. The AI trade is not dead. It is getting more selective, more industrial, and much more demanding about proof.

Editor’s Note: We just released our monthly update to our list of stocks most likely to 5X. Check it out here:

Ghost

Your agent needs more than 2 projects

You prompt. The agent builds. Then it asks for a database.

Ghost is postgres made for this. Spin one up in seconds. Fork it like a branch. Delete it when you're done. Pay nothing when it's idle.

Your agent gets full sql, mcp support, as many databases. No dashboards. No provisioning. No forgotten dev databases draining your card at month end.

Build a weekend app. Fork the schema three different ways. Throw two of them out. Ghost doesn't care. The next prompt can spin up a fresh one.

You're already vibe-coding the app. Stop wiring up the backend.

Unlimited databases. Unlimited forms. 100 compute/hrs a month. 1tb of storage. Free.

Login or Subscribe to participate

Got feedback? Follow the writer on Twitter @frank_locascio and send a message.

The BRRR is meant for informational purposes only. It is not investment advice. Please consult with your investment, tax, or legal advisor before making any investment decisions.

Reply

Avatar

or to participate

Keep Reading