In partnership with

In Today’s Issue:

⚡ Anthropic commits a staggering $200 billion to Google Cloud

🧠 Zyphra releases a compact model claiming frontier-level reasoning

🚦 NVIDIA opens its MRC networking protocol

🚀 Anthropic rents SpaceX's entire Colossus 1 supercomputer

🐟 A five-year study links omega-3 supplements to faster cognitive decline

And more AI goodness…

Dear Readers,

Superintelligence is a living product, and we're always refining it.

Starting today, you'll notice leaner news summaries designed to give you a sharper overview without the filler.

And this Sunday marks a first: we're launching Intelligence from the Community, a new weekly edition featuring essays and analyses written by talented voices from our readership. If you've been reading Superintelligence and thinking, "I have something to say," this is your stage!

⚡ The Signal

The AI compute race is no longer about who builds the most, it's about who can actually use what they've built.

Anthropic is stacking infrastructure deals at a pace that would make a sovereign blush: a reported $200 billion commitment to Google, a full takeover of Musk's Colossus 1 cluster, and partnerships now spanning Amazon, Microsoft, and FluidStack. But the real story isn't the spending. It's that xAI ran its world-record GPU fleet at just 11 percent utilization, turning Musk's supercomputer into a rental property for the very company he once called dangerous. Meanwhile, NVIDIA is open-sourcing its networking protocol and OpenAI is warning that the true bottleneck has shifted from chips to the fabric connecting them. The emerging logic is clear: raw hardware is becoming a commodity. The winners will be the teams that can keep massive clusters fed, synchronized, and productively occupied.

All the best,

💸 Anthropic’s Huge Google Cloud Bet

Anthropic has reportedly committed $200 billion over five years to Google for chips and cloud access, a deal that shows how AI startups are locking themselves into vast infrastructure spending just to keep scaling. The report frames the AI boom as increasingly circular: cloud giants fund AI companies, then collect enormous contracts back from them, while rising server, data center, and memory costs raise questions about how sustainable this race really is.

👉 tl;dr: Anthropic’s reported $200 billion Google deal highlights how the AI boom is being powered by massive, costly cloud and chip commitments.

🧠 Tiny Model Claims Big Leap

Zyphra says its new ZAYA1-8B model delivers unusually high reasoning power for its size, using under 1 billion active parameters while competing with much larger open-weight and proprietary systems on math, coding, and reasoning benchmarks. The sharper claim is not just the model’s size, but its full-stack bet: AMD-only training infrastructure, new architectural choices, large-scale RL, and a test-time compute method called Markovian RSA that appears to boost hard math performance through parallel reasoning and recursive aggregation.

👉 tl;dr: Zyphra is positioning ZAYA1-8B as a compact open model that extracts frontier-level reasoning performance through architecture, post-training, and test-time compute rather than raw parameter scale.

🚦 NVIDIA Opens AI Networking Protocol

NVIDIA says its Spectrum-X Ethernet platform is moving deeper into gigascale AI infrastructure with MRC, a new RDMA transport protocol designed to spread traffic across multiple network paths, reduce congestion, and recover quickly from failures. The sharp point is openness: MRC was proven on NVIDIA hardware with OpenAI, Microsoft, and Oracle, but is now being released through the Open Compute Project, positioning Spectrum-X as both a performance play and a standards-setting push for AI factories. Source: provided text.

👉 tl;dr: NVIDIA is opening its MRC networking protocol after production use in major AI clusters, aiming to make large-scale AI training faster, more resilient, and less vulnerable to network slowdowns.

Before you buy a supplement, ask AI to check the evidence.

Why it helps: Today's issue covers a study linking omega-3 to faster cognitive decline, not slower. Health advice online often lags behind the research. AI can help you spot the gap before you spend money on something that might not help, or might even hurt.

Try this:"I'm thinking about taking [supplement] for [goal]. Summarize what the most recent clinical evidence actually says. Include any major studies that found no benefit or potential harm. Keep it simple."

You're not replacing your doctor. You're showing up better informed.

🎬 Watch This

Why AI Needs a New Kind of Supercomputer Network - The OpenAI Podcast Ep. 18

Why it’s worth your time:

Frontier AI is no longer just a GPU story. This episode explains why the real bottleneck is increasingly the network that keeps thousands of chips synchronized during training.

Best bit:

The strongest part is the explanation of Multipath Reliable Connection, a new protocol OpenAI developed with AMD, Broadcom, Intel, Microsoft, and NVIDIA to route around failures and keep massive GPU clusters moving in lockstep.

Watch if you care about:

AI infrastructure / frontier model training / supercomputers / NVIDIA / the future of compute

During a fireside chat on Wednesday, the Anthropic CEO said his AI company had 80x year-over-year growth in revenue and usage in the first quarter.

Amodei added, half-joking, that he hopes this doesn't continue because that level of hyper-growth is "too hard to handle." It might be better to have a "mere 10x" growth, he quipped.

Anthropic Rents Musk's Supercomputer

The Takeaway

👉 Anthropic secured the entire Colossus 1 facility from SpaceX, adding 220,000+ NVIDIA GPUs and 300 megawatts of capacity, immediately doubling Claude Code rate limits and raising API limits for Opus models.

👉 xAI was running its massive GPU fleet at just 11% utilization, compared to 40%+ at Meta and Google, making the rental to a direct competitor a pragmatic business decision rather than an ideological one.

👉 Musk dissolved xAI as a standalone entity, rebranding it SpaceXAI under the SpaceX umbrella, all 11 original co-founders have departed, and training has moved to Colossus 2.

👉 The deal positions SpaceX as an AI infrastructure provider ahead of its planned IPO this summer, while Anthropic stacks compute partnerships now totaling agreements with Amazon (5 GW), Google/Broadcom (5 GW), Microsoft/NVIDIA ($30B Azure), and FluidStack ($50B).

The company Elon Musk once called "evil" is now his biggest GPU tenant. Anthropic just signed a deal to rent the entire Colossus 1 data center from SpaceX, gaining access to over 220,000 NVIDIA GPUs and 300 megawatts of compute capacity. The deal gives Anthropic access to more than 300MW of capacity across more than 220,000 Nvidia GPUs within the month. The reason? xAI's model flops utilization was around 11 percent, way below the 40 percent achieved by rivals.

In plain terms: Musk built the world's biggest AI supercomputer, but could barely use a fraction of it. Meanwhile, demand for Claude is surging, and Anthropic is hungry for every GPU it can get. The timing is remarkable: on the same day, Musk announced that xAI will be dissolved as a standalone company and fully integrated into SpaceX under the name SpaceXAI. All 11 original xAI co-founders have already departed. Anthropic immediately used the new capacity to double Claude Code rate limits and raise API limits for Opus models.

The company is also exploring orbital AI compute with SpaceX, pushing data centers into space. Could the biggest irony in AI, competitors becoming infrastructure partners, signal a new era of pragmatic collaboration?

Why it matters: This deal reveals how quickly the AI compute landscape is shifting. Companies that cannot efficiently use their hardware are becoming landlords for those who can, fundamentally reshaping competitive dynamics in the industry.

Sources:
🔗 https://www.theinformation.com/newsletters/ai-agenda/xai-shows-hard-use-lot-gpus?rc=bfliih

🔗 https://www.anthropic.com/news/higher-limits-spacex

Attio is the AI CRM for high-growth teams.

Connect your email, calls, product data and more, and Attio instantly builds your CRM with enriched data and complete context. Whether you’re running product-led growth or enterprise sales, Attio adapts to your unique GTM motion.

Then Ask Attio to plan your next move.

Run deep web research on prospects. Update your pipeline as you work. Find customers and draft outreach emails. Powered by Universal Context, Attio's intelligence layer, Attio searches, updates, and creates across your data to accelerate your workflow.

Ask more from your CRM.

The chart: Terminal-Bench 2.1 released. GPT-5.5 with Codex leads Terminal-Bench 2.1, while Claude Opus 4.6 with Claude Code posts the biggest upward revision after the audit, gaining +12.0 percentage points

The lesson: Coding-agent benchmarks are no longer just leaderboard theater. As coding agents become economically important, benchmark hygiene directly shapes how we judge real progress.

The caveat: Terminal-Bench 2.1 corrected issues in 28 of 89 tasks, roughly 30% of the benchmark. The rankings mostly survived, but absolute scores shifted by up to 12 points, a reminder that benchmark scores are only as reliable as the tasks underneath them.

Omega-3 Linked to Cognitive Decline

⚡ Bottom line A five-year study found omega-3 supplements were linked to faster cognitive decline in older adults, not slower.

💡 Why it matters Millions take fish oil for brain health, but this data suggests the popular supplement may do more harm than good in some people.

🔎 What it means Personalized, data-driven supplementation could replace the blanket recommendation that omega-3 is universally protective for aging brains.

Millions of older adults pop fish oil capsules every morning, convinced they're protecting their brains. A new five-year study just flipped that assumption on its head. Researchers analyzed data from the Alzheimer's Disease Neuroimaging Initiative (ADNI), comparing 273 omega-3 users with 546 matched non-users, carefully controlling for age, genetics, and diagnosis. The result was striking: participants who took omega-3 supplements showed faster cognitive decline than those who didn't. Even more fascinating, the decline wasn't linked to classic Alzheimer's markers like plaques, tangles, or gray matter loss.

Instead, the culprit appears to be reduced brain glucose metabolism, a proxy for synaptic dysfunction. The researchers suspect that DHA, the star fatty acid in fish oil, may actually be too chemically fragile for aging brains, triggering oxidative stress in mitochondria rather than protecting them. Commercially available fish oil is especially prone to oxidation, which could explain why earlier clinical trials using purified EPA and DHA found neutral results while this real-world study found harm. This is not the final word, but it's a loud wake-up call: supplementation is not one-size-fits-all, and more isn't always better.

Built for builders. Not buzzwords. San José 2026

500+ speakers. 18 content tracks. Workshops, masterclasses, and the people actually shipping the tools you use every day. WeAreDevelopers World Congress — September 23–25. Use code GITPUSH26 for 10% off.

Reply

Avatar

or to participate

Keep Reading