straced/layer
/Relationships

LayerRelationships

The active alliances, the live competitions, and the dynamics worth watching in the AI stack.

Aligned1Competing4Watching7As of2026-05-08

Aligned

SpaceXAIX platform

holding· 2026-04-08 to 2026-05-07

SpaceX dissolved xAI into SpaceXAI on May 6, 2026; the Grok-through-X distribution relationship is unchanged in structure.

SpaceX dissolved xAI as a separate company into SpaceXAI on May 6. Grok still distributes through the X platform, giving SpaceXAI a large consumer channel without relying on third-party cloud marketplaces.

xAI's distribution advantage through X is intact, but the ownership consolidation under SpaceX makes Musk-controlled infrastructure the single backstop for both model development and compute.

As of May 7

Competing for the same ground

OpenAIAnthropic

strengthening· 2026-04-08 to 2026-05-08

Anthropic embedded finance agents natively into Microsoft Office on May 8, the same day OpenAI shipped voice API models — each lab deepening a different distribution surface on the same cycle.

This cycle

Anthropic shipped ten finance agent templates running natively inside Microsoft Office on May 8, adding a direct enterprise runtime to the competitive front OpenAI's self-serve ads opened on May 6.

OpenAI and Anthropic are competing on three tracks: capability releases, distribution runtime, and monetization model. OpenAI opened self-serve ad inventory in ChatGPT on May 6; Anthropic has publicly committed to keeping Claude ad-free. Anthropic shipped finance agent templates that run natively inside Microsoft Office on May 8; OpenAI shipped three voice API models the same day — each lab deepening a different distribution surface.

A release or channel deal from either lab can reset model routing and pricing assumptions in the same week; the runtime and monetization divergence now shapes where enterprise buyers are willing to standardise.

As of May 8
MicrosoftGoogle

strengthening· 2026-04-08 to 2026-05-07

Google launched a vertically integrated agent-platform-plus-TPU stack at Cloud Next 2026 and committed $40 billion to Anthropic in April, while Microsoft ended its OpenAI exclusive license on April 27.

Microsoft and Google are competing for enterprise AI platform control. Google's Cloud Next 2026 launch bundled the Gemini agent platform with eighth-generation TPUs, adding a silicon-procurement dimension to the contest. The split posture — Microsoft loosening its model ties, Google deepening vertical integration — is widening.

Cloud contract choices now carry a hardware and agent-runtime commitment; a decision to use Google's platform implies accelerator and runtime lock-in alongside the model choice.

As of May 6
OpenAIGoogle DeepMind

holding· 2026-04-08 to 2026-05-08

Meta Muse Spark scored 58% on Humanity's Last Exam on May 8 — matching the performance tier of GPT Pro and Gemini Deep Think — without a direct capability shift between OpenAI and DeepMind themselves.

OpenAI and Google DeepMind are competing on reasoning and multimodal model quality. Meta Superintelligence Labs entered the frontier reasoning tier on May 8 with Muse Spark at 58% on Humanity's Last Exam, meaning the two-way benchmark contest now has a third credible entrant. GPT and Gemini-era releases still appear as close substitutes in many enterprise evaluations.

Teams should benchmark all three vendors — OpenAI, Google DeepMind, and Meta — because the frontier reasoning tier expanded this cycle.

As of May 8
CursorGitHub Copilot

holding· 2026-04-08 to 2026-05-06

SpaceX disclosed a $60B acquisition option for Cursor on April 26, 2026, injecting a potential ownership change into the competitive dynamic; no update in the four weeks since.

Cursor and GitHub Copilot are competing for the same developer workflow surface. Cursor has strong individual pull, while Copilot keeps enterprise distribution through Microsoft and GitHub contracts.

Teams can end up with split coding-assistant standards unless engineering and procurement align early.

As of May 2

Watching

OpenAIMicrosoft

weakening· 2026-04-08 to 2026-05-06

Microsoft ended its exclusive OpenAI license on April 27 and OpenAI finalized the DeployCo JV with TPG on May 5 — each a concrete step decoupling OpenAI's enterprise reach from Azure.

Microsoft and OpenAI ended exclusive licensing on April 27. Microsoft keeps a nonexclusive license through 2032. OpenAI now routes enterprise distribution through Azure, Amazon Bedrock, and the TPG-backed DeployCo JV.

Teams can negotiate multi-cloud model access, but integration defaults still differ by cloud contract.

As of May 5
AnthropicAmazon

holding· 2026-04-08 to 2026-05-06

Amazon committed $25 billion to Anthropic in April 2026, the largest AI infrastructure deal reported to that date; no changes to partnership terms in the four weeks since.

Amazon has committed up to $25 billion to Anthropic. Bedrock now carries Anthropic and OpenAI models in one lane. The partnership is still core, but platform exclusivity has weakened.

AWS teams still get fast Claude access, but model leverage now depends on Bedrock portfolio shifts.

As of May 2
AnthropicGoogle

holding· 2026-04-08 to 2026-05-06

Google committed $40 billion to Anthropic in April 2026 with $10 billion immediately; no material change to partnership terms observed in the four weeks to May 6.

Google has committed up to $40 billion to Anthropic, with $10 billion committed immediately. Google still funds Claude capacity while building Gemini. The tie is financial support plus direct model competition.

Teams should treat this as capital alignment, not a durable product alignment.

As of May 2
Model providersNVIDIA

weakening· 2026-04-08 to 2026-05-08

Meta and Broadcom published a four-generation MTIA roadmap through 2027 on May 8, making Meta the second large buyer — after Google with TPU — on a published multi-year path away from NVIDIA merchant silicon.

This cycle

Meta and Broadcom published a four-generation MTIA silicon roadmap on May 8, targeting 25x FLOPS growth through 2027 — the clearest signal yet that NVIDIA merchant dependency at hyperscaler scale is on a published multi-year exit path.

Most frontier training and inference still depend on NVIDIA supply. Anthropic routes 220,000 NVIDIA GPUs via SpaceX's Colossus 1, confirming hardware dependency persists through rival-operated channels. Meta and Broadcom published a four-generation MTIA roadmap on May 8 through 2027, the most public multi-year commitment to an alternative silicon path from a hyperscaler-scale buyer to date.

A supply or pricing shock at NVIDIA hits model providers regardless of which entity operates the cluster — but two large buyers are now on published timelines to reduce that exposure.

As of May 8
AI providersRegulators

holding· 2026-04-08 to 2026-05-06

India's MeitY secretary disclosed direct talks with Anthropic over Mythos on April 28, 2026, extending regulatory oversight to three jurisdictions; no new regulatory action in the four weeks to May 6.

Provider roadmaps now track regulatory response windows across regions. Mythos oversight now spans at least three jurisdictions: the US, UK, and India.

Teams shipping globally should expect uneven feature access and slower cross-region rollout parity.

As of May 2
Open-weight modelsAPI providers

strengthening· 2026-04-08 to 2026-05-08

DeepSeek V4 confirmed on Hugging Face on May 7 with million-token context and new RL sandbox; China's Big Fund is in talks to lead a $45 billion first round — capability and capital are advancing together.

Open-weight capability keeps improving while API pricing pressure increases. Hugging Face reports Chinese model lineages now hold 41% of Hub downloads. DeepSeek V4 ships with million-token context, hybrid attention, and a new RL sandbox — backed by a state semiconductor fund in talks to lead a $45 billion first round.

Buyers should revisit self-hosted versus API economics; open-weight quality, usage share, and state-backed capital depth are all rising in the same cycle.

As of May 7
Foundation model providersApplication developers

strengthening· 2026-04-08 to 2026-05-08

Anthropic embedded in Microsoft Office, Perplexity opened the Mac agent to all users, and xAI launched Grok Connectors on May 8 — three model providers each claiming a different platform runtime on the same cycle.

This cycle

Anthropic, Perplexity, and xAI each embedded into a distinct platform runtime on May 8, shifting the competitive dynamic from model capability to runtime ownership as the primary differentiation axis.

Foundation model providers keep expanding direct product surfaces into the application layer. Anthropic and OpenAI operate managed enterprise services channels targeting segments previously served by consulting firms. On May 8, Anthropic, Perplexity, and xAI each embedded into a different platform runtime on the same day — Microsoft Office, the Mac, and third-party app connectors — giving model providers direct distribution that bypasses independent application developers.

Application teams should track overlap risk when their model supplier launches features near their product boundary — and now must also watch which platform runtimes those suppliers are embedding into.

As of May 8