Every Wednesday, Signal Pro members get a step-by-step AI workflow they can apply immediately. No fluff, just practical guides to upskill you and your team. If you’re only reading the Sunday issue, you’re getting half the picture. Upgrade to Pro today. AI HighlightsMy top-3 picks of AI news this week. 1. Gemini Gets to WorkGoogle announced a coordinated wave of Gemini updates this week, headlined by global file generation that lets users generate ready-to-download Docs, Sheets, Slides, and PDFs without copying, pasting, or reformatting.
Anthropic2. Claude’s Big PullAnthropic spent the week turning Claude into the hub for professional work, pulling nine creative apps into the chat and launching a new Enterprise security product within 48 hours.
OpenAI3. OpenAI UnchainedOpenAI spent the week redrawing its biggest cloud relationship and shipping developer infrastructure that puts the frontier conversation back in play.
Content I EnjoyedWhen Labour Becomes a Manufacturing ProblemFigure just took its BotQ humanoid line from 1 robot a day to 1 robot an hour, a 24x throughput jump in 120 days. End-of-line yield is above 80% and improving weekly. The battery line is running at 99.3% first-pass. Each Figure 03 now passes 80+ functional verification tests, including thousands of squats, presses, and jogs before sign-off. This is humanoid manufacturing starting to behave like commodity electronics. Wright’s Law, the empirical observation that unit cost falls a fixed percentage with every doubling of cumulative production, has held across solar panels, lithium-ion cells, and silicon for decades. A 24x ramp compresses several doublings into a single quarter, which collapses the cost curve. That hardware curve is meeting collapsing inference costs head-on. Epoch AI’s most recent measurement puts the price of equivalent LLM performance halving roughly every two months, with declines of up to 900x per year on certain benchmarks. The “brain” inside a humanoid is rapidly converging on its energy floor. Hardware and software are reducing to a single output: a unit of physical work whose marginal cost decomposes into amortised steel, inference, and electricity, with no biological constraint anywhere in the stack. China is running the same experiment from different angles. RobotEra is now deploying its L7 humanoid in the thousands across China Post and SF Express logistics centres, achieving up to 85% of human-level efficiency 24/7. Across a full day, that is roughly 2.5x the output of a single human shift. Unitree just launched a dual-arm humanoid at $4,290, with a 20,000-unit shipment target for 2026 and a starting price already below the annual cost of minimum-wage labour in most OECD countries. For two centuries, the marginal cost of labour was set by biology and education: how quickly humans could be born, raised, and trained. That floor is being replaced by a manufacturing curve. Every robot rolling off the BotQ line is also a data-collection unit for the next policy version of Helix, which means throughput is now compounding in capability as well as cost. Idea I LearnedOpenAI’s Own Numbers Are the ProblemThis chart from the Wall Street Journal caught my attention this week. It’s built from OpenAI’s own projections. Training costs alone are forecast to exceed total revenue for three consecutive years from 2026 to 2028, before dropping to roughly 30% of revenue by 2030. That’s an awful lot of faith to ask public market investors to take. Last October, Sam Altman cited a $1.4 trillion figure for OpenAI’s future compute commitments. CFO Sarah Friar quietly walked the number back to investors at $600 billion through 2030. The WSJ reported last week that she has since told colleagues she’s worried the company can’t pay for those contracts if growth doesn’t accelerate. Anthropic, on the other hand, is a quieter story. Annualised revenue hit $30 billion in March, ahead of OpenAI’s $25 billion. Eight of the Fortune 10 pay for Claude. Over 1,000 enterprises spend more than $1 million a year on it. Claude Code alone runs at $2.5 billion annualised and holds 54% of the AI coding tool market. Seven of every ten new enterprise customers now pick Anthropic. So how does this translate to the IPO calendars? Anthropic is targeting an October 2026 listing at a $400–500 billion valuation. OpenAI had been aiming for Q4 2026 too, but Friar has privately suggested waiting until 2027. Internal controls aren’t built out, and revenue is missing targets. Morningstar sees mid-2027 as the realistic window. OpenAI bet on consumer scale and ate the costs to get there. ChatGPT image generation and Sora 2 drove huge usage spikes in 2025; both have since faded, and Sora 2 is discontinued. GPT-5.5 topped benchmarks at launch but growth has reportedly flattened. Anthropic stayed focused on enterprise, owned the coding category, and kept gross margins healthier despite its own compute strain. Banks have told both companies that whoever lists first defines the comparable for frontier AI. If Anthropic prices on cleaner unit economics, OpenAI walks in a year later, having to explain why its capital intensity is roughly twice as high for similar revenue, and why three years of its own projection show training costs above the top line. The “buy everything” compute strategy looked like the playbook in 2024. As we move through 2026 and the idea of going public becomes ever more real, it now looks like a balance sheet problem. Quote to ShareAlexis Ohanian on Google’s AI compute dominance: The numbers come from Epoch AI’s latest analysis of global chip ownership, which landed days before Alphabet’s Q1 earnings. Google Cloud grew 63% last quarter to $20 billion. Azure grew 40%, whilst AWS grew 28%. The cloud backlog nearly doubled in 90 days to $462 billion in contracted future revenue. Google CEO Sundar Pichai told investors Google is “compute constrained” and that cloud revenue would have been higher if they could meet demand. Now look at the stack. Google owns the chips, the models and the cloud. Everyone else rents at least one layer. Microsoft pays OpenAI. Amazon pays Anthropic. Both buy Nvidia GPUs. Anthropic runs much of its training on Google TPUs, paying the company whose model it competes with. Alphabet’s 2026 capex is now $180-190 billion, with 2027 already guided higher. Owning the stack creates Google’s enduring advantage, and it then becomes simple math to determine who will be left standing in a decade’s time once the capex wars have played out. Source: Alexis Ohanian on X Question to Ponder“As AI agents become more capable of seeing and acting across our screens, where do you see the future of human-computer interaction heading?” Interfaces are about to look very different. Right now, we type into chat windows, screenshot what’s on our screens, and paste them into Claude or ChatGPT to give it context. That’s an awful lot of friction. A good example of where things are heading is Clicky. It’s pitched as the simplest interface in the world to talk to AI and spawn agents. It interacts with native Apple Notes, Calendar, and Reminders, builds Mac apps, and runs research for you. The interesting bit is that it can see your screen. That single capability removes the friction. It already sees what you see. You speak, and it then guides your cursor. For something like video editing in DaVinci Resolve, where the learning curve is brutal, it’s like having an expert hand-hold you through the workflow. Now I don’t think the chat window will disappear. For iterative work like drafting a document, going back and forth in a chat still wins. But for navigating software, voice-first, screen-aware computing changes everything. The interface of the future is an assistant that watches your screen, listens to your voice, and moves with you through whatever you’re trying to do. Already a subscriber? Get your whole team on board. Signal Pro group subscriptions give everyone access to weekly AI workflows and tutorials, practical upskilling that pays for itself. It’s the kind of thing L&D budgets were made for. Share this with your manager today.
💡 If you enjoyed this issue, share it with a friend.
Invite your friends and earn rewardsIf you enjoy The Signal, share it with your friends and earn rewards when they subscribe. |