Talent wars, data lockdowns, and the $7K reality check
Meta tossed another casual billion into its endless hiring spree (on top of that $14.3B Scale AI shopping trip). We checked the latest hires, and sadly, none of them were Soham Parekh. Was this because Meta’s not a YC company? Guess we'll never know.
Speaking of, do you think it’s superintelligent to have an MIT dropout manage a Turing Award winner who came up with the very notion of the convolutional neural networks? Purely a rhetorical, of course, we can't wait for that awkward all-hands meeting when Alexandr Wang asks Yann LeCun to explain to the class why labeling datasets isn't exactly "scaling.”
Meanwhile, Cursor's "unlimited" plan turned out to be about as bottomless as Mission District mimosas on a Saturday - running dry before anyone could even get tipsy. Turns out if you want the real deal in AI, you're going to have to cough up way more cash - and it’s not just for the reason you think.
Key takeaways
Overall, the AI gold rush is maturing rapidly, exposing economic realities, intensifying battles over data, reshaping tech talent dynamics, and challenging companies to deliver real value beyond hype.
🚀 Industry updates
Meta's AI talent war, small victories, big questions
Update on Meta's $100M recruitment saga
In a surprising twist to our previous coverage, Meta has managed to secure 11 OpenAI researchers despite the widely reported failures of their $100 million talent acquisition strategy. While this represents a modest victory after months of unsuccessful poaching attempts, it raises questions about whether Meta's brute-force approach is finally yielding results or if these hires represent outliers willing to trade mission for massive compensation.
The bigger story may be Meta's pivot toward building infrastructure rather than just buying talent, bringing CEO Alexandr Wang into the fold to lead their "superintelligence" initiatives, with Wang now managing Yann LeCun. Rather than competing head-to-head for researchers, Meta appears to be betting on data infrastructure and compute scale as their path to AGI.
Scale AI CEO Alexandr Wang announces partnership with Meta for superintelligence research. (Source)
What's changed since the last issue:
What hasn't changed:
The talent war reveals an uncomfortable truth for Meta: in the race to AGI, you need more than deep pockets. You need believers. As one industry insider put it: "Meta is trying to buy a revolution, but revolutions aren't for sale."
36 new unicorns in 2025 - AI dominates
The billion-dollar club gets crowded, the first half of 2025 has minted 36 new unicorns, with standout valuations:
The megadeals:
The surprises:
Key takeaway: While AI companies dominate the list, the real story is diversity. From Medicare guides to restaurant software, 2025's message is clear: solve a real problem with rapid growth, and unicorn status awaits—AI optional.
Microsoft scales back AI chip ambitions amid development delays
Microsoft is revising its AI chip roadmap through 2028, opting for less ambitious designs after missing key deadlines. The company's latest Maia 200 chip slipped from 2025 to 2026, prompting a strategic shift: instead of designing new chips from scratch annually, Microsoft will now link multiple existing chips together (like the planned Maia 280, which combines two Braga chips). The most ambitious chip, Clea (intended to match Nvidia's performance) has been pushed beyond 2028. Despite spending billions as Nvidia's largest customer last year, Microsoft remains committed to reducing its dependence, targeting 20-30% better performance per watt than Nvidia's 2027 offerings. The reality check: Building competitive AI chips is harder than even tech giants anticipated.
Used EV batteries power new AI data centers
Redwood Materials, the battery recycling startup from Tesla co-founder JB Straubel, just opened an AI data center in Nevada powered entirely by repurposed EV batteries and solar panels. The facility, built with Crusoe (which is also developing OpenAI's Texas data center), runs 2,000 older-gen Nvidia chips using solar by day and recycled batteries by night. The clever pivot: Instead of immediately recycling EV batteries, Redwood extracts their remaining power capacity first, a business Straubel says could generate more revenue than their core recycling operations. As AI's energy demands explode, turning automotive waste into computing power might be the circular economy solution nobody saw coming.
The great AI data lockdown
Cloudflare and enterprise giants erect barriers as the battle for AI training data intensifies
The AI industry's insatiable hunger for data is triggering a widespread backlash. Cloudflare now blocks AI crawlers by default, offering website owners a one-click "toll booth" to keep AI bots from scraping their content. Meanwhile, enterprise software giants are waging their own data wars, throttling API access to prevent AI startups from building competitive products.
The enterprise battleground: Atlassian tried to buy AI search startup Glean for multiple billions in 2023. When rejected, they launched a competing product (Rovo) and started throttling Glean's API access in February, citing "unusual increases in API usage." Salesforce blocked Glean from storing Slack data. Notion is "re-evaluating" third-party data access. The pattern is clear: incumbents are using data access as a weapon against AI-powered competitors.
Why this matters:
As one executive put it: "It's a bright white line that cloud software companies should never cross, limiting customer access to their own data." Yet that line is being crossed everywhere. The AI gold rush has turned data from a shared resource into a strategic weapon. Welcome to the new normal.
OpenAI launches $10M minimum AI consulting service
OpenAI is now offering enterprise consulting services with a $10 million entry fee, hiring "forward-deployed engineers" (many from Palantir) to customize AI models for corporations and governments. Early clients include the Pentagon ($200M contract) and Grab (automated street mapping). The move positions OpenAI as a direct competitor to Palantir and Accenture, signaling ambitions beyond just selling API access to owning the full enterprise AI implementation stack.
Cursor's $7K reality check
The hidden costs of AI coding assistants. (Source)
The AI coding assistant bubble just got its first major reality check. Cursor, the popular AI-powered code editor, learned a painful lesson about the word "unlimited" when developers who paid $7,225 for yearly subscriptions watched their teams exhaust 500 requests in a single day, requests that previously lasted 20-25 days under the old Pro plan.
The issue emerged when Cursor transitioned from "500 guaranteed requests" to "unlimited with rate limits", a change that, counterintuitively, resulted in more restrictions for power users. The developer community quickly noticed, with Reddit and X threads documenting the unexpected limitations.
But the deeper challenge here reflects a broader industry pattern: the economics of AI assistants are trickier than they appear. As frontier models become more capable, they often become more verbose and expensive per request. Cursor, like many AI-first companies, is navigating the difficult balance between user expectations and the reality of LLM costs.
The timing is particularly challenging given the competitive landscape. With GitHub Copilot, Claude Code, and open-source alternatives like Roo Code and Cline readily available, developers have options. As one user noted: "The moat is gone." When switching costs approach zero and trust waivers, even the best product experience struggles to retain users.
This incident highlights a critical lesson for AI-powered tools: transparency around limitations and pricing changes matters as much as the technology itself. In a market where intelligence is supposedly getting cheaper, companies are discovering that delivering on "unlimited" AI is far more complex than it initially seemed.
The AI tool economics reality:
Cursor's CEO issued rapid damage control—full refunds, apologetic blog posts, promises to "do better." But the damage was done. Once developers start Googling "Cursor vs..."—you've already lost them.
This incident exposes a fundamental tension in the AI tools market: Is AI actually cheap, or are we just in the honeymoon phase? As companies realize that every "AI-powered" feature is essentially a costly API call to OpenAI or Anthropic, the economics of "unlimited" become unsustainable. Cursor's stumble might just be the canary in the coal mine for an industry built on subsidizing expensive AI with venture capital.
📄 Paper spotlights
Parallels between the VLA model post-training and human motor learning
The paper addresses a critical challenge in robotics: while VLA models have shown impressive generalization capabilities by leveraging large-scale pre-training on datasets, they still exhibit significant performance gaps when deployed for specific manipulation tasks. This limitation stems from the fundamental differences between the controlled conditions of pre-training and the complexity of real-world robotic applications.
The human motor learning analogy
The authors establish a compelling parallel between VLA model training and human motor skill acquisition:
Pre-training stage
Post-training stage
This biological inspiration provides a structured framework for understanding why post-training is essential and how it can be systematically approached.
Notable approaches include A3VLM, which integrates affordance representations directly into the model, and methods like RoboSpatial that enhance 3D perception capabilities through visual question answering.
Enhancing environmental perception. (Source)
Applying systems engineering and satellite earth observation for SDG15 monitoring in Ghana
The United Nations' 2030 Agenda for Sustainable Development established 17 Goals with 169 Targets and 232 quantitative Indicators. SDG15 ("Life on Land") focuses on protecting terrestrial ecosystems, with specific indicators including:
Ghana, like many nations, faces challenges in compiling comprehensive data for SDG reporting. Traditional ground-based surveys are resource-intensive and may not capture the full spatial and temporal dynamics of environmental change. This study addresses these limitations by developing satellite-based methodologies tailored to Ghana's specific context.
Land cover change analysis (2015-2022)
The study revealed significant land use transitions across Ghana:
The EVDT modeling framework. (Source)
The power of EVDT lies in its pragmatic approach, ensuring that technological solutions are grounded in environmental realities, address human vulnerabilities, and support informed decision-making.
💙 Projects we loved over the last two weeks
🧠 Liquid AI's LFM-1.3B-Math: Researchers Tim Seyde and Rohin Manvi transformed a general 1.3B chat model into a math reasoning powerhouse that matches DeepSeek-R1-Distill. Key breakthrough: best-in-class performance at 4K tokens (crucial for edge deployment) achieved through 4.5M samples, supervised fine-tuning (14% → 60%), and GRPO reinforcement learning to shorten responses while maintaining accuracy.
🏦 Trading agents: This multi-agent framework by Yijia Xiao et al. simulates real trading firms with seven specialized LLM agents working collaboratively. Unlike traditional single-agent approaches, it mirrors actual trading teams with analysts, researchers, traders, and risk managers using structured communication protocols to avoid the "telephone game" effect of degraded information. The system strategically pairs quick-thinking models (gpt-4o-mini) for data retrieval with deep-thinking models (o1-preview) for complex reasoning. Testing on major tech stocks (Apple, Nvidia, Microsoft, Meta, Google) from January-March 2024 achieved 23.21% minimum cumulative returns and 24.90% annualized returns, outperforming best baselines by 6.1%. Particularly impressive: 26% returns on volatile $AAPL in just three months.
🤖 AutoGluon-assistant: AWS's zero-code ML interface solves tabular problems via natural language. Outperforms 74% of human practitioners in Kaggle competitions and secured top 10 in $75,000 AutoML Grand Prix. Built on AutoGluon 1.0 with 75% win-rate improvement and 5-minute training, beating other systems given 1 hour.
💡 Discussions worth reading
Companies spend more on Cursor, less on engineers
The AI coding revolution is reshaping startup economics in stark ways. Gumroad has become the poster child, slashing its engineering team from 40 to just 12 developers while spending only $2,500-$5,000 monthly on AI tools like Cursor—a fraction of eliminated engineer salaries.
The impact is brutal: engineer salaries have dropped from $300,000 to $250,000 as AI handles bulk coding. CEO Sahil Lavingia reports that engineers now primarily review AI-generated code rather than write it, shifting their role from creator to curator. This massively benefits Cursor (and Anthropic, whose models power it) as companies achieve similar output with dramatically smaller teams.
Meanwhile, the talent war intensifies elsewhere: Apple just lost Ruoming Pang, its top AI models executive, to Meta's new superintelligence team, another blow to Apple's struggling AI efforts as Meta continues its aggressive hiring spree.
LeCun challenges the AlphaFold origin story
Meta's Chief AI scientist, Yann LeCun, sparked debate by questioning the narrative around AlphaFold's revolutionary breakthrough, arguing that DeepMind's achievement didn't emerge in isolation. In a LinkedIn post, LeCun traced protein structure prediction using neural networks back decades, crediting Pierre Baldi at UC Irvine for using recurrent nets to predict protein contact maps in the 2000s "long before deep learning became cool."
He highlighted overlooked contributions, including Daniel Cremers' team's work and Meta's own FAIR Protein group led by Alex Rives, which pioneered self-supervised pre-trained transformers for protein structure prediction (ESMFold). That team later formed EvolutionaryScale, a startup competing in the protein folding space. He traced the field's origins to the 1990s Snowbird Workshop with pioneers like Anders Krogh, Richard Durbin, and David Haussler, noting a 2001 MIT Press book "Bioinformatics: the machine learning approach" that documented the early state-of-the-art.
Coding assistants prove surprisingly sticky
New research from Indagari reveals AI coding assistants achieving retention rates that rival general-purpose chatbots, with professional developer tools dramatically outperforming consumer-focused options.
This comes amid explosive AI adoption: Ramp's AI Index shows 42% of U.S. businesses now pay for AI tools (April 2025), up from 6.2% in January 2023—nearly 7x growth that far exceeds government estimates of 9.4%.
Overall adoption rate. (Source)
Elite performers (70%+ retention):
Moderate performers:
Struggling segment:
Multi-tool adoption rising: Over 20% of coding tool users maintain multiple subscriptions simultaneously, indicating tools are becoming complementary rather than substitutes. This suggests the market is maturing toward specialized use cases rather than winner-take-all dynamics.
💰 Money moving in AI and data
$3.92 trillion market cap: Nvidia briefly hit a market value of $3.92 trillion on Thursday, putting it on track to become the most valuable company in history and surpassing Apple's record. Analysts believe Nvidia could become the first company to reach $4 trillion this summer, with Loop Capital setting a $250 price target that would equate to a roughly $6 trillion market value.
$4 billion annual revenue pace: Anthropic has reached $4 billion in annualized revenue, up nearly four times from the start of the year, as the company's surge is largely driven by selling AI models as a service to other companies, with code generation being a key driver. The meteoric rise positions Anthropic as potentially the fastest-growing SaaS company ever seen by VCs, though it still trails OpenAI's projected $12 billion total revenue for 2025.
$100 million valuation: Applied Compute, a pre-launch startup founded by three former OpenAI technical staffers, raised $20 million at a $100 million valuation in a round led by Benchmark partner Victor Lazarte. The reinforcement learning startup, founded by recent Stanford graduates Rhythm Garg, Linden Li, and Yash Patil, has already fielded interest from investors at valuations of $200 million and even $500 million.
$60 million Series D: Clearspeed secured $60 million in Series D funding led by Align Private Capital, bringing the company's total funding to $110 million. The voice-based risk assessment company consistently yields more than 30X return on investment for insurers by cutting claims handling time in half and increasing immediate payments by 40%. Former CIA Director General David Petraeus joined as a multi-round investor.
$34 million seed round: Wonderful raised $34 million led by Index Ventures to accelerate deployment of its AI customer service agents in non-English-speaking markets across Europe, Asia, and the Middle East. Founded in 2025 in Israel, the company is already powering hundreds of thousands of customer interactions for market-leading enterprises, including Bezeq and Maccabi Health Services, targeting a $200 billion global opportunity largely ignored by US-focused AI solutions.
$34 million dedicated YC fund: Y Combinator alum Kulveer Taggar launched Phosphor Capital, a venture firm dedicated solely to investing in YC companies, raising $34 million across two funds with backing from YC CEO Garry Tan. The fund capitalizes on YC's track record, where 6% of companies become unicorns, with Taggar particularly excited by opportunities in young AI startups under Tan's leadership.
Product @ Realpad | SaaS | ex kosik.cz
2wGreat read as always Deepnote!