Picking Teams in AI
three friendly robots in different coloured basketball jerseys on a basketball court cartoon via Midjourney

Picking Teams in AI

Yesterday, Databricks announced its intent to acquire Mosaic for $1.3b. Perhaps not coincidentally, Snowflake announced a deepened partnership with Nvidia to offer customers models & training on Nvidia’s Nemo platform.

Clouds are picking teams in one of the most important dislocations in software.

Cloud ↔️ LLM Infrastructure

Microsoft ↔️ OpenAI

Snowflake ↔️ Nvidia

Databricks ↔️ Mosaic

Google ↔️ Anthropic

Oracle ↔️ Cohere

Amazon ↔️ HuggingFace

Microsoft has invested over $10b, plus significant development efforts to work with OpenAI. In addition, Microsoft & Snowflake announced a deeper AI go-to-market partnership with Snowflake.

Snowflake’s partnership with Nvidia positions Snowflake’s cloud as a broader infrastructure platform.

DataBricks, whose business revolves around Spark operating in customer environments, has announced plans to acquire Mosaic, a vertically integrated model training & management system that functions on similar workloads.

Google has invested hundreds of millions into Anthropic, complementing its efforts with Google Brain.

Oracle has paralleled Microsoft’s OpenAI partnership with Cohere, investing & seeking to build a product for Oracle’s cloud with Cohere.

Amazon has announced HuggingFace LLMs on their Sagemaker product, embracing the open-source community.

Cloudflare, which has seen tremendous interest in its R2 storage product for model training because of a lower cost storage infrastructure, had partnered with Mosaic. It’s unclear how the Databricks acquisition might change that relationship.

Cloud infrastructure players are picking teams within the infrastructure layer. Most major cloud players have picked an LLM partner & perhaps will choose multiple.

For startups building LLM-based applications & infrastructure, this alters the calculus of selecting a cloud. Five years ago, many startups defaulted to AWS for the generous credits, broad catalog, & rapid pace of innovation.

LLM-enabled apps require customer data to train, propelling data security to the top of the list for most enterprise buyers. Startups may begin to pick clouds to reach a particular class of customer, the security promises the underlying platforms provide, & then available algorithms & cost.

Access to particular models may be a consideration, but given the rapid advances in open-source that advantage will likely erode over time.

⚡ Mayur Palta

GTM Field Engineering Leader, Non-profit Board member, Investor, Author, Speaker

2y

For learn more about "Outcompeting in the age of AI" bit.ly/45ceAKp

Like
Reply
Subhan Ali

Building a Data + AI Startup | NVIDIA | Stanford

2y

We started working on the NVIDIA Snowflake alliance long before the current interest in LLMs. See more here: https://www.linkedin.com/posts/subhanali_when-i-was-6-months-in-at-my-job-at-nvidia-activity-7079558877733670912-GqLP/

Like
Reply

This post sheds light on the significant role of cloud technology in the evolving software landscape. At Good AI Vibes, we explore how AI intersects with cloud computing in our bi-weekly newsletter. It's an invaluable resource for staying updated on the latest trends. Join us in this exploration: https://goodaivibes.substack.com/. Let's keep the conversation going and uncover more insights together! 💡

Like
Reply
Andrew Wise

GTM Vibe Coding Growth Hacker. Prev @Postmates @Posh @Capture @Grooveshark

2y

Very interesting

Like
Reply

To view or add a comment, sign in

Others also viewed

Explore topics