Build Better AI with Clarifai: Week 1 of AI Developer Series
This week's AI in 5 newsletter is special!
We’re kicking off a 6-week masterclass that will help you easily build and deploy production-grade AI. 💪
Over the next 6 weeks, we will explore key topics like AI Agents, the Model Context Protocol (MCP), deploying custom models on dedicated hardware, and more. Every week, we’ll focus on a specific topic in detail, providing code snippets, example projects, tutorials, and workflows to help you build solid AI solutions.
Here’s what we’ll cover today:
Let’s dive in! 👇
Add Models to Your AI Stack with an OpenAI-Compatible API! 🚀
You can now run over 100 open-source and third-party models, including LLMs, vision models, and audio models, using the familiar OpenAI API format. This makes it easier to plug models into your existing tools and workflows without starting from scratch.
Clarifai supports OpenAI-compatible endpoint, which means you can send standard OpenAI-style requests and have them translated into Clarifai API calls under the hood.
This compatibility lets you:
Now let’s look at how to use this endpoint with different libraries like the OpenAI SDK, LiteLLM, and the Vercel AI SDK:
1. OpenAI Library
Use the OpenAI SDK to run major open-source or commercial models hosted on Clarifai without changing your codebase.
By simply updating the base URL and using your Clarifai Personal Access Token (PAT), you can send OpenAI-style requests to access over 100 models.
Whether you're working in Python or JavaScript, the OpenAI client supports:
If you're already familiar with the OpenAI SDK, you can use your existing setup to access these diverse models and start building your own AI-powered applications right away.
Check out the code snippet below to get started and learn more here.
2. Vercel AI SDK!
More and more developers are building AI apps with TypeScript and full-stack tools, and the Vercel AI SDK is becoming a core part of that stack.
The SDK works with OpenAI-compatible providers, so you can connect to Clarifai’s endpoint to generate text, stream responses, and call tools in your TypeScript projects.
Whether you’re building APIs, integrating with a frontend, or working across both, this setup lets you stay in your stack while adding powerful model capabilities.
In the code snippet below, you’ll see an example of how to access the latest Claude Sonnet 4 model from the community using the Vercel AI SDK.
Check out more examples here.
3. LiteLLM!
LiteLLM gives developers a simple way to run inferences across multiple LLM providers through a single interface. It supports OpenAI-compatible APIs out of the box.
You can connect it to Clarifai’s OpenAI-compatible endpoint to chat, stream responses, and call tools with minimal setup. Just update the base URL and add your Clarifai Personal Access Token (PAT).
Find more examples here
Dev Tip: 📌
Build Agents Faster with OpenAI-Compatible Endpoints
Frameworks like CrewAI, LangGraph, and Google’s ADK are built to help you create autonomous agents that can plan, reason, use tools, and complete tasks.
Most of these frameworks are designed around OpenAI-style APIs for LLM integration, so you can easily connect models from Clarifai without changing your agent code.
Just set the base URL and your Personal Access Token (PAT) to start building and running agentic systems.
Below is an example showing how to integrate DeepSeek-R1 with Google’s ADK in just a few lines of code.
That’s it for today’s issue. Hope you enjoyed it!
Stay tuned for the next one, where we’ll cover how to build AI agents from scratch.
Happy building!
Clarifai Team
Senior Director - Partner Business -SIs and Consulting Partners
1moCongrats! 🎉
Enterprise Consultant | AI/ML | Telecom (OSS/BSS) and IT services | Solutions architecture, Program/project management
1moBase URL: https://api.clarifai.com/v2/ext/openai/v1 Link not working.