Wrapping an Existing API with MCP: How to Expose Your Current APIs to LLMs
Open Spaces is Gun.io’s field guide to the tools our engineers actually ship with. In this second installment—building on Part 1: Model Context Protocol (MCP)—The Missing Layer Between AI and Your Apps—veteran full-stack architect Tim Kleier shows how you can unlock AI muscle without touching your underlying stack.
Have a plain-vanilla REST endpoint—say, one that files support tickets? Strap on MCP and suddenly Claude, ChatGPT, or any LLM agent can create, update, and triage tickets through simple prompts. Tim walks through wrapping a Node-based ticket service, exposing an /mcp endpoint, and watching a live LLM open a ticket for “Mark in Milwaukee” in real time. Code snippets are inline; the full repo lives at github.com/upgrade-solutions/mcp-api-wrapper.
If you’ve been waiting for a practical way to let AI act on your data—not just talk about it—start here.
Our Scenario
Let’s imagine we work for a SaaS company with a custom system for creating support tickets. We want to hook it up to Claude so our customer support team can interact with the API through prompts. This is our end goal:
We give Claude a prompt and it employs the create_ticket MCP tool, which connects to our support API to create a support ticket for Mark. Before we dive into the code to see how the MCP tool is exposed alongside our regular API, let’s first talk about MCP tools.
What Are MCP Tools?
MCP defines a standard way for large language models (LLMs) to interact with external systems via tools. An MCP tool is a description of an action a model can take. Each MCP tool has these basic characteristics:
Rather than creating a separate MCP server just for tools, you can expose your MCP tool definitions at a dedicated endpoint within your existing API. This keeps things simple and lets LLMs discover the capabilities dynamically.
Here’s the MCP definition of our tool (create_ticket) in a Node context:
We first provide a name and description for the tool, then define the input schema. Finally, we define the function for processing a tool request. This is the corresponding API endpoint using Express:
The key point of overlap here is createTicket(). Our API endpoint calls that function to create a support ticket, and our MCP does exactly the same.
MCP Endpoint
Now that we’ve seen the side-by-side comparison of our ticket creation functionality in an API endpoint and an MCP tool, let’s take a look at how the MCP “server” is really just exposed as an endpoint in our API.
In our tool declaration above, we added a tool to the server object, and here we expose that server over HTTP (via a streamable HTTP transport) to the outside world.
This /mcp endpoint provides a list of available tools for MCP hosts/clients to interact with. Using a tool like MCP Inspector, we can see that the list of tools includes create_ticket, and we also see the parameters it takes on the right hand side.
Calling an MCP-wrapped API
Now that we’ve verified our MCP endpoint is working, we can point Claude to the running API.
After restarting Claude, the support-api MCP tool for creating tickets will now be available, and we can easily prompt Claude to help us create a support ticket.
Final Thoughts
As we’ve discussed in this article, you can hook up LLMs to your existing API without much additional code. MCP gives you a thin abstraction layer that lets LLMs safely and intelligently interact with your systems—starting with the APIs you already have.
Whether you’re improving customer support, automating IT workflows, or streamlining internal operations, wrapping your API with MCP can empower your workforce with new AI tooling.
Ready to bridge your existing systems with AI? The engineers who can implement solutions like this are in high demand. If you're looking to hire developers with MCP and AI integration experience, Gun.io's network includes the specialists who can turn your current APIs into AI-powered tools.