Building a Standalone MCP Server: Bridging Legacy and Modern Systems with AI

Building a Standalone MCP Server: Bridging Legacy and Modern Systems with AI

Open Spaces is Gun.io's field guide to the tools our engineers actually ship with. In this second installment, veteran full-stack architect Tim Kleier shows how you can unlock AI muscle without touching your underlying stack. This builds on Part 1: Model Context Protocol (MCP)—The Missing Layer Between AI and Your Apps, and Part 2: Wrapping an Existing API with MCP: How to Expose Your Current APIs to LLMs.

What if AI could seamlessly query both modern APIs and legacy databases—without your team rebuilding a thing?

That’s exactly what the Model Context Protocol (MCP) enables. In a previous article, Wrapping an Existing API with MCP, we connected to a customer support API. In this guide, we’ll build a standalone MCP server that exposes a legacy database to large language models (LLMs).

We’ll continue to use our customer support scenario to walk through how MCP servers work. For an intro to MCP, check out Model Context Procotol: The Missing Layer Between AI and your Apps.

The Use Case: Investigating Old Tickets with AI

Two years ago, your company switched to a new customer support platform. The new system offers a clean REST API. But the old system? It’s just a dusty Postgres database.

Now imagine a B2B customer submits a ticket:

“Hey, why is our payment $499/month now? We agreed to $350/month when we signed on three years ago.”

Support reps would normally have to contact a developer to query the legacy database. But with an MCP server in place, an LLM can do it for them.

Why Use a Standalone MCP Server?

While MCP tools can be embedded in an existing app (as we showed here), some orgs prefer to keep AI tooling separate:

  • Modularity – A dedicated AI layer avoids polluting core app code.

  • Cross-system Orchestration – A standalone MCP server can wrap multiple systems into a unified toolset.

  • Scale – You can run the MCP server as a scalable service behind an agent gateway.

Let’s see one in action.

Step 1: Set Up the MCP Server

Creating an MCP server is pretty simple, and you can check out the official docs for a quickstart. In order to connect to our legacy support system, we’ll use a PostgreSQL connector with read-only queries. 

Below are some of the key code snippets, and the full source code can be found here: https://github.com/upgrade-solutions/mcp-server-postgres

As you can see, the code primarily involves basic MCP server setup, a PostgreSQL connection, registering the query tool, and then query execution. 

In order to use the postgres MCP server with Claude, you’ll need to configure your claude_desktop_config.json file. 

Step 2: Use It from an LLM Agent

Now that the MCP server is ready and Claude has registered it, we can find that old ticket about grandfathered pricing. 

Not only was Claude able to find the record, it did so when given a slightly incorrect email address. This is because it can execute multiple read-only queries until it finds the data we’re looking for. Incredibly powerful!

Closing Thoughts

As AI agents grow more capable, they need structured access to real data—not just chat history. MCP offers the missing abstraction for that.

Standalone MCP servers let you build flexible, language-agnostic interfaces on top of your existing systems. Whether it’s a brand-new REST API or a crusty legacy database, the model doesn’t care—as long as it has tools to work with.

And that’s the future with MCP: a seamless bridge between systems old and new.

To view or add a comment, sign in

Others also viewed

Explore topics