Building an Intelligent AI System for Interrogating Data with Plain English

Building an Intelligent AI System for Interrogating Data with Plain English

I was speaking with a client about a control panel I'd recently delivered. It was a fairly sophisticated piece of work - drawing in live data feeds from hundreds of systems and devices, and presenting it all in one place. Everything was live, responsive, and designed to give them the oversight and control that they needed.

During that conversation, they asked something that sparked the idea for this build:

"How do I read the data? It's very important to be able to find insights from it."

They weren’t a developer. They didn’t know SQL. I showed them the database, but it meant nothing to them. They were a decision-maker, though. What if I could give them easy access to insights without the need to go through a data team, developer or analyst?

This would be a tool that could:

  • Connect to a MySQL database

  • Understand the schema

  • Let users ask plain English questions

  • Return not just the raw data, but a useful, contextual summary of the results

From Idea to Execution

With that goal in mind, I put together a plan for a simple, but powerful AI-driven tool, which would be a sort of "AI SQL whisperer". The core premise was this:

You talk to the system in natural language. It figures out the schema, writes the query, runs it safely, and explains the result like a helpful analyst.

Here's how I approached the build:

Building the System

Frontend scaffolding was done using Lovable.dev. I wanted to get a modern, clean interface up and running quickly, and like past projects, it would be something that I could iterate on. I've used Bolt.new previously, but sadly, it couldn't handle the scope this time.

From there, I moved the project into Cursor, which handled much of the backend build-out. I had Cursor generate initial service layers, connect to the MySQL database, and manage the API endpoints. Then I layered in AI capabilities.

On the backend, the main stack included:

  • Node.js / Express for API routes

  • SQLite for storing schema and conversation threads locally

  • Mysql external db access

  • OpenAI Responses API (using GPT-4o) and file storage, for natural language processing, SQL generation, and result summarisation

The frontend was powered by React + TypeScript, with UI components drawn from shadcn/ui. I focused on building a tight, split-panel layout:

  • Left pane: schema explorer

  • Right pane: AI chat interface

Every question the user asks is processed in real time. The AI generates a safe SQL query (enforced with validation rules, no INSERTs or DELETEs), runs it, and then analyses the result set to generate a human-readable response.

Safety and Control

A minimum amount of security and safety were baked into the system:

  • All queries are strictly read-only (SELECT-only)

  • LIMIT 100 is enforced automatically (to prevent massive return results)

  • Disallowed keywords and semicolon chaining are blocked

  • Sensitive fields (e.g. ) are automatically stripped from outputs

It was important that users couldn’t accidentally (or intentionally) harm the data. No dropping of databases please!

Understanding Schema for Smarter Queries

One of the more impressive aspects of this system is how the AI handles database structure. Because the AI is given the full schema - including all tables, columns, data types, and relationships - it effectively builds a mental model of how everything connects.

The AI is able to infer how the tables relate to one another - understanding foreign keys, one-to-many and many-to-many relationships, and how data flows through the system. It uses this understanding to determine which tables need to be joined, how to structure those joins, and which fields are relevant to the user's question. As a result, it's able to generate reasonably complex SQL queries that span multiple tables using accurate joins.

There’s no hardcoded logic or fixed templates. Each query is dynamically constructed in real time, driven entirely by the AI’s comprehension of both the user’s intent and the structure of the schema. And the quality of the queries it produces - despite having no pre-written templates - is genuinely impressive (even though I was expecting it, I was still impressed).

This makes it a genuinely intelligent tool for exploring real-world databases in a way that would normally require knowledge of SQL at the very minimum.

The Result: Data Access for Everyone

After just a day of building, testing, and refining, the system was fully operational. It could:

  • Connect to any MySQL schema

  • Extract and summarise its structure in Markdown

  • Maintain conversation context across threads

  • Let users ask questions and get smart, natural explanations of what the data says

I built a very quick demo database, based on an ecommerce use-case for the system to interrogate. Let me show you what the system looks like:

Schema summary + AI response interface:

AI answering the question "who has made the highest value order?"

AI answering the question "What does this database do?"

AI interprets a vague query and explains something interesting that it found in the data:

Why This Matters

Most organisations have a lot of data stored in SQL databases. But the majority of staff can't access that data directly - usually because that data is only accessible by the systems that create it.

This project has the potential to make that data accessible to anyone. It turns raw, relational databases into something that can be explored conversationally. The AI helps retrieve the data and explain what it means.

This is where AI can add some real value: It becomes a bridge between human intention and technical execution.

Beyond MVP

While the current version is fully functional, the next steps might be:

  • Data visualisation: Auto-generating charts based on results

  • Voice input: Letting users ask questions out loud

  • Export tools: Turn queries + results into report PDFs

  • Multi-user support: With authentication and saved sessions

I'd be surprised if this didn't exist as a SaaS platform - it's a universally useful use case for any organisation.

Final Thoughts

While I've built quite a few complex systems using AI, this project took shape and became genuinely useful a lot faster than even I expected.

If you're working with data, thinking about internal tooling, or trying to make information more accessible to your team - this might be something worth exploring.

As always, happy to share insights or help brainstorm how something like this could work for your organisation.

To view or add a comment, sign in

Others also viewed

Explore topics