New AI Framework Makes Small Language Models Mighty for Databases

MATS enables cost-effective, private Text2SQL solutions using compact AI models.

A new framework called MATS allows Small Language Models (SLMs) to perform Text2SQL tasks as effectively as larger, more expensive models. This innovation addresses privacy and cost concerns, making advanced database interaction more accessible for businesses.

Mark Ellison

By Mark Ellison

December 23, 2025

4 min read

New AI Framework Makes Small Language Models Mighty for Databases

Key Facts

  • MATS is a novel Text2SQL framework designed for Small Language Models (SLMs).
  • It addresses privacy and cost concerns associated with Large Language Models (LLMs).
  • MATS uses a multi-agent mechanism and reinforcement learning for improved performance.
  • The framework achieves accuracy on par with large-scale LLMs using significantly fewer parameters.
  • MATS can be deployed on a single-GPU server.

Why You Care

Ever wished you could just talk to your database and get answers, without needing a coding expert? What if privacy and cost were no longer barriers to this system? A new creation in AI, called MATS (Multi-agent Text2SQL structure), is making this a reality for businesses, according to the announcement. This could dramatically change how you interact with your company’s data, making it faster and more intuitive.

What Actually Happened

Researchers have introduced MATS, a novel Text2SQL structure designed specifically for Small Language Models (SLMs), as detailed in the blog post. Text2SQL is the challenging task of converting natural language questions into database queries (SQL). While Large Language Models (LLMs) excel at this, their cost and privacy implications often prevent companies from using them. MATS addresses these limitations by enabling SLMs—which are openly available and can be hosted in-house—to achieve comparable performance. This means businesses can now deploy Text2SQL solutions without relying on external, expensive LLM services, the team revealed.

Why This Matters to You

This new MATS structure has significant practical implications for your business and how you manage data. It offers a approach to the common dilemma of wanting AI capabilities without the hefty price tag or data security risks. Imagine being able to ask complex questions about your sales figures or inventory levels in plain English, and have your database instantly provide the exact data you need. This could empower non-technical staff to access insights previously locked behind SQL expertise.

Benefits of MATS for Your Business:

  • Cost-Effectiveness: Reduces reliance on expensive large-scale LLMs.
  • Enhanced Privacy: Allows in-house hosting of SLMs, keeping your data secure.
  • Accessibility: Democratizes data access for non-technical users.
  • Competitive Performance: Achieves accuracy on par with much larger models.

For example, consider a small e-commerce business. Instead of hiring a dedicated data analyst for every query, your marketing team could simply type, “Show me all products with more than 100 sales in the last quarter.” MATS, powered by an SLM, would translate this into the correct SQL query and retrieve the results. How much faster could your team make data-driven decisions if they could just ask for the information they needed? The research shows that MATS maintains “competitive performance despite a limited LLM size.”

The Surprising Finding

Here’s the twist: traditionally, it was assumed that only massive, computationally intensive Large Language Models (LLMs) could handle the complexities of Text2SQL tasks effectively. However, the study finds that MATS, utilizing SLMs, yields accuracy that is “on-par with large-scale LLMs when using significantly fewer parameters.” This defies the conventional wisdom that bigger always means better in the world of AI language models. The structure achieves this by employing a multi-agent mechanism, where specialized auxiliary agents collaborate, reducing the individual workload on the SLM, as mentioned in the release. This intelligent division of labor, combined with reinforcement learning from execution feedback, allows smaller models to punch above their weight.

What Happens Next

The introduction of MATS suggests a future where AI tools are more accessible and customizable. We could see initial deployments of this structure in specialized enterprise applications within the next 6-12 months. Businesses might begin integrating MATS-powered SLMs into their internal data analytics platforms. For example, a financial institution could use this system to allow portfolio managers to query market data using natural language, directly from their secure internal systems. Our advice for readers is to start exploring how in-house AI solutions can address your specific data interaction needs. The technical report explains that MATS, deployed on a single-GPU server, is already demonstrating its capabilities. This indicates a lower barrier to entry for companies wanting to harness AI for database management.

Ready to start creating?

Create Voiceover

Transcribe Speech

Create Dialogues

Create Visuals

Clone a Voice