Manticore: Blending AI Models for Better Language AI

New framework automates hybrid AI architecture design, making powerful language models easier to build and customize.

A new framework called Manticore automates the creation of hybrid AI architectures, combining existing pretrained models like GPT and Mamba. This innovation simplifies the complex process of designing advanced language models, offering improved performance and customization for various natural language tasks. It promises to make cutting-edge AI more accessible and efficient for developers.

Mark Ellison

By Mark Ellison

October 1, 2025

4 min read

Manticore: Blending AI Models for Better Language AI

Key Facts

  • Manticore is a new framework for automating hybrid AI architecture design.
  • It reuses pretrained models like GPT and Mamba to create new hybrids.
  • The framework uses simple projectors to translate features between different pretrained blocks.
  • Manticore hybrids match manually designed hybrids and outperform individual models on various NLP tasks.
  • The research was submitted on June 2, 2024, with a revised version on September 30, 2025.

Why You Care

Ever wonder why building AI models feels so complex and time-consuming? Imagine you could mix and match the best parts of different AI brains to create something even smarter. That’s precisely what a new structure called Manticore aims to do for large language models (LLMs). This creation could drastically change how you interact with AI, making it more tailored and efficient.

What Actually Happened

A team of researchers, including Nicholas Roberts and Samuel Guo, has introduced Manticore, a novel structure designed to automate the creation of hybrid AI architectures, as detailed in the blog post. Currently, designing these complex hybrid models requires extensive manual effort and expert knowledge. What’s more, new hybrid models typically need to be trained from scratch, which is a resource-intensive process. Manticore tackles these challenges by allowing developers to reuse existing pretrained models, like those from the GPT series and Mamba, to construct new, hybrid systems. The structure incorporates projectors, which are simple components that translate features between different pretrained blocks, facilitating their integration, the research shows.

Why This Matters to You

This creation means you no longer need to start from zero when developing a specialized AI. Think of it as building with LEGOs instead of carving each piece from wood. For example, if your company needs an AI that excels at both creative writing (like GPT) and efficient long-context understanding (like Mamba), Manticore could help you combine these strengths without immense creation costs. This approach enables faster creation cycles and more targeted AI solutions for your specific needs.

Key Benefits of Manticore:

  • Faster creation: Reuses existing pretrained models, reducing training time.
  • Enhanced Performance: Combines strengths of different AI architectures.
  • Greater Flexibility: Allows customization of AI capabilities for specific tasks.
  • Reduced Manual Effort: Automates complex design processes.

How might this structure change the way you approach your next AI project? The team revealed that Manticore allows for “LM selection without training multiple models, the construction of pretrained hybrids from existing pretrained models, and the ability to program pretrained hybrids to have certain capabilities.” This means you can pick and choose the best components, making your AI more and efficient. Your ability to create specialized AI just got a significant boost.

The Surprising Finding

What’s truly remarkable about Manticore is its ability to create high-performing hybrid models by combining already pretrained components, rather than starting fresh. This challenges the conventional wisdom that complex hybrid AI architectures always demand a full, expensive retraining cycle. The study finds that “Manticore hybrids match existing manually designed hybrids, achieve strong performance on Long Range Arena, and improve on pretrained transformers and state space models on various natural language tasks.” This outcome is surprising because it suggests that simply blending existing, AI components can yield results comparable to, or even better than, bespoke, hand-crafted designs. It overturns the assumption that deep, ground-up architectural creation is always necessary for superior performance.

What Happens Next

Looking ahead, Manticore could significantly accelerate AI creation. We might see initial applications emerge within the next 12-18 months, perhaps by late 2025 or early 2026, as the company reports. For example, imagine a content creation system using a Manticore-powered AI to generate highly nuanced articles by combining a text generation model with a factual verification model. This would produce more accurate and engaging content. For developers, the actionable advice is to explore how existing pretrained models can be modularly combined for your projects. The industry implications are vast, potentially democratizing access to AI capabilities by lowering the barrier to entry for custom model creation. This structure could lead to a new era of ‘component-based’ AI creation, making AI more accessible to everyone, according to the announcement.

Ready to start creating?

Create Voiceover

Transcribe Speech

Create Dialogues

Create Visuals

Clone a Voice