Multiverse Computing Brings Compact AI to Your Devices

New API and app aim to make powerful AI models run directly on your phone, reducing reliance on the cloud.

Multiverse Computing has launched an API portal and an app, CompactifAI, to make compressed AI models more accessible. This initiative allows smaller, efficient AI models to run on personal devices, offering enhanced privacy and potentially lower operational costs for businesses. It marks a significant step towards 'AI on the edge' by reducing dependence on external cloud infrastructure.

Sarah Kline

By Sarah Kline

March 20, 2026

4 min read

Multiverse Computing Brings Compact AI to Your Devices

Key Facts

  • Multiverse Computing has launched an app (CompactifAI) and an API portal for its compressed AI models.
  • The company has compressed models from major AI labs including OpenAI, Meta, DeepSeek, and Mistral AI.
  • CompactifAI allows AI to run directly on user devices, enhancing privacy and reducing cloud reliance.
  • The app automatically switches to cloud-based models if a device lacks sufficient RAM and storage.
  • Businesses are the primary target for this technology due to potential for lower compute costs.

Why You Care

Ever worry about your personal data floating around in the cloud when you use AI? Or perhaps you’re frustrated by slow AI responses or hefty subscription fees. What if AI could live directly on your device, without needing a constant internet connection or sending your information elsewhere? This is precisely what Multiverse Computing is working to make a reality, and it could change how you interact with artificial intelligence every day.

What Actually Happened

Multiverse Computing, a Spanish startup, has recently made a significant move to bring its compressed AI models to a wider audience. The company has launched both an app, called CompactifAI, and an API portal, according to the announcement. This API portal acts as a gateway, allowing developers to access and build applications using these efficient, smaller models. The team revealed that they have successfully compressed models from major AI labs, including OpenAI, Meta, DeepSeek, and Mistral AI. This means these AI tools can now run more efficiently.

For end-users, the CompactifAI app offers a taste of “AI on the edge.” This means the AI processing happens directly on your device. Your data doesn’t leave your phone, which significantly boosts privacy. This approach also eliminates the need for a constant internet connection, making AI accessible even offline.

Why This Matters to You

Imagine having an AI assistant that understands your queries instantly, without sending your voice or text to a remote server. This is the promise of Multiverse Computing’s approach. By running AI models directly on your device, you gain more control over your data and experience faster interactions. Your privacy is enhanced because your information stays local.

However, there’s a practical consideration for users. Your mobile device needs sufficient RAM and storage to run these models effectively. If your device, like many older iPhones, doesn’t meet these requirements, the app will automatically switch to cloud-based models. This routing is managed by a system Multiverse has named Ash Nazg, as mentioned in the release. While convenient, this cloud fallback means you lose the privacy benefits of local processing.

Think of it as having a personal AI chef in your kitchen versus ordering takeout. The personal chef (on-device AI) knows your preferences intimately and keeps everything private. Ordering takeout (cloud-based AI) is convenient, but your data travels. What kind of AI experience do you prefer for your daily tasks?

As CEO Enrique Lizaso stated, “The CompactifAI API portal [now] gives developers direct access to compressed models with the transparency and control needed to run them in production.” This direct access empowers developers to build new applications with these efficient models. Businesses are particularly interested in this system due to the potential for lower compute costs.

Here’s a look at the benefits of compressed AI models:

FeatureOn-Device AI (Compressed Models)Cloud-Based AI (Traditional LLMs)
Data PrivacyHigh (data stays local)Lower (data sent to cloud)
Internet NeedMinimal/NoneConstant
Response SpeedFasterDependent on network latency
Compute CostsPotentially Lower for businessesCan be high
Device StorageRequires sufficient RAM/storageMinimal device requirements

The Surprising Finding

Perhaps the most unexpected revelation from this creation is the sheer demand for AI efficiency, driving companies to explore alternatives to massive cloud-based models. While the CompactifAI app has seen limited public adoption—Sensor Tower data shows only 1,000 downloads for the Android version as of March 19, 2026—the real focus is clearly on businesses. This challenges the common assumption that mass consumer adoption is the primary metric for AI success. Instead, the company reports that the main target audience is enterprises seeking to reduce operational expenses and enhance data security. This indicates a shift in priorities within the AI landscape, moving beyond just raw computational power to practical, cost-effective deployment.

What Happens Next

Multiverse Computing’s focus on businesses suggests we will see more enterprise-level applications emerge in the coming quarters. Developers can now experiment with the API portal, potentially leading to new products by late 2026 or early 2027. For example, imagine a manufacturing plant using on-device AI to monitor equipment in real-time, without sending sensitive operational data to external servers. This could significantly improve efficiency and security.

The industry implications are substantial. As the company reports, lower compute costs are a major draw for enterprises considering smaller models. This could lead to a broader adoption of edge AI solutions across various sectors. Our advice for readers is to keep an eye on your device’s specifications. If you’re considering a new phone or tablet, prioritize models with ample RAM and storage to take advantage of these emerging on-device AI capabilities. This system could fundamentally alter how businesses manage their AI infrastructure and how you experience smart devices.

Ready to start creating?

Create Voiceover

Transcribe Speech

Create Dialogues

Create Visuals

Clone a Voice