Mirai Boosts On-Device AI: Faster Models for Your Phone

Reface and Prisma founders launch Mirai, optimizing AI inference directly on consumer hardware.

Mirai, a new London-based startup, is tackling the challenge of running complex AI models directly on your smartphone. Founded by the creators of Reface and Prisma, Mirai aims to make on-device AI faster and more efficient, reducing reliance on costly cloud infrastructure.

Katie Rowan

By Katie Rowan

February 20, 2026

3 min read

Mirai Boosts On-Device AI: Faster Models for Your Phone

Key Facts

  • Mirai was founded by Dima Shevts (Reface co-founder) and Alexey Moiseenkov (Prisma co-founder).
  • Mirai secured a $10 million seed round led by Uncork Capital.
  • The company's engine, built in Rust, can increase AI model generation speed by up to 37%.
  • Mirai's current focus is on optimizing text and voice AI models for on-device inference.
  • The company plans to expand support to vision models and Android devices in the future.

Why You Care

Ever wonder why your phone struggles with AI features, or why some apps require an internet connection for seemingly simple tasks? What if your device could handle AI locally, without sending your data to the cloud? This is precisely what Mirai, a new London-based startup, aims to achieve. They are working to bring AI directly to your smartphone, making it faster and more private.

What Actually Happened

Mirai, a 14-person technical team, recently emerged to address a essential gap in the AI landscape. While much of the AI conversation centers on massive cloud data centers, Mirai focuses on improving on-device model inference – that is, running AI models directly on your phone or tablet. The company, backed by a $10 million seed round led by Uncork Capital, was founded last year by Dima Shevts and Alexey Moiseenkov, as mentioned in the release. Both co-founders have a strong background in building consumer applications. Shevts co-founded the popular face-swapping app Reface, and Moiseenkov was the CEO and co-founder of Prisma, a viral AI filter app, the company reports. Their combined experience in consumer apps drives Mirai’s mission to enable complex AI tasks on mobile devices.

Why This Matters to You

Mirai is developing a structure to help AI models perform better on your devices. They have built an inference engine specifically for Apple Silicon, which significantly optimizes on-device throughput (the amount of data processed over time). Imagine using an AI writing assistant that summarizes documents instantly on your iPad, without any lag or privacy concerns. The company reports that its upcoming SDK (Software creation Kit) will allow developers to integrate this runtime with just a few lines of code. “One of the visions why we started the company was that we wanted to give developers, like this Stripe-like, eight lines of code [integration] experience,” Shevts stated, emphasizing ease of use for developers.

Mirai’s Impact on Your Apps

FeatureCurrent Cloud-Based AIMirai’s On-Device AI
SpeedCan experience latency due to networkNear- processing
PrivacyData often sent to external serversData stays on your device
CostRelies on cloud compute, can be costlyReduces cloud costs for developers
Offline UseLimited functionality without internetFull functionality without internet

This means your future apps could be faster, more private, and work seamlessly offline. Think of it as having a AI supercomputer in your pocket. What kind of AI features would you love to see run perfectly on your device, even without an internet connection?

The Surprising Finding

Here’s an interesting twist: Mirai’s engine, built in Rust, can bump up a model’s generation speed by up to 37%, the company claims. This is quite surprising given the common assumption that on-device AI is inherently slower and less capable than cloud-based solutions. The team revealed that while tuning models for different platforms, they intentionally avoid altering model weights (the parameters that define how an AI model works). This ensures there is no loss in the quality of the AI’s output, a crucial detail often overlooked when optimizing for speed. This approach challenges the notion that speed improvements must come at the expense of accuracy or quality.

What Happens Next

Mirai’s current focus is on improving text and voice modalities on its system, as mentioned in the release. However, the team plans to support vision capabilities in the future. They are actively working with frontier model providers (companies creating AI models) to improve these models for edge use (running on devices). The company is also in talks with various chipmakers, indicating future hardware integrations. Later, Mirai intends to bring its engine to Android devices, expanding its reach significantly. The team also aims to release on-device benchmarks. This will allow model makers to test and compare on-device performance, fostering creation across the industry. For you, this means a future where your devices are smarter, more responsive, and more capable than ever before.

Ready to start creating?

Create Voiceover

Transcribe Speech

Create Dialogues

Create Visuals

Clone a Voice