Why You Care
Ever wonder why some AI features on your phone feel slow or drain your battery quickly? What if AI could run directly on your device, instantly and efficiently? A new player, Mirai, is stepping into this arena. The company aims to make on-device AI a practical reality for everyone. This could mean faster, more private, and more affordable AI experiences for you.
What Actually Happened
Mirai, a London-based startup, has emerged with a mission to enhance on-device AI model inference. The company was founded by Dima Shevts and Alexey Moiseenkov, both known for building consumer apps. Shevts co-founded the popular face-swapping app Reface, as mentioned in the release. Moiseenkov was the CEO and co-founder of Prisma, a viral AI filter app from the last decade, the company reports. Their combined experience in consumer tech is now focused on solving a key challenge in AI creation. Mirai has secured a $10 million seed round led by Uncork Capital, according to the announcement. The 14-person technical team is currently developing a structure to help AI models perform better on devices. Their initial focus is on an inference engine specifically for Apple Silicon, as detailed in the blog post.
Why This Matters to You
Mirai’s work could significantly impact your daily tech experience. Imagine your favorite apps running AI tasks without relying on distant cloud servers. This means greater privacy and potentially lower costs for developers, which could translate to more affordable or feature-rich apps for you. The company plans to offer an SDK (Software creation Kit) that allows developers to integrate their runtime with just a few lines of code, the team revealed.
“One of the visions why we started the company was that we wanted to give developers, like this Stripe-like, eight lines of code [integration] experience,” Shevts said. “You basically go to our system, integrate the key, and start working with summarization, classification, or whatever your use case is.”
Think of it as bringing the AI processing power closer to you. For example, your phone’s camera could instantly analyze and enhance photos using complex AI, all without sending your images to the cloud. This could lead to much faster results and better data security. How might , on-device AI change the apps you use every day?
| Feature | Current Cloud AI | Mirai’s On-Device AI |
| Processing Speed | Can be delayed by network latency | Near-, as processing happens locally |
| Data Privacy | Data often sent to remote servers | Data stays on your device, enhancing privacy |
| Cost for Devs | High operational costs for cloud infrastructure | Potentially lower costs due to reduced cloud usage |
| Offline Access | Limited functionality without internet | Full AI functionality even without connectivity |
The Surprising Finding
While much of the AI conversation centers on massive cloud data centers, Mirai’s founders identified a crucial missing piece. The common assumption is that AI must live in the cloud. However, Shevts noted that “everybody speaks about cloud, about servers, about AGI coming. But the missing piece is on-device [AI] for consumer hardware.” This is a twist on the prevailing narrative. The startup built its engine in Rust, a programming language known for its performance. This choice can bump up a model’s generation speed by up to 37%, they claim. This means significant performance gains are possible even on consumer hardware. It challenges the idea that on-device AI is inherently less capable or slower than its cloud counterparts.
What Happens Next
Mirai is currently focusing on improving text and voice modalities on its system, the company reports. However, plans are in motion to support vision capabilities in the future. The team is also actively engaging with frontier model providers to fine-tune their models for edge use. They are also in discussions with various chipmakers. Later, Mirai intends to expand its engine to support Android devices, broadening its reach significantly. For example, imagine a voice assistant on your phone that understands complex commands instantly, even offline, thanks to this system. Mirai also plans to release on-device benchmarks. This will allow model makers to test performance directly on consumer hardware. This will provide valuable data for the entire industry. Developers should keep an eye on Mirai’s upcoming SDK for potential integration into their apps, possibly in the next 12-18 months. This could offer a new pathway for more efficient and private AI features.
