Why You Care
Ever worn a new gadget and felt a bit… exposed? Or worried about how others perceive your tech? What if the key to successful AI hardware isn’t just features, but whether it makes people want to punch you in the face? This isn’t just a funny thought; it’s a serious investment criterion from a seasoned tech investor. Your next AI device might be judged by this very human metric.
What Actually Happened
Kevin Rose, a general partner at True Ventures, has a blunt rule for evaluating AI hardware investments, as detailed in the blog post. He states, “If you feel like you should punch someone in the face for wearing it, you probably shouldn’t invest in it.” This candid assessment comes from his experience watching AI hardware startups repeat past mistakes. Rose, an early investor in companies like Peloton and Fitbit, has largely avoided the current AI hardware gold rush. While other venture capitalists fund smart glasses and AI pendants, Rose takes a different approach. He focuses on emotional resonance and social acceptability, not just technical capability, the team revealed.
Why This Matters to You
This perspective is vital for anyone considering new AI wearables. It’s not just about what the device does for you, but how it feels to wear and how others react to it. Imagine trying to use an AI device that constantly records conversations. How would your friends or family feel? This concern about privacy and social constructs is central to Rose’s thinking, according to the announcement. He believes many current AI wearables break these unwritten social rules.
Key Factors for AI Wearable Success:
- Emotional Resonance: How does the device make you feel?
- Social Acceptability: How do others feel about you wearing it?
- Privacy Respect: Does it avoid always-on listening?
- Real-world Utility: Is its use case more than just a novelty?
For example, Rose mentioned his experience with a failed AI pendant. He tried to use it to win an argument with his wife by recalling logs, which he quickly realized was a bad idea. “You do not want to win a battle by going back and looking at the logs of your AI pin. That doesn’t fly,” he recalled. This highlights how personal relationships can clash with intrusive tech. How might your own daily interactions change if you wore an ‘always-on’ AI device?
The Surprising Finding
The surprising twist here is that technical prowess alone isn’t enough for AI hardware to succeed. Rose, who was on the board of Oura – a company commanding 80% of the smart ring market – has seen this firsthand. The research shows that the difference between successful and failed wearables isn’t just about features. It’s about how the device makes you feel, and crucially, how it makes others feel around you, as mentioned in the release. He explains, “As an investor, you kind of have to not only say, okay, cool tech, sure, but emotionally, how does it make me feel? And how does it make others feel around me?” This challenges the common assumption that more features or AI automatically lead to better products. Instead, human-centric design and social integration are paramount.
What Happens Next
Rose worries that we are currently in an “early days of social media” moment with AI, the company reports. He suggests that decisions made now, which seem harmless, could have negative long-term consequences. He expects that in a decade or two, we might look back and question our current approach of “slapping AI on everything.” This perspective implies a future re-evaluation of AI integration. For example, photo apps that remove people from backgrounds might seem convenient now. However, Rose cites a friend who erased a gate from his yard in a picture, leading to potential confusion for his children later. Industry implications suggest a shift towards more thoughtful and less intrusive AI hardware design. Developers should prioritize user comfort and social integration over raw AI capability. Your next purchase of an AI wearable should consider these ethical and social factors, not just its technical specs.
