NVIDIA Unveils Desktop AI Agents with DGX Spark & Reachy Mini

New open models and hardware empower users to build personalized, real-world AI companions.

NVIDIA announced new open models and the Reachy Mini robot at CES 2026, enabling users to create personal AI agents. This development allows for private data processing and direct collaboration with a desktop AI 'buddy.'

Katie Rowan

By Katie Rowan

January 6, 2026

4 min read

NVIDIA Unveils Desktop AI Agents with DGX Spark & Reachy Mini

Key Facts

  • NVIDIA unveiled new open models for AI agents at CES 2026.
  • The models include NVIDIA Nemotron reasoning LLMs and NVIDIA Isaac GR00T N1.6 VLA.
  • NVIDIA Cosmos world foundation models are also part of the new offerings.
  • Users can create physical AI agents using NVIDIA Reachy Mini and DGX Spark.
  • The goal is to enable private data processing and collaboration with a desktop AI 'buddy'.

Why You Care

Ever wished you had a personal AI assistant that wasn’t just a voice in your phone but a physical presence? One that could process your data privately and work alongside you? NVIDIA just made a significant move toward making that a reality. At CES 2026, the company showcased how you can bring your own AI agent to life. This isn’t just about AI; it’s about creating a tangible, interactive AI companion right on your desk. Your digital future is about to get a lot more physical.

What Actually Happened

At CES 2026, NVIDIA unveiled a collection of new open models designed to power the next generation of AI agents. These agents can operate both online and in the real world, according to the announcement. The company introduced key building blocks for AI builders. These include the recently released NVIDIA Nemotron reasoning LLMs (large language models). Also new is the NVIDIA Isaac GR00T N1.6 open reasoning VLA (visual language agent). What’s more, NVIDIA Cosmos world foundation models were highlighted, as detailed in the blog post. These tools provide the necessary components for developers to create their own AI agents. Jensen Huang, NVIDIA’s CEO, demonstrated how users can create a personal “office R2D2” using NVIDIA Reachy Mini hardware. This allows for direct interaction and collaboration with a physical AI. The process involves using the processing power of NVIDIA Reachy Mini, the company reports.

Why This Matters to You

Imagine having an AI buddy that sits on your desk, ready to assist with tasks and process your information. This is no longer science fiction. The ability to create such an agent means a new level of personal AI interaction. You can now build an AI that understands your specific needs. It can also handle your data with a focus on privacy. This is a significant step beyond cloud-based AI. It offers a tangible, local AI experience. For example, think of a small robot arm that could help organize physical documents based on your voice commands. Or perhaps it could fetch specific items from your desk. This system aims to make AI more integrated into your daily physical workspace. It offers a more personal and interactive experience. What specific tasks would you entrust to your own desktop AI companion?

  • Key Components for Desktop AI Agents:
  • NVIDIA Nemotron reasoning LLMs
  • NVIDIA Isaac GR00T N1.6 open reasoning VLA
  • NVIDIA Cosmos world foundation models
  • NVIDIA Reachy Mini hardware
  • DGX Spark processing power

This setup allows for a unique blend of AI and physical interaction. “You can talk to and collaborate with” your AI, as mentioned in the release. This moves AI from abstract software to a concrete, helpful entity. Your data can be processed locally, enhancing privacy and control. This could redefine how you interact with artificial intelligence.

The Surprising Finding

What truly stands out is the emphasis on bringing AI agents into the physical world, right onto your desk. Many AI advancements focus on software or cloud services. However, NVIDIA’s approach with Reachy Mini is different. It provides a tangible, interactive robot. This allows for a personal “office R2D2” that you can talk to and collaborate with, according to the announcement. This challenges the common assumption that personal AI will remain purely digital. Instead, it suggests a future where AI has a physical presence in our daily lives. The integration of DGX Spark processing power with a compact robot like Reachy Mini makes this possible. It creates a local, private AI experience. This revelation means AI isn’t just about screens anymore. It’s also about physical interaction.

What Happens Next

NVIDIA’s announcement at CES 2026 suggests a near-term future for personal AI agents. Developers and enthusiasts can expect to replicate this experience at home soon. The blog post provides a step-by-step guide for this process. This indicates that the necessary tools and instructions are already available. Within the next few quarters, we could see more widespread adoption of these desktop AI agents. For example, imagine a content creator using a Reachy Mini to physically sort through props for a video shoot, guided by an AI. Or a podcaster having their AI companion manage physical sound equipment. The industry implications are vast. This could spur creation in robotics and personal computing. It offers a new frontier for human-AI collaboration. This system aims to empower “AI Builders to build their own agents,” the team revealed.

Ready to start creating?

Create Voiceover

Transcribe Speech

Create Dialogues

Create Visuals

Clone a Voice