AI Glasses Will Be Able To Create New Apps And Show Them In Your Vision In 5 Years: Mark Zuckerberg

Meta has just released a new set of AI glasses for $799 that have a built-in visual display, and this display could be capable of incredible things in the coming years.

Meta CEO Mark Zuckerberg has painted a remarkably ambitious vision for the future of AI-powered smart glasses, suggesting that within just a few years, these devices will be capable of generating entire app interfaces on demand directly in users’ vision. His comments, made during the recent unveiling of the new Meta Ray-Ban Display glasses, reveal a timeline that’s more aggressive than many industry observers might expect, with Zuckerberg suggesting transformative capabilities could arrive in as little as two to three years.

The Meta founder’s vision centers on what he describes as an “always on experience” that represents the culmination of AI integration with wearable technology. “I think that sort of culminates in the glasses vision, where I think what you’re going to get is eventually this always on experience that—I mean, you can control when it’s on and off—but it can be always on if you want, where you can just let it see what you see, hear what you hear, can go off and think about the context of your conversations and come back with more context or knowledge that it thinks you should have,” Zuckerberg explained.

Perhaps most remarkably, Zuckerberg envisions a future where traditional app interfaces become obsolete in favor of AI-generated user experiences. “When you need an app, it can just generate the UI from scratch for you in your vision,” he stated, describing a paradigm shift that would fundamentally change how we interact with digital services and applications.

When pressed on the timeline for these capabilities, Zuckerberg offered an unexpectedly optimistic projection. “I’m not sure how long it’s going to take to get to that. I don’t think this is five years. I think it’s going to be quicker. So two, three. It’s hard to exactly know, but I don’t know. I would guess every time I think of what a milestone would be in AI, they all seem to get achieved sooner than we think. So I think my optimism about AI has generally only increased as time has gone on, in terms of both the timeline for achieving it and how awesome it’s going to be.”

The implications of Zuckerberg’s vision extend far beyond incremental improvements to current smart glasses technology. If realized, this would represent a fundamental shift in human-computer interaction, moving from discrete app-based experiences to a seamless, context-aware AI companion that can understand and anticipate user needs in real-time. The current Meta Ray-Ban Display already demonstrates early steps toward this vision, featuring a screen in the right lens that can show text messages, video calls, turn-by-turn directions, and visual results from Meta’s AI service. However, Zuckerberg’s timeline suggests we’re on the cusp of capabilities that would make today’s technology seem primitive by comparison. This aligns with broader industry trends, as competitors like Apple, Google, and Microsoft are all investing heavily in AR and AI integration, suggesting that the race to achieve true ambient computing through smart glasses has become a defining battleground for the next generation of consumer technology.

Posted in AI