AI Glasses Which Allow Users To Type With Hands In Pockets To Be Developed By Next Year: Meta’s Yann LeCun

The AI revolution thus far has remained confined to computers and smartphones, but it might soon be coming to other computing devices.

Meta’s Chief AI Scientist Yann Lecun has said that AI-powered glasses with displays will be available next year. He said that these glasses will have Electromyography (EMG) interfaces that’ll allow users to control them in ways that aren’t yet mainstream.

“ What are we going to get human level AI? Do we need human level AI?” he said at an event. “Yes, because we’re going to work around with those smart glasses and we’re going to need to have sort of human level AI systems so we can talk to them. Those glasses don’t have a display, but within the next year or so, we’re going to have glasses with displays and with EMG interfaces so we can like point and click with our fingers, type with our hands in our pocket, things like that.” he added.

“And pretty soon that’s going to be the new computing platform, and it’s enabled by AI. And then in a few years we’ll have like full augmented reality glasses, but right now that’s too expensive. So, in the future, all of us will be walking around with basically a virtual staff of super smart virtual people helping us in our daily lives. It’s like every one of us would be kind of a boss of a a staff, of people? All of us would be sort of a manager, if you want,” he continued.

LeCun has predicted a pretty aggressive timeline for AI-powered glasses with displays and EMG interfaces. Meta has been developing its smartglasses in partnership with Ray-Ban maker EssilorLuxottica. These glasses include features like live streaming, Meta AI integration, and improved camera and audio capabilities. In the U.S. and Canada, Meta AI helps users with tasks, creation, and connection, with multimodal AI enabling the glasses to understand what the wearer is seeing. The glasses have created some buzz, leading to new styles, colors, and features. The second generation of Meta’s Ray-Ban Smart Glasses includes AI assistants powered by the company’s Llama model that can respond to user voice queries.

But Meta now says that in a year from now, these glasses will not only have displays, but also have features that’ll allow users to type while their hands are in their pockets, and click and point as well. As such, these glasses could potentially become a third computing device after the computer and the smartphone, and allow both input and output tasks — indeed, Meta CEO Mark Zuckerberg has said that glasses will be the main computing platform by the 2030s. And if these glasses can indeed be made public by next year, they could have the potential to change the world of computing, much like mobile had done in the 2010s.

Posted in AI