Leaks and announcements this week show why AI glasses will be the star of next year’s holiday gift-giving.
Somebody screwed up and released a promotional video of Meta’s upcoming AI smart glasses on YouTube. (It was supposed to wait for tomorrow’s Connect conference.) Meta hastily deleted it, but not before people downloaded it.
The glasses look good. Meta will sell four AI glasses lines:
An $800 pair of Ray-Ban smart glasses with a heads-up display (not spatial) and a new wrist controller. The controller uses differential electromyography to detect muscle movement for digital commands to the glasses.
Oakley Meta Sphaera glasses designed for athletes, featuring a centrally placed cyclops camera and no display.
An updated version of Ray-Ban’s AI glasses.
The existing Oakley Meta HSTN glasses.
So: One line with a heads-up display and three lines with no display. All have cameras.
Oakley Meta Sphaera glasses represent a category of product that spells the beginning of the end of GoPro-like action cameras. (Speaking of spelling, the sub-brand is awkward.) Even before camera quality approaches the quality of the GoPros, the ease and appeal of use of the glasses will be preferred by skiers, skydivers, mountain bikers, and skaters.
Snap is also innovating. The company yesterday announced Snap OS 2.0, the operating system for its upcoming sixth-generation consumer Spectacles AR glasses, expected next year.
The new OS has a better web browser, according to the company. It lets users take and see pictures, enjoy AR music experiences, translate text, and see contextual information about what the camera sees.
In general, Snap Spectacles will be more advanced but less socially acceptable than Meta’s glasses. They’re closer to Apple Vision Pro than Ray-Ban Meta glasses, but can be worn outside the house.
Amazon’s AI glasses should ship in the middle of next year.
AI glasses from Apple, Google, Samsung, and Xiaomi are expected to go on sale by late next year.
We’ll see offerings from big AI companies and small startups.
What Meta does matters. The company owns 70% of the global smart glasses market, thanks to Ray-Ban Meta glasses.
And what Mark Zuckerberg says about AI glasses matters, too, because (unlike subjects like the “social graph,” the future of social being all about people interacting with bots instead of other people, and the metaverse) he’s been completely right about all AI glasses matters he’s pontificated upon.
Here’s Zuck’s most recent proclamation: 1) AI glasses are the ideal form factor for interacting with AI because they place the cameras, speakers, microphones, and displays in the right place; 2) AI glasses will become the main way people interact with AI; 3) AI glasses will become the world’s primary computing platform; and 4) people not wearing AI glasses in the future will be at a “cognitive disadvantage” compared to people enhanced with these devices.
Personally, and it pains me to say this, I believe he’s right about all of this. And the best part is that when his vision for AI glasses becomes true, we’ll no longer have to patronize Meta in order to participate.
The future of smart glasses is coming into focus. By the end of next year, AI glasses will go totally mainstream. And in the year 2027, the replacement of smartphones as the most important information gadget in our lives will begin.
More From Elgan Media, Inc.
Is AI changing our language?
AI and the end of proof
The AI-powered cyberattack era is here
How to remember everything
Where’s Mike? Vallerano, Italy
(Why I’m always traveling.)
The idea that an $800 peripheral will go mainstream just doesn't make sense. Maybe when cellphones are fully integrated into them and people can get them from their cellphone providers, like our $1000 iPhones, then I can see it happening.
Not at $800 a pair we all won't.