The 4 amazing new things we just learned about Apple's 'iGlasses'
I'm probably the most bullish, optimistic fan of Apple's mixed reality glasses projects, and I'm going to tell you why in one word: Holograms.
I expect that Apple's efforts will result in the mainstreaming of 3D holographic images as a feature of ordinary, everyday life.
Instead of pouring our minds into the narrow-focus funnel of a tiny smartphone screen to find out what's happening in the wider world, the wider world will be represented by full-size objects in our immediate environment, as will information and content that's specific or exclusive to the places we inhabit.
It's going to be super fun.
Of course, research into smart glasses of all types has been going on for decades. Startups have been churning out and selling mixed reality glasses for years.
But Apple's best role is to take technologies on the fringe and drop it into the mainstream simultaneously creating a demand, a market and an example of how to do it in a consumer-friendly way.
Here's what we already know or can expect with reasonable certainty about Apple's mixed reality or AR glasses:
Apple intends to ship two very different over-the-eyes hardware platforms: 1) VR-like goggles where you're looking at screens; and 2) AR glasses where you're looking through clear glass, but can also see virtual objects that appear to be floating in space.
The bigger, heavier glasses come first, possibly next year; the lighter, everyday glasses come years later.
The bigger mixed-reality glasses may have two 8k screens (which is incredible), up to 14 cameras, lidar and may cost $3,000.
Apple has been inventing, designing and patenting technologies and concepts for years, and has hundreds of patents in this space.
Apple has been making acquisitions and acquihires for years, hoarding intellectual property and expertise in this space.
Apple has been introducing features and components into iPhones that can be considered precursors to mixed- and augmented-reality glasses, such as Lidar and AR support for developers.
Apple CEO Tim Cook has made it clear that Apple is obsessed with augmented reality, and doesn't care that much about virtual reality.
Here's what else we already know.
But this is a fast-moving story. Here are the 4 new things we've learned in the last 24 hours:
Apple may turn its augmented reality glasses into health-related, quantified-self devices to replace or augment the role of the Apple Watch. A new patent continuation described by Patently Apple describes the use of sensors in glasses that look like ordinary prescription glasses that can perform extreme feats of biomedical detection. The patent describe the monitoring of head movements, jaw muscles and movement, mouth opening, respiratory rate, blood pressure, heart rate, heart rate variability, oxygen saturation, skin moisture, body temperature, body posture, glucose blood levels, stress and others. The glasses will also be able to identify the user biometrically, and detect the user's emotions, thoughts and brain function.
The mixed reality headset may look really nice. What we've seen previously is cartoonish, low-res mockup of a high-res render seen by The Information. Now, a designer named Antonio De Rosa has reverse-engineered the low-res mockup to create a high-res version. (All the images on this page were created by De Rosa.)
Apple is working with a company called Taiwan Semiconductor Manufacturing Co. (TSMC) to develop what has been described as "ultra-advanced" micro OLED displays "less than 1 inch in size" at a secret facility in Taiwan, according to a report in a Japanese business publication. Micro OLEDs should be thinner, lighter and more power-efficient than glass-based displays. The display technology is so advanced that if Apple's headset ships next year it will not contain this technology.
Apple hasn't figured out how its glasses will be controlled. They're still prototyping and testing a wide range of input devices, including rings, gloves (here’s the new patent, which hit yesterday) and in-the-air hand gestures.
My prediction is that the "killer app" for all Apple's glasses and headsets will be holographic virtual meetings, conversations and chats. Here's how it works. You've got a meeting, so you put on your Apple headset or, five years from now, a notification appears floating in space while you're wearing your prescription Apple iGlass product. The glasses biometrically identify you, and the other meeting participants pop up one by one as holographic avatars in your physical space. The avatars will be Memoji-type cartoons that reflect the actual body language, gaze, facial expressions and mouth movements of the real people you're talking with in real time. You'll be able to make eye contact with these avatars, and when someone is talking, all avatars, and you, will look toward the speaker, and you'll be able to look around at the holograms to see where they are looking as well. The sound will appear to come from the direction of the avatar that's speaking.
The Wall Street Journal this morning embraced this idea in a piece that looked at some of the other companies working on realizing this idea. These include Canada's ARHT Media, which makes a display system called the HoloPod; Spacial, which enables holographic meetings inside Oculus Quest; and others.
A radical, transformative new user interface only comes along once in a generation. The Apple version of this new world will arrive roughly 20 years after Apple mainstreamed the "multi-touch" user interface on smartphones, which, needless to say, absolutely changed everything.
I believe that within 10 years or so, augmented reality worn all day with ordinary-looking glasses will replace the smartphone as the main "computer" and user interface to apps and the internet.
That's why every nugget of news around Apple's productization of this platform is monumental. It's great to see Apple's "iGlasses" coming into focus.
Hey Mike, cool ideas but (yes I have a but).
I am not so sure about your killer app estimation. I know a thing or two about AR, and have been stuck in Zoomiverse (Actually Teams) in this last year. The truth is in a video call lots of peoples are not really fully paying attention.
Be it that the topic is not really relevant to them, or that they have other things to do, mostly are just listening "with one ear", ready to join once they hear their name.
So, the actual killer app will be the one that makes your avatar look as if its really playing attention :-)
Anyway, I honestly see much more future in guidance technology. One of the most used features of the Apple Watch by my friends is the navigation as they walk into a new place. AR is ideal for presenting information about the surroundings. Be it for delivery, tourism, looking for coffee (or a toilet), plus imagine all the Augmented Billboards :-)
Thanks for the insights, Good job as always.