If you’re excited about the future of augmented reality, or even just a little curious, Meta’s Orion prototype is the most compelling case yet for the technology.
For Meta, Orion is more than just making AR glasses a reality. It’s also the company’s best attempt to become less dependent on Apple and Google’s app stores and the regulations that come with them. If Orion succeeds, we might not need smartphones at all. Zuckerberg has speculated that the glasses could eventually become “the primary way to do computing.”
For now, it’s still too early to know if Zuckerberg’s bet will actually come to fruition. Orion is still a prototype. Meta hasn’t said when it might become widely available or how much it might cost. Part of that is because the company, which has already spent billions of dollars in AR and VR research, still needs to figure out how to make the Orion significantly more affordable than the $10,000 it reportedly costs to make the current version.
In addition, the Orion’s hardware and software also need to be improved. And, perhaps most importantly, the company will eventually need to convince its massive user base that AI-infused, eye-tracking glasses offer a better way to navigate the world.
Still, Meta has been eager to show off the Orion since its unveiling at Connect. And, after recently getting a chance to try the Orion out myself, it’s easy to understand why: the Orion is the most impressive AR hardware I’ve seen.
Meta’s First AR Glasses
Meta has clearly put a lot of effort into making its AR glasses look normal. While Snap has been mocked for its oversized glasses, the size and shape of the Orion is closer to a pair of traditional frames.
Still, they’re still noticeably wide and thick. The thick black frames, which house multiple cameras, sensors, and custom silicone, might work on some face shapes, but I don’t think they’re particularly attractive. And while they look less cartoony than Snap’s AR glasses, I’m pretty sure I’d still get some weird looks if I walked around in public with them. At 98 grams the glasses were noticeably heavier than my usual prescription lenses, but never felt heavy.
Meta Orion Glasses
Meta’s Orion glasses are still quite heavy.
In addition to the actual glasses, Orion relies on two other pieces of kit: a 182-gram “wireless compute puck,” which needs to stay near the glasses, and an electromyography (EMG) wristband that allows you to control the AR interface with a series of hand gestures.
The puck I saw came equipped with its own cameras and sensors, but Meta told me they’ve simplified the remote control-sized device so it can be used primarily for connectivity and processing.
When I first saw the three-piece Orion setup at Connect, my first thought was that it was an interesting compromise to keep the glasses small. But after trying everything out together, it doesn’t feel like a compromise at all.
You control Orion’s interface through a combination of eye tracking and gestures.
After a quick calibration when you first put the glasses on, you can navigate AR apps and menus by glancing around the interface and tapping your thumb and index finger together. Meta has been experimenting with wrist-based neural interfaces for years, and Orion’s EMG wristband is the result of that work.
The band, which looks like little more than a cloth watch band, uses sensors to detect the electrical signals that accompany subtle movements of your wrist and fingers. Meta then uses machine learning to decode those signals and send them to the glasses.
It might sound complicated, but I was surprised by how intuitive navigation felt. The combination of quick gestures and eye tracking felt much more accurate than hand tracking controls I’ve used in VR. And while Orion also has hand-tracking capabilities, it feels more natural to quickly tap your fingers together than to spread your hands out in front of your face.
What it’s like to use Orion
Meta showed me several demos to show off Orion’s capabilities. I asked the Meta AI to generate an image and create a recipe based on a handful of ingredients on a shelf in front of me. The latter is a trick I’ve also tried with the Ray-Ban Meta smartglasses, except with the Orion, the Meta AI was also able to project recipe steps onto the wall in front of me.
I also answered a few video calls, including one from a surprisingly lifelike Kodak avatar. I watched a YouTube video, scrolled through Instagram Reels, and wrote a response to an incoming message. If you’ve used a mixed reality headset, a lot of it will feel familiar, and a lot of it wasn’t too different from what you can do in a VR headset.
