Meta supremo Mark Zuckerberg unveiled Orion smart glasses, a new augmented reality ( AR ) prototype, at the annual Meta Connect developer conference.
These will be a new way to merge the true and modern worlds, ten years in the making, and not expected to be available on great streets until 2027. They will be controlled by the fingers and the eyes through a wrist-mounted neurological program.
What does this mean for how we interact with computers in the future of Mr smartwatches? We asked three technical experts at the University of Bangor, Peter Butcher, Llŷr sion Cenydd and Panagiotis Ritsos.
Why has Orion been like a professional problem?
Incorporating so far superior technology into something so compact poses serious technical challenges. This includes new holographic show systems, hands and eyesight monitoring, off-device processing, cameras, speakers and microphones – all while ensuring the device remains visually appealing and has good battery life.
Meta’s chief technical officer, Andrew Bosworth, recently captured the size of the problem by saying:” In buyer electronics, it might be the most advanced point that we’ve ever produced as a species”.
The visual design is a huge obstacle. Combined reality headsets like as Meta Quest 3 and Apple Vision Pro rely on “passthrough” technologies, in which external devices capture real-time picture of the patient’s environment. This is displayed inside the helmet, with electronic components overlaid.
In comparison, Orion’s holographic forecast allows users to instantly see through clear lens, with images projected into their view. This has demanded substantial R&, D.
Are there other notable innovations?
The field of view, or the angular range that the user can see through the headset, is one of the key elements that affect how immersive a pair of mixed reality headsets is. The Magic Leap 2‘s 70° field of view, larger holographic AR glasses designed for businesses that are currently priced above US$ 3, 000, is the state of the art. They are produced by Magic Leap, a US business whose backers include AT&, T. and Google.
With Orion, Meta has achieved a field of view of 70° in a much smaller product, which is a grand innovation and crucial for Zuckerberg’s vision of an unobtrusive wearable device.
The wristband for the neural interface is also crucial. Users can control the device by making subtle finger movements like pinching and swiping their thumb against the index finger thanks to the device’s ability to listen to nerve impulses from the brain to the hand. Similar controls exist for older mixed reality headsets, such as Apple Vision Pro, but they rely on external cameras to interpret hand movements.
The benefit of directly tapping into nerve impulses is that the person only needs to consider the gesture at hand and may not even need to perform it in its entirety.
The technology also opens up brand new input methods, such as texting via mimicking handwriting, and is likely to mature before consumer-grade holographic displays become available.
Has Orion been more trouble than Meta expected?
Meta initially gave the Orion prototype only a 10 % chance of success, so it has exceeded expectations. While there is still much work to be done, particularly in reducing costs and miniaturizing components, Orion could eventually lead to a consumer-ready device.
Do you believe Meta will have a more affordable version by 2027?
Meta believes that the initial cost will be comparable to laptops or flagship phones. The new iPhone 16 starts at £799 ($ 1, 069 ). In a similar way to how VR headsets were introduced a decade ago, development kits were released toward the end of the decade that were targeted at early adopters and developers.
In the interim, applications that could eventually run on AR glasses can be developed using Meta Quest 3 and other mixed reality headsets like the Apple Vision Pro and other AR glasses.
Why are the Orion glasses still so expensive?
Because the majority of the hardware, including Ledos micro-display panels and silicon carbide waveguides, are n’t yet produced at scale, holographic AR glasses are still expensive.
Production constraints are reported to be causing Orion unit prices to close to US$ 10,000, but these components are crucial for producing high resolution and holographic displays. Even so, the battery life is currently only about two hours.
Could anyone possibly defeat Meta in the market?
Thanks to Meta’s multi-billion dollar investment in R&, D through its Reality Labs subsidiary, it has become a leader in virtual and mixed reality headsets, with a robust app ecosystem. However, Apple, Microsoft, Samsung and Google are developing similar technologies.
Microsoft’s HoloLens and [Snapchat owner ] Snap Inc.’s Spectacles series have made advances in AR, but results have been mixed due to limitations like limited field of view and poor graphics.
Orion appears to be ahead of technology for holographic displays. Apple, a different company to watch out for, is developing Vision Pro and looking into AR smart glasses.
Will AR glasses change the world?
AR-enabled devices could cause an “iPhone moment” that forever alters how we interact with technology. As the next major computing platform, Zuckerberg sees them as a more natural and user-friendly alternative to smartphones.
The success of early mass-market smart glasses such as Meta’s Ray-Ban glasses, which allow users to make calls, capture videos and interact with Meta AI, hints that AR glasses could see widespread adoption.
Zuckerberg initially believed that holographic technology would be required for smart glasses to expand functionality beyond these Ray-Bans ‘ fundamental features. However, having the ability to incorporate an AI voice-powered assistant has made Meta realize that a new consumer product category can be created using smart glasses.
While the four-hour battery life requires improvement, the positive feedback from both reviewers and users, particularly using them on Instagram and TikTok, demonstrates the potential.
What will the future hold?
Reading messages, watching a virtual screen on the wall, playing games, collaborative work – all the things you can do with mixed-reality headsets, but shrunk down to a pair of glasses. When two people feel present in the same space, a friend will video call into your living room.
Virtual assistants can already use smart glasses to see what you see, hear what you hear, talk to you, and respond to questions and follow commands. This is even stranger. AI will eventually be able to appear in your vision and allow you to engage in natural conversation with it.
By 2030, AI will radically change the ways in which we interact with each other, our physical world and computers. Orion aims to prepare us for a world in which the physical, artificial and virtual co-exist.
Llŷr ap Cenydd is lecturer in computer science, Bangor University, Panagiotis Ritsos is senior lecturer in visualisation, Bangor University, and Peter Butcher is lecturer in human computer interaction, Bangor University
This article was republished from The Conversation under a Creative Commons license. Read the original article.