Meta chief executive Mark Zuckerberg on September 25 unveiled the first functional prototype of its much-anticipated augmented reality (AR) glasses, called Orion, as the race to build the next major computing platform heats up.
During the company's annual Connect conference, Zuckerberg claimed that Orion is the "most advanced glasses the world has seen".
The AR glasses, previously codenamed Project Nazare, bridge the physical and virtual worlds by combining the look and feel of a regular pair of glasses with immersive AR capabilities. It enables people to place 2D and 3D content and experiences by using the physical world as their canvas.
The glasses use artificial intelligence (AI) to analyse and understand the surroundings to anticipate and proactively address the needs of the user.
Zuckerberg said the product has been in the works for nearly 10 years, and continues to be the full vision it is working towards. "With Orion, we are getting closer to achieving the dream of Reality Labs to create the next major computing platform that delivers a deep sense of presence like you were right there with another person," he said.
This launch comes after Snapchat parent Snap Inc last week unveiled the fifth generation of its Spectacles AR glasses along with a brand new operating system, Snap OS, to power these devices.
Similar to Spectacles 5, Orion will not be available to consumers for purchase. Zuckerberg said it will remain as a prototype that will be available to Meta employees and a handful of external partners to learn, iterate and build towards the next version of the hardware, which will be the first full-holographic AR glasses for consumers.
"We do have a few things that I want us to keep pushing on before we ship this as a consumer product. We're going to keep tuning the display system to make it sharper. I want to keep working on the design to make it smaller and a bit more fashionable, and we need to keep working on the manufacturing to make it a lot more affordable," Zuckerberg said.
What features does Orion have?
Orion has a 70-degree field of view, which Meta claims is the largest field of view in the smallest AR glasses form factor to date. This unlocks several immersive use cases, including multitasking windows and big-screen entertainment to life-size holograms of people that can blend into the user's view of the physical world.
Meta Orion AR glasses prototype with a neural wristband and a wireless compute puck (Image: Meta)
Orion utilises a new kind of display architecture with tiny uLED projectors in the arms of the glasses that shoot light into waveguide lenses that have nanoscale, 3D structures etched into the lenses so they can diffract light and put holograms at different depths and sizes into the world in front of the user, Zuckerberg said.
The lenses are made of lightweight silicon carbide, while the frames are made of magnesium, the same material used in F1 race cars and spacecrafts. The glasses will be powered by a battery that fits in the arm of the glasses, Zuckerberg said.
People will be able to interact with the glasses through AI voice control, hand and eye tracking, and a wrist-based neural interface through an EMG (electromyography) wristband.
For example, if someone messages a user, a small hologram window will pop up, and one can reply with a few subtle gestures. One can also video call friends and family who appear as life-size holographic avatars in the room or play holographic games such as ping pong or chess with them.
Meta has also built a wireless compute puck for Orion that offloads some processing tasks from the glasses, thereby enabling a longer battery life and a better form factor with low latency.
New features for Ray-Ban Meta glasses
Zuckerberg also announced new AI features to the company's Ray-Ban Meta glasses to help users remember things like where they parked, translate speech in real time, and answer questions about things they are seeing.
It also announced expanded integrations with Spotify and Amazon Music, and new app integrations with Audible and iHeart to allow people easily access more content from glasses.
Zuckerberg said they are seeing the emergence of a new AI-centric device category, while highlighting the company's struggle to keep up with the demand for Ray-Ban Meta glasses.
During his keynote speech, he also announced Quest 3S, an entry-level version of its Quest line of mixed-reality headsets. Quest 3S will feature the same mixed reality capabilities and fast performance as Quest 3, but at a lower price point of $300.
New AI enhancements
Apart from hardware announcements, the Meta chief introduced new features and product enhancements to the social networking giant's AI products. This includes its AI chatbot Meta AI which now allows users to ask questions through voice and receive answers aloud, including in the voices of various celebrities such as John Cena, Kristen Bell, and Dame Judi Dench.
The firm is piloting a Meta AI translation tool that will automatically translate the audio of Reels. It is also testing displaying personalised AI-generated images in people's Facebook and Instagram feeds that are automatically generated by Meta AI based on their interests or current trends.
Zuckerberg also released Llama 3.2, the latest iteration of its open-source Llama large language model.
Llama 3.2 comes with multimodal support, which means it can understand different formats such as text and images at the same time. It will be available in four variants - small and medium-sized vision LLMs (11B and 90B), and lightweight, text-only models (1B and 3B).
Discover the latest Business News, Sensex, and Nifty updates. Obtain Personal Finance insights, tax queries, and expert opinions on Moneycontrol or download the Moneycontrol App to stay updated!
Find the best of Al News in one place, specially curated for you every weekend.
Stay on top of the latest tech trends and biggest startup news.