The Future of Smart Glasses

In the realm of technological innovations, while some devices achieve widespread recognition, many others fail to gain traction. This year marks a decade since Google announced the cessation of production for Google Glass, and for an extended period, it seemed that mixed-reality products—specifically, devices that do not entirely obscure the user’s field of vision as virtual reality headsets do—would remain primarily within the domain of enthusiasts rather than mainstream consumers.

Fast forward ten years, and smart glasses are poised to become, dare we say, fashionable. Meta’s smart glasses, developed in collaboration with Ray-Ban, closely resemble the iconic Wayfarers famously worn by Tom Cruise in Risky Business. Additionally, Meta has recently showcased its stylish Orion augmented reality glasses prototype, while Snap has introduced its fifth-generation Spectacles, both of which would seamlessly fit into the trendiest urban environments. In December, Google unveiled its new unnamed Android XR prototype glasses, and speculation regarding Apple’s ongoing development of a long-anticipated glasses project continues to circulate. Furthermore, Chinese technology giants such as Huawei, Alibaba, Xiaomi, and Baidu are also competing for a share of this emerging market.

The sleeker designs of this new generation of glasses undoubtedly enhance their appeal. More significantly, smart glasses are on the cusp of becoming genuinely functional, and it is evident that major technology companies are betting on augmented reality eyewear as the next significant consumer device category. Below, we outline expectations for smart glasses in 2025 and beyond.

The Role of AI Agents in Enhancing Smart Glasses Utility

Although mixed-reality devices have existed for decades, their primary applications have been in specialized fields such as medicine, construction, and technical remote assistance, where they are likely to continue to be utilized, potentially in more niche capacities. Microsoft is the creator of the most recognized of these devices, which overlay virtual content onto the user’s real-world environment, and has marketed its HoloLens 2 smart goggles primarily to corporate clients. Recently, the company confirmed the discontinuation of this device, opting instead to focus on developing headsets for the U.S. military in collaboration with Palmer Luckey’s latest venture, Anduril.

The general public may soon gain access to devices that are genuinely usable. The artificial intelligence sector is currently abuzz with the development of agents that enhance large language models (LLMs) by enabling them to perform tasks autonomously. Over the past year, significant advancements have been made in the capabilities of AI multimodal LLMs to process video, images, and audio alongside text, thereby opening new applications for smart glasses that were previously unattainable, as noted by Louis Rosenberg, an augmented reality researcher who contributed to the development of the first functional augmented reality system at Stanford University in the 1990s.

Meta has expressed a clear interest in AI agents. Although the company announced in September that it has no immediate plans to market its Orion prototype glasses due to their high cost, Mark Zuckerberg has raised expectations for future iterations of Meta’s smart glasses by declaring Orion the “most advanced pair of AR glasses ever made.” He has also emphasized Meta’s commitment to delivering a “highly intelligent and personalized AI assistant” to a broad user base, asserting that Meta’s glasses represent the “ideal form factor for AI.”

While Meta is enhancing the conversational capabilities of its Ray-Ban smart glasses’ AI—introducing a new live AI feature that responds to prompts based on the wearer’s visual and auditory experiences—future agents are expected to provide these systems with not only sensory input but also contextual awareness of their surroundings, according to Rosenberg. For instance, agents operating on smart glasses could engage in spontaneous interactive dialogues with their wearers based on their environment, such as reminding them to purchase orange juice when they pass a grocery store or identifying a colleague who walks by. Google has also shown a keen interest in this agent-centric approach, as evidenced by the unnamed smart glasses it first presented at Google I/O in May 2024, which are powered by its Astra AI agent system.

“Having worked on mixed reality for over 30 years, this is the first time I can envision an application that will genuinely drive mass adoption,” Rosenberg states.

Competition Between Meta and Google for Market Leadership

The timeline for achieving mass adoption of smart glasses remains uncertain. During a recent earnings call, Zuckerberg indicated that 2025 would be a “defining year” for assessing the future of AI glasses and determining whether they will experience a surge in popularity or represent “a longer grind.”

Zuckerberg has reason for optimism; Meta currently leads its competitors, having sold over one million units of the Ray-Ban Meta smart glasses last year. The company is also preparing to introduce new styles through a partnership with Oakley, which, like Ray-Ban, is part of the EssilorLuxottica brand portfolio. While the current second-generation glasses do not display digital data or notifications, a third version featuring a small display is anticipated for release this year, according to the Financial Times. Additionally, reports suggest that Meta is developing a lighter, more advanced version of its Orion AR glasses, referred to as Artemis, which could be available as early as 2027, as reported by Bloomberg.

The addition of display capabilities will position the Ray-Ban Meta glasses on par with Google’s unnamed Android XR glasses project, which features an in-lens display (the company has yet to announce a definitive release date). The prototype demonstrated to journalists in September included a version of Google’s AI chatbot, Gemini, and much like Google’s development of its Android operating system for third-party smartphones, its Android XR software is expected to operate on smart glasses produced by other manufacturers as well as its own.

These two major players are engaged in a competitive race to deliver face-mounted AI to the masses, a contest that is likely to intensify, as noted by Rosenberg—especially given that both Zuckerberg and Google co-founder Sergey Brin have described smart glasses as the “perfect” hardware for AI. “Google and Meta are the leading technology companies that are most advanced in the AI space independently. They are exceptionally well positioned,” he asserts. “This is not merely about augmenting your environment; it is about augmenting your cognitive capabilities.”

Challenges in Developing Smart Glasses

While advancements in technology have made the development of smart glasses more accessible, achieving a successful product remains a complex challenge. Michael Miller from the AR gaming company Niantic observed a significant presence of smaller companies at CES, the expansive consumer electronics exhibition held annually in Las Vegas, all working on their own glasses and supporting systems, including Chinese brands such as DreamSmart, Thunderbird, and Rokid. Although the endeavor is still costly—requiring several million dollars in investment to develop a prototype—it indicates that the future of the sector will not rely solely on major technology firms.

“On both hardware and software fronts, the barriers to entry have significantly decreased,” states Miller, who leads augmented reality hardware at Niantic, a company that has collaborated with Meta, Snap, and Magic Leap, among others. “However, transforming these technologies into viable consumer products remains challenging. Meta has successfully leveraged the Ray-Ban brand, which provides them with a competitive advantage. It is difficult for lesser-known brands to market their glasses effectively.”

Consequently, it is likely that ambitious smart glasses manufacturers in countries such as Japan and China will increasingly seek partnerships with local eyewear companies renowned for producing desirable frames, thereby generating momentum in their domestic markets before expanding internationally.

The Role of Developers in Shaping Smart Glasses Experiences

These smaller entities will play a crucial role in creating new experiences for smart glasses users. The utility of smart glasses largely depends on their ability to transmit and receive information from a user’s smartphone, as well as the interest of third-party developers in creating applications for these devices. The more functionalities available to consumers, the more likely they are to invest in smart glasses.

Developers are currently awaiting Meta’s release of a software development kit (SDK) that would enable them to create new experiences for the Ray-Ban Meta glasses. While larger brands are understandably cautious about granting third parties access to the discreet cameras on smart glasses, this limitation restricts opportunities for researchers and creatives to innovate, according to Paul Tennent, an associate professor at the Mixed Reality Laboratory at the University of Nottingham in the UK. “Historically, Google has been somewhat more open to this,” he adds.

In contrast, Snap and smaller brands such as Brilliant Labs, whose Frame glasses utilize multimodal AI models including Perplexity, ChatGPT, and Whisper, as well as Vuzix, which recently launched its AugmentOS universal operating system for smart glasses, have readily made their SDKs available, much to the delight of developers, according to Patrick Chwalek, a student at the MIT Media Lab who contributed to the smart glasses platform Project Captivate as part of his PhD research. “Vuzix is gaining popularity at various universities and companies because it allows users to build experiences on top of their platform,” he notes. “Most of these applications are related to navigation and real-time translation, and I anticipate we will see numerous iterations of these functionalities in the coming years.”

Leave a Reply

Your email address will not be published. Required fields are marked *