The race to put augmented reality smart glasses on your face is heating up. Snap Spectacles are transforming into “Specs” and will launch as lighter and more powerful AR wearables in 2026.
CEO Evan Spiegel announced the all-new Specs on stage at the XR event AWE, promising smart glasses that are smaller, considerably lighter, and “with a ton more capability.”
The company didn’t spell out a specific time frame or price, but the 2026 launch schedule does put Meta on notice, which is busy prepping its exciting Orion AR glasses for 2027. It appears, Snap Specs will face off with the Samsung/Google Android XR-based Glasses, which are also expected sometime in 2026.
As for what consumers can expect from Specs, Snap is building them on the same Snap OS used in its fifth-generation Spectacles (and likely still using a pair of Qualcomm Snapdragon XR chips). That means all the interface and interaction metaphors, like gesture-based controls, will remain. But there are a significant number of new features and integrations that will start showing up this year, long before Specs arrive, including AI.
Upgrading the platform
Spiegel explained the updates by first revealing that Snap started working on glasses “before Snapchat” was even a thing and that the company’s overarching goal is “making computers more human.” He added that “with advances in AI, computers are thinking and acting like humans more than ever before.”
Snap’s plan with these updates to Snap OS is to bring AI platforms into the real world. They’re bringing Gemini and OpenAI models into Snap OS, which means that some multi-model AI capabilities will soon be part of Fifth Generation Spectacles and, eventually, Specs. These tools might be used for on-the-fly text translation and currency conversion.
The updated platform also adds tools for Snap Lenses builders that will integrate with the Spectacles’ and Specs’ AR waveform-based display capabilities.
A new Snap3D API, for instance, will let developers use GenAI to create 3D objects in lenses.
The updates will include a Depth Module AI, which can read 2D information to create 3D maps that will help anchor virtual objects in a 3D world.
Businesses deploying Spectables (and eventually Specs) may appreciate the new Fleet Management app, which will let developers manage and remotely monitor multiple Specs at once, and the ability to deploy the Specs for guided navigation at, say, a museum.
Later, Snap OS will add WebXR support to build AR and VR experiences within Web browsers.
Let’s make it interesting
Spiegel claimed that, through lenses in Snapchat, Snap has the largest AR platform in the world. “People use our AR lenses in our camera 8 billion times a day.”
That is a lot, but it’s virtually all through smartphones. At the moment, only developers are using the bulky Spectacles and their Lenses capabilities.
The consumer release of Specs could change that. When I tried Spectacles last year, I was impressed with the experience and found them, while not quite as good as Meta Orion glasses (the lack of gaze-tracking stood out for me), full of potential.
A lighter form factor that approaches or surpasses what I found with Orion and have seen in some Samsung Android XR glasses, could vault Snap Specs into the AR Glasses lead. That is, providing they do not cost $2000.
You might also like
https://cdn.mos.cms.futurecdn.net/zkqfUD4Y2556ujFRP3iEnL.jpg
Source link
lance.ulanoff@futurenet.com (Lance Ulanoff)