smart glasses – Live Laugh Love Do http://livelaughlovedo.com A Super Fun Site Mon, 01 Dec 2025 03:25:26 +0000 en-US hourly 1 https://wordpress.org/?v=6.9.1 Meta Ray-Ban Display Glasses Review: Halfway to the Future http://livelaughlovedo.com/technology-and-gadgets/meta-ray-ban-display-glasses-review-halfway-to-the-future/ http://livelaughlovedo.com/technology-and-gadgets/meta-ray-ban-display-glasses-review-halfway-to-the-future/#respond Thu, 16 Oct 2025 13:18:55 +0000 http://livelaughlovedo.com/2025/10/16/meta-ray-ban-display-glasses-review-halfway-to-the-future/ [ad_1]

If you see someone nearby wearing thick Ray-Ban glasses, maybe staring off into space a bit and making small gestures with their fingers, you could be witness to the next big piece of wearable tech. Gesture-enabled smart glasses are here in the form of Meta Ray-Ban Displays, and I’ve been wearing them for about two weeks now, off and on. Yes, I’m one of those people.

Will you eventually be one of those people, too? Well, start by asking yourself whether you even want a display hovering around near your eyes, able to be called into existence with a double tap of a middle finger and thumb. Do I? Yes and no.

Watch this: My Life With Meta Ray-Ban Displays: A Weird Wild Future

Meta’s latest $800 glasses feel to me like a transformational gadget for life. At their best, they reveal magic glimpses of a subtle interface, another layer of information on the world, with a display that conjures itself on demand. At their worst, they highlight the numerous missing pieces still needed to make smart glasses truly essential. Including, by the way, prescription support for my eyes. Right now, I’m testing them with contact lenses on.

Also, I have fundamental concerns about distraction and safety while wearing them.

Meta Ray-Ban Display glasses in black and a Neural Band

7.0

Meta Ray-Ban Display

Like


  • Nearly invisible in-lens heads-up display

  • Impressive gesture controls via included wristband

  • Assistive captioning and maps apps are truly useful

  • Viewfinder and zoom functions for taking photos and videos

Don’t like


  • Shorter battery life than standard, display-free Meta Ray-Bans

  • Neural band can feel annoying to wear

  • Few apps and phone-connected functions

  • Can’t mirror phone on the display, just certain apps

As they currently exist, the Meta display glasses I review here are not as useful as a smartwatch or as good a value as display-free smart glasses, such as the standard Meta Ray-Bans. Even if you want to be one of those people, you should probably wait until this impressive technology matures a bit.

But the technologies inside these glasses — near-invisible display tech provided by reflective waveguides, and wild neural band technology driven by electromyography — are going to show up in more glasses and wearables eventually. It’s early days for significant advancements in tech for our wrists and faces, and while these glasses are technical achievements, they currently feel like a beta test of things to come.

CNET's Scott Stein wearing the new Meta Ray-Ban Display smart glasses and neural wristband.

The tiny screen embedded in Meta’s Ray-Ban Display glasses is only visible to the wearer. It’s controlled by gestures that are sensed by the included neural wristband.

Scott Stein/CNET

Now I feel like an everyday cyborg

On the surface, the new display glasses look a lot like Meta’s existing audio and camera-enabled Ray-Ban and Oakley smart glasses. The difference is that they have a heads-up display in one eye to show apps and information, along with a gesture-enabled wristband you wear to control the display. 

The display in Meta’s Display Glasses isn’t as capable as a VR headset or Tony Stark specs, however. You won’t see 3D things in these glasses. All they do is project a single display in a single eye, flat and 2D. Essentially, they’re like an evolved pair of Google Glass (circa 2013) for the modern age.

The really advanced ideas come on the included Neural Band, which can register subtle hand gestures. Little taps and swipes allow me to control what’s on the screen, like a clickable mouse made of my fingers. Subtle vibrations give me feedback as I tap.

The wrist and display upgrades on these glasses make the whole experience feel equal parts futuristic and odd. I’m sort of an everyday cyborg who can summon screen readouts into my vision. Sometimes it feels like my life has become a first-person video game. Other times it feels like I’ve glued a smartwatch to my face or given my eye Apple CarPlay.

CNET's Scott Stein wearing Meta Ray-Ban Display glasses with sunglass mode activating

Meta Ray-Ban Display glasses on my face, transition lenses activating. Do you notice I’m recording?

Numi Prasarn/CNET

Looks: Subtle, yet not that subtle

Compared with some other augmented reality specs and smart glasses I’ve tried, Meta’s look surprisingly stylish for big, chunky glasses. The frames come in either black or semi-transparent sand-colored (brown) and make the standard Ray-Bans look slim and low-key by comparison. I love the way these glasses look on my face, but for the record, my family doesn’t.

They’re heavier than non-display Ray-Bans, but not by much (69 grams, compared to 49 grams). They still feel premium, solid and comfortable to wear. The thick arms have hinge springs that bend back to reduce tension on the sides of my head, and they don’t exert pressure on my temples.

It’s pretty amazing how the display inside the right lens isn’t visible to outside users who look at me, even when it’s on. 

When it’s off, you have to look closely to make out the reflective waveguide tech that creates the image. It looks like a series of small lines on the side of the lens. There’s also a narrow vertical strip down the side of the lens that’s visible at certain angles. The waveguide tech here is a lot better than anything else I’ve seen, and a sign of how invisible in-glass displays could be.

That tech has a major downside, however: Meta can only make these Ray-Bans with prescriptions that range from minus 4 to plus 4. My eyes are over minus 8. I’ve had to wear contact lenses to test them for this review, defeating the whole idea of these actually being my everyday glasses. I hope Meta can figure out how to work with more prescriptions — not just for myself, but for anyone else hoping to buy them. Signs are strong that will happen for this type of lens tech, but exactly when remains a mystery.

A photo of two hands, one wearing Apple Watch, one wearing Meta Neural Band

My two hands now: watch on left, gesture-controlliing neural band on right. (Shot on Meta Ray-Ban Display glasses and cropped.)

Scott Stein/CNET

Neural band: Amazing and also awkward

To control these glasses, Meta invented a whole new wearable that looks like a screenless fitness tracker. Called the Meta Neural Band, this fabric-covered device has an array of flat electrodes on the inside that push gently against my wrist, measuring electrical impulses via EMG (electromyography) technology. 

These signals interpret one-handed gestures I make to control the glasses’ screen. You’re supposed to wear the band tightly on your dominant wrist, above your wristbone, higher up than a normal smartwatch. The magnetic clasp tightens easily, though, and feels like a fitness band. 

The band has no function other than controlling the glasses. It charges with its own magnetic pin cable, has water resistance for splashes but not for swimming, and lasts about a day on a charge. 

I hoped it would feel like I had magical powers to control what I saw. In practice, the magic works in bits and pieces. The wristband only recognizes a narrow set of gestures, which control all the navigation on the display as I move between windows and apps. 

Learning the gestures takes effort. Double-tapping my middle finger and thumb summons the display, and other gestures go back or confirm a selection. To choose an app, I have to swipe my thumb and then tap my forefinger and thumb. After a while, all that movement can sometimes make my hand cramp.

Small actions can be subtle and fascinating. Double-tapping my thumb on my closed fist activates Meta AI voice prompts, allowing me to quietly ask about things, take a photo, play music or read a message. Or playing music, a quick tap, maybe a finger pinch and twist for volume. I can do these even when my hands are by my sides, walking. 

Sometimes the gestures don’t activate — when holding a grocery bag with that hand, holding a steering wheel or reaching in my pocket. Sometimes I activate them accidentally, like I did during a ZDNet podcast when, gesturing with my hands, I kept activating Meta AI.

You could use the right arm of the glasses instead of the band. It has a trackpad that scrolls in multiple directions and has single- and dual-finger touch gestures. It’s awkward, however. The band, which comes with the glasses, still feels essential if you’re really going to try living with them.

Scott Stein of CNET talking to glasses and showing his words as captions in-lens

We got our camera behind the glasses to show what live captioning really looks like. It’s pretty wild.

Numi Prasarn/CNET

AI powers: I can caption real life 

Meta AI is designed to answer questions, open apps, send messages or even use the cameras to analyze something in front of me and attempt to translate or describe it to me. These AI functions are pretty much exactly the same as what’s on Meta’s other Ray-Ban and Oakley glasses, with the same hit-and-miss accuracy. But now, I can also see responses on-screen in text and sometimes graphic form. 

One feature unique to these glasses is assistive captioning, and it might be the most magical feature of all. It uses the microphones to focus only on the speaker in front of me and translates what they say to text that appears in the display moments later. I can see people wanting these just for the captioning alone.

But for continuous AI analysis of the world using the cameras, you have to activate Live AI mode, which drains the battery very quickly. Expect an hour or less of use that way, versus up to 6 hours otherwise in more casual modes.

Heads-up maps in the lens of Meta Ray-Ban Display glasses

Maps can pop up in-display and show navigation in some cities, too. Truly useful, although potentially distracting.

Numi Prasarn/CNET

Display apps: Few and far between, but signs of magic

For all that Meta’s promising a transforming future of world-aware AI glasses, there’s not a ton of hyper-intelligent new stuff going on when I’m wearing them. They’re mostly bringing up a dashboard of certain go-to apps on demand, overlaid for me to scan quickly. Unlike Meta’s future promises of contextual AI that can truly know what you need at any moment, a lot of my use is more deliberate, like a smartwatch.

The color display in these glasses is high-res, crisp and detailed. It’s also ghostly looking, both because it’s semi-transparent and it’s only in one eye. Reading it with one eye was fine, but it made me wish for a wider field of view. 

Apple Music album readout for Starship Troopers playing on Meta Ray-Ban Display glasses.

Playing a little Starship Troopers, as I do on deadline.

Numi Prasarn/CNET

It’s also visible even in bright daylight thanks to transition lenses in the glasses. The sunglass mode activates quickly, and I’ve been able to see messages even in the brightest head-on sun. 

I mainly used the display for things like quick readouts, thumbnails of photos or a map to glance at. It’s not a full dashboard for my phone, and I can’t use it to playback videos. When I listen to a Jets game played via Bluetooth audio from the NFL app on my phone, I can’t see the game itself. In that sense, these aren’t display glasses like Xreal or Viture glasses that actually mirror your whole phone via USB-C.

The camera viewfinder app inside Meta Ray-Ban Display glasses' lens

A little idea of what the heads-up camera viewfinder feels like when wearing the glasses. You can zoom in with your fingers.

Numi Prasarn/CNET

The 10 apps Meta has on these glasses are all Meta-made, and it shows. Facebook Messenger, Instagram and WhatsApp are the primary ways to chat or have live video calls, where someone could also see your camera feed. There’s also a basic music player that works with Apple Music, Amazon or Spotify. 

You can look through photos and videos taken on the glasses, or use the camera app to get a live viewfinder, and even zoom in on your shot digitally by pinch-and-twisting your fingers. It’s a wild idea, but the digital zoom can feel buggy and a bit hard to control.

An onboard maps app is fascinating, and can bring turn-by-turn directions to my eyes as I walk or even while driving. The pop-up turn indicators seem useful as I walk through my town, but not necessarily any more than directions from earbuds or my smartwatch.

Overall, the collection of apps is no substitute for my phone, and Meta hasn’t even built deeper hooks into its own apps like Facebook or Messenger. Google’s expected wave of AI-enabled glasses coming next year could better at accessing Android phones, at least. Meta needs to figure out how to extend its glasses feature set and apps while navigating Apple and Google’s garden walls and app stores.

Ray-Ban Display glasses and neural band together on a white pedestal

The glasses and the neural band both need charging. The band lasts a whole day, but the glasses only last several hours.

Numi Prasarn/CNET

Battery life: Now there are two more things to charge

If you’re like me you already have a lot of things to charge every day: a phone, a smartwatch, a pair of earbuds, maybe. Meta adding two more – the glasses and the neural band – feels like a lot.

To charge, you snap the glasses into a collapsable carrying case with a battery pack and USB-C port. The glasses charge quickly, but battery life lasts only 2 to 6 hours in my everyday use so far. That’s less than Meta’s screen-free second-gen Ray-Bans. Sooner or later, I need to recharge during the day, which means carrying a second pair of glasses.

Luckily, since I’m wearing contacts to test these, that’s no big deal. But if these were my everyday glasses, it wouldn’t be great.

The neural band, meanwhile, has its own special charge cable and lasts up to a full day on a charge. While that helps my charge stress, I still need to manage the glasses. I find myself looking at the battery status for band and glasses through the day now, just like I do my phone. 

There should be an easier way to charge these glasses on the fly, using swappable batteries or a tethered cable. Until these can achieve a full day of battery life, they’ll hold them back as a true life assistant.

Privacy is a total unknown, safety is a concern

Meta’s one of the worst of the big tech companies when it comes to handling data and privacy. Meta tends to suck up data for unclear purposes or for serving ads (which don’t appear on these glasses at all, yet). 

Many people I know are hesitant to use Meta glasses at all for these reasons, and I get it. I also don’t know how Meta will handle the evolution of more advanced world-aware AI on these glasses down the road.

Could I sneak a photo of you using these glasses? Yes, and more easily than before, since now I can trigger the camera subtly with my fingers by my side. There’s still an LED light that goes off when the camera’s in use, but it’s easy to miss, especially in bright daylight.

I’m also concerned about safety. Having a display on my face while walking or especially while driving is a potentially serious distraction. There is a driving awareness mode and an audio-only mode for the glasses, but that’s not activated by default. The glasses made no automatic recommendations for deactivating the display when driving, something I think Meta should add immediately.

Meta Ray-Ban Display glasses on a white table

How will Meta make these glasses work better with our phones, and our lives?

Numi Prasarn/CNET

The future is more AI companies aiming for your face

The template for what Meta is showing off for these glasses isn’t some out-on-a-limb concept. Google, Amazon and Apple are all expected to have glasses of their own in the next couple of years, mixing in heads-up displays and more AI-assisted features, possibly adding wrist-based controls or hand gestures, too.

Meta has plans to turn these into fully augmented reality devices capable of layering 3D into the world, like the prototype Orion glasses I tried last year. Ray-Ban Displays aren’t like that yet, but they’re also the first of their kind.

I think of my time with Ray-Ban Displays like life with those early days of smartwatches that felt nearly ready to be on our wrists all the time. Nearly, but not quite. These Display glasses are like prototypes, but the landscape is changing fast, and Meta will need to perfect the next generation further. It could happen as soon as next year. And by that time, many other pairs of glasses will be ready for your eyes, too. While they’re the most advanced smart glasses out there right now, they’re not the most practical ones to wear.

In the meantime, I recommend the display-free Ray-Bans for most people. If you’re ready to be an early adopter of neural wrist tech for $800, dive right in. Personally, I think my eyes need a bit of a break.



[ad_2]

]]>
http://livelaughlovedo.com/technology-and-gadgets/meta-ray-ban-display-glasses-review-halfway-to-the-future/feed/ 0
Prediction: 1 Artificial Intelligence (AI) Stock Will Be Worth A Lot http://livelaughlovedo.com/finance/prediction-1-artificial-intelligence-ai-stock-will-be-worth-more-than-nvidia-and-palantir-technologies-combined-by-2030/ http://livelaughlovedo.com/finance/prediction-1-artificial-intelligence-ai-stock-will-be-worth-more-than-nvidia-and-palantir-technologies-combined-by-2030/#respond Sun, 31 Aug 2025 09:00:31 +0000 http://livelaughlovedo.com/2025/08/31/prediction-1-artificial-intelligence-ai-stock-will-be-worth-more-than-nvidia-and-palantir-technologies-combined-by-2030/ [ad_1]

Meta Platforms is using artificial intelligence to strengthen its advertising business, and its Orion augmented reality glasses could be the next big consumer electronics product.

Interest in artificial intelligence went parabolic following the release of ChatGPT in late 2022. Since then, Nvidia stock has advanced 1,090% to a market value of $4.2 trillion. And Palantir Technologies stock has climbed 2,340% to a market value of $370 billion. That means the companies are collectively worth $4.6 trillion.

I predict Meta Platforms (META -1.69%) will surpass that figure in no more than five years (i.e., before the end of 2030). The company is currently worth $1.9 trillion, which means its share price must increase by about 247% for its market value to reach $4.7 trillion. Here’s why I think that could happen.

A bull figurine stands in front of stock price charts.

Image source: Getty Images.

Meta Platforms is a digital advertising giant with deep AI expertise

Meta Platforms owns three of the four most popular social media platforms as measured by monthly active users. That competitive advantage lets it collect consumer data on a tremendous scale, and that data helps brands target ad campaigns. As a result, Meta is the second-largest adtech company worldwide and is likely to gain market share, according to Morningstar.

Meta has already made strides in boosting engagement with artificial intelligence (AI). CEO Mark Zuckerberg told analysts on the second-quarter earnings call, “Advancements in our recommendation systems have improved quality so much that it has led to a 5% increase in time spent on Facebook and 6% on Instagram.” He also said that advertising conversion rates increased across both social media platforms, meaning more clicks and purchases.

Importantly, Meta is investing aggressively in AI infrastructure and aspires to automate the entire ad creation process by next year. The Wall Street Journal writes, “Using the ad tools Meta is developing, a brand could present an image of the product it wants to promote along with a budgetary goal, and AI would create the entire ad, including imagery, video, and text.”

Meta’s Orion smart glasses could be the next big consumer electronics product

Meta Platforms is the market leader in smart glasses, a nascent market where shipments more than tripled last year and are forecast to increase faster than 60% annually through 2029. And Meta is actually gaining market share. Its Ray-Ban smart glasses accounted for nearly three-quarters of shipments in the first half of 2025, up from 60% in 2024.

Counterpoint Research writes, “Ray-Ban Meta smart glasses redefine the smart glasses experience by integrating wearable AI while combining a stylish design with enhanced smart functionalities.” The company sees a large opportunity on the horizon. Zuckerberg believes smart glasses could replace smartphones as the personal computing form factor of choice within the next 15 years.

To capitalize, Meta announced Orion last year, smart glasses that incorporate augmented reality (AR) that overlays the physical world with holographic displays. The company will not commercialize the product for several years while it works to make the technology less expensive. However, smart glasses that blend AR and AI could be revolutionary, as they would enable wearers to search the internet, talk with friends, and watch media content without phones.

Apple rose to great heights following its introduction of the iPhone in 2007. If Zuckerberg is correct about smart glasses being the next big breakthrough in consumer electronics, Meta could become the Apple of the next decade, which means its market value could increase substantially in the years ahead.

Meta Platforms could be a $4.7 billion company by mid-2030

To summarize, Meta has a strong presence in digital advertising and a leadership position in smart glasses. Adtech spending is forecasted to grow at a rate of 14% annually through 2032, while smart glasses sales are projected to increase by more than 60% annually through 2029. In total, that gives Meta a reasonable shot at annual earnings growth of 20%+ in the next five years.

That outlook makes the current valuation of 26.7 times earnings seem quite reasonable. And if Meta does grow earnings at 20% annually over the next five years, its share price could increase by 149% without any change in the price-to-earnings (P/E) ratio. That would bring its market value to $4.7 trillion by mid-2030, surpassing the current combined market value of Nvidia and Palantir.

Trevor Jennewine has positions in Nvidia and Palantir Technologies. The Motley Fool has positions in and recommends Apple, Meta Platforms, Nvidia, and Palantir Technologies. The Motley Fool has a disclosure policy.

[ad_2]

]]>
http://livelaughlovedo.com/finance/prediction-1-artificial-intelligence-ai-stock-will-be-worth-more-than-nvidia-and-palantir-technologies-combined-by-2030/feed/ 0
Amazon is selling ;$99 AI translation glasses for $59 http://livelaughlovedo.com/finance/amazon-is-selling-stylish99-ai-translation-smart-glasses-for-59-and-buyers-love-all-the-unexpected-perks/ http://livelaughlovedo.com/finance/amazon-is-selling-stylish99-ai-translation-smart-glasses-for-59-and-buyers-love-all-the-unexpected-perks/#respond Sat, 21 Jun 2025 09:37:05 +0000 http://livelaughlovedo.com/2025/06/21/amazon-is-selling-stylish99-ai-translation-smart-glasses-for-59-and-buyers-love-all-the-unexpected-perks/ [ad_1]

TheStreet aims to feature only the best products and services. If you buy something via one of our links, we may earn a commission.

Everyone wants to be seen as smart. One of the best ways to achieve this goal is by making smart purchases of smart items. You can do both, with the help of Amazon, when you buy one of its best pairs of smart glasses which is currently at its lowest price in 30 days. You can see this deal right now clear as day, so don’t miss the opportunity to take advantage.

The WGP AI Translation Smart Glasses are on sale for only $59. That’s an amazing 40% off the regular price of $99. If you think this may be an item worth buying, then you’re speaking our language.

WGP AI Translation Smart Glasses, $59 (was $99) at Amazon

Courtesy of Amazon

Get it.

These glasses have so many features, we don’t know if we even have time to share them all, but we’ll give it a try. They include dual open-ear Bluetooth speakers, an ip54 waterproof rating, and the aforementioned AI translation function. They also have a noise reducing microphone for phone calls, which eliminates 90% of ambient sound for crystal clear calls.

The magnetic charging port allows you to easily charge the glasses in a hurry, and is far easier to use than an unwieldy standard plug. You can get as much as six hours of functionality on a single charge, so you certainly won’t be tethered to a power source all day. The glasses even include their own app for both Apple and Android users, making it the perfect pick for just about anyone.

Related: Amazon is selling a $625 Citizen Eco-Drive watch for just $375, and shoppers love its ‘understated elegance’

Amazon shoppers were thrilled with these glasses. One called it “my portable office and entertainment hub,” also saying they “bought these smart glasses for my multilingual business meetings, but discovered unexpected perks! The Bluetooth music function is genius – Paired with my AirPods Pro for wireless tunes during commute…Last weekend I even used the navigation mode for hands-free walking directions…An all-in-one tech marvel.”

Another customer described the glasses as “stylish,” and added “took me less than 2 minutes to set up, super easy to use…All my friends were very intrigued…I’m happy to use this pair of smart glasses regularly in my office, to block blue light from the monitor while listening to music.

The WGP AI translation Smart Glasses are for so much more than simply getting past a language barrier. For just $59, you can have access to more digital features than you ever imagined for both work and play. This is one deal whose value definitely won’t be lost in translation.

[ad_2]

]]>
http://livelaughlovedo.com/finance/amazon-is-selling-stylish99-ai-translation-smart-glasses-for-59-and-buyers-love-all-the-unexpected-perks/feed/ 0
Move over Ray-Ban, Oakley Meta glasses are arriving this Friday http://livelaughlovedo.com/technology-and-gadgets/move-over-ray-ban-oakley-meta-glasses-are-arriving-this-friday/ http://livelaughlovedo.com/technology-and-gadgets/move-over-ray-ban-oakley-meta-glasses-are-arriving-this-friday/#respond Tue, 17 Jun 2025 03:50:42 +0000 http://livelaughlovedo.com/2025/06/17/move-over-ray-ban-oakley-meta-glasses-are-arriving-this-friday/ [ad_1]

Oakley Sphaera

TL;DR

  • The Oakley Meta glasses are set to arrive on June 20.
  • The smart glasses will be similar in functionality to the Ray-Ban Meta AI Glasses and are expected to be geared toward athletes.
  • While Meta is also working on more advanced smart glasses with built-in displays, this Oakley-branded version is expected to skip the display.

Oakley and Meta have announced that “The next evolution is coming on June 20.” This strongly suggests that the two brands are finally ready to unveil the long-rumored successor to the Ray-Ban Meta AI Glasses. Both Oakley and Meta have begun teasing the upcoming release on their respective social media channels. Meta has also launched a sign-up page for those who want to be notified as soon as the glasses are officially released.

Although the companies haven’t revealed details about the design, specs, or features of the new glasses, their collaboration was previously leaked via Bloomberg. Reportedly codenamed Supernova 2, the new smart glasses will likely offer similar functionality to the Ray-Ban Meta Glasses. While Meta is also working on more advanced smart glasses with built-in displays, this Oakley-branded version is expected to skip the screens.

Instead, the Oakley Meta glasses are expected to be geared toward athletes and are said to be based on the Oakley Sphaera model (pictured above). Notably, the design of the new glasses could include a centrally positioned camera instead of the side-mounted style we saw on the Ray-Bans. This change is meant to enhance usability for cyclists and other athletes.

Since major hardware upgrades aren’t expected, the Oakley Meta glasses will likely be priced similarly to the current Ray-Ban Meta Glasses, starting at approximately $299. Fortunately, we don’t have to wait much longer to see exactly what Oakley and Meta have been working on.



[ad_2]

]]>
http://livelaughlovedo.com/technology-and-gadgets/move-over-ray-ban-oakley-meta-glasses-are-arriving-this-friday/feed/ 0
Snap CEO Evan Spiegel promises new lightweight ‘Specs’ http://livelaughlovedo.com/finance/snap-ceo-evan-spiegel-promises-new-lightweight-specs-smart-glasses-next-year-in-race-to-beat-meta-and-google-to-market/ http://livelaughlovedo.com/finance/snap-ceo-evan-spiegel-promises-new-lightweight-specs-smart-glasses-next-year-in-race-to-beat-meta-and-google-to-market/#respond Wed, 11 Jun 2025 04:14:23 +0000 http://livelaughlovedo.com/2025/06/11/snap-ceo-evan-spiegel-promises-new-lightweight-specs-smart-glasses-next-year-in-race-to-beat-meta-and-google-to-market/ [ad_1]

Snapchat, long-known as a featherweight in the league of Big Tech giants, is hoping to best opponents Meta, Google and Apple by releasing its new augmented reality AI-enabled smart glasses months, maybe even years, before the big guys. 

Speaking at a conference on Tuesday, Snap CEO Evan Spiegel said the company would release a new version of its camera-equipped glasses next year that will incorporate an interactive, AI-enhanced digital screen within the lens. The 2026 release date would be ahead of Meta, which plans to release its AR “Orion” glasses in 2027, while Google has not attached a date to its Android XR glasses

“The tiny smartphone limited our imagination,” Spiegel said in his keynote at the Augmented World Expo conference in Long Beach, Calif. “It’s clear that today’s devices and user interfaces are woefully inadequate to realize the full potential of AI.” 

The new “Snapchat Specs” will be lightweight and AI-enhanced, Snap said. They will allow users to look at objects in the real world and leverage AI to access information, such as translating ingredients on a label from foreign languages. The glasses will also allow users to interact with the objects on the lens, Snap said, citing examples like playing video games with their eyeballs.

The company did not share photos of the Specs frames or provide information on pricing. As part of the Specs announcement, Snapchat shared that operating system partnerships with OpenAI and Google Gemini will extend into experiences for the glasses. 

If Snap follows through on the promise of 2026 launch, it would be the first Big Tech company to market with augmented reality glasses for mainstream consumers, claiming an early lead in the race to create the successor to the smartphone—a competition involving everyone from Meta, Google, and Apple, to ChatGPT maker OpenAI, which recently announced a partnership with former Apple design boss Jony Ive.   

A pioneer in the glasses form factor, Snap made waves with the release of its “Spectacles” in 2016. The funky looking glasses were equipped with a camera that allowed users to post photos and short video clips directly to their Snapchat feed. But in recent years, Snap’s Spectacles have been eclipsed by Meta, which partnered with EssilorLuxottica to release Ray-Ban smart glasses. Though Meta hasn’t shared financials around its Ray-Ban glasses, EssilorLuxottica noted that the companies have sold over 2 billion glasses since their 2023 debut. Luxottica plans to increase products of the co-branded glasses to 10 million units by 2026, suggesting that the companies are pleased with the results and potential of the glasses. 

That said, Meta’s glasses do not have AR capabilities; rather, the glasses have audio-based AI features as well as photo and video capability. Meta has said it will release its Orion AR glasses in 2027, with technology that will allow users to scan their Threads feeds with eye tracking hardware.  

Other tech giants have glasses in their sights, too. At its IO developer’s conference in May, Google announced that it would join the smart glasses market by partnering with Warby Parker. And Apple, whose $3,500 VisionPro headset has failed to catch on with consumers, is reported to release smart glasses next year that mimic the current version of Meta’s Ray Bans, while working on more advanced AR glasses that are still years away, according to Bloomberg.

The Specs announcement follows a turbulent financial period for Snapchat. After years of worrisome financials, Snapchat seems to have stabilized and increased free cash flow in the most recent quarter. The glasses are partially a revenue diversification effort as the company is propagated by ads to its social network.

Still, Snapchat did not share what the glasses will cost consumers. Meta’s Ray-Ban glasses, which do not have AR capabilities, cost between $239 and $303 so it’s reasonable to assume the Specs’ prices will be steeper due to the hardware requirements. 

The style and comfort of the glasses are also likely to be critical, with consumers having repeatedly demonstrated an aversion to bulky- or geeky-looking smart glasses and headsets. With its 2026 launch date, Snap has thrust itself back into the conversation, but success will rest on whether it can produce a product consumers actually want to wear.

This story was originally featured on Fortune.com

[ad_2]

]]>
http://livelaughlovedo.com/finance/snap-ceo-evan-spiegel-promises-new-lightweight-specs-smart-glasses-next-year-in-race-to-beat-meta-and-google-to-market/feed/ 0