augmented reality – Live Laugh Love Do http://livelaughlovedo.com A Super Fun Site Mon, 01 Dec 2025 03:25:26 +0000 en-US hourly 1 https://wordpress.org/?v=6.9.1 Meta Ray-Ban Display Glasses Review: Halfway to the Future http://livelaughlovedo.com/technology-and-gadgets/meta-ray-ban-display-glasses-review-halfway-to-the-future/ http://livelaughlovedo.com/technology-and-gadgets/meta-ray-ban-display-glasses-review-halfway-to-the-future/#respond Thu, 16 Oct 2025 13:18:55 +0000 http://livelaughlovedo.com/2025/10/16/meta-ray-ban-display-glasses-review-halfway-to-the-future/ [ad_1]

If you see someone nearby wearing thick Ray-Ban glasses, maybe staring off into space a bit and making small gestures with their fingers, you could be witness to the next big piece of wearable tech. Gesture-enabled smart glasses are here in the form of Meta Ray-Ban Displays, and I’ve been wearing them for about two weeks now, off and on. Yes, I’m one of those people.

Will you eventually be one of those people, too? Well, start by asking yourself whether you even want a display hovering around near your eyes, able to be called into existence with a double tap of a middle finger and thumb. Do I? Yes and no.

Watch this: My Life With Meta Ray-Ban Displays: A Weird Wild Future

Meta’s latest $800 glasses feel to me like a transformational gadget for life. At their best, they reveal magic glimpses of a subtle interface, another layer of information on the world, with a display that conjures itself on demand. At their worst, they highlight the numerous missing pieces still needed to make smart glasses truly essential. Including, by the way, prescription support for my eyes. Right now, I’m testing them with contact lenses on.

Also, I have fundamental concerns about distraction and safety while wearing them.

Meta Ray-Ban Display glasses in black and a Neural Band

7.0

Meta Ray-Ban Display

Like


  • Nearly invisible in-lens heads-up display

  • Impressive gesture controls via included wristband

  • Assistive captioning and maps apps are truly useful

  • Viewfinder and zoom functions for taking photos and videos

Don’t like


  • Shorter battery life than standard, display-free Meta Ray-Bans

  • Neural band can feel annoying to wear

  • Few apps and phone-connected functions

  • Can’t mirror phone on the display, just certain apps

As they currently exist, the Meta display glasses I review here are not as useful as a smartwatch or as good a value as display-free smart glasses, such as the standard Meta Ray-Bans. Even if you want to be one of those people, you should probably wait until this impressive technology matures a bit.

But the technologies inside these glasses — near-invisible display tech provided by reflective waveguides, and wild neural band technology driven by electromyography — are going to show up in more glasses and wearables eventually. It’s early days for significant advancements in tech for our wrists and faces, and while these glasses are technical achievements, they currently feel like a beta test of things to come.

CNET's Scott Stein wearing the new Meta Ray-Ban Display smart glasses and neural wristband.

The tiny screen embedded in Meta’s Ray-Ban Display glasses is only visible to the wearer. It’s controlled by gestures that are sensed by the included neural wristband.

Scott Stein/CNET

Now I feel like an everyday cyborg

On the surface, the new display glasses look a lot like Meta’s existing audio and camera-enabled Ray-Ban and Oakley smart glasses. The difference is that they have a heads-up display in one eye to show apps and information, along with a gesture-enabled wristband you wear to control the display. 

The display in Meta’s Display Glasses isn’t as capable as a VR headset or Tony Stark specs, however. You won’t see 3D things in these glasses. All they do is project a single display in a single eye, flat and 2D. Essentially, they’re like an evolved pair of Google Glass (circa 2013) for the modern age.

The really advanced ideas come on the included Neural Band, which can register subtle hand gestures. Little taps and swipes allow me to control what’s on the screen, like a clickable mouse made of my fingers. Subtle vibrations give me feedback as I tap.

The wrist and display upgrades on these glasses make the whole experience feel equal parts futuristic and odd. I’m sort of an everyday cyborg who can summon screen readouts into my vision. Sometimes it feels like my life has become a first-person video game. Other times it feels like I’ve glued a smartwatch to my face or given my eye Apple CarPlay.

CNET's Scott Stein wearing Meta Ray-Ban Display glasses with sunglass mode activating

Meta Ray-Ban Display glasses on my face, transition lenses activating. Do you notice I’m recording?

Numi Prasarn/CNET

Looks: Subtle, yet not that subtle

Compared with some other augmented reality specs and smart glasses I’ve tried, Meta’s look surprisingly stylish for big, chunky glasses. The frames come in either black or semi-transparent sand-colored (brown) and make the standard Ray-Bans look slim and low-key by comparison. I love the way these glasses look on my face, but for the record, my family doesn’t.

They’re heavier than non-display Ray-Bans, but not by much (69 grams, compared to 49 grams). They still feel premium, solid and comfortable to wear. The thick arms have hinge springs that bend back to reduce tension on the sides of my head, and they don’t exert pressure on my temples.

It’s pretty amazing how the display inside the right lens isn’t visible to outside users who look at me, even when it’s on. 

When it’s off, you have to look closely to make out the reflective waveguide tech that creates the image. It looks like a series of small lines on the side of the lens. There’s also a narrow vertical strip down the side of the lens that’s visible at certain angles. The waveguide tech here is a lot better than anything else I’ve seen, and a sign of how invisible in-glass displays could be.

That tech has a major downside, however: Meta can only make these Ray-Bans with prescriptions that range from minus 4 to plus 4. My eyes are over minus 8. I’ve had to wear contact lenses to test them for this review, defeating the whole idea of these actually being my everyday glasses. I hope Meta can figure out how to work with more prescriptions — not just for myself, but for anyone else hoping to buy them. Signs are strong that will happen for this type of lens tech, but exactly when remains a mystery.

A photo of two hands, one wearing Apple Watch, one wearing Meta Neural Band

My two hands now: watch on left, gesture-controlliing neural band on right. (Shot on Meta Ray-Ban Display glasses and cropped.)

Scott Stein/CNET

Neural band: Amazing and also awkward

To control these glasses, Meta invented a whole new wearable that looks like a screenless fitness tracker. Called the Meta Neural Band, this fabric-covered device has an array of flat electrodes on the inside that push gently against my wrist, measuring electrical impulses via EMG (electromyography) technology. 

These signals interpret one-handed gestures I make to control the glasses’ screen. You’re supposed to wear the band tightly on your dominant wrist, above your wristbone, higher up than a normal smartwatch. The magnetic clasp tightens easily, though, and feels like a fitness band. 

The band has no function other than controlling the glasses. It charges with its own magnetic pin cable, has water resistance for splashes but not for swimming, and lasts about a day on a charge. 

I hoped it would feel like I had magical powers to control what I saw. In practice, the magic works in bits and pieces. The wristband only recognizes a narrow set of gestures, which control all the navigation on the display as I move between windows and apps. 

Learning the gestures takes effort. Double-tapping my middle finger and thumb summons the display, and other gestures go back or confirm a selection. To choose an app, I have to swipe my thumb and then tap my forefinger and thumb. After a while, all that movement can sometimes make my hand cramp.

Small actions can be subtle and fascinating. Double-tapping my thumb on my closed fist activates Meta AI voice prompts, allowing me to quietly ask about things, take a photo, play music or read a message. Or playing music, a quick tap, maybe a finger pinch and twist for volume. I can do these even when my hands are by my sides, walking. 

Sometimes the gestures don’t activate — when holding a grocery bag with that hand, holding a steering wheel or reaching in my pocket. Sometimes I activate them accidentally, like I did during a ZDNet podcast when, gesturing with my hands, I kept activating Meta AI.

You could use the right arm of the glasses instead of the band. It has a trackpad that scrolls in multiple directions and has single- and dual-finger touch gestures. It’s awkward, however. The band, which comes with the glasses, still feels essential if you’re really going to try living with them.

Scott Stein of CNET talking to glasses and showing his words as captions in-lens

We got our camera behind the glasses to show what live captioning really looks like. It’s pretty wild.

Numi Prasarn/CNET

AI powers: I can caption real life 

Meta AI is designed to answer questions, open apps, send messages or even use the cameras to analyze something in front of me and attempt to translate or describe it to me. These AI functions are pretty much exactly the same as what’s on Meta’s other Ray-Ban and Oakley glasses, with the same hit-and-miss accuracy. But now, I can also see responses on-screen in text and sometimes graphic form. 

One feature unique to these glasses is assistive captioning, and it might be the most magical feature of all. It uses the microphones to focus only on the speaker in front of me and translates what they say to text that appears in the display moments later. I can see people wanting these just for the captioning alone.

But for continuous AI analysis of the world using the cameras, you have to activate Live AI mode, which drains the battery very quickly. Expect an hour or less of use that way, versus up to 6 hours otherwise in more casual modes.

Heads-up maps in the lens of Meta Ray-Ban Display glasses

Maps can pop up in-display and show navigation in some cities, too. Truly useful, although potentially distracting.

Numi Prasarn/CNET

Display apps: Few and far between, but signs of magic

For all that Meta’s promising a transforming future of world-aware AI glasses, there’s not a ton of hyper-intelligent new stuff going on when I’m wearing them. They’re mostly bringing up a dashboard of certain go-to apps on demand, overlaid for me to scan quickly. Unlike Meta’s future promises of contextual AI that can truly know what you need at any moment, a lot of my use is more deliberate, like a smartwatch.

The color display in these glasses is high-res, crisp and detailed. It’s also ghostly looking, both because it’s semi-transparent and it’s only in one eye. Reading it with one eye was fine, but it made me wish for a wider field of view. 

Apple Music album readout for Starship Troopers playing on Meta Ray-Ban Display glasses.

Playing a little Starship Troopers, as I do on deadline.

Numi Prasarn/CNET

It’s also visible even in bright daylight thanks to transition lenses in the glasses. The sunglass mode activates quickly, and I’ve been able to see messages even in the brightest head-on sun. 

I mainly used the display for things like quick readouts, thumbnails of photos or a map to glance at. It’s not a full dashboard for my phone, and I can’t use it to playback videos. When I listen to a Jets game played via Bluetooth audio from the NFL app on my phone, I can’t see the game itself. In that sense, these aren’t display glasses like Xreal or Viture glasses that actually mirror your whole phone via USB-C.

The camera viewfinder app inside Meta Ray-Ban Display glasses' lens

A little idea of what the heads-up camera viewfinder feels like when wearing the glasses. You can zoom in with your fingers.

Numi Prasarn/CNET

The 10 apps Meta has on these glasses are all Meta-made, and it shows. Facebook Messenger, Instagram and WhatsApp are the primary ways to chat or have live video calls, where someone could also see your camera feed. There’s also a basic music player that works with Apple Music, Amazon or Spotify. 

You can look through photos and videos taken on the glasses, or use the camera app to get a live viewfinder, and even zoom in on your shot digitally by pinch-and-twisting your fingers. It’s a wild idea, but the digital zoom can feel buggy and a bit hard to control.

An onboard maps app is fascinating, and can bring turn-by-turn directions to my eyes as I walk or even while driving. The pop-up turn indicators seem useful as I walk through my town, but not necessarily any more than directions from earbuds or my smartwatch.

Overall, the collection of apps is no substitute for my phone, and Meta hasn’t even built deeper hooks into its own apps like Facebook or Messenger. Google’s expected wave of AI-enabled glasses coming next year could better at accessing Android phones, at least. Meta needs to figure out how to extend its glasses feature set and apps while navigating Apple and Google’s garden walls and app stores.

Ray-Ban Display glasses and neural band together on a white pedestal

The glasses and the neural band both need charging. The band lasts a whole day, but the glasses only last several hours.

Numi Prasarn/CNET

Battery life: Now there are two more things to charge

If you’re like me you already have a lot of things to charge every day: a phone, a smartwatch, a pair of earbuds, maybe. Meta adding two more – the glasses and the neural band – feels like a lot.

To charge, you snap the glasses into a collapsable carrying case with a battery pack and USB-C port. The glasses charge quickly, but battery life lasts only 2 to 6 hours in my everyday use so far. That’s less than Meta’s screen-free second-gen Ray-Bans. Sooner or later, I need to recharge during the day, which means carrying a second pair of glasses.

Luckily, since I’m wearing contacts to test these, that’s no big deal. But if these were my everyday glasses, it wouldn’t be great.

The neural band, meanwhile, has its own special charge cable and lasts up to a full day on a charge. While that helps my charge stress, I still need to manage the glasses. I find myself looking at the battery status for band and glasses through the day now, just like I do my phone. 

There should be an easier way to charge these glasses on the fly, using swappable batteries or a tethered cable. Until these can achieve a full day of battery life, they’ll hold them back as a true life assistant.

Privacy is a total unknown, safety is a concern

Meta’s one of the worst of the big tech companies when it comes to handling data and privacy. Meta tends to suck up data for unclear purposes or for serving ads (which don’t appear on these glasses at all, yet). 

Many people I know are hesitant to use Meta glasses at all for these reasons, and I get it. I also don’t know how Meta will handle the evolution of more advanced world-aware AI on these glasses down the road.

Could I sneak a photo of you using these glasses? Yes, and more easily than before, since now I can trigger the camera subtly with my fingers by my side. There’s still an LED light that goes off when the camera’s in use, but it’s easy to miss, especially in bright daylight.

I’m also concerned about safety. Having a display on my face while walking or especially while driving is a potentially serious distraction. There is a driving awareness mode and an audio-only mode for the glasses, but that’s not activated by default. The glasses made no automatic recommendations for deactivating the display when driving, something I think Meta should add immediately.

Meta Ray-Ban Display glasses on a white table

How will Meta make these glasses work better with our phones, and our lives?

Numi Prasarn/CNET

The future is more AI companies aiming for your face

The template for what Meta is showing off for these glasses isn’t some out-on-a-limb concept. Google, Amazon and Apple are all expected to have glasses of their own in the next couple of years, mixing in heads-up displays and more AI-assisted features, possibly adding wrist-based controls or hand gestures, too.

Meta has plans to turn these into fully augmented reality devices capable of layering 3D into the world, like the prototype Orion glasses I tried last year. Ray-Ban Displays aren’t like that yet, but they’re also the first of their kind.

I think of my time with Ray-Ban Displays like life with those early days of smartwatches that felt nearly ready to be on our wrists all the time. Nearly, but not quite. These Display glasses are like prototypes, but the landscape is changing fast, and Meta will need to perfect the next generation further. It could happen as soon as next year. And by that time, many other pairs of glasses will be ready for your eyes, too. While they’re the most advanced smart glasses out there right now, they’re not the most practical ones to wear.

In the meantime, I recommend the display-free Ray-Bans for most people. If you’re ready to be an early adopter of neural wrist tech for $800, dive right in. Personally, I think my eyes need a bit of a break.



[ad_2]

]]>
http://livelaughlovedo.com/technology-and-gadgets/meta-ray-ban-display-glasses-review-halfway-to-the-future/feed/ 0
The Quiet Revolution: How Virtual Styling and AI Are Redefining Personal Shopping http://livelaughlovedo.com/fashion-style/the-quiet-revolution-how-virtual-styling-and-ai-are-redefining-personal-shopping/ http://livelaughlovedo.com/fashion-style/the-quiet-revolution-how-virtual-styling-and-ai-are-redefining-personal-shopping/#respond Thu, 02 Oct 2025 01:56:33 +0000 http://livelaughlovedo.com/2025/10/02/the-quiet-revolution-how-virtual-styling-and-ai-are-redefining-personal-shopping/ [ad_1]

Image

Image from: Pexels

Personal shopping, which used to be a luxury for the rich, is going through a subtle but big change. Technology is making the profession more accessible to everyone, so it’s no longer limited to in-store appointments and one-on-one sessions with a stylist. 

AI and virtual style platforms are breaking down old borders and giving people all across the world tailored, easy-to-use, and very effective buying experiences. This change isn’t simply an improvement to e-commerce; it’s a whole change in how people find, try on, and buy clothes. It makes personalized, high-touch service available to everyone. 

The Growth of Virtual Try-Ons

The virtual try-on is one of the most important new ideas that has come out of this technological revolution. With augmented reality (AR), people can now preview how a piece of apparel or a pair of shoes will appear on their body without ever leaving their house. This technique solves a big problem with internet shopping: not knowing how something will fit and look. 

Brands such as Amazon’s Virtual Try-On for Shoes and Nike’s Fit feature are using virtual try-on tools that employ a shopper’s phone camera to make a digital overlay. This helps people see garments quite clearly. This feature not only reduces returns, but it also makes clients feel more at ease, turning a transaction that used to be impersonal into one that is entertaining and interactive. Virtual try-ons are helping both businesses and customers connect the digital and real worlds. They make buying things online seem more authentic and safe. These features are so effective that some brands have reported a 20-30% reduction in returns for items that utilize this technology.

Recommendations and Curation Powered by AI

AI is changing the way we get product suggestions in more ways than just virtual try-ons. Basic algorithms that recommend things based on prior purchases are widely used by traditional e-commerce sites; however, this may be restricting. The next generation of AI-powered stylists makes customization even better. 

These technologies look at a lot of different data points, such as a customer’s surfing history, social network likes, expressed preferences, body type, and even where they live. Then the AI can make a style profile that is genuinely unique to the consumer. It can propose not only comparable goods but also whole ensembles and new trends that the customer could adore but hasn’t thought of yet. This smart curation turns purchasing from a straightforward transaction into a voyage of discovery, helping people improve their style with each click.

Redefining Digital Lifestyle Experiences

Combining AI to fashion is part of a bigger societal movement toward tech-driven, highly individualized services and entertainment. AI-powered platforms are changing the way we shop and the way we use different types of digital entertainment. People desire digital experiences that are rich and tailored to their needs in every part of their life. People really want curated, high-energy interactions. 

This is shown by the rise of online streaming services that use algorithms to guess what people want to watch, the creation of advanced virtual concerts, and the rise of interactive platforms like live casino games, which are a fun and social way to spend time online. Fashion businesses must understand the workings of this digital engagement environment. It’s not just about selling garments when you use these channels. It’s about becoming a part of a consumer’s life and creating a world for your brand that goes beyond the product itself.

Impact on Accessibility and Retail

The technological revolution is having a huge effect on the retail business itself, making personalized shopping more scalable and accessible than ever. Small and medium-sized businesses who don’t have the money to hire a staff of stylists in their stores may now provide their consumers a very personalized experience. This process makes things more equal by providing tiny firms and emerging creators a chance to compete with established ones. 

Virtual style is also great for those who haven’t had a lot of access to popular fashion in the past. This might be because there aren’t many stores in their region or they don’t feel like they belong in high-fashion places. It lets customers test out different styles and discover businesses that they like. This reinforces the concept that style guidance and fashion curation should be a right, not a privilege.

[ad_2]

]]>
http://livelaughlovedo.com/fashion-style/the-quiet-revolution-how-virtual-styling-and-ai-are-redefining-personal-shopping/feed/ 0
Snap CEO Evan Spiegel promises new lightweight ‘Specs’ http://livelaughlovedo.com/finance/snap-ceo-evan-spiegel-promises-new-lightweight-specs-smart-glasses-next-year-in-race-to-beat-meta-and-google-to-market/ http://livelaughlovedo.com/finance/snap-ceo-evan-spiegel-promises-new-lightweight-specs-smart-glasses-next-year-in-race-to-beat-meta-and-google-to-market/#respond Wed, 11 Jun 2025 04:14:23 +0000 http://livelaughlovedo.com/2025/06/11/snap-ceo-evan-spiegel-promises-new-lightweight-specs-smart-glasses-next-year-in-race-to-beat-meta-and-google-to-market/ [ad_1]

Snapchat, long-known as a featherweight in the league of Big Tech giants, is hoping to best opponents Meta, Google and Apple by releasing its new augmented reality AI-enabled smart glasses months, maybe even years, before the big guys. 

Speaking at a conference on Tuesday, Snap CEO Evan Spiegel said the company would release a new version of its camera-equipped glasses next year that will incorporate an interactive, AI-enhanced digital screen within the lens. The 2026 release date would be ahead of Meta, which plans to release its AR “Orion” glasses in 2027, while Google has not attached a date to its Android XR glasses

“The tiny smartphone limited our imagination,” Spiegel said in his keynote at the Augmented World Expo conference in Long Beach, Calif. “It’s clear that today’s devices and user interfaces are woefully inadequate to realize the full potential of AI.” 

The new “Snapchat Specs” will be lightweight and AI-enhanced, Snap said. They will allow users to look at objects in the real world and leverage AI to access information, such as translating ingredients on a label from foreign languages. The glasses will also allow users to interact with the objects on the lens, Snap said, citing examples like playing video games with their eyeballs.

The company did not share photos of the Specs frames or provide information on pricing. As part of the Specs announcement, Snapchat shared that operating system partnerships with OpenAI and Google Gemini will extend into experiences for the glasses. 

If Snap follows through on the promise of 2026 launch, it would be the first Big Tech company to market with augmented reality glasses for mainstream consumers, claiming an early lead in the race to create the successor to the smartphone—a competition involving everyone from Meta, Google, and Apple, to ChatGPT maker OpenAI, which recently announced a partnership with former Apple design boss Jony Ive.   

A pioneer in the glasses form factor, Snap made waves with the release of its “Spectacles” in 2016. The funky looking glasses were equipped with a camera that allowed users to post photos and short video clips directly to their Snapchat feed. But in recent years, Snap’s Spectacles have been eclipsed by Meta, which partnered with EssilorLuxottica to release Ray-Ban smart glasses. Though Meta hasn’t shared financials around its Ray-Ban glasses, EssilorLuxottica noted that the companies have sold over 2 billion glasses since their 2023 debut. Luxottica plans to increase products of the co-branded glasses to 10 million units by 2026, suggesting that the companies are pleased with the results and potential of the glasses. 

That said, Meta’s glasses do not have AR capabilities; rather, the glasses have audio-based AI features as well as photo and video capability. Meta has said it will release its Orion AR glasses in 2027, with technology that will allow users to scan their Threads feeds with eye tracking hardware.  

Other tech giants have glasses in their sights, too. At its IO developer’s conference in May, Google announced that it would join the smart glasses market by partnering with Warby Parker. And Apple, whose $3,500 VisionPro headset has failed to catch on with consumers, is reported to release smart glasses next year that mimic the current version of Meta’s Ray Bans, while working on more advanced AR glasses that are still years away, according to Bloomberg.

The Specs announcement follows a turbulent financial period for Snapchat. After years of worrisome financials, Snapchat seems to have stabilized and increased free cash flow in the most recent quarter. The glasses are partially a revenue diversification effort as the company is propagated by ads to its social network.

Still, Snapchat did not share what the glasses will cost consumers. Meta’s Ray-Ban glasses, which do not have AR capabilities, cost between $239 and $303 so it’s reasonable to assume the Specs’ prices will be steeper due to the hardware requirements. 

The style and comfort of the glasses are also likely to be critical, with consumers having repeatedly demonstrated an aversion to bulky- or geeky-looking smart glasses and headsets. With its 2026 launch date, Snap has thrust itself back into the conversation, but success will rest on whether it can produce a product consumers actually want to wear.

This story was originally featured on Fortune.com

[ad_2]

]]>
http://livelaughlovedo.com/finance/snap-ceo-evan-spiegel-promises-new-lightweight-specs-smart-glasses-next-year-in-race-to-beat-meta-and-google-to-market/feed/ 0