A big reveal that flatlined
Mark Zuckerberg made an appearance at Meta Connect 2025 with much anticipation. The highlight was his firm’s newest device—the Meta Ray-Ban Display smart glasses, which Meta calls “the next generation of AI-powered eyewear.”
For $799, the frames are flashy with enhancements: double the battery, crisp 3K video recording, and higher resolution than before. On paper, they are a wearable tech dream gadget for anyone who adores wearable tech.
But on stage in front of the world, the dreams very quickly hit a brick wall. What was to be a seamless demo became an embarrassing series of glitches.
The cooking demo gone wrong
The first major flop was the effort by Meta to show off the glasses’ cooking aid features. Internet chef and Buffalo Bills “grillmaster” Jack Mancuso wore the glasses to cook a Korean-style steak sauce.
It started out well. The AI noticed the ingredients on the counter and said, “I like the setup you have going there with soy sauce and such. How can I help?
But when Mancuso merely inquired, “What do I do first?” the glasses got confused. Instead of leading him through step one, the AI incorrectly assumed the chef had already mixed everything.
“You already mixed the base ingredients, so now, grate a pear to add to the sauce,” it responded.
The chef tried again. Same reaction. With the plate full of wasted ingredients, the AI just did not understand what was occurring. Mancuso eventually attributed the failure to the Wi-Fi, shrugged, and handed it back to Zuckerberg.
Zuckerberg brushes it off
Zuckerberg tried to laugh off the failed demo, saying, “It’s all good. The irony of the whole thing is that you spend years building technology, and then the Wi-Fi on the day sort of gets you.”
It was a gracious spin, but the moment had already highlighted a maddening reality: the glasses just weren’t ready to impress.
Trouble with a simple call
After that, Zuckerberg attempted again—sending a text with the glasses and a corresponding wristband to message Meta’s Chief Technology Officer, Andrew “Boz” Bosworth. That succeeded.
But when did he set out to make a live video call? Zip. No matter how many times he tried, the call didn’t go through. “That’s too bad. I don’t know what happened,” he admitted.
Bosworth then joined him on stage, where the two, at least, were able to show off one feature that did function: live subtitles in a conversation. The feature would be a breakthrough for people with hearing disabilities. “If you have trouble hearing, I think that this is going to be a game changer,” Zuckerberg stated.
What the glasses promise
Despite the hiccups, Meta is selling a bold vision of such glasses. They claim they will function like a personal assistant, always ready to listen and wait to help—while you’re baking, texting, or driving.
Whereas with smartphones, the idea is that you’ll have to dig into your pocket; with glasses, you’ll just ask, and the AI will answer forthwith through the lenses.
It’s all part of Zuckerberg’s grand initiative to bring AI into our everyday lives, not just on screens but in the world outside too.
Read this later:
COLA forecast may push 2026 average Social Security check to $2,062
A rough beginning, but not the end
Technology demos crash all the time, even for meta-giants. But when all eyes are on them, those glitches are embarrassing. Consumers will fork over $799 and expect a lot better than stuttering delays and “sorry, Wi-Fi problems.”
The destiny of Meta’s AI glasses will depend on whether or not the company can work out these early problems and make people think they’re worth more than a high-end toy.
In the meanwhile, the launch of Meta’s intelligent Ray-Bans reminds us that sometimes innovation doesn’t quite go according to plan, even in Silicon Valley. Sometimes the future trips over its own shoelaces on the way to the stage.