×

Smart Glasses: Data, Privacy and Pineapple on Pizza

Pineapple Pizza through smart glasses

This week Shirley Marschall is looking at the rise of smart glasses, and what the ad industry makes of this new wearable tech...

Things that don’t mix well: water and oil. Pizza and pineapple. Mentos and cola. Red wine and a white carpet.

In tech, the list is growing quickly. Terms & Conditions and human attention spans. Grok and brand safety. And now, the combination of AI, cameras, and the concept of "maybe we shouldn’t record strangers without their knowledge". Now, we're in the era of smart glasses.

Yes, barely a decade after Google Glass failed spectacularly, the tech industry is trying it again. Only this time, the glasses come with AI, which, apparently, makes everything better. Judging by millions of pairs already sold, quite a few people do seem convinced this mix is a good idea. 

How fast we forget… back in 2014, Google Glass users quickly became social pariahs. Bars banned the device and people were asked to leave restaurants. The wearers even earned the affectionate nickname: glassholes.

The backlash wasn’t really about the hardware. People simply felt extremely uncomfortable with strangers walking around with glasses, pointing cameras at their faces. Wondering if the 'glasshole' was reading emails, watching cat videos, or recording everyone. Apparently, society hadn’t quite warmed to the concept of being recorded by random people wearing futuristic eyewear. So Google quietly shelved the product.

And then came AI. Instead of disrupting hardware, AI resurrected it, hoping to turn it into gold. Big tech and scrappy start-ups alike are throwing "AI-infused wearables" at the wall: voice pins, AI rings, mysterious screenless devices, and of course smart glasses. It’s chaotic, ambitious, and very on-brand for an industry that moves fast and breaks things.

Anyway, smart glasses are suddenly everywhere again… only this time the glasses are connected to AI. The pitch sounds almost wholesome: an "AI assistant for the real world". It sees what you see. Answers questions about your surroundings. Helps you navigate the world. 

There’s just one small detail hiding behind that vision. AI systems need data. Enormous amounts of it. And the easiest way to collect it? Let millions of people walk around with cameras and microphones, constantly capturing the world around them. Which basically turns people into AI’s first humanoid robot prototypes. Not quite the augmented human fantasy some imagined. More like the useful idiot version of it.

Two magical letters: T&C

Of course, none of this happens without consent. Technically. The terms of service explain everything…somewhere. Buried in documents so long that most people accept them in the same way they accept cookie banners or software updates: by clicking "agree" and moving on with their lives. (Is it fair to say we spend more time choosing a restaurant for dinner than understanding what we are giving to these companies? Probably.)

But even if users did read the terms carefully, there’s another problem. The person wearing the glasses might agree to the terms. The hundreds of people they encounter throughout the day never did. And yet their faces, voices, and private moments may still end up in the dataset.

And that’s not a hypothetical concern. Meta alone reportedly sold over seven million smart glasses in 2025. But wherever there’s Meta, a highly unsurprising scandal isn’t far away. Recent investigative reporting revealed that footage captured by these glasses (including people undressing, intimate moments, and other highly private situations) was reviewed by contractors working to train the AI systems behind the product. 

Meta’s response? It Wasn't Me. It’s in the T&C.

To be clear, Meta promotes these glasses as privacy-conscious devices while, quite literally, looking straight into people’s private lives. In what world is this OK, even if it technically sits somewhere inside the T&Cs?

But as Lex Zard, policy director at Check My Ads, put it: "Meta keeps pushing unsafe products. Now they want to dominate the device layer the same way they dominate advertising. These glasses should never have been released as consumer products. They do have some privacy measures, but Meta knows the product is fundamentally unsafe."

For years, Meta has lacked this one crucial piece of the puzzle: its own device. That’s why we see them push in this direction again and again. VR headsets, the metaverse, and now smart glasses. Well aware that whoever owns the device sitting on your face owns the data, the interface, and ultimately the advertising opportunities that come with it. For companies whose business models depend on advertising and behavioural data, that’s an irresistible proposition. 

And for advertising, well, if smartphones monetised screen time, smart glasses could monetise everything you look at. A potential new attention surface. Advertising has always been good at finding those.

Ricky Sutton, author of Future Media, said: "Meta is not a hardware company, or a social media company. It’s an ad company. Of the US$200.97bn (£72.7bn) it earned last year, 97.6% came from advertising. Any new device, product or innovation it creates will be to grow that ad slice further. That means more data, more usage, more engagement to deliver more ads. That’s the vision Zuck’s unleashing with his glasses." 

The strange cultural irony playing out at the same time. Films like The Mitchells vs. The Machines warn about the unintended consequences of technology that feeds on human attention and personal data.

Meanwhile, millions of people are buying devices that do exactly that: cameras connected to AI systems owned by companies whose business models depend heavily on advertising and data.

Perhaps the simplest test is this.

If someone genuinely wants notifications popping directly into their eyeballs, that’s their choice. But if the goal is simply an AI assistant whispering directions or reminders, it’s worth asking why the device also needs a camera constantly recording the surrounding world.

Unless, of course, the assistant isn’t the point. The data is.