I wore Meta’s smartglasses for a month – and it left me feeling like a creep | AI (artificial intelligence)


Lately, I’ve been hearing Judi Dench’s voice in my head. She tells me tomorrow’s forecast, when to turn right, that there’s been another message in my group chat. Day or night, Dame Judi is eager to assist. When I ask the eight-time Academy Award nominee what I’m looking at, she answers: a residential area, a person in a pub, daffodils. “They are a bright yellow colour and are often associated with spring.”

This isn’t a delusion. This is, apparently, progress. I am test-driving Meta’s smartglasses and Dench voices its integrated AI assistant: “Here to chat, answer questions, create images and provide advice and inspiration,” said “Judi” when I selected her over the actors John Cena and Kristen Bell. “Shall we begin?”

Over the next decade, predicts the Meta founder Mark Zuckerberg, smartglasses will gradually become “the main way we do computing”, fulfilling many of the same functions as smartphones – taking photos, playing music, making calls, giving directions. For people who wear glasses, Zuckerberg has suggested, the upgrade is a no-brainer, bundling more features into an essential accessory. And for those of us who don’t, it is only a matter of time. In 2025, Meta sold more than 7m pairs globally.

Are they really the future, bringing us the benefits of tech without the tyranny of screens, or will they trap us deeper in the digital world? To see for myself, I wore a pair for a month.

The most common response to my new frames is: “Why?” I don’t usually wear glasses, and these clear-lens Wayfarers (part of Meta’s collaboration with Ray-Ban) are on the heavy side. I look like the nerdy girl in a 90s romcom, or the old guy from Up, but the sunglasses would have made me even more conspicuous if worn indoors. The other question I get asked is: “Are you filming me?” In general, I find people do not like being around someone wearing Meta glasses, not least because sometimes the answer is: “Yes.”

Elle’s friend Stevie photographed while Elle was wearing the smartglasses. Photograph: Elle Hunt

When I arrive home, my boyfriend, Marco, clocks me wearing them, then instantly freezes, as though he has unexpectedly come face to face with a predator. (Marco is not his real name; I have violated his privacy enough.) Although they are still fringe tech, concerns about privacy are mounting, amid reports of users covertly recording in public. Last month, journalists in Sweden found that moderators employed by Meta review intimate footage from the glasses, including of people using the toilet and having sex. (Meta responded that it took the protection of people’s data very seriously and was constantly refining its efforts and tools in that area. It said that unless users choose to share media they’ve captured with Meta or others, that media stays on the user’s device.) That’s not to mention the company’s reported plans to implement facial recognition.

Meta’s newer Display glasses have tiny screens embedded in the lenses for displaying text messages, maps and the like, but they cost $799 (£600) and are not yet available outside the US. My entry-level Gen 1 Wayfarers, which retail for £299, aren’t as functional as a smartphone.

It’s possible that smartglasses could bring about a paradigm shift, as an accessible first step towards wearable AI and augmented reality. Meta is certainly betting big on them, expanding its partnership with the eyewear giant EssilorLuxottica (the parent company of Ray-Ban, Oakley and more than 150 other brands) and heavily investing in its Meta AI. The pitch is that smartglasses remove friction between our physical and digital worlds, enabling the wearer to be more present “in the moment, with your head up and your hands free”, as Alex Himel, Meta’s vice-president of wearables, recently told Radio 4.

A Meta spokesperson said in an emailed statement that the company foresaw smartglasses being increasingly used alongside smartphones, not replacing them. Eventually, they will have more specific, unique functions, the spokesperson said, but for now they aim to be less distracting and more convenient than taking your phone out of your pocket. Wearing a camera does encourage me to take more photos while out with friends, but, when I later download them on to my phone, most of them are unfocused and awkwardly framed. It’s not easy, composing an image with your eyes.

I get more use out of the glasses as headphones. They pipe audio directly into your ear without blocking ambient sound; only someone standing very close to you could tell that music was playing. It’s nifty tech, but the trade-off is continual interruptions. Headphones signal to others that you are not listening; glasses don’t. I am self-conscious while taking calls in public, apparently talking to myself.

So far, so nonessential. Smartglasses’ real promise lies in the integrated AI: what Meta is hailing as “the most natural and seamless way to access an AI assistant in your daily life”. Judi can operate my phone via voice commands (“call Ian”, “take a picture”), answer questions and engage with what I can see – for example, by reading aloud printed text or identifying objects. Blind people and those with low vision have reported this as transformative. Using the Be My Eyes feature, they can even connect with a sighted volunteer, who then “looks through” their glasses’ camera and gives live feedback. It is easy to see smartglasses’ potential as assistive technology; they are already being used to help people with dementia, dyslexia and other needs. But, for now, they are not reliable or functional enough to offer consistent support or replace other aids.

Even Zuckerberg, presenting the new Meta Display on stage last year, repeatedly tried and failed to make a simple video call. My experience is similarly haphazard. Judi frequently mishears me, cuts out midway through answering my questions or reads aloud only some new texts in a chain, meaning I don’t feel confident giving her tasks without then checking them on my phone – defeating the point of “hands-free” tech.

The AI assistance is also limited. Judi can distinguish daffodils from daisies and confirm when I am in a pub (as opposed to outdoors), but she struggles with even slightly more involved requests. When I ask her to ID a friend’s hat, for example, Judi says she can’t yet help “with requests about product pricing or availability”. At Tate Modern, when I ask her to tell me about the artwork I’m looking at, Judi describes “a bed with white bedding and a blue mat underneath it [with] various items scattered”. It’s not wrong, exactly, but it’s not what I want to know about My Bed by Tracey Emin.

The AI assistant’s description of Tracey Emin’s My Bed installation at Tate Modern was very sketchy. Photograph: Tracey Emin

Of all the glasses’ selling points, I’d been most enthusiastic about the real-time translation, imagining it smoothing my interactions abroad and allowing me to eavesdrop on Marco, a native Italian speaker. If Marco was initially wary of the glasses, after two weeks he is sick of the sight of them, but indulges my request to role-play tourist and local.

I ask him for directions to “the nearest duomo”, looking directly at him, as instructed, with the Meta AI app open on my phone. Marco rattles off a response with authentic speed and dismissiveness. An English translation appears on my phone’s screen, after a slight lag, but it is incomprehensible. The interaction is significantly less fluid than typing into Google Translate and passing the phone back and forth. This is frequently the case in my month of Meta glasses: instead of a more seamless experience, they add another layer of tech, one that’s more fiendish to navigate.

There’s a reason smartglasses are not all that popular, suggests Iain Rice, a professor of industrial AI at Birmingham City University. As the technology evolves and improves, they may become standard for certain activities, but at present they conspicuously lack clear, valuable “use cases”. His take is that Meta wants to be seen as innovating alongside Google and Apple, but lacks the vision and business acumen to come up with genuinely essential tech. Rice points to the metaverse, Zuckerberg’s long-trailed digital world, from which Meta began shifting resources away last month, after $80bn in investment. Meta’s glasses could prove similarly misjudged, he thinks.

Most people don’t want to wear glasses if they don’t need to, Rice points out – which is partly why Google Glass failed 10 years ago. “Meta has spent time trying to make them well designed and trendy … but they didn’t seem to have the market research at the forefront: do people actually want this? Will they buy into it? Or is the tech just not there?”

It’s no surprise that take-up seems to have been highest among content creators, perpetually in need of more footage and fresh angles on their day-to-day lives. More specifically, Meta glasses have become associated with pick-up artists, social-media pranksters and other public irritants. It has become increasingly common online to see footage shot with glasses, showing the wearer’s point of view, although the person featured may not have consented to being recorded or even been aware.

Elle Hunt testing the smartglasses in a cafe in Norwich. Photograph: Ali Smith/The Guardian

A blinking LED on the frame alerts others that recording is in progress (and was made more visible for the Gen 2 model), but workarounds are widely shared online. Even as is, the light is easy to miss, especially in bright light. To test this, on the tube home one evening, I guiltily snap a picture of the people across from me. No one notices, absorbed in their own devices. They might have twigged had I angled my phone in that now familiar cagey way, but people don’t yet think to check glasses for a light or a camera.

The number of content creators, and worse, capitalising on this, plus the absence of other obvious uses, has earned Meta glasses the nickname “pervert glasses”. Of course, not all users are ill-intentioned, but I can’t deny: not only do I feel creepy wearing them, but they also lead me to think like a creep. When I see a stranger walking her lookalike dog, and when I run into an ex for the first time in years, both times I have the fleeting thought: I wish I’d been recording.

I’m perturbed by how quickly my initial discomfort passes and the glasses become second nature. Just having a covert camera makes me want to use it, the possibilities of the tech overriding my better judgment and even basic decency.

The Meta spokesperson told me that, as outlined in the terms of service, the user is responsible for using them lawfully and “in a safe, respectful manner”: “As with any recording device, people shouldn’t use them for engaging in harmful activities like harassment, infringing on privacy rights or capturing sensitive information.”

The trouble is, as emerging tech, smartglasses complicate what we consider harassment, privacy and sensitive information. Some users, for example, argue that being able to film makes them feel safer, while others feel threatened by their presence. With no laws against recording in public places in the UK, whose rights take precedence: those who want to wear smartglasses or those who don’t want to be perceived by them?

Elle taking the glasses for a Sunday-morning jaunt to Ikea. Photograph: Elle Hunt

This comes to a head for me on a solo Sunday-morning jaunt to Ikea. Realising I don’t have to suffer alone, I use the glasses to place my first video call, to Marco. At first, it’s all fun and games, showing him the Swedish snacks and soft toys. Then he tries to order me to fling some around and have a weird interaction with an employee. “Why?” I ask, pointing out that neither of us are typically disruptive in public. Marco apologises, blaming force of habit: the first-person point of view reminds him of playing a video game.

No other shoppers seem to notice me talking to myself, but the knowledge that I am broadcasting without their knowledge is genuinely queasy-making. When a small child crosses my path, I instinctively, immediately whip my head towards the shelves with a sinking feeling. I cannot fathom what sort of person would be untroubled by doing this or, worse, feel entitled to do it. After that, I never make another video call and stop dismissing Meta glasses as a gimmick.

Even in their present shonky state, Rice agrees, they are shaping up as a flashpoint in a bigger, existential discussion about just how much integration we want with tech. “They’re pervading slowly into society … If you see a person wearing them and don’t want to be recorded, unfortunately, the only way to make sure is by moving out of the way.”

Nonconsensual and covert filming are only the tip of the iceberg, he says. For Meta, smartglasses are a way to gather masses of data on individual users and whatever – or whomever – they’re seeing. Some media and interactions captured on Meta glasses could be used to train its AI, the company admits. “If you knew what was being done with all the imagery after you’d recorded it, I don’t think you’d be doing it the way you are,” Rice says.

Meta could make changes to protect bystanders’ privacy, he suggests, such as blurring and removing unapproved faces at the preprocessing stage. His concern is that, as the tech gets cheaper and better, all glasses will be made smartglasses, just as Bluetooth became the norm. “I think they’ve released a technology that the world wasn’t ready for, and definitely hasn’t regulated for, so there has to be some ownership.”

In the meantime, he says, we should feel confident pushing back: by asking friends wearing Meta glasses to take them off around us; and strangers we suspect of filming us to stop. What would Rice say to someone who was thinking about getting a pair? “I’d say: I don’t think you should.”

After a month, I’m glad to return the glasses to the office. I’m sick of seeing friends’ faces fall at the sight of me, of feeling trapped inside the computer, of hearing Judi say I’ve got a new text. Before I wipe the device’s memory, I alert Judi to my intentions and ask if she has any last words. “Thanks for letting me know,” she says.

Do you have an opinion on the issues raised in this article? If you would like to submit a response of up to 300 words by email to be considered for publication in our letters section, please click here.



Source link

  • Related Posts

    Pixels and paintings: video games return to the V&A | Games

    In the grand entrance of the Victoria & Albert Museum, beneath a looming dome with ancient statues visible through nearby arches, a programmer/DJ is busy live-coding a glitchy electronic music…

    Watch the first crewed Artemis mission take flight

    Artemis II, the first crewed mission under the Artemis program, is scheduled to launch today, April 1. NASA is opening a two-hour window for its lift off, starting at 6:24…

    Leave a Reply

    Your email address will not be published. Required fields are marked *

    You Missed

    Grandparent scammer avoids further prison time, slapped with $70K restitution order

    Grandparent scammer avoids further prison time, slapped with $70K restitution order

    Pixels and paintings: video games return to the V&A | Games

    Pixels and paintings: video games return to the V&A | Games

    Blanchet hits the hustings in Terrebonne, Que., with a call for public consultation on Alto high-speed rail project

    Blanchet hits the hustings in Terrebonne, Que., with a call for public consultation on Alto high-speed rail project

    QXO Completes Acquisition of Kodiak Building Partners

    Trump to attend supreme court arguments on landmark birthright citizenship case – US politics live | US supreme court

    Trump to attend supreme court arguments on landmark birthright citizenship case – US politics live | US supreme court

    Artemis II live updates: Window for launch opens today

    Artemis II live updates: Window for launch opens today