The headline for Scoble was that he’d taken it using his Google Glass smart spectacles, the tech company’s groundbreaking move into hardware which many were convinced would usher in a new era of tech interfaces. The headline for everyone seeing the photo in question was, by contrast, “we must destroy this technology, for we have seen the evil it can wreak”.
Google Glass never really recovered from the stench of uncool Scoble bestowed on it. The term “glassholes” was coined for wearers and production of the prototype was ceased in January 2015, just three years after the product’s launch.
Now, though, the big beasts of the Valley are betting big on ocular devices once more. While Mark Zuckerberg’s recent tech demo of Meta’s new smart glasses didn’t exactly go smoothly, hands-on reports from media who’ve had a play with them suggest that the $800 Meta Ray-Ban Display specs are… actually pretty good. Per The Verge’s hands-on reporting, “you can use it to see text messages, Instagram Reels, maps, or previews of your photos, letting you do all kinds of things without having to pull out your phone… it sort of functions like a pop-up extension of it.”
Coming next will be even, er, ‘smarter’ glasses which will let you stream live video of your field of vision, use augmented reality to give you a digital overlay on your view of the world, do live videocalls with your interlocutors hovering in the corner of your field of vision… all the scifi stuff, basically. Zuckerberg says he expects usage of smart glasses to have overtaken smartphones by 2030, that they will become “the main way we do computing”, and that those without will be at a “significant cognitive disadvantage” (he did also say a lot of stuff about “the metaverse” in 2021, though, so perhaps take his pronouncements with a pinch of salt).
So, are we coming to the end of the lifecycle of the smartphone? Will we all soon swap the head-down swipe-tap for a heads-up view of a digitally-augmented world? A quick glance at social media following Meta’s product demo suggested there might be some resistance – although that was admittedly on Bluesky, where the general vibe tends to be “Zuckerberg is Satan and we must destroy his works”.
Suggested Reading
The rise of the robot eulogisers
Loath as I am to contradict the refuseniks, it’s worth pointing out that a range of smart Ray Bans produced by Meta already exists, and the collaboration is outselling “standard” sunglasses in many of the brand’s stores. The current iteration of the tech is, in a strange way, not a million miles away from Google Glass a decade ago – it just looks a damn sight better thanks to the tie-up with Luxottica (in which Meta recently bought a stake) – but it’s what’s coming next that does rather indicate we might be doing away with the omnipresent black slabs in our pockets over the next few years.
The thing is, phones are quite rubbish in many ways. They’re an intermediating layer between you and the web, or between you and the thing you’re trying to do or experience.
They require you to do all these tedious things like holding them, and typing, and making sure you don’t drop them, or scratch them, or stand holding them in a public place for longer than three seconds in case an enterprising young person on a bike decides they quite fancy taking it out of your hand… they are fiddly, and annoying, and, if we’re honest, they’ve not gotten meaningfully better in the past few rounds of “upgrades”. We needed smartphones because we wanted the internet on the go and the only way we could conceive of accessing it was via a textual interface, which required a device we could type on. That, though, is no longer the case.
The reason Zuckerberg, and others, are bullish on glasses as the coming tech layer is – as with so much right now – the generative AI boom. With LLMs, including Meta’s own, now perfectly capable of understanding audio and responding in kind, it’s possible to intersect with the digital world using only your voice, so there’s no need to type any more. The AI can “see” what you see, analyse it, and respond in real time – so there’s no need to take photos for upload and analysis.
Thanks to augmented reality technology, you can now see a screen superimposed onto your field of vision, so there’s no need to pull the (slippery, breakable, stealable) device out to stare at it to get your information. Why would you need your phone to work out how to get to that party, and who else is going, and what to wear, when you can get the address, get your crew together and choose your fit without once taking off your specs or tapping a device?
You wouldn’t, is the short answer. Which is why I’m with Zuckerberg on this one – your phone is dead, it just hasn’t quite realised it yet. Of course, as with all things Zuckerbergian, there’s a soupcon of terror surveillance to go along with your bracing shot of futurism – if you think we spend too much time online now, just wait til we all have the internet strapped to our faces for nine hours a day. If you don’t think that this is going to usher in a whole new era in nefarious data-extraction, and the subsequent sale of that data to advertisers then, well, I have a bridge to sell you.
Oh, and if you’re a privacy-focused sort, best not to speculate too hard about all the things someone might potentially do if they saw you in the street and thought you looked interesting. Someone’s already spun up a prototype of smart glasses software that lets a user find personal information, including social profiles and contact details, for anyone they see through their lenses. Meanwhile on TikTok, adverts promise stickers that will mask or block the indicator light on existing Meta smart glasses to prevent others from knowing that a user is recording them; elsewhere, upstart competitor specs are coming to market with no indicators whatsoever, meaning it’s impossible to know whether someone is filming or recording you. Expect to see the smart glasses / balaclava combination everywhere come 2030.
