September 25, 2024, USA, Menlo Park: at the Meta Connect developer conference, Mark Zuckerberg, lead … [+]
In our book Money in the MetaverseVictoria Richardson and I wrote that while your Apple Vision Pro will likely remain at your desk and replace your computer, smart glasses will likely replace your cell phone. These comments may have seemed hyperbolic to some, but with Mark Zuckerberg saying the same thing, I think fintech strategists need to start thinking about the implications for the financial services industry.
Smart glasses
We also quoted Louis Rosenberg (CEO of Unanimous AI) who predicted that in ten years people will laugh at images of us walking down the street and staring at a small screen, as virtual and augmented reality (with layers of rich virtual content over the real world) ) will become the main digital gateway, with the transition starting mid-decade. This is what Mark Zuckerberg says just said about this at Meta’s annual Connect developer event:
I think it’s pretty easy to wrap your head around it [the idea that] There are already 1 to 2 billion people who wear glasses every day. Like everyone who has upgraded to smartphones, I think everyone who has glasses will upgrade to smart glasses fairly quickly in the next decade. And then I think it’s going to become very valuable, and that many other people who don’t wear glasses today will eventually start wearing them too.
Meta embarks on a multi-year, multi-billion dollar effort to position itself the vanguard of connected hardwareand in addition to the new and cheaper Oculus Quest 3S headset, it also showed off its improved, AI-powered Ray-Ban Meta smart glasses. I plan to get my hands on some as soon as possible, because (as more than one commenter has already pointed out) it’s the AI that makes their augmented reality proposition so powerful. As one reviewer wrote, the device shows us how we could do that get access to Always-On Computing soonwithout the distracting wrist turning to look at a smartwatch, or the palm turning to look at a phone screen. Potentially disturbingly, the glasses can distance us even further from our real-life presence.
I can use the same simple example that everyone else uses to get the point across. I’m good with faces and generally recognize the people I’ve spoken to, although not always. I’m bad with names and it’s a bit embarrassing to walk around an event, office or (as happened recently) a corporate hospitality event and say hello to people you know and like, and then the first ten minutes of conversation desperately trying to remember their name. I want light tinted Ray Bans that don’t hide my eyes from people but tell me to talk to someone within five seconds of stopping “this is Joey Donuts from The Donut Factory, you last met him at Money20/20 in Amsterdam , but you remember him from his time at Visa, where he led that mobile payments project that you advised.”
Take my money, please.
If I were a salesperson, I would want the obvious upgrade of telling me his spouse’s name, how many children he has, where he was born and which football team he supports, and so on. It won’t be long before we’re all walking into Money20/20 wearing smart glasses that can tell you who everyone is and scan their LinkedIn profiles (so no one needs name badges anymore). I expect you can set it up so that there are green ticks next to people who influence budgets at banks, and red crosses next to empowered but budgetless deputies (for example me) and a yellow circle next to someone who reports to a green tick .
(Come on, you’d all do it. It’s embarrassing to meet people and misremember their names, but it’ll be more embarrassing to be the only person in the room who doesn’t know everything about everyone.)
But… and you knew there was going to be a ‘but’… what about privacy? What if there’s a pervert walking down the street and he has his glasses set to spot women who live alone and have a job that makes them work late and…well. you see my point. Given any face, it won’t take that long for an AI to find the social media footprint and get to work, a perspective evident from the earliest days of technology when Google Glasses were the new new thing and the company’s CTO, Andrew Bosworth, wanted to equip them with facial recognition capabilities. In one recording of an internal meetinghe said that omitting facial recognition from augmented reality glasses was a “missed opportunity” to improve human memory.
(What we need is some kind of equivalent of the “noindex” meta tag so that our faces can opt out of this dystopia, but I don’t see how to make that happen.)
Pay your way
I’m very interested in what this transition to privacy-free augmented reality will mean for money. I think it will make payments cheaper and more secure. Why? Well, as The Economist noted in a special supplement on digital payments: “With all payment systems there are tradeoffs“. I quite liked the phrasing of the problem, which is that “all it takes is a spreadsheet”, but (and this is a big but, as I’m sure you know) to use spreadsheets and still prevent fraud, managing disputes, ensuring privacy and offering credit, well… the costs can add up The Economist framing is “if you know who everyone is, all you need is a spreadsheet,” a framing that succinctly captures a fundamental barrier to progress in financial services.
When you know who everyone is.
Well, with your smart glasses on you know who everyone is, whether you’re sitting at a real bank and whether you’re talking to a real police officer, doctor or lawyer. Just as financial services became more convenient and secure as they moved from the Internet to mobile phones, they are taking another step forward as they move from mobile phones to super screens. I am very optimistic about the new opportunities for secure trading that these services provide. gift.