Walk down any street and it’s a familiar scene: people crane their necks looking at their phones. But in the not-too-distant future, we’ll probably be content to just watch as digital information hovers over the world before us, integrating a blend of the digital and real worlds, all thanks to augmented reality.
Contact lenses could soon replace our phone screens
When I visited Mojo Vision’s office in July, I held its augmented reality smart contact lens about an inch in front of my eye to try out its functionality, moving a cursor around the space in front of me by moving the lens. Since I couldn’t wear the contact lens, I used a virtual reality headset to test out its eye-tracking technology and demo apps, directing a small cursor by simply moving my eye. I could read from a digital teleprompter that displayed a series of words as I moved my eye, and I could also look around the room to see arrows pointing north and west, designed to help potential users navigate the exterior.
To “click” on one of the apps dotted around a circle hovering in front of me, I simply stared at a small tab next to the app for an extra second. Numbers and text appeared in my upper field of vision, indicating, for example, my pedaling speed, or displaying the weather, or giving me information about an upcoming flight. To close the app, I would look away from that information for a whole second.
Technologists have been talking for years about what the next computing platform will be, a decade after mobile devices replaced desktop computing as our primary gateway to the internet. Meta CEO Mark Zuckerberg is banking on the Metaverse, a fully immersive virtual world accessed through a headset.
But I think the biggest shift will be towards augmented reality, where glasses or contact lenses display information about the world around us so we can see both the online world and the real world. If there’s one thing humans love to do (although badly in many cases), it’s multitasking. Phones will become more like mini-servers that coordinate all the different devices that we will increasingly wear on our bodies: headphones, watches and soon glasses, the final piece of the invisible computing puzzle.
The Mojo Vision Lenses are an engineering marvel and perhaps one of the most ambitious hardware projects in Silicon Valley today. The company had to develop its own chemical and plastic compounds that would allow an eyeball to breathe through a lens covered in electronics. When I held the lens in my hand it was noticeably thick and large enough to extend beyond the iris to cover parts of the whites of the eyes.
“It’s not uncomfortable,” said David Hobbs, senior director of product management at the startup, which has ported several prototypes.
The lens incorporates nine titanium batteries of the type normally found in pacemakers and a flexible circuit narrower than a human hair providing all the power and data. A slightly convex mirror bounces light off a tiny reflector, simulating the mechanics of a telescope, which magnifies pixels that are packed into just two microns, or about 0.002 millimeters. From a few meters away, this small screen looks like a point of light. But when I looked closer through the lens, I was able to watch a video of Baby Yoda, an image as crisp and engaging as any video I’d seen on a screen.
I could imagine people watching TikTok videos that day, but Mojo Vision wants the lens to have practical uses. The information it displays on your eye should be “very tight, snappy, quick snippets,” said Steve Sinclair, senior vice president of product and marketing. Still, the company is figuring out “how much information is too much information,” according to Sinclair, who previously worked on the Apple Inc. product team that developed the iPhone.
For now, Mojo Vision is working on a lens for the visually impaired that shows glowing digital edges overlaid on objects to make it easier to see those objects. It is also testing different interfaces with companies that create running, skiing and golf apps for phones, for a new kind of hands-free display of activity. Sinclair says that barring regulatory delays, consumers could buy a Mojo lens with a custom prescription in less than five years. This may be an ambitious schedule given that other augmented reality projects have been delayed or, like Google Glass, have failed to live up to the hype.
Google’s parent company, Alphabet Inc., has also failed to provide a smart contact lens for medical use (1), but overall, big tech companies have been driving a much of the development around virtual and augmented reality. Apple is working on lightweight augmented reality glasses that it plans to release later this decade, Bloomberg News reported. Next year it is also expected to launch a mixed reality headset, which it showed off to its board in May. Facebook currently dominates virtual reality device sales with its Quest 2 headset, but it’s also racing to launch its first augmented reality glasses in 2024, according to an April report in The Verge.
Why does augmented reality take longer? Because it merges digital elements with physical objects in a constantly moving view. It is a complex task that requires a lot of processing power. Even so, our desire to keep at least one foot in the real world means we’ll likely be spending more time in augmented reality in the long run.
The big question is how to balance being present in real life while constantly seeing digital information. Today, it takes seconds to pull out a phone, launch an app, and perform a task on its screen. In the future, we’ll be able to enter an app just by looking at it for an extra second. It will raise all kinds of thorny issues around addiction and how we interact with the world around us.
Sinclair says the same question came to him years ago when he was working on the iPhone. “I can’t say how we at Mojo are going to completely mitigate that,” he said. “But the trend is going in this direction, that people are going to have instant access to information.”
Whether with contact lenses or glasses, the human eye will indicate a world swimming in more digital information than ever before. Our brain will have to get used to a lot of things.
More from Bloomberg Opinion:
Sorry Zuckerberg, The Metaverse Won’t Replace Zoom: Parmy Olson
The whole world could do with a first iPhone: Tim Culpan
Who needs the government to go to Venus? : Adam Minter
(1) Google announced a partnership with Novartis in 2014 to develop a glucose sensor smart contact lens. Four years later, Alphabet’s life sciences division, Verily, said it had canceled the project.
This column does not necessarily reflect the opinion of the Editorial Board or of Bloomberg LP and its owners.
Parmy Olson is a Bloomberg Opinion columnist covering technology. A former journalist for the Wall Street Journal and Forbes, she is the author of “We Are Anonymous”.
More stories like this are available at bloomberg.com/opinion