In 2013, I was given a watch for Christmas. It was like one of those vintage Casio watches with the murky green LED screens. Except when you pressed one of the knobbly buttons on the side, it would announce the time.
As we all do when we receive a gift that we aren’t too fond of, I grinned, feigned delight, and subsequently felt bad for not adoring the watch I was then strapping to my wrist. In retrospect, it was an exceptionally thoughtful present from an extended family who I saw, at most, once every four years. I was, after all, 6,572 miles away from my home in London, spending Christmas for the first time at my aunt’s house in the sweltering heat of Malaysia.
But that evening, as I sat indoors wallowing in the cool breeze of the air conditioner, with my eight and five-year-old second cousins clambering on me, eagerly pressing the button on my watch to make it screech “9:18 p.m.” in an obnoxiously loud female American voice, I couldn’t help but feel completely infantilized. It was like I’d just been handed a kid’s novelty Mickey Mouse watch at the age of nineteen. Worse still, I felt guilty for feeling this way about a device that would undoubtedly improve my life.
Earlier that summer, I had lost my central vision to a rare genetic disease called Leber’s Hereditary Optic Neuropathy (LHON). Ever since, I’ve felt chained to the tendrils of technology.
A part of me says that I’m lucky — or as lucky as someone can be — to have lost my sight at a time of vast and relentless technological change, especially in the sphere of disability. But at the same time, I feel — still, six years on — suffocated by assistive technology. I was eighteen when I began losing my sight: the age when young adults are getting ready to leave the roost and take on the world. Instead, I was held back from becoming independent by my loss of vision, and forced to rely on nannying technology to do the most menial of tasks.
The thought that I’m going to be using assistive technology for the rest of my life still throws my stomach into freefall. I resent depending on technology to do all of the things we learn to do as children. I resent looking weak and vulnerable. I resent wearing hideous, eye-catching devices that tell the world I need help. I resent the loss of both my autonomy and my privacy.
Rear Window
It wasn’t long ago when vision-impaired people had to rely on in-person support workers and independent-living care workers to help run basic errands. Today, technology is making it possible to get help remotely. Around the world, tens of thousands of vision-impaired people like me are welcoming strangers virtually into our homes — or bringing them along journeys outside our homes — to help complete tasks that require sight.
These disembodied guests enter our lives through the rear camera on our smartphones, via an app called Be My Eyes. If you need help doing something that requires the vision of a sighted person, all you have to do is open up the app, and within seconds you’re connected to a volunteer who can see whatever you point your rear camera at.
Last week, I connected to a volunteer and asked if they could tell me whether the frozen pizza I’d shoved into the oven and had already begun eating had expired or not. As I twisted the pizza box this way and that in the direction of my iPhone camera, I was acutely aware of how many things were in the shot. “Enjoy your pizza,” said a woman’s voice after she confirmed that I wasn’t about to die from food poisoning. “I love your pajamas, by the way!”
I melted into the ground from embarrassment as I pressed the “end” button on my iPhone, looking down at my reindeer-themed pajama bottoms. She had probably thought she was saying something completely innocuous. But it served as a painful reminder that my entire life was now on display. My dependency on strangers’ eyes means I’ll never escape these intrusions, no matter how vulnerable they make me feel.
There are startups going even further than Be My Eyes. Aira is a company that built a headset for people with low vision. For $124 a month, you can strap the device to your head and enjoy 120 minutes with a trained professional agent. The headset, which looks like a mix between Google Glass and steampunk-inspired specs, comes with a camera and an earpiece. The agent can see your surroundings through the camera and access your GPS location. With that, they can help talk a vision-impaired person through everyday sighted hurdles. It’s like a paid, hands-free version of Be My Eyes.
I would never wear Aira in public. Like my talking watch, I worry that it would draw attention, that it would single me out as someone who needs help. This is one of the reasons why I prefer to use Microsoft’s Seeing AI app: it’s the rare piece of assistive technology that helps you not have to ask for help from a stranger. As the name suggests, Seeing AI uses artificial intelligence and the smartphone’s rear camera to help people with low vision recognize text, scenes, and even faces. You can store pictures of your friends and family in the app so that it can identify them by name when the phone is pointed in their direction.
Like a lot of artificial intelligence, however, Seeing AI isn’t perfect. The other night, I was walking down London’s South Bank, desperately trying to find a pub using Seeing AI. At every venue, I stopped and pointed my phone up at the sign. Six pubs later, I realized that the app isn’t yet at the level where it can recognize ornately designed text or curly font that isn’t Times New Roman.
I considered asking a passerby for help. I even considered using BlindSquare, an app that could tell me how many meters away and in what direction the pub is from me. But I decided against it. Instead, I called up my friend sitting in the pub and asked them to come and get me.
From Bats to Apps
The assistive technology boom for the blind arguably began in 1976, when New Zealand-based engineer Leslie Kay and his-then graduate student Russell Smith invented the Sonic Guide. Inspired by bats’ use of sonar for navigation, the Sonic Guide was a $2,000 piece of eyewear that transmitted auditory information to the wearer about their environment. The closer they came to an object in their path, the louder and more frenetic the noise became.
The Sonic Guide was revolutionary — except no one bought it. Kay and Smith went on to invent many other assistive devices, including the Viewscan, the first portable video magnifier; the first portable talking word processor; and the BrailleNote reader, all of which live on in different forms today. The company that Smith founded, today known as HumanWare, became the Apple of blind technology.
The field has been moving rapidly since the release of the Sonic Guide. Today we have digital magnification glasses like eSight; powerful video magnifiers from HumanWare; artificially intelligent headsets like OrCam, similar to Seeing AI; and Kay and Smith-inspired sonar-detecting bracelets like the Sunu Band. They’re all quite expensive, however, which is why the App Store’s endless stream of affordable and often free assistive apps are so important. With text recognition apps like KNFB Reader, money-identification apps like LookTel, GPS and location-orienting apps like BlindSquare or Microsoft’s Soundscape, costly physical devices are losing their monopoly on assistive technology.
Yet these apps aren’t always particularly sophisticated. Like Be My Eyes, they often involve a video link to a remote worker. They frequently rely on human rather than artificial intelligence. As such, they belong to a broader trend of on-demand labor apps like Instacart, DoorDash, and Uber. The support workers and personal assistants that vision-impaired people have always depended on are still there. But now they live on the other side of our phones, as gig workers performing platform-mediated labor.
Tethered, Together
I’m with my friends in Old Spitalfields Market in East London. It’s trendy, sprawling, and filled with the scents of cross-continent street food. The market is bustling and we’re ravenous, but we can’t find the place we’re looking for. Finally I say, “Should I just use my phone?”
“Let’s just use our eyes,” my friend says to me, before biting her tongue.
And then I get it: I’m not the only one who feels uncomfortable about my reliance on technology. I’m not the only one who hates knowing that I can’t live without my smartphone.
So I put my phone back in my pocket and let my friends keep looking for that Mexican place we can’t find. At least I know I’m not alone.