I'm a disabled person, and Siri changed my life. Then I lost the ability to speak.

Impact
ByJamison Hill

Two years ago, I abruptly lost my ability to speak.

I was battling myalgic encephalomyelitis, commonly known as chronic fatigue syndrome, a debilitating multi-system disease that, among other things, impairs the body's metabolic functions. One day I was walking around talking and eating just like a healthy person; the next I was bedridden, unable to speak or eat solid food.

Before I became so severely ill, I was able to function relatively well with the disease. I lived on my own and worked from home, and I used Apple's virtual assistant, Siri, every day. I was able to speak loudly enough for it to register my words, and I often enlisted its help: I had it play soothing music when I was laid up in bed fighting a particularly intense surge of symptoms; I used it to call a family member when I needed help cooking and doing laundry; I even used it to research my disease when I was first diagnosed.

But as I became sicker, I soon realized my voice was losing its sonority. Siri was having trouble recognizing my words — an unusually indicative barometer of my deteriorating health.

Spouting off commands into a smartphone has become a part of many people's daily lives, able-bodied or not. For many of the 77% of adults who own smartphones in the United States, saying "Call home" or "Check the weather" is as routine as telling a loved one "Good morning."

This percentage of people includes some living with chronic illnesses and disabilities. Many disabled people rely on virtual assistants for the most essential tasks, like buying groceries online or tracking crucial health metrics such as blood pressure and glucose levels.

Disabled people with aphonia — those, like me, who can't speak — are at a major disadvantage when using virtual assistants.

There's a big problem with "Hey, Siri"

Unfortunately most virtual-assistant apps recognize a command or question only through a prompt, usually by pressing a specific button or speaking directly into the phone. To prompt Apple's virtual assistant, for example, all you have to do is say, "Hey, Siri."

The problem: I can't say "Hey, Siri." I can't, for that matter, say anything loudly enough for Siri, or any other virtual assistants, to register my voice. Every time I try to say something, Siri replies with "Sorry, I'm not sure what you said." And as sincere as I believe that sentiment to be, it doesn't solve my problem.

Jamison Hill/Apple

I am, however, able to use Siri to read. With the "Speak selection" and "Speak screen" settings turned on, I am able to select text — a text message, email or article — for Siri to read to me. Despite the app's shortcomings in recognizing my voice, this feature has become quite valuable to me; I even used it to proofread this article.

But this is the only thing I am able to accomplish with Siri's help, leaving a rather large void in my smartphone usage. Virtual assistants that recognize only voices, not text commands, are incomplete to me and people with similar disabilities.

Luckily, Google has filled the void — and so will the Siri update in iOS 11

In May, Google introduced a virtual assistant for the iPhone — the first such app I've used with dual dictation and typing features. Apple also announced a texting feature coming to Siri in iOS 11 this fall.

This is fantastic news. I started using Google Assistant when it first came out for the iPhone in May and it has already made my life easier. I have used the typing feature to set reminders to take my medications, to look up the email addresses of my many doctors and nurses and even to buy critical supplies for my IV treatments like sterile alcohol pads, bandages and medical tape. I could do these things manually without Google Assistant, sure, but the virtual assistant lets me do a number of tasks with one app as opposed to several.

The option to use either voice or text commands in Google Assistant opens many new avenues, however small, for disabled people, as some of us rely on the voice feature while others like myself benefit from typing instructions.

And while critics say that Google Assistant can't match the convenience of using Siri directly on the iPhone's "home" button, for me the convenience is actually a nuisance. I am constantly activating Siri by accident, resulting in its jarring voice and other incidental noises disturbing my sleep.

Jamison Hill/Apple

The perfect assistant doesn't exist

Recently, my health has subtly started to improve, though my limited speech still prevents me from using Siri’s voice-recognition feature. But I haven't let that deter me from conducting my life through my smartphone. I now use both Google Assistant and Siri — the former for most logistical tasks and the latter to read my messages and proofread my writing.

On most days, I wake up and immediately grab my phone. I open Google Assistant, then send a text message to my caregiver in the other room. This cues her to bring me food and my medicine, as she does every morning. The rest of my day plays out in a similar fashion as I use Google Assistant to communicate with my caregiver, friends and family members, as well as compose emails to my doctors. Then, if I am writing an article or doing research, I switch over to Siri so it can read the text to me, which saves me considerable cognitive energy throughout the day.

There is an obvious trade-off between these two virtual assistants. The Google Assistant iPhone app, does not, that I know of, read text. Siri does. But with Google Assistant’s typing commands, I am able to complete tasks I was only able to do with Siri when I could speak.

Together they offer what one app should: a virtual assistant that allows me to type commands and prompt it to read text. The main purpose of a virtual assistant, as I see it, is to make the user's life easier. For me, that would mean knowing I can't speak and doing its best to accommodate my disability so I can still perform the tasks I need to complete each day. The perfect virtual assistant would know my patterns, the tasks I do every day, when I do them and that I can't vocalize my instructions. Then the app would automatically tell my caregiver that I'm awake and need my medicine without me needing to do anything but confirm the action on my screen.

The perfect virtual assistant would know my patterns, the tasks I do every day, when I do them and that I can't vocalize my instructions.

It would also be nice to see app designers consider people with different disabilities when creating virtual assistants. There are many different types of disabilities; I am merely one example. The more virtual assistants that cater to the disabled, the easier our lives will be, and in turn, the more we will use the apps. As for the virtual assistants and other technologies that remain useful only for healthy or vocal people, it is a reminder that so many things that could benefit the disabled are yet to be devised.