While I’m learning about how how some users perceive and operate pages, I think it’s a good time to start diving into some of the assistive technology (AT) that people use. A week ago I had a more cut and dry approach to the WAS Body of Knowledge, as seen on my Google calendar, but now I’m discovering a more natural overlap of ideas as my own curiosity grows and I attempt to apply these ideas in a useful and concrete way.
To start, I spent time with Apple’s VoiceOver (VO) on my iPhone. Why this particular AT first? Honestly, I needed it during work today to test and share with others. Secondly, it’s one I have the least familiarity with. It’s hard to go against how I normally use my iPhone! Yet, enough motivation and time has slowly brought me around and made me also think more on how this AT can lend itself to benefiting everyone in the screen-less world we are presumed to be heading into.
Things I accomplished
- Successfully navigated my iPhone using VoiceOver without looking at the screen or giving up too easily out of frustration.
- As part of my bi-weekly duties at work, wrote a short accessibility blurb meant to introduce librarians, archivists, and museum staff to VO on iPhone (not yet published at the time I’m writing this).
What I learned today
- VO has a screen curtain mode that turns the screen off, while the functionality continues. This offers added privacy or could save battery power. Careful! I scared myself, not being able to toggle my screen back on. Fortunately, I’d practiced navigating beforehand and listened my way through Settings to turn VO off, which resolved the issue. Phew!
- Practiced at least 5 ways to interact with my iPhone using VO:
Gesture Desired Behavior Swipe down with 2 fingers Start reading continuously (from this point on) Tap with 2 fingers Stop reading Rotate both thumbs in sync (like a dial) Scroll through rotor for list settings and navigation options. Tap with 1 finger Select item Double-tap with 1 finger Choose item or activate button
- I can comfortably listen to VO at a speed of 60%.
Food for thought
- It seems to me the visually impaired have to sacrifice a bit more of their privacy just to accomplish what people, who don’t use screen readers, can accomplish on a daily basis (everything is read aloud).
- Imagine if all of us practiced using VO. There could be benefits to not looking at your screen at every task, giving verbal commands for tasks, and gaining empathy and insight into how others interact differently with the same world.
- I actually appreciated listening to, as opposed to looking at, some things. For instance:
- small icons are often problematic for me to see, but VO suddenly brought clear definition to emojis and indicators that I may have only guessed at before,
- I could listen to my email as I walked on my break, and still keep alert for what was around me, and
- I could perform some tasks on my phone, without looking the screen lighting up the whole room, while my son drifted off to sleep tonight.