Day 40: Practice with NVDA

Today I found myself doing a lot of testing with the NVDA screen reader on PDFs at work. So, I made a diversion from my plan to code a custom modal tonight, instead to document what I was learning throughout the day.

I should note that this was not my first encounter with NVDA. It has been my go-to screen reader for testing webpages during the past year. On that note, this post does not go in-depth about all the things NVDA, but rather points out things I hadn’t spent time learning during these quick experiments I’ve done in the past.

Things I accomplished

Review of shortcuts I knew

Task Shortcut
Start reading continuously (from this point on) Insert + Down Arrow
Stop reading Ctrl
List all headings, links, and landmarks Insert + F7
Next line Down Arrow
Next character –or– Next input option (radio buttons, selection list) Right Arrow
Quit NVDA Insert + Q

What I learned today

  • I’ve only used NVDA for listening, but forget that Braille input/output is possible. NVDA supports Braille.
  • NVDA stands for NonVisual Desktop Access.
  • Control + Alt + N only works to open NVDA when a desktop shortcut has been created.
  • If you are running NVDA on a device with a touchscreen (Windows 8 or higher), you can use NVDA with touch commands.
  • Pause speech with the Shift key, as opposed to stopping speech with the Control key.
  • NVDA can navigate you to the next blockquote with the Q key.
  • NVDA has commands to read an entire open dialog (Insert + B) or just the title of the dialog (Insert + T).

On top of all these new shortcuts and tidbits, I was reminded that I am not a screen reader user. When I was trying to solve a “problem” within a remediated PDF document, I finally concluded that it wasn’t the document’s problem, but rather the way I was using NVDA as a novice screen reader user. Listening to the PDF with JAWS, which gave me the results I expected, I decided to abandon the issue I thought the document had. In doing so, I’m relying on the fact that I am not that user, and the appropriate tags given to the document would allow real screen reader users to make their own decisions while still being able to access all the content within this document.

Day 39: Concepts Concerning Custom Modals

Today I started combing through code and underlying principles to create accessible dialogs (modals). The development of this pattern has eluded me in the past, so I wanted to tackle it first. I didn’t get as far as I’d liked (no coding on my part), but did squeeze in the time to at least review what others have done and why they did it.

Things I accomplished

What I learned today

  • I knew there was a dialog HTML element, but didn’t realize just how unequally supported it is across browsers. I see now why so many devs just use a div with ARIA role=”dialog” instead.
  • Once a dialog is opened, focus should immediately move inside the dialog, and an accessible name (aria-label, aria-labelledby, or aria-describedby) and dialog role should be announced.
  • When a dialog is open, the Tab key should not allow the user to get outside of the dialog box.
  • A user should be able to close the dialog with Esc, tapping outside the box, pressing a Close button, or even F6 (reaching the address bar).
  • When a dialog is closed, focus should return back to the element that initiated it.
  • Dialogs should be hidden with visibility:hidden.
  • Expected keyboard interactions within the dialog should be:
    • Tab: Moves focus forward inside the dialog
    • Shift + Tab: Moves focus backward inside the dialog
    • Escape: Closes the dialog

With all the effort that goes into a custom dialog, and the many shortcomings with only partially-reliable workarounds, it makes me wonder if dialogs are really the answer to some “problems”. I look at the usability aspect for all users, and wonder if dialogs really meet a user’s need, or rather, a designer’s and business need.

Other articles I read today

Day 38: Accessible Custom Widgets Overview

Struggling a bit as to what exactly I need to be studying when it comes to accessible techniques (beyond browser and assistive technology compatibility), I’m moving onto the next WAS Body of Knowledge topic “create interactive controls/widgets based on accessibility best practices.” I feel mostly familiar with the topic I researched yesterday, and I feel like getting back into code and doing some testing myself will help solidify more knowledge. In reality, the interactive controls/widgets concept will circle back around to the “accessible JavaScript, AJAX, and interactive content” and ARIA sections, but will offer me more context and application to this head knowledge.

And, to be honest, after all this reading and Googling, I’m ready to jump back into coding with CodePen, as well as my local environment.

Things I accomplished

In review

What I learned through researching accessible interactive content, I need to:

  • manage focus
  • use semantic HTML
  • keep content perceivable at all times
  • create device-independent events
  • consider DOM order when adding content dynamically
  • simplify events

Basic keyboard interactions with a webpage:

  • Tab navigates to next focusable element
  • Shift+Tab navigates to previous focusable element
  • Arrows navigate between related radio buttons, menu items, or widget items
  • Enter activates a link or button, or submits a form
  • Space activates a button or toggle
  • Esc closes menus, modals, and other popover variations

Remember… W3C has a thorough document on ARIA authoring practices, including an extensive section on Design Patterns and Widgets.

What I learned today

  • The Tab key should focus on the widget; the arrow keys should navigate within the widget.
  • When a custom role is assigned to an element, the custom role completely overrides the native role.
  • When creating ARIA widgets, pay attention to the semantic structure of the roles. Some roles have required parent or child roles, or required attributes.
  • Role=”application” should be used sparingly. It overrides many AT keystrokes, such as ones that allow screen reader users to navigate by headings, landmarks, and tables.
  • A screen reader has two modes it uses to access a website:
    • focus mode
    • browse mode
  • Some of the MDN docs about ARIA share vital information about keyboard interaction, available states and properties, and effects on assistive technologies.

Related resource of the day

In Deborah Edwards-Onoro’s The State of the Web: Making the Web More Accessible, she offers her takeaways from a 30-minute video that Google produced about the importance of web accessibility. It’s a quick read when you’re crunched for time, and a good reminder of just how important accessible websites are. My favorite was takeaway #1: “The more accessible your website is, the more usable it is to everyone.”

Day 37: How Well Do Browsers Play with Assistive Technologies?

This week I’m moving into the WAS Body of Knowledge section “choose accessibility techniques that are well-supported”. Most of these topics I’ve had some experience with and even preached about myself. For instance, adhering to coding standards and building with progressive enhancement in mind are two concepts I firmly believe can eliminate a lot of problems. I also understand that testing across platforms, browsers, and assistive technologies is important in order to discover what unanticipated barriers might occur, despite coding to standard.

That being said, today I focused on learning about what combinations of browsers and assistive technologies have been tested to work the best together. I know a bit about screen reader and browser combinations, but I’m certain there is more to learn than the base knowledge I have.

Things I accomplished

What I learned today

Here’s what I learned:

  • VoiceOver (VO) on macOS works mostly well with Firefox, but VO used with Chrome has limited support. Naturally, VO works best with Safari.
  • On that note, only Talkback works the best with Chrome. Other screen readers experience limited support or some support with exceptions. Oddly enough, even ChromeVox has some exceptions.
  • Edge does not support Dragon or ZoomText, and yet Internet Explorer (IE) does. As a matter of fact, IE is recommended for use with these two technologies.
  • Edge has the most support (with exceptions) for Narrator.
  • JAWS has been recommended for a while to be used with IE, but Firefox is a second close as of recently.
  • NVDA still plays best with Firefox.
  • Firefox and IE differ in visual focus, so both should be tested for this.
  • Likewise, video and audio elements differ across browsers, so those should be tested across browsers, too.
  • IE and Firefox are the only browsers that support Flash and Java accessibility.
  • ChromeVox uses to DOM to access content for the listener, rather than other screen readers that access an accessibility API or combination of API and DOM.
  • Level Access has a wiki on iOS Accessibility Issues.
  • SAToGo is another screen reader that works on Windows.

One of my favorite resources is canisue.com when checking for support across browsers. Choice of elements can really matter in cases where IE doesn’t support all HTML5 elements, including dialog. This resource alone has taught me so much about browser support for standards as I’ve worked through projects. In this vein, HTML5 Accessibility is another useful site.

One thing to remember is that following standards (like WCAG) are your best bet. Aiming for specific AT or browser support is not a good approach since updates can be made and support between the two can change.

Note: when reading through the Level Access wiki about AT support by browsers, these were for most popular browsers. Other browsers like Opera were not mentioned.

Day 36: Questions to Ask throughout the Product Life Cycle

Today wrapped up my research for the week about integrating accessibility into a product’s life cycle. I ended with reviewing what a product’s life cycle looks like, and how everyone can play a role during each phase to ensure accessibility is considered throughout development, rather than an afterthought.

Things I accomplished

What I learned today

Stages of a product’s lifecycle with accessibility questions to ask at each stage:

  • concept: does it solve a problem for people with disabilities? what are the different user needs?
  • requirements: what accessibility standards/laws does it need to follow?
  • design: do the mockups create any barriers?
  • prototyping: does the prototype create any perceivable, operable, or understandable errors?
  • development: is the code following standards? are appropriate patterns being used?
  • quality assurance (QA): are automated and manual accessibility checks being run?
  • user acceptance testing (UAT): does this product work for real users with disabilities?
  • regression testing: when updates are made, are checks still passing?

Day 35: A11y Verification Testing as Part of User Testing

Today’s mission was to reflect on user (usability) testing, and search for accessibility verification testing (AVT). AVT was new to me, and I had a harder time finding that exact word combination when searching the web. So, I ended up reading about accessibility user testing and manual testing a developer can perform on her own to emulate user testing.

Things I accomplished

What I learned today

  • Accessibility testing is subset of usability testing.
  • User testing for accessibility requires recruiting real users with disabilities. The rest is much like general user testing:
    • observing them in an environment familiar to them,
    • assigning tasks to accomplish,
    • observing unspoken actions,
    • scrutinizing results, and
    • conclude what changes need to happen.

Cool resource

Designing for Guidance (Microsoft) [PDF] offers tips about the varied learning styles that people have. With the approach of inclusive design in mind, this tiny booklet will make you think more critically when you develop a product or learning course for your large audience.

Day 34: TalkBack on Android

A diversion from my quality assurance research this week out of necessity of testing with an Android screen reader at work. Time spent today: 2 hours.

Things I accomplished

What I learned today

  • TalkBack wasn’t too different from my experience with VoiceOver gestures. Some minor gesture differences.
  • The equivalent functionality to VoiceOver’s rotor is the Local Context Menu.
  • TalkBack quick access can be updated to a triple-click of the Home button of my S5 to turn it on.
  • TalkBack keyboard events are not the same as touch events. It can be hard to develop for all TalkBack users (some keyboard users, some touch users).
  • 29.5% of respondents to WebAIM’s screen reader survey said they use TalkBack.
  • A two-finger or three-finger swipe navigates me through my multiple screens.
  • The “explore by touch” feature reads focusable items as I drag my finger around the screen.
  • Entering my PIN was easier for me to enter with TalkBack then it was with VoiceOver.

Day 33: A11y, UX, or Both?

Things I accomplished

What I learned today

Comparing accessibility and user experience, both have benefits for all, yet differ mostly by audience:

  • accessibility
    • audience: people with disabilities
    • intent: the targeted audience can perceive, understand, navigate, and interact with websites and tools (an equivalent user experience)
  • user experience
    • audience: anyone
    • intent: a product should be effective, efficient, and satisfying

Accessibility includes a more technical aspect (considerate of assistive technologies, for instance); UX is more principled in its approach.

Usable accessibility = a11y + UX.

Accessibility is just one aspect of the “universal web”.

Looking at accessibility a little closer, what makes a person disabled? We may think of someone with a disability as having a certified report by a doctor or proving an obvious physical or mental difference from our own. Yet disabilities are actually better defined as a conflict of a person’s ability with their environment. It puts us all on a spectrum, doesn’t it?

An accurate statement

“For people without disabilities, technology makes things convenient. For people with disabilities, it makes things possible.” – Judith Heumann, U.S. Department of Education’s Assistant Secretary of the Office of Special Education and Rehabilitative Services

Factoid resource

  • Section 255 of the Telecommunications Act of 1996: Fueling the Creation of New Electronic Curbcuts
    A timeline of IT innovations built for someone with disabilities, but made its way into mainstream tech use. To my surprise: the typewriter!

Day 32: Benefits of Designing for A11y

Things I accomplished

What I learned today

There are several benefits to starting out a design process with accessibility in mind, rather than catching it after production and resorting to remediation. Some of these benefits include:

  • a solid customer base due to an off-the-shelf accessible product,
  • saved money by building it right rather than redesigning over and over again,
  • minimized legal risk,
  • innovation and the challenge to solve real-world problems,
  • improved productivity,
  • improved diversity, and
  • improved corporate image and brand when accessible technologies and strategies are incorporated within their organization.

Day 31: A11y throughout a Product’s Lifecycle, Waterfall vs. Agile

Moving onto the next WAS Body of Knowledge study topic: integrating accessibility into the quality assurance process. Approximate study time: 1 hour.

Things I accomplished

What I learned today

Considering accessibility shouldn’t just happen at the mockup or code level. It can happen throughout the product’s entire life:

  • concept,
  • requirements,
  • design,
  • prototyping,
  • development,
  • quality assurance,
  • user testing, and
  • regression testing.

Additionally, I read about the agile and waterfall processes and when to apply accessibility when working through the cycle:

  • waterfall approach: present throughout each step and is well documented
  • agile approach: discussed at scrum meeting, established as a requirement, built into design and architecture, use standardized testing with TDD and automation

Love this statement by Karl Groves which nails it when it comes to encouraging the development of an accessible product rather than blockading it:

“Become a member of the team, not a gatekeeper, and you will be seen as a resource instead of a hurdle.”