Accessibility: which Paralympics sites passed the test?

More than one billion people (nearly one-seventh of the population) worldwide live with some form of disability, according to the WHO/World Bank World Report on Disability.

Whether you look at it from an ethical, legal or commercial standpoint, that’s a lot of people who might struggle to use your website/mobile site/app due to problems with sight, hearing, cognitive, motor skills or physical disability.

The W3C’s Web Content Accessibility Guidelines (WCAG) 2.0 – which forms the basis for most regulations, guidelines and tests – are eight years old this December. The scope of these guidelines is currently under review, of particular interest is the task force that is considering revisions for mobile accessibility… as we will see below, mobile is different.

But despite the longevity of the guidelines, we’ve yet to reach the point where accessibility is ingrained in the de facto web/mobile design or content management strategy and execution.

There’s been no Eureka moment for web accessibility akin to mobilegeddon, where everyone woke up (in fear) to the importance of mobile web.

What will it take?

  • Does Google need to tweak its search algorithm to demote inaccessible sites and launch a test-the-accessibility-of-my-site tool?
  • Do governments need to make regulations more stringent? Should web rules for public services e.g. UK, EU  and US  be extended to private business, as disability advocates recommend?
  • Do we need some a big test case where a huge company is prosecuted or sued under existing discrimination regulation? There’s been some litigation, but no massive class action lawsuit… yet.
  • Or could the buzz around the Paralympics Games encourage companies/organizations to consider how they could better serve, engage, sell to disabled people over the web? Now wouldn’t that be a great digital legacy of the games?

Leading by example

There are plenty of (mobile) websites that should be leading accessibility by example:

  • Government agencies (which are under strict rules in many countries e.g. US, UK, to be accessible).
  • Charities which focus on disabled or elderly people.
  • Businesses which sell to/serve disabled or elderly people.
  • Companies which offer advice on accessibility e.g. lawyers.
  • Any company which takes corporate social responsibility seriously.
  • Associations that support the Paralympic Games or Paralympians – including the sponsors.
  • National broadcasters with exclusive rights to cover the Paralympic Games.

In this column we will focus on the last two of these groups, and run some them through some automated tests on their Paralympic homepages (leaving out the sponsors).

If the results of these automated tests are to be believed (there can be issues with automated tools, including false positives) then it seems that some organizations…

…take web accessibility more seriously than others:

We’re not pointing the finger, we’re simply highlighting the importance of testing the accessibility of your website. And yes, we have run the same tests on ClickZ and it is clear there are improvements to be made here as well.

dna32_paralympics_1

 

It is clear when you study the BBC’s Mobile Accessibility Guidelinesarguably the most useful resource on mobile accessibility,  that the BBC has made accessibility a priority.

So what is web accessibility and WCAG2?

WCAG2 is the global web standard set by the W3C for web accessibility (sometimes abbreviated to a11y). It sets out four principles (sometimes abbreviated to POUR) – web content must be:

  1. Perceivable (users know the content/user interface is there, even if can’t see it, hear it etc.).
  2. Operable (user interfaces can be interacted with in different ways e.g. touch, mouse, keyboard, screen reader, eye tracking).
  3. Understandable (information and operation is easy to comprehend).
  4. Robust (the content must work with various user agents including assistive technologies).

There’s a myth that WCAG2 only applies to people with visual impairment and/or people using screen readers (browsers that read website text and commands aloud).

That’s wrong. The WCAG2 principles and guidelines also applies to hearing, cognitive, motor skills and physical disability. But there are some gaps in its scope that it plans to address through the following task forces:

WCAG2 sets three levels of accessibility, as Marco Zehe, senior platform engineer and evangelist, in Mozilla’s accessibility team, explains:

WCAG has three levels of success:

Level A, which is the basic level everyone should aim for. This includes most screen readers, contrast, keyboard, and other accessibility criteria everyone should meet in an ideal world. This goes for desktop and mobile.

Level AA, which is all of A, but includes higher level things such as subtitling video content etc.

Level AAA, which is the absolute Olympic Gold of accessibility. This, for example, includes alternatives in simple language, videos translated into the sign language of the respective spoken language etc.

A is the bare minimum, AA should be strived for in many cases, too, and AAA is an absolute 5 star rating.

 

Test, test and test some more

The place to start with accessibility is testing. There are three stages of accessibility testing:

  • Automated testing tools.
  • Manual accessibility testing by an expert.
  • User testing by people with disabilities.

1. Automated testing tools

There are lots of web-based accessibility tools, free and paid. These tools scan webpages for errors that contravene the W3C’s WCAG 2.0 and other standards e.g. US government regulation Section 508.

Some tools are better than others: more up-to-date (check for date of release when choosing one), less false positives and easier to understand.

They don’t tend to offer benchmarks or overall performance scores. This means you need to do a comparative study of rival sites to ascertain you performance against your peers (see our Paralympics test below). And some only provide little in the way of recommendations to address the errors.

No test claims to be definitive or a substitute for manual testing. But they’re a good start and a good tool to assist with manual testing.

It’s a good idea to use a number of tools. These four were recommended by accessibility practitioners and used in our Paralympics test:

  • CodeSniffer – sits on your browser toolbar ready for action, brief descriptions of errors and the WCAG2 principle contravened. Can be set to measure WCAG2A, AA or AAA errors or Section 508 errors (see image above).
  • aXe– available as a browser extension for Chrome or Firefox. Prioritize fixes as critical or serious, with brief description (see image below).
  • Wave – available as a browser extension for Chrome, highlights positives as well as negatives and
  • Tenon – paid tool, but 30 day trial available. Prioritizes fixes, offers remedial action and impacted community.

dna32_paralympics_2

 

The Paralympics test

The following table shows the results of our test on the international Paralympics sites and those of the national broadcasters, using the selected automated tools.

For each test the three best performers are shaded in green, the worst in red. N.B. Results have not been adjusted for false positives.

Paralympics sites of the top performing countries tested with accessibility tools
Wave CodeSnifer aXe Tenon
Errors Features Contrast Errors Errors Violations Issues
General rio2016.com/en/paralympics/sports 13 16 37 32 135 133
General paralympic.org 1 28 33 222 3 44
General m.paralympic.org 24 11 9 29 32 35
China english.cctv.com/special/2016_rio_paralympics 46 0 36 71 65 76
GB paralympics.channel4.com 3 50 48 51 20 14
GB bbc.co.uk/sport/disability-sport 3 52 3 2 34 N/A
GB paralympics.org.uk 1 19 4 3 50 6
Ukraine 1tv.com.ua 16 96 1 21 18 17
US
olympics.nbcsports.com/category/paralympics
101 46 56 222 50 60
US
teamusa.org/us-paralympics
38 4 5 293 34 61
Australia paralympic.org.au 13 29 473 22 19 7
Germany
rio.sportschau.de/rio2016/paralympics
6 58 1 3 5 67
Germany
rio.zdf.de/paralympics
2 0 45 14 18 3
Netherlands nos.nl/ps2016/ 6 34 57 81 53 20
Brazil redeglobo.globo.com
14 27 13 40 121 136
Key: the three best performing webpages are shaded in green; the worst in red. @Andy_Favell

There are two major problems with automated testing tools:

  • There is no tool – that we know of – specifically designed to test the accessibility or mobile websites or apps. The tools above will test responsive or mobile sites, but they will test using the same criteria as PC sites; they go not take account of touch, small screens and other mobile specifics. Hopefully this will change following the review and recommendations of the Mobile Accessibility Task Force.
  • Automated testing can mis-diagnose errors, creating false positives, and tend to miss a lot of issues.

This is why manual accessibility testing is important.

Mobile is different

Alan Smith, a global accessibility consultant, with Humana Healthcare and a participant in the W3C Mobile Accessibility Task Force:

“Web and mobile web should be [approached/tested] separately. Web on a mobile screen and mobile apps are two different mobile experiences as well.

None of the automated tools check mobile apps or even check responsive design pages on multiple devices.”

What are the most common accessibility issues with mobile web/apps?

When testing mobile apps using the device screen reader (VoiceOver on iOS and Talkback on Android) the biggest issues tend to be:

  • They do not define a proper label and or other instructions to links and buttons.
  • If there is an expand/collapse section it should say so and announce upon touch that it is expanded or collapsed (its function and state).
  • If there is a mixed list of links and information, the items in the list that are links should both demonstrate this visually with a right facing arrow and also announce being an link.
  • Too often touch targets are too small to be accurately touched.
  • Color contrast should be 7:1 for mobile, but this is only a best practice and not a guideline yet.

What accessibility issues do automated tools tend to miss?

Alan Smith rattled off a lengthy list of issues that automated tools are unlikely to spot. These have been paraphrased.

Automated testing only finds 20% of the issues/violations and the rest are covered under manual testing with keyboards and screen readers.

Here are some things that automated testing will not find or will be unable to determine a violation without human assistance:

  1. A series of tabs that are coded as links, but do not properly announce their role, state and value using e.g. role=tab
  2. Radio buttons and checkboxes have a leading question/statement that is not announced by screen readers. So the screen reader just reads a list of answers.
  3. A carousel does not have a stop or pause control
  4. A video clip is coded for auto-play. Videos should never auto-play.
  5. Sensory instruction description violations e.g. “Select the green button”.
  6. The skip navigation link is not going to the correct place on the page.
  7. If there is inconsistent navigation throughout your site.
  8. Issues with text sizing to 200%.
  9. Timeout controls that are too brief or do not allow users to extend the timeout period when filling out forms.
  10. It will not tell you if you have megamenus with submenus that are not coded correctly for keyboard and screen reader users.

When and how to pick an expert

Ideally the web team has accessibility expertise as it is much easier to introduce measures during the development stage than to retrofit pre-launch.

Several of the accessibility consultancies/charities offer training, such as these courses  from the UK charity AbilityNet or these courses from WebAIM

There are also plenty of specialists and agencies who can offer assistance, but be sure to vet them well.

As Marco Zehe explains:

“Companies that need to fulfil something like Section 508 or the UK equivalent, should consider bringing in an expert. Otherwise, available testing tools will get them quite some way towards a goal. The most important thing is to not consider accessibility as an after-thought, but include it in every phase of the project upfront.

The expert should have a good reputation for completing web accessibility work. This should be implicit in their references – as you will find with consulting companies such as The Paciello Group, Deque Systems, SSB Bart Group and WebAIM.”

User testing by people with disabilities

Testing by automated tools and expert accessibility practitioners should expose many of the issues that contravene WCAG and other best practice guidelines and regulations, but this does not guarantee your site or app will deliver a good user experience to people with disabilities.

There are four reasons for this:

  • There are gaps in the WCAG guidelines – as mentioned – with respect to mobile accessibility, cognitive and learning disabilities and low vision.
  • Best practice rules are set broadly to cover a wide variety of sites, and users.
  • Real users do not follow best-practice procedures for using sites and apps.
  • User testing will highlight opportunities to make your site/app more relevant to disabled people based on user behavior.

User tests can occur in a variety of ways: on site, remotely, moderated, unmoderated etc.

For those companies interested in outsourcing disability user testing, the Access Works service from Knowbility – a non-profit accessibility consulting company based in Austin (TX) USA – looks worthy of consideration.

Access Works has a database of persons with disabilities who are available to do online, unmoderated testing of Web sites. These testers work from their own homes, using their own adaptive technologies.

The testing organization pays $75 (US) per tester, $50 of which Knowbility pays in turn to the tester. Tests are created and administered by UserZoom or Loop11 – which are charged on a case-by-case basis.

Tom Jewett, Knowbility:

“Testers access these through a personal Access-Works dashboard; they are identified to the testing organization only by a unique ID number, to preserve the privacy of their personal information.

The Access-Works database includes over 300 active potential testers. The majority of these are blind, which is also the most frequently-requested category. Others include low vision, motor skill impairments, hearing disabilities, and cognitive or neurological impairments; some testers may have multiple disabilities.

Two-thirds of the registered testers live in the United States; the rest are mostly in Australia, Canada, India, and the UK. Knowbility can also provide custom tester recruitment based either on location or on disability profile, through accessibility organizations worldwide.”

Conclusion 

From a Paralympics perspective, it is surprising that the organizers did not make the grant of broadcasting rights conditional on the provision of website that is accessible to people with a range of disabilities.

Ideally this should be a de facto provision in all contracts, but for a sporting event that promotes the achievements and rights of disabled people it is a glaring omission.

Resources:

In addition to those experts referenced in this column, the author would particularly like to thank Robert Gaines, web and app developer, of RGaines Web Development for his help researching this column.

 

Read the reports:

This is Part 32 of the ClickZ ‘DNA of mobile-friendly web’ series.

Here are the recent ones:

Related reading

phone-image
Concept for mobile apps.
mobile internet
hand using phone white screen on top view
<