Buscador

lunes, 22 de octubre de 2018

Los límites de las herramientas automáticas

Muy interesante el artículo The Power (and Limits) of Automated Accessibility Testing:
For WCAG 2.0 AA conformance testing (the generally-accepted goal for accessibility compliance) automated tests represent only 10 percent of our test plan, 6 percent when you factor in tests across multiple browsers or screen readers.
[...]
Automation is a good way to tell if "something" is wrong, but it won't find all the compliance issues on a site, especially a complex one. There are only two WCAG 2.0 guidelines that we rely on automation alone to test: Page Titled and Parsing. All other guidelines may require manual review depending on the setup of your site.
[...]
What does automation miss? It's terrible at simulating keyboard and screen reader navigation, which may mean that a site passes automation but can't be used at all by someone with motor or visual disabilities. (We see that a lot.) Automation is terrible at telling what you can see when the site is zoomed. It can't tell where you need headers, or if a link has a clear enough description, or if you're moving around a navigation element that needs to be in a predictable place. Automation can't tell if you're relying on color to indicate something a color blind user won't be able to see. It's terrible with error messages, because it isn't entering anything into your forms. It won't tell you if your captcha makes your form impossible for a screen reader user to submit. Automation probably won't bother testing your site in a mobile view.
Automation should be your first attack. It's ours. But it's not enough. Don't trust anyone who tries to convince you it is. Double check the site with a free screen reader like VoiceOver or NVDA, tab through it with your keyboard, zoom in and see what happens. It’s the users of your site, not the robots, that need accessibility.

No hay comentarios: