AI + Thought Leader Accessibility

Published on by David A. Kennedy

Jakob Nielsen published a controversial piece on accessibility, arguing that accessibility has failed. The solution? Tailored experiences for people, generated by artificial intelligence.

Nielsen misses the mark in many ways in this post, and a number of people have called that out.

Brian DeConinck in Jakob Nielsen’s Bad Ideas about Accessibility:

His argument boils down to: “Accessibility is too hard for designers. Let’s just give it to AI and wash our hands of the whole thing.” It shows a complete lack of faith in the whole idea of design as a way to solve problems, and a lack of faith in UX designers to understand disability and make informed design choices.

Per Axbom in On Nielsen’s ideas about generative UI for resolving accessibility:

The door is wide open to claiming that users will in many cases use voice, text, keyboards and eye-trackers in the future to ask AI assistants to navigate and fetch content for them under their own supervision. Something that could mean volumes for accessibility without requiring any new interfaces at all for existing websites. But a unique, individualised UI for each user, generated without supervision by any designer, is an extreme take with very little foundation in feasibility or desirability.

Matt May in We need to talk about Jakob:

It’s… thoughtless. Hopeless. Soulless. Nielsen built his reputation on sharing his hot takes with a nascent blogosphere in the 1990s and early aughts, and if he hasn’t spent it all by now, I hope this finishes the job. What concerns me, though, is that he’s selling a class of executives hostile to disabled access a convenient fiction that will end up putting accessibility work on the back burner for a future which may never arrive.

There isn’t much I can add that hasn’t already been highlighted in those posts. As I thought more about Nielsen’s post, I kept coming back to three areas:

  1. Privacy risk: Nielsen doesn’t take into account that the idea would require people to identify as disabled to get that magical, personalized experience. That turns into a privacy risk for people with disabilities.
  2. Artificial intelligence fixes everything: One of Nielsen’s main points centers on how accessibility creates a substandard user experience. This happens for those who are blind, thanks to “ a linear (one-dimensional) auditory user interface to represent the two-dimensional graphical user interface (GUI) designed for most users.” Won’t many of the personalized experiences created by artificial intelligence come in as substandard? Yes they will.
  3. The awareness and operations aspects of accessibility: Many of the common accessibility problems come from basic mistakes like missing form labels. That happens because of a lack of awareness around accessibility. You can nail the basics, but you need to know about them and do the work. That’s harder when you have thought leaders telling you not to worry about it at all. Making accessibility happen takes operational expertise, especially at larger organizations. Nielsen doesn’t cover any of that, leaving artificial intelligence to solve process challenges as well.

There’s nothing wrong with speculating about how a new technology could improve experiences for people. But coming at it with a clickbait headline, no research, a faulty premise and a deep misunderstanding of the technology in question means you’re not speculating thoughtfully, but thoughtlessly.


Tagged Accessibility