The agency tasked with developing the new tool had a clear brief: make accessibility a core aspect. They reached out to us last year, seeking our involvement from the very beginning. This was quite satisfying and meant that the necessary remedial work should be much less than usual, given our early involvement.
Website design is quite subjective, but when it comes to digital accessibility, we make it clear that we're not here to make subjective comments on designs unless it directly impacts our accessibility testing. Our focus is always on ensuring that it functions for as many people as possible.
I didn't hear from the developers for at least four months. Then, an email arrived, letting me know they were ready for an audit from me and my team of disabled testers.
I was pleased to discover that, in many ways, the accessibility standards were very high, and everything I had suggested had been implemented. After checking the website myself, I invited my fantastic team to review it.
True accessibility is only revealed through the lived experience of disabled individuals, and once again, this was evident during their testing of the tool.
The team praised how well the tool had been constructed, but there were still some accessibility glitches, mostly related to consistency. For instance, tabbing behaviour (using the tab key to navigate) varied on different pages, and there were instances where all MSPs were selected by default with one type of search but none were selected with another. Visually, the automatic selections were clear, but for users of screen readers, there was no indication.