UX Design for Accessibility: Court of Appeals Lifecycle Overview
Our UX Designers at Moser Consulting had the opportunity to complete a lifecycle of a design project for the Indiana Court of Appeals. They came to us with their website to determine what the UX improvements could be for their website to achieve more accessibility and a better user experience. Throughout this piece I will be going over the steps in the lifecycle of this project with these including: understanding the current UX issues, educating on WCAG and section 508 accessibility guidelines, automated testing, and user interviews and testings.
User Interviews to Understand UX Issues
We began by talking with the stakeholders to understand some of the current user experience issues. We developed tasks and questions about the user’s current dashboard website in order for us to gain a deep understanding of the current user experience and user experience issues. We created a “Usability Script” to walk through with current users to identify pain points and areas for improvement. Below, we have examples from our initial discovery presentation with the Court of Appeals.
This slide shows how we identified how most users use the COA Dashboard. It is something that the users use frequently, but usually have it running in the background until they need to use the dashboard for something specific. Users seemed to generally like the software and were fond of the flexibility of the dashboard, as well as being able to customize notifications.
WCAG/508
Our team used Web Content Accessibility Guidelines (WCAG) and Section 508 standards into the design and user experience research process. Section 508 has mandatory accessibility requirements for government websites, but WCAG has an even more comprehensive list of accessibility standards. By making sure we hit WCAG standards, we also hit Section 508 standards for the COA dashboard. Our team of UX researchers and designers were able to do a front end accessibility review, where we were able to identify accessibility issues within the UI of the dashboard.
Further Accessibility Testing with User Who is Visually Impaired
Our team got the opportunity to do a usability interview with a member of the COA team named Jordan. Jordan is visually impaired and provided us with valuable insight to improve the accessibility of the COA dashboard. While interviewing Jordan, we learned that he “worked around” a lot of the accessibility issues and memorized the system enough to know how he could work with it. He expressed his fear of change for the system, because he did not want to lose the capabilities he already had or have to completely start over learning the system as if it were new. These insights were a high priority for us, and it was valuable to be able to communicate accessibility issues with our clients as well as the psychology for people who might negatively impacted by accessibility issues.
Automated Testing
We employed automated tools to quickly identify and address common accessibility issues, ensuring ongoing compliance. One of the tools we used was a back end accessibility review that identified accessibility and usability issues within the code. This was a good addition to our accessibility testing, because the code is able to identify each little error or problem. With the automated testing, we learned there were 20 pages with accessibility problems for the dashboard, which is worse than an average website.
Our Recommendations
With our initial research, we came up with recommendations that drove our goals for the new designs and next round of testing. Our recommendations consisted of the following:
Focus on AA Accessibility
This is a middle ground accessibility, but highly encourage for a government website. Accessibility was one of our main focuses with the website.
Work with Jordan before releases
We recommended working with Jordan before the releases. This recommendation is just to make sure that the user experience of accessibility matches up with our data of the accessibility within testing.
Clear actions of what to do
The previous design of the dashboard, it was confusing for the user where to go on the screen. There were not many things labeled. For example- we added “My Cases” on the first screen. Before, the user could not tell that it was on the “My Cases” screen. This theme of not labelling was present throughout the entire dashboard.
Help Section
The previous design had no ‘Help’ section for the user. We recommended based on user interviews, as well as our own research that adding a ‘Help’ section with FAQ’s would be very beneficial to the user and might help with the user problem-solving.
User Testing
Regular user testing and feedback sessions helped refine the design, ensuring a more inclusive and user-friendly experience. After making UI changes to the dashboard and creating a clickable prototype, we had a second round of user testing or User Acceptance Testing (UAT). This was to receive user feedback as well as determine if the changes were things the users would be “accepting” of. Generally, the UAT of our new prototype was well received by users, because we worked closely with them on their initial concerns, which were things that were addressed in the new designs.
Conclusion
Our UX designers at Moser was able to work closely with the COA team through a lifecycle of research, design changes, and user testing in order to create the absolute best experience for the user when using the COA dashboard. By the end of this project lifecycle we discovered and communicated current UX issues,WCAG and section 508 accessibility guidelines, automated testing, and user interviews and testings.