Ask an application or full-stack developer about the types of testing they perform on their applications, and they are likely to report on unit testing, API testing, and hopefully SAST (static application security testing) and other security testing. The QA automation engineers are likely to mention functional test automation, cross-browser and -device testing, mobile app testing, and performance testing. Devops engineers aim to enhance CI/CD (continuous integration/continuous development) pipelines with continuous testing, and SREs focus on the observability of applications and microservices.
Performing all this testing isn’t easy, and one recent study reports that only 22% of IT’s budget in 2020 was dedicated to QA activities, down from a high of 35% in 2015. Lack of sufficient funding, testing skills, and development time are high on development teams’ list of testing challenges, and this requires teams to strategize what types of testing to prioritize.
But where does the application’s usability fit into this picture of development and testing activities?
Some agile development teams place usability testing in the category of user acceptance testing and expect business stakeholders to test and provide usability feedback. This may be good enough for smaller-scale departmental workflow applications, but it falls short of what regulations require and the best practices for customer-facing applications, large-scale workflows, or applications used internationally.
Here are three usability development and testing standards and some ways for agile development teams to establish them in development and testing workflows.
Comply with WCAG standards and ADA regulations
Development and testing teams should review the quick reference of the Web Content Accessibility Guidelines (WCAG) Version 2.1. These guidelines aim to make the web accessible to people with disabilities, including blindness, low vision, deafness, hearing loss, limited movement, speech disabilities, and photosensitivity, and to offer some accommodation for learning disabilities.
These guidelines provide specifications at three levels of compliance, A through AAA, including the following:
- Time-based media guidelines, including audio and video, should include captions and descriptions. Sign language is required to achieve AAA-level status.
- Keyboard guidelines require accessibility to all content, techniques for keyboard shortcut implementations, and specifics on using the keyboard to focus on page components.
- Navigation guidelines are included on page titles, focus order, and link purpose.
Although WCAG are guidelines, they are referenced standards to comply with Title III of the Americans with Disabilities Act, Section 508 of the Rehabilitation Act for technologies deployed at federal government agencies, and California AB 434 for California state agencies. Global accessibility regulations include Canada’s Accessible Canada Act and similar provincial and municipal regulations and other international laws and policies.
Developers unfamiliar with these guidelines might want to review this guide to ADA compliance, while designers should consider these WCAG mobile and web design principles. It’s a best practice to create standards and agile user story acceptance criteria so that agile teams factor in these guidelines during application development.
During the application development, testing, and deployment phases, QA and compliance teams should consider using web accessibility solutions such as accessiBe, AudioEye, Crownpeak, Level Access, and UsableNet AQA. These tools validate and report on application compliance with WCAG and other accessibility best practices.
Jason Taylor, chief innovation strategist at UsableNet, says, “Incorporating WCAG and ADA standards early in the design and the development process helps ease of use for end-users. It is also more efficient, saving the development team the time-consuming and expensive work of remediation. Developers should observe real users of assistive technology, and this experience helps designers gain important empathy.”
Target localization and internationalization during development
When applications target end-users in multiple countries or cultures, developers and testers should implement localization (L10n) and internationalization (I18N) in the application.
What’s the difference between localization and internationalization? Developers can think of localization as configurations to target specific languages and cultural requirements, including date formats, currency, keyboard configuration, colors, symbols, data presentation defaults, and other local and regional factors.
Internationalization describes configuring the application to target multiple audiences and includes design, architecture, and application configuration considerations.
Coding languages, such as AngularJS, ASP.NET, Java, and React, and content management systems, like Drupal and WordPress, come with frameworks and practices to support internationalization. Development organizations may also want to review localization and I18N platforms like Lingoport, Lokalise, Memsource, and Transifex that plug into agile and devops tools and enable localization at scale.
Localization and internationalization need to be factored into the requirements, and it can be challenging to retrofit legacy applications with these capabilities. I spoke with Thomas J. Sweet, vice president of cloud services at a major financial institution, about this challenge. He agrees and states, “Designing for localization and globalization from the beginning takes only a little more time but, like security, is nearly impossible to add in at the end.”
Deliver delightful experiences by testing application usability
Organizations committing to digital transformations and seeking to improve customer and employee experiences should test their web and mobile application usability.
In a recent customer experience report, the top area cited for gathering customer feedback, with more than 50% responding, was on website and mobile applications. Respondents identified several top areas for collecting customer feedback on usability, including accessibility, design, customer journeys, mobile interfaces, and machine learning capabilities. The top three impacts for customer-facing applications are higher conversion rates, increased user engagement, and revenue generation.
Usability testing used to be an expensive process requiring focus groups, tools for tracking end-user response, and significant expertise in designing tests, facilitating interviews, capturing data, and producing insights. Today, services like BetaTesting, Lookback, Testbirds, Testlio, UserZoom, UserTesting, and UXCam streamline the approach by tapping into networks of online testers, using video to capture the experience, aggregating the information, and presenting insights in portals.
For development teams looking to deploy frequently, these tools pave the way for more regular testing and repeatable user feedback. Incorporating feature flags, CI/CD deployment automation, and usability testing enable teams to get rapid feedback on proofs of concept, A/B tests, and controlled feature rollouts.
Testing functionality, performance, reliability, and security are important application development and testing considerations. Meeting accessibility standards, internationalizing applications, and testing usability are vital for meeting regulations and improving end-user experiences.