What does the world look like to you? How do you perceive it? Taking in the world with different experiences; different angles, everyone looks at the world through a different lens. Among the many who have a different vision, is Neil Harbisson, the world’s first cyborg who suffers from colour blindness. With an antenna implanted in his skull, he senses and feels the colours around him which makes one ruminate the lives of the differently abled and how technology has seeped into their beings to make their lives better and efficient.
Since the role of technology is pre-eminent in the world, to keep it updated and flawlessly functional, testing becomes equally essential in the process. With a concentrated effort to adhere to the given accessibility guidelines- Section 508, ADA and WCAG 2.0, the path is set for delivering accessible software products. What comes into picture thereafter is a thorough check for the accessibility requirements and more so, an amalgamated effort by a paired testing approach to produce realistic results. This is the basic focus during accessibility testing. But what strikes out is to see how intrinsically is the technology connected. With Harbisson’s case, the picture in focus is of AS, i.e., Artificial Sense, giving a different worldview as compared to AI. With AS, one creates the intelligence by his/herself. An interesting shift from AI, yet the role of AI becomes as imperative as ever in ensuring that, if in the future there is a reliance on AI than AS for such cases, the results would be seamless. Although they differ as concepts and AS in itself is a transforming realm, with AI, the “intelligence” provided will be concrete, if implemented. The expanse of AI is so wide that its coverage becomes challenging. Starting from a deep-rooted knowledge of machine learning to effectively integrating it with automation, the challenge for testers increases ever so in covering and managing the whole ecosystem. Integrating this with the domain of accessibility has proved wonders and will continue to do so in the future as well. Providing alt texts dynamically via image recognition, audio and video transcripts in real time, etc., have become a reality today with AI in picture. In various verticals, be it banking or healthcare, the amalgamation of accessibility and AI in terms of facial recognition or prompt predictions has made the task of testers stimulating than ever before.
With 15% of the world population suffering from at least one kind of a disability, ensuring digital equality becomes as important as attaining an indiscriminate environment elsewhere. In this, the role of testers shouldn’t just start at the end of an SDLC but be done simultaneously with the development process to rectify errors from the beginning itself. Conjoining AI to this process seamlessly becomes a task as is, which requires a concentrated effort from the QA team.
People with impairments didn’t have a say as to what they suffered but they definitely have a say as to what they want to be and see as their future. This dreamt future, through the looking glass, can be realized truly once technology is seamlessly congruous with the lives of the differently abled.