Designing AI Applications to Treat People with Disabilities Fairly

Designing AI Applications to Treat People with Disabilities Fairly by Shari Trewin and Yves Veulliet, IBM.

AI solutions must account for everyone. As artificial intelligence becomes pervasive, high profile cases of racial or gender bias have emerged. Discrimination against people with disabilities is a longstanding problem in society. It could be reduced by technology or exacerbated by it. IBM believes in working to ensure their technologies reflect the organisation’s values and shape lives and society for the better. Often, challenges in fairness for people with disabilities stem from human failure to fully consider diversity when designing, testing and deploying systems. If this is not taken into account, there is a risk of systematically excluding people with disabilities. This article outlines IBM’s ‘Six Steps to Fairness’ in the presence of rapidly advancing AI-based technologies.