AMAZON TERMINATED ITS recruiting tool in 2018 after it was found to show a bias against women. So, why did the artificial intelligence (AI) tool not like women candidates? When the engineers dug into it, they found that the AI was trained using data from a time when the tech industry was dominated by men. And in doing so, the AI had ‘learnt’ that male candidates were preferable. Not only that, its machine learning (ML) model had learnt to penalise resumes with words like “women’s” as in “women’s chess club”. And that was the reason why it recommended only male candidates.
While Amazon stopped using the tool once the issue came to light, it has become a prime example of how not to deploy AI systems. But even five years later, the relevance of this incident has only grown. Sample this: per Accenture’s 2022 Tech Vision Research report, only 35 per cent of users globally trust how AI is implemented across companies. And about 77 per cent believe that companies must be held responsible for any misuse of