Amazon has scrapped a new artificial intelligence tool after learning the recruiting engine had a bias against women.
Reuters reports that the e-commerce giant has been building computer programs for the last few years to review resumes and search for top talent. The experimental hiring tool used artificial intelligence to give job candidates scores ranging from one to five stars, but shortly after it was found that the system was not rating candidates for technical positions in a gender-neutral way. The e-retailer effectively shut down the program before rolling it out to a larger group.
However, a statement from an Amazon spokesperson says that the system was never used by Amazon recruiters to evaluate candidates.
The computer models were trained to vet applicants by observing patterns in resumes submitted to the company over a 10-year period, Reuters reports. As the tech industry is dominated by men, most applicants came from males. The system was able to teach itself that male candidates were preferable, penalizing resumes that included the word "women." Meanwhile, the system favored words commonly found on male resumes such as "executed and "captured."The company reportedly edited the programs to make them neutral to gender-identifying words, but it was not guaranteed that the machines could sort candidates in other discriminatory ways.
Gender biases wasn't the only issue that the technology had, Reuters reports. The system also had skewed judgement and often recommended unqualified candidates for jobs.
Reuters reports that Amazon disbanded the team early last year as executives lost hope for the project. However, the experiments gives a look into the limitations of machine learning as large companies expect to adopt the technology in the upcoming years.
Amazon now uses a "much-watered down version" of the recruiting engine to help with some rudimentary chores, Reuters reports. However, a new team has been formed to give automated employment screening another try, however, this time they will focus on diversity.