Amazon is known as the largest online retailer in the world. In the UK alone they employ over 27,000 people, in permanent positions, with this number rising even more through the seasonal periods using the aid of temporary seasonal employees.
Up until this year (2018), Amazon have used agencies to maintain these employee numbers, partnering with a multitude of employment agencies including the largest employment agency in the world, Adecco. However, this was all about to change with the development of Amazons AI, which was to be used for searching through CV’s to find the best matches for the roles available.
“CV’s obtained over a 10 year period”
Amazon had been reportedly working on such an AI from as early as 2014. Using CV’s obtained over a 10 year period from applicants. This new AI would have streamlined the recruitment process for one of the largest employers in the UK, were it not for the major issue that was discovered during testing in 2015.
In 2015 Amazon was testing their new AI on applicants for their software developer roles. The team discovered an unusual pattern in the way the AI was choosing candidates. The AI appeared not to choose candidates in a gender neutral way, by this we mean it was biased towards a particular sex when deciding who would be best suited for a position.
This occurred due to the way in which the AI was ‘trained’ to decide which candidates were best suited for employment. The AI was trained using models that observed patterns in CV’s that were submitted over a 10 year period. This initially seemed like the ideal way to train the AI in finding the best candidates, and this would have been the case had the training pool been a vast, varied plethora of CV’s from people of all walks of life, sex and experience.
“The majority of CV’s were from male candidates”
However, this was not the case in the training pool used. The majority of the CV’s used came from males. This created an inherent bias within the AI’s decision making from the start. This is by no means a scolding of Amazon and its recruitment practices, but shows the gender gap when it comes to technological based roles in any industry. With males being predominantly being hired in technological based roles.
Due to the largely male based training pool being used to teach the AI which candidates were best suited for particular roles, the AI had taught itself that men were the preferable candidates. This meant it penalised female applicants as they were more likely to have words such as “women’s” and “women’s ….. Team” in their CV’s and CV’s with all female schools on them. These candidates were automatically downgraded on the list of potential candidates, creating an instant bias towards male applicants.
Although Amazon had edited their AI so as to eliminate as much of this bias as possible, there still no guarantee that the AI would not find other ways to discriminate against potential candidates. Due to this and the fact that the executives had lost hope for the project, the team behind the AI was disbanded in early 2017 and the project was shelved.
“Amazon did not deny that recruiters used the recommendations”
Amazon have confirmed that the recommendations provided by the AI were used by recruiters but were never used solely in the recruitment process. Amazons only statement on the situation was that the tool “was never used by recruiters to evaluate applicants” but they also did not deny that recruiters looked at the recommendations provided by the AI during recruitment.
Although this experiment by Amazon was not the most successful, it can be used as case study in the limitations of machine learning. Not only this but it also provides a lesson for the future for companies that are looking to move towards the automated recruitment process.