There is a fantastic piece of research about venture capital funding and the different questions that are posed to female-led companies versus male-led companies. They asked men about the opportunities for gain and women about the opportunity for loss. It is a promotion versus prevention concept. The problem with these questions is that respondents answer them in different ways; a promotion question gets a promotion answer, while a prevention question causes women to reinforce points about potential losses, rather than the potential for the business. There is an obvious bias in the
The research analyzed job interviews at two leading US universities – University of California and University of Southern California – over a two-year period, and found that women were questioned more by hiring panels, they were interrupted more and faced more follow up questions. This meant that they spend more of their time reacting to queries than presenting why they are right for the job.
Add to this not just manterrupting, but a report from the Journal of Social Sciences which found that men are twice as likely to interject while speaking to a woman. Even powerful and highly respected women like Supreme Court Justice Ruth Bader Ginsburg can’t escape this. According to a 2017 study, male Supreme Court justices have reacted to the increase in women serving on the bench by increasing their interruptions of them.
All of which makes the interview a difficult and professionally challenging, dangerous place for a woman.
While it is easy to suggest that we use artificial intelligence (AI) to get rid of all this bias, I caution against such a move.
In October Amazon scrapped an AI recruiting tool they were using to analyze resumes to select the top five percent of candidates. Why? The company realized that the system was not rating candidates for developer jobs in a gender-neutral way. The model was built on patterns in resumes submitted over the previous ten years, most of which came from men and the AI was exhibiting a preference for male resumes.
The people behind the majority of today’s technological advances — the workers creating the algorithms — are predominantly white and male. The concern here is that when these white male coders assemble data for chatbots, the machines they design are likely to perpetuate inequities found in the real world. They are prone to hard code their own subconscious bias about race, gender and class into algorithms that are designed to mirror human decision making. This has the propensity to amplify existing stereotypes and create a stronger association for male and female-oriented images, behaviors, and careers.
Machines learn from masses of data. If that data has gender biases incorporated, it will become part of the algorithm. For example, researchers at Boston University and Microsoft asked the machine learning software to complete the statement, “Man is to computer programmer as woman is to …” It replied, “homemaker.”
In the excellent book The Man Who Lied to His Laptop, Clifford Nass reports how BMW was forced to recall one of its cars because male drivers in Germany didn’t trust the female voice offering directions from the car’s navigation system. In Japan, a call center operated by Fidelity would rely on an automated female voice to give stock quotes but would transfer customers to an automated male voice for transactions.
In 2016, Harvard published a report about the inherent bias in interviews and hiring managers to use their own context when evaluating individual applicants for a job. Another study found that women and man are more likely to hire a man for the job. When presented with two candidates with equal qualification and performance, the man was 1.5 times more likely to be hired.
But What about Ageism?
Another major -ism that is with us is ageism – in a world where youth is seen as a competitive advantage, being older is potentially being seen in a negative light. There has been extensive research about how much harder it is for older women to get interviews than older men. The National Bureau of Economic Research carried out research testing the prevalence of age discrimination in hiring, shows that the résumés of older women get far fewer callbacks than those of older men and younger applicants of either sex.
This is based on bias and unfortunately double bias for women who are older.
How Can Recruitment Help?
- Same Questions – Ask the same questions of people irrespective of age or gender.
- Reporting – There is an inherent lack of transparency in the recruitment process. We need to make this more transparent. We need to start recording the ratio of resumes by gender, interview candidates by gender to hire by gender.
- Group Interviews – The same research showed that managers were more likely to remove any gender bias questions or comments in their interview if they were sitting with other people.
- Bias Training – People don’t realize they are biased unless they are shown how their unconscious (and conscious – let’s be honest) do not align with the corporate culture and expectations about their role in the hiring process.