Undoubtedly, technology has had an unprecedented influence on our lives. It shapes our consumer habits, hobbies, relationships — and even our chances of scoring a good job.
If you apply for a job online, chances are there’ll be one abstract thing standing between you and a prospective job interview: a set of complex hiring algorithms.
It may sound like a scene from a futuristic movie, but the opposite is the truth.
This article talks about the air of authority and respect that computers have in today’s hiring industry and the concerns about objectivity and fairness that this change brought about.
Courtesy of Wall Street Journal
A strange story of Kyle Behm
A couple of years ago, a young nam named Kyle Behm got interested in a minimum wage part-time job at a Kroger supermarket. For a high-achieving student like Kyle, it semmed like nothing but a formality.
What happened, though, is that Kyle didn’t get invited to an interview. He’d been red-lighted by the personality test he’d taken during the application process.
Of course, Kyle tried to apply elsewhere. But he always got turned down. All companies were using the same exam that resembled the commonly used “five factor model” grading people for conscientiousness, extraversion, agreeableness, neuroticism and openness to experience.
The test these employers were using was developed by a workforce management company as part of an employee selection program. But apparently, they were walking on thin ice here.
Using medical exams in hiring is illegal under the Americans with Disabilities Act of 1990 and if it turns out to be the case of the so-called personality tests, too, companies will have to stop using them.
Now imagine that the application you’d have to fill in would include questions like “Over the course of the day, I can experience many mood changes.” (Radioshack), “I believe that others have good intentions.” (Lowe) and “If something very bad happens, it takes some time before I feel happy again.” (McDonald’s). Weird, right?
Nonetheless, automatic systems based on sophisticated mathematical formulas — such as the one used to sift through Kyle’s job application — are becoming more and more widespread across the developed world.
And with their scale and importance along with their elusiveness, these hiring algorithms are underway to create a subclass of people who will find it increasingly harder to find a decent job.
Algorithms and big data economy
The big data economy came with a promise of saving time and having better results.
Who’d object to having a computer that could skim through thousands of resumes or loan applications in a second or two and sort them into neat lists with the most suitable candidates on top?
But still it’s too hard to ignore the feeling that computer algorithms shape our world in extremely elusive ways.
Like pagan gods, these sophisticated mathematical models are often opaque. Their divine workings are accessible by no one but the highest priests in their domain.
To be able to make predictions of future successful outcomes, computer algorithms feed on historical data. That’s why many companies that use them (in the form of online personality tests, for example) to say how objective they are, claiming they remove human error and bias from complex decision-making.
Machines surfing the wave of objectivity
The introduction of technology into hiring not only saved time but was also presented as fair and objective. After all, who’d blame a machine for being prejudiced when it’s just doing its work and processing cold numbers?
With increasing complexity, mathematical models have acquired great authority. Today, there are computer algorithms out there that can:
- Predict if we’ll be valuable customers.
- Determine what we see on social media.
- Sort through resumes and applications.
- Evaluate productivity of employees.
- Predict whether we’re likely to repay a loan.
- Inform prison sentences.
- Monitor our health.
- Determine how much we pay for insurance.
- Filter what kind of political messaging we’ll receive.
The rise of personality tests in the past decade was fueled by the desire to streamline the hiring process. Their popularity may have gone through the roof, but the doubts about their effectiveness and fairness still remain.
Don’t get me wrong. These algorithms were created — similarly to Nobel’s dynamite — with good intentions.
The goal was to replace our subjective judgments with objective measurements. But as the reality shows, it doesn’t always work out like that.
Their verdicts, even when clearly wrong or ambiguous, seem to be beyond dispute or appeal.
Contrary to the original intentions, they tend to punish the poor and the oppressed, throwing our society off the balance even more.
Let’s make the process fairer, said Kronos
Driven by the desire to bring fairness into the hiring process, a company called Kronos developed a broad range of software tools for workforce management.
One of their products, Workforce Ready HR, promises to eliminate “the guesswork” in hiring and bring fairness back into the game.
What does it do, then?
According to the Kronos website, their tools can help companies “screen, hire, and onboard candidates most likely to be productive — the best-fit employees who will perform better and stay on the job longer”.
These claims sound extremely persuasive. But research has shown that personality tests are poor predictors of job performance. They were found to be only one-third as predictive as cognitive exams and also far below reference checks.
If we take these results seriously, it means that personality tests can hardly succeed in choosing the best employee. They can be just the cheapest and the fastest way of excluding as many people as possible.
Orchestras in the pursuit of pure objectivity
Still, the majority of job applicants face the challenge of making their application pass the test and land an interview.
Automatic systems have turned into tools that invariably discriminate racial and ethnic minorities, as well as women.
That sounds frightening, but what should we really do to eliminate discrimination and prejudice from the whole process?
One of the ways to avoid such prejudices is to consider applicants blindly.
Classical music has long been dominated by men. And we’re not only talking about composers like Bach, Chopin or Dvorak — male musicians have been playing first fiddle in orchestras for centuries.
In the 1970s, orchestras decided to address this problem and started to hold auditions with the musician hidden behind a sheet. Sex, age, race or even connections, reputations and education no longer mattered. The music from behind the sheet was the only thing that could make a musician stand out.
Guess, what happened. Compared to the 1970s, there are five times as many women playing in orchestra today. This sounds like objective judgment at its finest, but the reality is much more complex.
Only few professions can arrange such an even-handed tryout for their job applicants.
The problem is that while musicians are performing the job during the actual audition, resumes and interviews in conventional professions often turn the evaluation of job seeker’s qualities and future success into a pure guesswork.
The mechanics of resume screening process
Many HR departments rely on automatic systems to help them deal with piles of resumes on their desks. The statistics say that, in the US, some 72% of resumes never saw a pair of human eyes.
Computer programs take the resume and start fishing for keywords. They pull out relevant skills and experience that the employer is looking for, scoring each resume and predicting the job seeker’s match for the job opening.
Recruiters then need to decide where to place the cut-off point. The crude math is simple, though. The more candidates they eliminate, the less work they will have with processing the top matches.
So what can job seekers do to evade the possibility of sending their resumes into a black hole?
Success awaits those who remember to craft their resumes with that automatic reader in mind. A resume stuffed with industry-relevant keywords and skills will speak directly to recruiters — and computers, too.
The new era of caution
In the future, we can expect companies to systematically harvest data from social networks to inform their hiring process. They will have more insight into who we are, what we like and how we behave.
To be able to redress rather than reinforce inequality, companies are now facing a challenge. If they want to stop relying on pseudoscientific nonsense and a bundle of untested assumptions, they need to examine the hiring models they’re using.
If companies want to succeed in today’s economy, they must refrain from using personality tests that exclude people from opportunities and focus on finding employees who think creatively and work well in teams.
And the task of the scientists will be to develop predictive models that have more to do with rewarding people than punishing them.
It will take some time until companies learn to read and interpret big data well. But sacrificing a bit of efficiency for fairness will definitely pay off in the long run.