LinkedIn’s job-matched AI was biased. company solution? More artificial intelligence.

More and more companies are using artificial intelligence to recruit and hire new employees, and artificial intelligence can take into account Almost any stage in the recruitment process. Covid-19 has fueled new demand for these technologies. both of them something curious And the HireVue, companies specializing in artificial intelligence-assisted interviews, reported an increase in business during the pandemic.

However, most job searches start with a simple search. Job seekers turn to platforms like LinkedInAnd the monster, or ZipRecruiterWhere they can upload their CVs, browse vacancies and apply for vacancies.

The goal of these sites is to match qualified candidates to available jobs. To organize all these slots and candidates, many platforms use AI-powered recommendation algorithms. Algorithms, sometimes referred to as matching engines, process information from both the job seeker and the employer to produce a list of recommendations for each.

“You usually hear the anecdote that a recruiter spends six seconds looking at your resume, right?” says Derek Kahn, vice president of product management at Monster. “When we look at the recommendations engine that we created, you can reduce that time to milliseconds.”

Most of the matching engines are optimized for creating apps, he says John Jersing, former Vice President of Product Management at LinkedIn. These systems base their recommendations on three categories of data: information that the user provides directly to the platform; User-specific data based on others with similar skill sets, experiences, and interests; and behavioral data, such as the number of times a user responds to messages or interacts with job postings.

In the case of LinkedIn, these algorithms exclude a person’s name, age, gender, and ethnicity, because including these characteristics can contribute to bias in automated processes. But Jersin’s team found that even then, the service’s algorithms can still detect behavioral patterns shown by groups with particular gender identities.

For example, while men are more likely to apply for jobs that require work experience beyond their qualifications, women tend to go only to jobs where their qualifications match the job requirements. The algorithm interprets this variation in behavior and adjusts its recommendations in a way that inadvertently harms women.

“You might, for example, recommend more senior jobs to one group of people than another, even if they are equally qualified,” Jersen says. These people may not be exposed to the same opportunities. And that’s really the effect we’re talking about here.”

Men also list more skills on their resumes with a lower degree of proficiency than women, and often interact more aggressively with recruiters on the platform.

To address such issues, Jersin and his team at LinkedIn Building a new artificial intelligence It was designed to produce more representative results and published in 2018. It was basically a separate algorithm designed to counter skewed recommendations towards a particular group. The new AI ensures that before referring matches sponsored by the original engine, the recommendation system includes an even distribution of users across gender.

Kan says Monster, which lists 5 to 6 million jobs at any one time, also incorporates behavioral data into its recommendations but doesn’t correct bias in the same way LinkedIn does. Instead, the marketing team focuses on enrolling users from diverse backgrounds for the service, and the company then relies on employers to report and tell Monster whether or not it has passed a representative pool of candidates.

Irina NovosilskyC., CEO of CareerBuilder, says it’s focused on using data the service collects to teach employers how to eliminate bias from their job ads. For example, “When a candidate reads a job description with the word ‘rock star,’ there is actually a lower percentage of women who apply for the job,” she said.

Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button