Contents 1 Artificial intelligence discriminated against (even) 2 “The final decisions meet the people” On a page

Already in the year of 2014, Amazon has developed in the USA, the Software should create by means of Artificial intelligence, a Ranking of the applications received and read. Now that the application robot, discriminated against women, but not intentionally. Sven Laumer, University of Nuremberg-Erlangen, explores how digital technologies in HR processes, and explains how this could happen.

TIME ONLINE: Mr. Laumer, how does an application robot?

Sven Laumer: Behind a job application robot is an Artificial intelligence, the applications according to certain characteristics, and key words related to a job are examined. Then, you can ask for the Recruiter, a pre-selection of suitable candidates. The System only works if it is fed with data and training. For example, the company can train the Artificial intelligence to compare all of the incoming applications with the successful recruitment of the last ten years and establishing connections.

Born in 1982, is the owner of the Schöller-endowed chair for business Informatics at the University of Erlangen-Nuremberg, Germany. He researched for more than ten years, how digital technologies are used in personnel processes, and co-author of the two study series Recruiting Trends and application to practice. © private

TIME ONLINE: And how could it happen that the System Amazon has disadvantaged women?

Laumer: In the case of Amazon, the System had men, especially the type of candidate that is applying the most frequently in the organization: namely, the tech-savvy. The Software assumed that these people have a particularly large interest in the employer. Actually comprehensible. The Problem is: Because there are in the Tech industry, more men than women, has concluded, the robot that can inspire men for the company, and filtered women out.

“robot implicit discriminate against them as a people.”

Sven Laumer

TIME ONLINE: does this sort of thing more often?

Laumer: In the USA there was a similar case. Since the Artificial intelligence had found that the employees who live further place away from the work, terminate faster. Since this is not in the interest of the company, it sounds like a once obvious, these Applicants tend to weed out. However, people belonging to an ethnic minority living in the outskirts often. And you discriminated against, implicitly, a whole group of applicants, the decision not to penalise.

TIME ONLINE: So, an artificial intelligence decides also not bias free as a people?

Laumer: I wouldn’t say so. No man can make himself so completely from his first impression, even non-expert personnel. This can lead to you not hasty in making a judgment about the candidate because you like his nose. A robot that does not know the characteristic nose at all, this wouldn’t happen. Robot discriminate against implied as a people. But the way, as a Software takes a decision, of course, depends on the data set that is based on.

“The big challenge is to train the Artificial intelligence discrimination-free.”

TIME ONLINE: If a company, especially white men, then the Artificial intelligence is fixated on white men.

Laumer: Exactly. The big challenge is to train the Artificial intelligence discrimination-free. You need a data set that is explicitly and implicitly discriminatory.

TIME ONLINE: How can it succeed?

the Publisher TIME to market Current Vacancies in the IT & technology To the list.

Laumer: it is important to believe the data is never blind, but you always have to ask. There are the most crazy relationships: For example, the number of suicides by Drowning is correlated in a swimming pool significantly with the number of films in which Nicolas Cage plays. Us people, it is clear that we can not prevent a case of death by producing fewer films, with Nicolas Cage. But the Artificial intelligence is not coming up with this idea. She sees the correlation, without critical questions. So this is when it comes to Recruiting. The System detects the relationship between the distance to the workplace, and the termination rate, without taking into account other possible factors of influence.

TIME ONLINE: So the people of the Artificial intelligence have to teach, that both has to do, not necessarily something to each other.

Laumer: Exactly. As soon as one recognizes that the criterion of distance to the workplace discriminates against certain people implicitly, you can’t teach the System to select it.