Finally, you are invited to a job interview. You expect a staff member from human resources, but instead you sit opposite a robot. During the 30-minute interview, robots like “Matilda” or “Sophia” ask the candidate dozens of questions about motivation, career goals, or strengths and weaknesses. Cameras not only record the spoken word, but also the candidate’s facial expressions and gestures. The robot recognizes emotions in the candidate’s face and can react spontaneously. The combination of the generated responses and emotions results in a personality profile that is compared with existing data of successful employees and assigned to a category “suitable for the job” or “unsuitable for the job.”
AI in Recruiting
This scenario is not fiction, but reality nowadays. The U.S. American company “HireVue” and others advertise and sell their sophisticated video interview software towards large companies. While the applicant is interviewed comfortably from within his or her own four walls, up to 20,000 pieces of data can be collected from this interview and analyzed in record time, using algorithms to find the right employee.
And larger companies already use artificial intelligence in their global HR recruitment process. In Switzerland, companies like Credit Suisse utilize software to review applications and categorize applicants according to their suitability or unsuitability for the advertised position.
The legal issues relating to sophisticated video interview software range from labor to data protection laws, and the use of such technologies in civil and criminal proceedings will soon be a topic of discussion. The much-defamed lie detector might get a new life as its shortcomings – rightly criticized in the 1960s and 1970s – are most probably eliminated by today’s video interview software.
Supporters of the new software think that algorithms can far better comply with today’s demanding employment law requirements because they make decisions objectively, based on facts, and with no biases or prejudices. They also claim that the much broader and deeper information improves inclusion and diversity efforts. The larger the database, the better the accuracy, so human resource managers claim. And with the help of this sort of video interviews, valuable information can be generated in advance that goes far beyond one’s resumé or curriculum vitae, and exposes subconscious human characteristics that even sophisticated interviewers cannot spot.
Programming the Software Correctly
It is crucial that algorithms are programmed so that during job interviews, no questions are asked that are prohibited or frowned upon by labor laws and which would interfere with employees’ protected privacy interests. For example, questions about health or financial circumstances in most cases violate the right to privacy and the right to informational self-determination, depending on the relevance for the advertised position. As is generally known, personal data may only be processed if it relates to a candidate’s suitability for the job or if it is necessary for the execution of the employment contract.
The linchpin question revolves around the inner personal layers the applicant may wish to disclose and whether he or she explicitly consents to such disclosure. This can be difficult to ascertain when a candidate wants to get the job whatever the circumstances and freely answers a range of questions and submits to the video interview, but in such situations, the candidate will probably have to assume consent is given voluntarily. In other words, candidates will have to consider whether they really want to work in an organizational culture that will probably strive for complete transparency once the employee is on board, too.
Non-Verbal Cues and Discrimination
The consent issue also questions whether a candidate participating in a video interview is aware that software tracks non-verbal signals such as one’s heartbeat, eye movement, and facial expressions. This might cause an applicant to unknowingly respond non-verbally to questions that would be inadmissible under labor laws that seek to protect the future employee’s privacy rights.
Particular care must be taken that the application software is not programmed in a way that it intentionally or unintentionally discriminates against job candidates. Critics rightly claim that algorithms are formulated by people and can thus also reflect their potential prejudices. In other words, algorithms are opinions embedded in codes and therefore inherently subjective.
GDPR Compliance
According to the EU-GDPR, every job applicant has the right not to be rejected by a decision based on automated data processing and which has legal effect. Failing to adhere to these protections may then allow an applicant or employee to appeal the decision. In Switzerland, GDPR provisions only apply to Swiss companies operating within the EU. As part of its ongoing data protection review, Switzerland will largely adopt the current EU data protection standards to comply with and enforce cross-border data exchange requirements.
Dr. Thomas Rihm
Hasret Mutlu
Rihm Rechtsanwälte
Zurich, Switzerland