The software was frequently used during the coronavirus pandemic to prevent fraud. Invigilators could then observe students taking an exam at home. The software incorporates facial detection and apparently Pocornie’s face could sometimes not be found. This meant that she was occasionally ejected from the exam with error messages like ‘face not found’ and ‘room too dark’.
She filed a complaint at her university, which denied that there was any discrimination. The university also asked the supplier about it, but the supplier said that there was nothing wrong with the software. According to Proctorio, that emerged from an external investigation.
VU Amsterdam says an unstable internet connection is a better explanation of the problems than the colour of this student’s skin. After all, others had no similar complaints relating to their skin colour and there were in fact more students who had problems logging in.
An interim decision shows that the Netherlands Institute for Human Rights does not go along with that. It is, after all, beyond dispute that the student had problems logging in. Moreover, scientific research shows that facial recognition software generally works less well with dark-skinned people.
The university’s counterarguments do not weight up against that, in the opinion of the institute. And the external investigation carried out on Proctorio’s behalf? It has not yet been made public. In addition, the university has not given out any information about the average login time of other students.
VU Amsterdam now has ten weeks to refute the suspicion of discrimination, otherwise the ruling will go against the university. The decisions are not binding, however. And the institute cannot impose a fine or award damages.
“I’m very happy that the law is on my side”, Pocornie says in a statement to the press. She is curious to see the reaction of other higher education institutions that have also used this software.
The human rights institute is satisfied too. “Although this is only an interim decision, it is an important moment in the history of our decision-making”, says institute chair Jacobine Geel in a news item on the website. “Someone has succeeded in saying, ‘Hey, this algorithm is doing something odd. I suspect the algorithm is discriminating.’”