Anti-cheating software could be racist, says human rights institute
Robin Pocornie, a student at VU Amsterdam, was sometimes unable to access the online environment of the anti-cheating program Proctorio because her face was not recognised. She suspects that is because of the colour of her skin.

Image by: Esther Dijkstra
The software was frequently used during the coronavirus pandemic to prevent fraud. Invigilators could then observe students taking an exam at home. The software incorporates facial detection and apparently Pocornie’s face could sometimes not be found. This meant that she was occasionally ejected from the exam with error messages like ‘face not found’ and ‘room too dark’.
Enquiry

Centre: Robin Pocornie. Left and right: Hans de Zwart and Naomi Appelman of the Racism and Technology Center, which is advising her.
Image by: racismandtechnology.center
She filed a complaint at her university, which denied that there was any discrimination. The university also asked the supplier about it, but the supplier said that there was nothing wrong with the software. According to Proctorio, that emerged from an external investigation.
VU Amsterdam says an unstable internet connection is a better explanation of the problems than the colour of this student’s skin. After all, others had no similar complaints relating to their skin colour and there were in fact more students who had problems logging in.
An interim decision shows that the Netherlands Institute for Human Rights does not go along with that. It is, after all, beyond dispute that the student had problems logging in. Moreover, scientific research shows that facial recognition software generally works less well with dark-skinned people.
Not public
The university’s counterarguments do not weight up against that, in the opinion of the institute. And the external investigation carried out on Proctorio’s behalf? It has not yet been made public. In addition, the university has not given out any information about the average login time of other students.
VU Amsterdam now has ten weeks to refute the suspicion of discrimination, otherwise the ruling will go against the university. The decisions are not binding, however. And the institute cannot impose a fine or award damages.
“I’m very happy that the law is on my side”, Pocornie says in a statement to the press. She is curious to see the reaction of other higher education institutions that have also used this software.
Important
The human rights institute is satisfied too. “Although this is only an interim decision, it is an important moment in the history of our decision-making”, says institute chair Jacobine Geel in a news item on the website. “Someone has succeeded in saying, ‘Hey, this algorithm is doing something odd. I suspect the algorithm is discriminating.’”
De redactie
Latest news
-
Waiting for the warmth
Gepubliceerd op:-
Column
-
-
Israel receives less and less European research funding
Gepubliceerd op:-
Science
-
-
Return or not? More and more Dutch-Caribbean students explore that question
Gepubliceerd op:-
Student life
-
Comments
Comments are closed.
Read more in Campus
-
A ChatGPT of its own for staff: university launches Desidera
Gepubliceerd op:-
Artificial Intelligence
-
-
Newly elected council members in favour of better support for students, against rising tuition fees and Planetary Health Diet
Gepubliceerd op:-
University Council
-
-
Combating antisemitism: ‘Steep learning curve’ for education administrators
Gepubliceerd op:-
Safety
-