Human Rights Institute NL warns about risks of algorithms in education

The Netherlands Institute for Human Rights advocates greater awareness in education regarding the risks of algorithms. Schools are increasingly using algorithms, but there is insufficient knowledge to use them responsibly.

The use of algorithms in education has increased significantly in recent years. For example, schools use teaching materials that adapt to the level of a student or use software that can recognize fraud. Although algorithms can reduce the workload of teachers and provide more insight into the learning process of students, the Netherlands Institute for Human Rights states that there are also risks involved. Moreover, there would not be enough knowledge in schools to use algorithms in a responsible manner. As a result, students would unnecessarily run the risk of being discriminated against by the algorithms, warns the Board in response to research by research bodies KBA Nijmegen and ResearchNed.

About half of primary schools use adaptive learning systems, in which the teaching material is adapted to the current ‘level’ of a student. Those who regularly make mistakes are then offered easier practice material. Students who do significantly better are asked more difficult questions about the same material. The Netherlands Institute for Human Rights states that the use of such software carries the risk that the level of students is not always properly estimated. For example, students with dyslexia, autism or ADHD could give different answers than the students on whom the learning system has been trained. As a result, the system can incorrectly rate these students lower.

Algorithms are also used in higher education. For example, anti-fraud software is often used at colleges and universities. This carries the risk that certain groups of students may be disadvantaged, such as students for whom Dutch is not their native language. Algorithms are more likely to ‘think’ these students that an AI chatbot such as OpenAI’s ChatGPT or Google’s Gemini has been used, meaning that these students are more likely to be identified as fraudsters by the software. Research also shows that the facial detection algorithms of anti-cheating software work less well for people with a dark skin color, which can lead to discrimination.

The Netherlands Institute for Human Rights warns that discrimination and inequality of opportunity are lurking is. That is why the Board believes that educational institutions should consider whether the technology contributes to good education before using algorithms. The Board also states that it is difficult for teachers and school leaders to always look critically at the resources used. “People just tend to believe what a computer says.” Moreover, there is often insufficient information available about the operation of algorithms, which makes it difficult to make a good assessment. The Board therefore believes that the Dutch Ministry of Education, Culture and Science should help schools with this, for example by conducting research into the effects of digital systems, providing information about the risks of algorithms and drawing up guidelines that a system must comply with. to prevent discrimination.