Rotterdam ‘fraud prediction’ algorithms automating injustice: Dutch Government violating fundamental rights and the rule of law
7.3.2023
Question for written answer E-000780/2023
to the Commission
Rule 138
Kim Van Sparrentak (Verts/ALE)
For three years, the municipality of Rotterdam used a discriminatory algorithm to profile people and ‘predict’ social welfare fraud[1]. On the basis of this algorithm, Rotterdam residents, mainly young single mothers with limited knowledge of Dutch, were summoned for fraud investigations, with no explanation[2]. The system was trained on and fed biased data and produced discriminatory, inaccurate and unfair results[3]. Subjective criteria, such as ‘a well-groomed appearance’, were taken into account[4]. The fraud investigations led to people having their benefits wrongfully reduced[5].
The Dutch Government used algorithms in ways that violate fundamental rights in the child benefits scandal[6]and with the SyRI fraud model[7], while the Ministry of Social Affairs coordinated the use of data-driven profiling systems in low-income neighbourhoods in cities including Utrecht and Zaandam[8].
- 1.Is the use of algorithms to ‘predict’ fraud compatible with fundamental rights, such as the rights to non-discrimination, a fair trial, equal treatment, and privacy?
- 2.Is the use of risk profiling to ‘predict’ fraud with an algorithm compatible with the rule of law and basic administrative principles such as the presumption of innocence, the right to an explanation, and transparency?
- 3.If the Commission finds that the Dutch Government consistently uses algorithms in ways that violate fundamental rights or the rule of law, will it start an infringement procedure against the Netherlands?
Submitted: 7.3.2023
- [1] https://www.versbeton.nl/2023/03/computer-zegt-vrouw-hoe-een-rotterdams-algoritme-jonge-alleenstaande-moeders-discrimineerde/, https://www.lighthousereports.com/investigation/suspicion-machines/.
- [2] Ibid.
- [3] https://www.lighthousereports.com/investigation/suspicion-machines/.
- [4] https://www.versbeton.nl/2023/03/computer-zegt-vrouw-hoe-een-rotterdams-algoritme-jonge-alleenstaande-moeders-discrimineerde/.
- [5] https://www.wired.co.uk/article/welfare-algorithms-discrimination.
- [6] https://www.politico.eu/newsletter/ai-decoded/a-dutch-algorithm-scandal-serves-a-warning-to-europe-the-ai-act-wont-save-us-2/.
- [7] https://algorithmwatch.org/en/syri-netherlands-algorithm/.
- [8] https://www.lighthousereports.com/investigation/the-algorithm-addiction/.