Parliamentary question - E-000780/2023Parliamentary question
E-000780/2023

Rotterdam ‘fraud prediction’ algorithms automating injustice: Dutch Government violating fundamental rights and the rule of law

Question for written answer  E-000780/2023
to the Commission
Rule 138
Kim Van Sparrentak (Verts/ALE)

For three years, the municipality of Rotterdam used a discriminatory algorithm to profile people and ‘predict’ social welfare fraud[1]. On the basis of this algorithm, Rotterdam residents, mainly young single mothers with limited knowledge of Dutch, were summoned for fraud investigations, with no explanation[2]. The system was trained on and fed biased data and produced discriminatory, inaccurate and unfair results[3]. Subjective criteria, such as ‘a well-groomed appearance’, were taken into account[4]. The fraud investigations led to people having their benefits wrongfully reduced[5].

The Dutch Government used algorithms in ways that violate fundamental rights in the child benefits scandal[6]and with the SyRI fraud model[7], while the Ministry of Social Affairs coordinated the use of data-driven profiling systems in low-income neighbourhoods in cities including Utrecht and Zaandam[8].

Submitted: 7.3.2023

Last updated: 20 March 2023
Legal notice - Privacy policy