Post by account_disabled on Mar 9, 2024 4:08:24 GMT -5
The technology that surrounds us, the one that accompanies our coffee every morning and that we use so many times throughout the day, which in principle we think can make our lives easier, can also show us that used without effective regulation can increase social inequality. , racial, economic and, for a few decades, digital. With the excuse of preventing fraud or streamlining the provision of social services, there are States that resort to mathematical formulas - algorithms - to make these decisions but, contrary to what it might seem at first, many of these algorithms are neither objective nor fair. and many reproduce the stereotypes and discriminations that already exist in the non-digital world. And in the workplace? Nor in this area are we safe from digital inequality. The introduction of surveillance cameras, wearable trackers, sensors or any tool that monitors our performance or implements “ranking systems” to locate lower salaries, can become weapons that directly attack rights as important as privacy, data protection or non-discrimination.
How algorithms can perpetuate inequality Algorithms and inequality. Subsidies based on racial profiling. © Pexels Cottonbro Studio 1. Xenophobia in the Netherlands: How an algorithm denied childcare subsidies based on racial profiling In 2013, the Dutch authorities introduced an algorithmic system to detect whether childcare subsidy applications were potentially fraudulent. To design this system, the nationality parameter was included, which led to a classification based on racial USA Phone Number profiles, so that people who did not have Dutch nationality were automatically identified as potential fraudsters. The result of this practice was a discriminatory loop that suspended aid to thousands of people and subjected them to hostile investigations and aggressive subsidy recovery policies, with the consequent economic, family and mental health damage that this caused them.
This event was so scandalous that the Dutch government was forced to resign entirely in January 2021. In the report “ Xenophobia Machines ” Amnesty International explains the operation of this discriminatory system in detail and shows how latent discrimination was reinforced based solely on the origin and nationality of the population. Algorithms and inequality: are certain social groups being left behind? In Serbia they introduced a system that automates the process of deciding which people can access social assistance.© Pexels Godfrey Atima 2. Poverty and discrimination in Serbia: how decision automation can discriminate against vulnerable groups To receive social assistance in Serbia you must have an extremely low income threshold, which is even below the absolute poverty level (about €106 per month.
How algorithms can perpetuate inequality Algorithms and inequality. Subsidies based on racial profiling. © Pexels Cottonbro Studio 1. Xenophobia in the Netherlands: How an algorithm denied childcare subsidies based on racial profiling In 2013, the Dutch authorities introduced an algorithmic system to detect whether childcare subsidy applications were potentially fraudulent. To design this system, the nationality parameter was included, which led to a classification based on racial USA Phone Number profiles, so that people who did not have Dutch nationality were automatically identified as potential fraudsters. The result of this practice was a discriminatory loop that suspended aid to thousands of people and subjected them to hostile investigations and aggressive subsidy recovery policies, with the consequent economic, family and mental health damage that this caused them.
This event was so scandalous that the Dutch government was forced to resign entirely in January 2021. In the report “ Xenophobia Machines ” Amnesty International explains the operation of this discriminatory system in detail and shows how latent discrimination was reinforced based solely on the origin and nationality of the population. Algorithms and inequality: are certain social groups being left behind? In Serbia they introduced a system that automates the process of deciding which people can access social assistance.© Pexels Godfrey Atima 2. Poverty and discrimination in Serbia: how decision automation can discriminate against vulnerable groups To receive social assistance in Serbia you must have an extremely low income threshold, which is even below the absolute poverty level (about €106 per month.