Space to be and to become: Privacy as the foundation for growth
- 25 oktober 2023
As stipulated in Article 20 of the Charter of Fundamental Rights of the European Union (EU) (‘the Charter’), all individuals are equal and stand equal before the law. Yet, equality does not entail homogeneity. We perceive, communicate, and love in countless ways. We carry within us unique family histories, navigate varying power structures, and bear the historical and social significance of our race and gender from the day we are born. And we constantly brace ourselves for political interpretations of how we express ourselves, what we stand for, and what we believe in.
We are all equal, but not identical
Therefore, we require space. Space to explore, to falter, to make up our minds and change them again, to persist and to persevere. Space to understand who we were, who we are, and who we aspire to be. Space to get to know our inner selves, without the interference, patronisation or belittlement of others. And space to live our truth, even if that truth is not popular, loved or even understood. This space is the lifeblood for our personal development and for a free and open society. It is granted, in part, by the right to privacy, as stipulated in the Charter: ‘Everyone has the right to respect for his or her private and family life, home and communications’. The right to privacy is inextricably linked to other human rights. For instance, the processing of data pertaining to gender, religion or ethnicity could infringe upon an individual’s rights to equal treatment or freedom of religious expression. Consequently, the right to privacy acts as a gatekeeper for other human rights. This essay delves into several recent
cases in the Netherlands where infringement on the right to privacy has precipitated violations of other human rights.
We require space. Space to explore, to falter, to make up our minds and change them again, to persist and to persevere. Space to understand who we were, who we are, and who we aspire to be.
Identities reduced to stereotypes: Data processing and the dangers of stereotypical reductionism
With the advent of seemingly limitless technological capabilities in data processing, the right to privacy is under considerable strain. Data containing personal information is being processed, analysed and interpreted on an unprecedented scale. Insights and conclusions drawn from such data are disseminated and exchanged. The multifaceted aspects that construct identities are simplified, categorised and packaged into predetermined profiles, a process that reduces complex identities to mere stereotypes. Indeed, profiling might simplify the prediction of an individual’s receptiveness to advertising strategies or the success of medical treatments. But how appropriate is it to estimate, based on profiling, the political persuasions of a swing voter, the likelihood of someone’s involvement in ‘suspicious transactions’, or who might commit benefit fraud? Automated decision-making based on profiling was heralded with the promise of simplicity, efficiency and efficacy. Unfortunately, the opposite appears to be true, and the consequences becoming increasingly apparent, as we observe various fundamental rights being undermined due to breaches of the right to privacy.
The multifaceted aspects that construct identities are simplified, categorised and packaged into predetermined profiles, a process that reduces complex identities to mere stereotypes.
Compounded discrimination: The cumulative impact of bias
A striking example of the infringement of privacy leading to other human rights violations can be found in the realm of digital welfare. The Dutch digital welfare system gave rise to the now notorious childcare benefits scandalFind out more about the child benefit scandal. In 2018, it came to light that many parents who received childcare benefits from the Dutch Tax Administration had been wrongly identified as committing fraud. This incorrect classification forced parents to repay thousands of euros to the state, plunging many into severe financial distress, homelessness, and acute stress and poverty. In an alarming revelation, as many as 2090 children of affected parents were taken into care between 2015 and 2022Read more on how this scandal affected children. Of the affected parents, a disproportionate number were found to be from immigrant backgrounds. This was corroborated by subsequent investigations, which revealed that the Tax Administration was processing data related to parents’ nationalities. The Dutch Data Protection AuthorityYou can find the DPA's report here confirmed this claim and identified three illicit processing operations. First, dual nationalities were being processed. Second, nationality data was employed as an indicator for the risk classification model. Third, this model was being utilised to detect organised fraud. It was concluded that there had been discriminationbased on nationality. Amnesty International’sRead more on 'Xenophobic Machines' further investigation discovered that discrimination was perpetrated not only on the grounds of nationality but also ethnicity. Indeed, the use of nationality data facilitated the discriminatory targeting of ethnic minorities by the risk classification model. As Amnesty highlighted in its report, ‘ethnic profiling violates the prohibition of discrimination. It leads to the criminalisation of certain groups of people and it reinforces historical stereotypical associations between fraud and ethnicity’.
Additionally, the investigation revealed that people who received higher benefits were more likely to be labelled as committing fraud. This led to individuals from low-income households being disproportionately affected owing to their greater dependence on benefits and the larger sums they received. Moreover, lower-income parents found it more difficult to repay the vast sums demanded by the Dutch government. Amnesty’s assessment classified the situation as intersectional discriminationRead our article on discriminating algorithms, as those most affected were typically ethnic minority groups who are generally more likely to have low incomes. Compounding the issue, it later emerged that religious profilingRead Controle Alt Deletes article on profiling was also occurring, with individuals who had made donations to mosques being deemed higher risk.
The childcare benefits scandal starkly illustrates the intersectional nature of discrimination, where multiple axes of bias reinforce one another. This is not a new phenomenon. As early as 1989, CrenshawSee Crenshaw's book 'On intersectionality' highlighted how intersecting political and social layers within our identities can render us more vulnerable to discrimination or privilege. Factors such as gender, ethnicity, class, sexual orientation, religion, weight and disability can all affect one’s position within the spectrum of power in society. The intersection of multiple factors can either consolidate a position of privilege or expose an individual to compounded discrimination. This phenomenon is exemplified in the childcare benefits scandal. For example, individuals who were Muslim, had a lower income and possessed at least one non-Dutch ethnicity were thrice-penalised based solely on these classifications, with no consideration to their individual circumstances.
Factors such as gender, ethnicity, class, sexual orientation, religion, weight and disability can all affect one’s position within the spectrum of power in society. The intersection of multiple factors can either consolidate a position of privilege or expose an individual to compounded discrimination. This phenomenon is exemplified in the childcare benefits scandal.
Criminalisation prior to transgression: The cost of presumptions
These distinct forms of discrimination arise when data aggregators, such as the Tax Administration Office of the Dutch government in our earlier example, construct profiles based on collected data. While individuals may suspect that their data could be used to make assumptions about them, in this instance, these assumptions were used to predict the likelihood of the individual to commit social security fraud. Moreover, the propensity to suspect individuals even before (or despite the absence of) rule-breaking suggests that the assumed actions of a person’s assigned group are paramount in establishing their risk profile. It is critical to note that the Dutch government justified its implementation of algorithmic profiling on the grounds that it would contribute to effective fraud detection and prevention, thereby serving the public interest. This raises the question of whether an argument of public interest should supersede the rights and interests of individual citizens.
In 2020, The Hague District CourtYou can read the caselaw here contested this notion, ruling that the pursuit of preventing and combating fraud for the sake of economic welfare must be balanced against intrusions on individuals’ private lives. The Court evaluated whether the Systeem Risico Indicatie (SyRI) legislation, which permitted the confluence of diverse data to combat fraud, was in breach of Article 8 of the European Convention on Human Rights. The court determined that the SyRI legislation failed to meet the ‘fair balance’ standard necessary to justify the violation of the right to privacy for the defence of broader interests. Additionally, the Court noted that the use of SyRI offered little safeguard owing to its lack of transparency and verifiability. As the legislation violated European law, it was deemed unlawful and non-binding. Litigation brought forward by civil society organisations, including the Dutch Jurists Committee for Human Rights and the Platform for Civil Rights, sent a clear message: Intrusions into individuals’ private lives, particularly through the collection, combination and sharing of personal data, must be judiciously scrutinised by courts and governments.
The right to privacy: The sentinel of human rights
The implications of large-scale data aggregation and processing are substantial. After all, individuals who are classified as high-risk must face the repercussions, whether they know about the classification or not. The childcare benefits scandal exemplifies the human cost of such violations. Moreover, while benefits were abruptly discontinued and demands for repayment piled up, parents were left without answers as to why they had been labelled as committing fraud, and requests for information were met with heavily redacted filesRead more on how parents were kept in the dark. Even legal protection proved insufficient.
According to the Judicial CouncilRead more in the judges reflections, families were forced into an unequal struggle against a far more powerful government. According to the EU Charter of Fundamental Rights, human dignity is inviolable and must be respected and protected (Article 1), and discrimination is prohibited (Article 21). Individuals are entitled to freedom of thought and belief (Article 10), freedom of expression (Article 11), equality before the law (Article 20), the right to protection and care for children (Article 24), the right to social security if self-provision is unfeasible (Article 34) and the right to legal protection (Article 47). People have the right to be presumed innocent until proven guilty. All these rights safeguarded by the Charter were threatened in the childcare benefits scandal, which began with the unlawful processing of personal data. It is not without reason that one of the central objectives of the General Data Protection Regulation is to protect all fundamental rights and freedoms, particularly (though not exclusively) the right to the protection of personal data. The violation of privacy through the processing of personal data can infringe on other fundamental rights. Thus, we refer to the right to privacy as a ‘gatekeeper’ for other human rights. This emphasises why we must continue to advocate for the right to privacy as a fundamental right – it is crucial to safeguarding an open and free society in which everyone is equal, and whose differences are maintained, respected and valued.