What is the harm in linking digital wallets to India’s unique biometric ID Aadhaar?

None, many people would argue.

But Poornima*, a transgender activist living in a big city in North Eastern India, knows differently.

Among the trans people she works with, many engage in sex work and rely on the popular Indian digital wallet Paytm for payments. Wallets that, until 2017, gave no insight into their working lives.

Enter Aadhaar.

Following a strong policy push by the government, this randomly generated 12 digit digital ID was made mandatory over a few short years in order for residents to access state benefits, welfare subsidies, and to file taxes. In 2017, linking digital wallets like Paytm with Aadhaar also became compulsory for a while. And sex working communities like Poornima’s were concerned.

Advertisement

Gender is among the data collected when registering for Aadhaar, with three gender options always provided. In some ways, the recognition of more than two genders on official government forms is a wonderful thing. But the fact that this data is collected can also pose a risk. For many trans people, job opportunities are severely restricted, and sex work is often one of the few options available. The authorities know this too.

So as the data on payments is linked to data on gender, Poornima and her colleagues fear that this might end up inadvertently visibilising their identities as sex workers.

This is a problem. Under Indian law, both soliciting and living off the income of sex work are crimes. And while many in the trans community are trying to earn a living through sex work, the government believes that what they need is “rehabilitation”, in order to live more “respectable” lives. In this struggle over what is right, data becomes a crucial currency.

Advertisement

For communities that already live at the margins, the consequences of losing control over this data are devastating. In Poornima’s words: “It’s just another way for the state to control people – to control the activities they do and the kinds of expression they engage in.”

What is boundary management?

At the heart of the vulnerability that Poornima draws attention to lies an aspect of the right to privacy that receives little attention but is becoming increasingly important in the digital age: privacy as boundary management.

All of us engage in boundary management in our everyday lives: it happens every time we make a decision about what to reveal – or what not to reveal – about ourselves. Whether we decide to hand over our phone number to department stores. Whether we decide to tell a cute stranger in a bar our name. Whether we decide to share our sexual orientations with our families. Some of these decisions may seem trivial, but collectively, our ability to control them affects our capacity for self-determination.

Advertisement

In the 20th century, way before the internet came into being, BR Ambedkar – one of India’s foremost intellectuals and a contemporary of Gandhi – was already sharply aware of the importance of being able to dynamically control what information about ourselves we share, with who, and when.

While Gandhi famously called upon his followers to relocate to India’s villages, Ambedkar recommended that the country’s Dalits – those who, in India’s pervasive caste system, have been assigned to the lowest rung – move to cities instead. The anonymity of the city, Ambedkar argued, would give them a much better chance to escape the rigid requirements of caste; to build better lives for themselves than the close-knit social networks of the village permitted. In villages, your caste will always already be known. In the cities, you can still retain some measure of control over whether or not to reveal this information.

‘Edinburgh Royalty of the City expansion (1685–1885)’. Credit: Data Foundry/National Library of Scotland; 2020 (CC-BY 4.0)

Ambedkar recognised that boundary management is not only important because it allows us to control what we share with others; he also understood that this control is crucial to living a life of dignity.

Advertisement

As human beings, we are deeply shaped by our social worlds. But if we are to live with dignity, the social cannot be permitted to overwhelm our sense of identity – no matter how much we value our social bonds. Especially when it comes to aspects of our identities that are marginalised, it is crucial to have the space to take a step back from social environments and develop a critical perspective about the world around us. To enjoy breathing room to validate our own experiences, beliefs, feelings and desires – even, or perhaps especially, when they do not align with what society tells us. Boundary management is what allows us to create that room.

Traditional understandings of privacy often fail to accommodate such concerns. When we define privacy in terms of physical spaces, such as our homes, we see privacy as a static, binary state: something we require for now and ever. But boundary management is always dynamic, because it is always contextual. When we make decisions about what to share with whom, these are not fixed conditions with sharply delineated boundaries. Instead, they constantly adapt to new situations and changing relationships.

You might not give that cutie at the bar your phone number in the first few minutes of your interaction, but perhaps as the conversation progresses, you will. You may not tell your parents about your sexual orientation the moment you realise you might be lesbian, gay or bi, but perhaps as you feel increasingly comfortable with that aspect of your identity, you long to share it with them. In both cases, just because you have shared this information with particular people, doesn’t mean that you want to immediately share it with the world. ​​

Advertisement

And so here is the big challenge for privacy in the digital age: in their quest to make all of us increasingly legible, transparent and predictable, governments and private actors are fundamentally undermining our capacity to engage in the autonomous management of our bodies, selves and lives as we see fit.

Bodies as data

Why are the fears of Poornima and the people she works with not heard more loudly in public discourse? An important part of the answer lies in how we have been told to look at data.

As technology ensures that data is created about more aspects of our lives, governments, big tech companies and start-ups alike have gone into overdrive to collect that data – simply because it exists. If it doesn’t yet have value today, they argue, perhaps tomorrow it will. Because data, the argument goes, is the ultimate solution to, well, everything. So much so that business models, governance plans and visions for development are all increasingly formulated around it.

Advertisement

Implicit or explicit in many of these developments is the assumption that information yielded by the datafication of our lives is somehow more objective and accurate than anything that has come before. This becomes possible because data is effectively portrayed as a separate layer that somehow penetrates everything and yet exists independently of the medium in which it is contained.

When data is put forward as a resource to be mined, it is not just any resource: when data is excavated, it is believed to somehow yield ultimate truth. In other words, data has effectively emerged as the 21st century’s oracle.

This pertains to our bodies too. Take Aadhaar. When registering for Aadhaar, people are required to share all 10 fingerprints and scans of both their irises, in addition to their name, date of birth, gender, address and a facial photograph.

Advertisement

Because Aadhaar is supposed to be merely a number, not a physical ID, it’s often been heralded for its lack of materiality. But ironically, in its strong reliance on biometric information, it is actually grounded in “intensely material acts”. After all, it’s precisely by linking biometric data to demographic data, such as your gender and date of birth, that it supposedly becomes possible to establish who you really are.

‘Distorted Scans 2.0’; 2013. Credit: Christopher Frank Beitz/Tumblr

As several researchers have pointed out, the biometric data yielded by the excavation of bodies is the lynch pin of Aadhaar: it’s treated as indisputable and foundational, as somehow preceding all social, political, and economic forms of identity. Your body’s data, and only that data, will tell us who this body really is.

As reports continue to pour in of people being denied rations in India because their fingerprint authentication fails, in some cases with starvation as a consequence, this is why: our systems are now set up in such a way that the data of our bodies is privileged over our living breathing bodies in telling others who we are. Our agency in representing ourselves has become irrelevant.

Advertisement

But the truth is that neither bodies nor data exist outside of the social world – and so neither do bodies-as-data.

Projecting biases

In 2011, the Karnataka Legislature amended the state’s Police Act to include a section that specifically targeted trans people. Until this reference to the trans community was deleted from the provision in 2017, Section 36A allowed the police to maintain a register of the names and addresses of trans people who could be “reasonably suspected” of “undesirable activities”. With one brushstroke, the law singled out trans people as potential criminals, simply on the basis of their gender identity​.​ ​​

The roots of such practices go back to colonial times, when British rulers in India created databases to surveil citizens. These entries were based simply on suspicions regarding people’s “characters”. Thus, many nomadic people whose occupations, customs and ancestry did not fit British conceptions of “civilised people” ended up in databases labelled “criminal tribes”. Others who were considered “expendable, undesirable and not worth protecting” found themselves on lists of “goondas” and “bad characters”.

Advertisement

Even though this law was repealed following India’s independence, the ongoing surveillance of particular groups based solely on vague suspicions continues today.

Already before the 2011 amendment, the Karnataka Police Manual allowed for a register of “goondas”, which includes “a hooligan, rough, vagabond or any person who is dangerous to the public peace or tranquility”. That latter category, is, of course, broad enough to include more or less anyone who doesn’t follow dominant social norms.

If the people Poornima works with are concerned about their gender being revealed, they have good reason. Since bodies don’t exist outside their social contexts, we generally don’t look at or treat all bodies in the same way. Even if it shouldn’t be so, in practice it all too often matters whether others read our bodies as female, male, trans, or gender non-conforming; poor or rich; dark-skinned or light-skinned. Our bodies are canvasses onto which societies’ biases are projected.

Biometric data yielded by the excavation of bodies is the lynch pin of Aadhaar. Credit: Saumya Khandelwal/Reuters

But if the production and interpretation of bodies is to an important extent a social issue, that is true for data as well – with important consequences.

Advertisement

Take, for example, the Indian police’s reported move towards predictive policing. The assumption is that this will make policing more effective, and lead to a better allocation of resources – and make policing more “rational”, if you will. But how is “rational” defined in this context? Much depends on the design of the algorithms that are used to guide these efforts, and the data that is fed into them.

How, for example, will trans people in Karnataka fare under this new regime if they are already targeted as potential criminals under constant surveillance? Given the existence of provisions such as Section 36A, they are likely to be over-represented in what is supposedly “crime-related” data – the “crime”, though, is simply their gender.

Conversely, can you imagine how different police data sets would look if the Karnataka law singled out potential perpetrators of white collar crimes in the same way that it did trans people?

Advertisement

If one of these data sets exists and the other does not, that itself is a reflection of social norms and beliefs that are embedded in the law. Rather than any “rationality”, it’s the law’s bias that drives the generation of data. And if such biases are not corrected before data is fed into the algorithm, predictive policing in India will further deepen the inequalities that trans people and other targeted communities face – by directing further police attention their way.

Data is social because decisions about what to include and what to ignore at the design level, what to pay attention to and what to disregard during collection and analysis, always involve processes of interpretation – and those are frequently infused with biases.

And so when our bodies and their actions become datafied, this doesn’t expose all of us equally, because not all of us are equally vulnerable. For those who in some way don’t fit the norm, the subsequent loss of control over what they share with others is potentially far more harmful than for those who do more or less conform.

Advertisement

Fear and stigma

In December 2016, 30-year old Priya, an HIV-positive sex worker from Secunderabad, was asked to submit her Aadhaar number at the centre where she goes to get her life-saving anti-retroviral therapy drugs every month. Worried that the linkage with Aadhaar would lead to this sensitive health information being leaked, Priya stopped taking her medication that month. Priya’s husband and children do not know that she is HIV-positive or a sex worker.

Since Aadhaar is a required data point in a wide range of governmental and commercial databases, the linking of such databases with Aadhaar as the connector has become much easier. And this consolidation of data that was earlier scattered is a cause for concern.

In 2017, a year after India’s National AIDS Control Organisation urged state AIDS Control Societies to link the provision of ART with Aadhaar, less than half of all HIV-positive people accessing ART services in Karnataka complied with the order. Many feared that doing so would increase their exposure to potential leaks, and thus to stigma.

Advertisement

Similarly, in April 2019, reports emerged that Karnataka’s Department of Health and Family Welfare will soon begin tracking every pregnancy in the state using an Aadhaar-linked unique ID number. A year earlier, two different proposals had already suggested that women’s pregnancies be tracked using Aadhaar so as to monitor and curb the practice of gender-biased sex selection, which is outlawed in India.

But as Ramya Chandrashekhar points out, where such a “digital trail of choices exercised about one’s body is created under State vision, [it] hampers women’s autonomy to make decisions related to their bodies and life”.

In the absence of strong data protection policies, how will such tracking and the possibility of data being non-consensually shared impact women? Especially those who do not want their families to know they are having an abortion – for fear of being stigmatised or bringing disrepute, or simply because their families might disagree with their decision? It is quite likely that many of these women will look for options outside of the formal health system in order to avoid their actions being tracked, thus putting their bodies at risk.

Advertisement

Their fears are not unfounded. In the past two years alone, leaks of reproductive health related data belonging to millions of Indian women have been reported from both North and South India. In the latter case, the data was linked to Aadhaar.

While normalising increased surveillance of the bodies, decisions and data of marginalised people, such measures also reduce our control over who gets to collect our data over sensitive aspects of our lives, and who gets access to it.

State-sanctioned voyeurism

In Bangalore’s garment factories, more than three quarters of the labour force consists of women – and sexual harassment is pervasive. Given this, women’s safety has become a major justification for the use of CCTV cameras in these factories. But when researcher Nayantara R looked into the actual impact of such cameras, she couldn’t find a single instance in which CCTV footage had been successfully used to seek redressal against harassment.

Advertisement

Instead, Usha, a worker in one of the factories, highlighted how the cameras had had the opposite effect: because the cameras do not record sound, a complaint about repeated sexual harassment by a supervisor who would say lewd things to a worker on the factory floor was not taken seriously – and was eventually dismissed.

And because CCTV cameras penetrate even the most intimate actions, they de facto create new forms of harassment and avenues for voyeurism, with gendered effects. Workers that Nayantara spoke with also noted that their bathroom visits were monitored, and that even everyday actions such as laughing or scratching inside a sari blouse had become cause for discomfort or fear. As one worker said: “Nowadays I try to ignore any itch”.

Even if women’s safety is the stated motivation, CCTV cameras turn bodies and their movements into data all over the world. And this proliferation has made boundary management even more difficult for women and gender minorities. It has enabled new forms of monitoring, often by people whose own privileges lead to biases that only reproduce, rather than challenge, existing inequalities. At the same time, the pervasiveness of such cameras has made it even more difficult for women and gender minorities to control what information about their bodies and movements they share and with whom.

Advertisement

In 2013, more than 250 video clips, believed to be CCTV footage, of couples who shared intimacies on platforms of Delhi’s Metro stations or in near-empty trains were reported to have been uploaded to international porn sites. As foretold by Ambedkar, the anonymity of big cities provides among the few spaces of physical privacy accessible to those from conservative backgrounds or who live in crowded quarters. The consequences of such data “leaks”, therefore, tend to be particularly severe for marginalised or economically vulnerable women and those belonging to the queer community – who can be recognised by someone they know.

So whether you are a woman worker or a passenger on public transport, the current dominant implementations of CCTV cameras do very little to displace the power imbalances at the heart of the violence so many of us face.

Privilege of anonymity

What makes a person look gay, straight, or lesbian? Apparently, there’s an algorithm that can tell. After researchers Michal Kosinski and Yulin Wang trained an algorithm to identify people’s sexual orientations from their facial features, they found that it did a much better job than human beings.

Advertisement

Not surprisingly, the study was highly controversial – and rightly so. What on earth were the researchers thinking when they decided to build a mechanism that could potentially out people on a massive scale, especially when in so many countries queer sexualities continue to be criminalised?

Like human beings, the algorithm wasn’t always accurate in its categorisations. But because of the aura of objectivity and truthfulness that surrounds data, studies like these have the potential to put large numbers of people at tremendous risk.

Lesbian, gay, bisexual, transgender, queer and intersex persons continue to be discriminated against. Credit: Dibyangshu Sarkar/AFP

At least in urban spaces, our expectation is that when we show our faces in public, this will reveal little about us – except to those whom we have chosen to share more with. This anonymity, or the expectation of privacy in public, is central to the empowerment that cities provide many marginalised people. But now, this will be increasingly difficult to come by as the use of facial recognition technology increases. Not only will it be possible to record and leak the footage of women kissing and fondling their partners on the Delhi metro, it will also be possible to identify the women in question more easily than ever before. The protection that anonymity gives us to explore life in unsanctioned ways will no longer exist. And Ambedkar’s advice to India’s Dalits to move to cities will have lost some of its force.

Advertisement

It’s not just a question of anonymity and privacy, though. As Joshua Scannell points out, such research around facial recognition exposes the growing confidence put into biometrics as a way to know who we really are – and illustrates the dangers that come with it. It is eerily reminiscent of similar trends in the 18th and 19th centuries, when the pseudo-sciences of physiognomy and phrenology claimed that it was possible to determine someone’s personality, and even their moral character, by examining their faces or their skulls.

The most horrible elements of these pseudo-disciplines may have been debunked by science, with a lot of help from civil rights movements, but the effects of such claims have often been long-lived, and devastating. For example, research has shown that the Rwanda genocide of 1994 found its roots in, among other things, the application of precisely such dubious ideas by Belgian and German colonisers, who used them to divide the Rwandan population into Hutus and Tutsis.

The basic premise of their claims – that character can be deduced from physical attributes – has come alive again in works such as that of Kosinski and Wang. The only difference is that its application has now been driven deeper into our bodies and our faces.

Advertisement

And this influences how freely we can live our lives.

Facial recognition

In 2017, reports emerged that the Chennai police had started to equip CCTV cameras in a popular shopping area with facial recognition software. The system had been fed photographs of around 500 former convicts and “wanted criminals”. If one of these people was found to be hanging around the area – that is, if there was an overlap of only 80% between the footage and one of the images in the database – an alert would be sent out to the police, who would pick up the person for questioning. A detailed verification would only be done later.

Who ends up in such databases? The newspaper report didn’t clarify how, or from where these images of “wanted criminals” had come, but Indian law allows for photographs to be taken by the police of people who have merely been arrested, not convicted, provided a Magistrate gives his approval.

Advertisement

With facial recognition software being increasingly integrated into preventive policing efforts across the country, the existence of laws such as Section 36A of the Karnataka Police Act and other provisions that single out specific groups of people for surveillance becomes a further cause for concern. If trans people regularly find themselves the focus of police action simply based on their gender, will the use of facial recognition software mean that everyday acts like loitering in a shopping area can get them detained?

This concern is by no means far-fetched. Lawyer Darshana Mitra explains how already in Bangalore, women who have been identified as sex workers by the police are detained simply because they occupy public space in the middle of the day: standing on a walkway at a busy bus stand, eating peanuts, talking to people. “Do you see what they are doing!?”, a police officer exclaimed in a conversation with Mitra about video footage he had recorded. Through his question, he was implying that the women’s actions amounted to public soliciting. When you are a sex worker, it seems as though the simple act of hanging around in a public space is already a crime.

Facial recognition software threatens to further normalise the criminalisation of sex workers, trans people and other groups who are always already under surveillance.

Advertisement

Moreover, news reports indicate that police in the neighbouring state of Kerala have started to use facial recognition technology that draws on social media data to identify protestors they consider “criminals, extremists and trouble makers”. As such initiatives spread, engaging in political protest to object to their treatment might come with increasing risks for these who are already vulnerable. For example, at recent protests against India’s Citizenship Amendment Act and National Register of Citizens, activists expressed fears that the Delhi police was using facial recognition technology “to profile dissenters and members of a particular community”.

Like attempts to read sexual orientation into people’s faces, the roots of such practices, too, go back to colonial times. The British in India singled out specific people for surveillance simply on the basis of their physical and other features, without any evidence of wrongdoing on their part. In 21st century democratic India, this deserves to be asked: has enough really changed?

Shrinking privacy

Putting our bodies firmly back into debates about data in the digital age puts into sharp relief what is at stake here – for everyone.

Advertisement

Our bodies, now datafied, are the embodiment of so many of our privileges and marginalisations; containers for our most intimate secrets. They are reference points for some of the most stringent norms that run, and sometimes ruin, our lives. We would do well to remember that if our bodies do not exist outside of the social, neither does the data they embody.

Credit: HowledAway (CC BY-NC 3.0)

Poornima’s understanding of the purpose of government efforts to connect a wide range of datasets is clear. “Something like linking the Aadhaar to, say, online banking, is telling trans sex workers not to earn their livelihood,” she says. “It’s violating [our] human rights. It’s violating [our] privacy”.

Is this statement stretching things too far? It might appear so at first; after all, it may well be that this outcome is not one that the state ever explicitly intended. Yet, Poornima’s comment is insightful.

Advertisement

For one thing, it reminds us that forced transparency often has a disciplining effect that heightens the vulnerability of marginalised people. It is often in very small spaces where social norms and regulation cannot yet reach that breathing room for boundary management – and the questioning of social norms that often comes with it – can develop. As our governments and corporations are busy making us as transparent and legible as possible, those spaces are slowly disappearing one by one.

And so with that in mind, it deserves to be asked: should pregnancies and abortions ever be mandatorily tracked, and data on them forcibly collected? Should possible links between facial features and sexual orientation be datafied and fed into an algorithm? Should the Indian police be allowed to integrate its facial recognition based predictive policing with the Aadhaar database? At least one senior law enforcement official in India has suggested this move, which would effectively make it possible to track every Indian, including those who have never even been accused of a crime. In fact, should the use of facial recognition software, except in the most narrowly defined policing situations, be allowed at all?

At the moment, the argument that more data is always good is often taken at face value. But if we are to continue to have spaces where we can validate our own experiences of what it means to be a body in the world, our ability to engage in boundary management is crucial, online and offline.

Advertisement

Without a continuously developing capacity for self-determination and subjectivity, formulating a critical perspective on society will no longer be possible – and so neither will social change.​​

*Names changed

This piece was first published on Deep Dives as part of the series Bodies of Evidence.