Is India moving towards a mass surveillance state
The dystopia of mass surveillance in George Orwell’s 1984 seems to be closer than ever before. This has mainly happened because of the advancement of artificial intelligence technologies like facial recognition technology or FRT.
FRT systems are technologies of the future and these systems are already in place and are being used by governments across the world for ‘security’ purposes. China does not want to hear anything about democracy and therefore it has installed the largest centralised FRT system in the world.
There are over 200 million closed-circuit television cameras or CCTVs in China from which data can be collected and analysed. There are also 20 million specialised FRT cameras that collect data continuously for analysis.
China is currently using these systems to surveil ethnic Uyghur minorities in the re-education camps it has set up in the Xinjiang region. Their behaviour is being manipulated with the use of continuous surveillance. China also used FRT systems to profile protestors during the pro-democracy protests in Hong Kong.
These steps have raised worldwide concerns about cultural bias that can become inherent in these systems. It has also put into question a person’s right to freedom of expression, privacy and basic dignity.
But how does an FRT system work? It can be as simple as the face identification in a smartphone which works as a password and verifies a user’s face to unlock the phone. Other everyday examples are the myriad face modification filters that are a part of apps like FaceApp, Instagram and Snapchat.
As the complexity of the purpose of an FRT increases, so does the complexity of the analysing system and its intelligence. The most prevalent FRT models create a mathematical representation of a person’s face which is then compared to the representations captured in an existing database.
Where does India stand?
India is not far behind. It is building one of the largest centralised FRT surveillance systems in the world. And this system has been called the National Automated Facial Recognition System (AFRS).
In 2019, the National Crime Records Bureau under the Union Ministry of Home Affairs issued a request for proposal (RFP) for companies which would be interested in creating the AFRS.
This RFP has undergone multiple changes in the last one year owing to push back from civil society organisations, mainly because there is no law governing facial recognition technologies systems in the country and the current systems are widely inaccurate.
The deadline was also extended many times, twice in just the last month. The deadline is on September 17. Currently, this FRT system is being used by police forces in seven Indian states. But this system is very basic and rudimentary, unlike what is being China.
They are also being used for the purpose of identification by different civil departments across the country. For instance, the Union Ministry of Civil Aviation has started trials for the Digi Yatra at some airports like Hyderabad (July 2018), Delhi (September 2018) and Bengaluru (December 2018).
The programme allowed passengers to check-in using facial recognition as the boarding pass, making the process paperless. But the major problem with FRTs in India because they are so rudimentary is that they are not accurate which could lead to misidentification of people in turn leading to false accusations and arrests.
In 2018, the Delhi Police had reported that their trial FRT system had an accuracy of only two per cent. This two per cent was admitted in an affidavit filed by Delhi Police before the Delhi High Court.
What is worse is that in 2019, the Union Ministry of Women and Child Development reported that the accuracy of the current FRT systems in India was only one per cent. The ministry reported that it could not even distinguish between boys and girls.
Despite the poor track record, India continued to use FRT in 2020. According to Union home minister Amit Shah, over 1,900 faces have been identified through facial recognition software for inciting violence in the national capital during the Delhi riots at the end of February. Shah said driving licence and voter ID card information was used for carrying out the identification.
Even if an FRT system is accurately implemented, there will be function creep.
Function creep occurs when a technology or system gradually widens its scope from its original purpose to encompass and fulfil wider functions. For instance, the Delhi Police used FRT to track down people present during the protests against Anti-Citizenship Amendment Act and National Registry of Citizens that occurred from December 2019 to March 2020.
In the absence of a national or even state policy governing artificial intelligence systems like FRT, function creep raises grave concerns about people’s right to freedom of expression and their right to privacy.