Support

Explore

HomeNo Image is Available
About UsNo Image is Available
AuthorsNo Image is Available
TeamNo Image is Available
CareersNo Image is Available
InternshipNo Image is Available
Contact UsNo Image is Available
MethodologyNo Image is Available
Correction PolicyNo Image is Available
Non-Partnership PolicyNo Image is Available
Cookie PolicyNo Image is Available
Grievance RedressalNo Image is Available
Republishing GuidelinesNo Image is Available

Languages & Countries :






More about them

Fact CheckNo Image is Available
LawNo Image is Available
ExplainersNo Image is Available
NewsNo Image is Available
DecodeNo Image is Available
Media BuddhiNo Image is Available
Web StoriesNo Image is Available
BOOM ResearchNo Image is Available
BOOM LabsNo Image is Available
No Image is Available
Deepfake TrackerNo Image is Available
VideosNo Image is Available

Support

Explore

HomeNo Image is Available
About UsNo Image is Available
AuthorsNo Image is Available
TeamNo Image is Available
CareersNo Image is Available
InternshipNo Image is Available
Contact UsNo Image is Available
MethodologyNo Image is Available
Correction PolicyNo Image is Available
Non-Partnership PolicyNo Image is Available
Cookie PolicyNo Image is Available
Grievance RedressalNo Image is Available
Republishing GuidelinesNo Image is Available

Languages & Countries :






More about them

Fact CheckNo Image is Available
LawNo Image is Available
ExplainersNo Image is Available
NewsNo Image is Available
DecodeNo Image is Available
Media BuddhiNo Image is Available
Web StoriesNo Image is Available
BOOM ResearchNo Image is Available
BOOM LabsNo Image is Available
No Image is Available
Deepfake TrackerNo Image is Available
VideosNo Image is Available
Decode

Smile, You're On Camera: The AI Surveillance At Ram Mandir In Ayodhya

Speaking to Decode, Atul Rai, co-founder and CEO of Staqu Technologies, the company behind the AI-powered security system in Ayodhya, said, the company’s JARVIS platform is installed in every camera set up at multiple locations in and around Ayodhya Ram Mandir.

By -  Kaisar Andrabi | By -  Hera Rizwan |

21 Jan 2024 12:08 PM IST

Days before Prime Minister Narendra Modi is set to headline the 'pran pratishtha', or consecration ceremony of the Ram Temple in Ayodhya in a mega spectacle on January 22, the echoes of "Jai Shri Ram" (Victory to Lord Rama) resonate throughout Ayodhya. Devotees from all corners of India are flocking to the town. The pathways leading to the entrance of the Ram Mandir are scattered with construction materials, dust, and the sound of machines echoing in the complex.

The city has been fortified with a high-security surveillance system for the inauguration.

Prakash Yadav, 49, from the Naya Ghat area, is thrilled to witness the swift completion of the footpaths in the main market of Hanuman Garhi in Ayodhya city, which leads to one side of the entrance to the Ram Mandir. A week ago, three CCTV cameras were installed at the corner of his sweet shop, situated on the bend of the path, to monitor movements from three different angles for security reasons. “I was informed by the installation team to take responsibility in case of any damage,” he told Decode.

CCTV cameras will cover key locations across the city, including Kanak Bhawan, Hanuman Garhi, Shri Nageshwar Nath Mandir, Ram Ki Paidi, and Ram Janmabhoomi.

Meanwhile, the Uttar Pradesh Police and Special Task Force have partnered with technology firms to enhance security in the pilgrim city.

Staqu Technologies, an artificial intelligence startup headquartered in Gurgaon, has been assigned to deploy its AI-driven audio-video analytics software, Jarvis, on the current cameras in Ayodhya to improve overall security.

This software is designed to analyse real-time video footage, detect suspicious activity and potential threats, and promptly alert authorities. The security arrangements are in order as the city anticipates a significant influx of thousands of devotees on January 22.

High-tech equipment and infrastructure, valued at Rs 90 crore, will be deployed to ensure round-the-clock and foolproof security for the Ram Mandir. The Director General of Law and Order, Prashant Kumar told Decode that these measures are designed to withstand attacks and prevent intrusions, to safeguard the temple.

Other security gadgets like crash-rated bollards designed to protect high-profile structures from coordinated vehicular attacks, an under-vehicle scanner for inspecting vehicles on the Janmabhoomi Path, and strategically positioned boom barriers would be stationed around the Mandir areas.

Apart from surveillance, the Department of Telecommunications (DoT) is leveraging AI and machine learning (ML) to ensure a seamless connectivity experience for telecom operators during the event.

The AI-Powered Surveillance

Speaking to Decode, Atul Rai, co-founder and CEO of Staqu Technologies, the company behind the AI-powered security system in Ayodhya, said, the company’s JARVIS platform is installed in every camera set up at multiple locations in and around Ayodhya Ram Mandir. It will use facial recognition technology (FRT). 

“The platform is also synced with the UP police department’s database of more than 8,00,000 criminals. If a suspect visits the event, the cameras will detect and match the person with the database in real-time, and alert the authorities,” he explained.

Rai explained that even if a visitor is not registered in the criminal database, the JARVIS platform can detect the suspect through reverse facial recognition. “All JARVIS needs is an uploaded photograph of the person, and the cameras will search for the suspect,” he told Decode.

The cameras are also equipped with Automatic Number Plate Recognition (ANPR) capabilities, allowing them to utilise the government's vehicle registration database, including e-vahan Parivahan, to detect vehicles sporting counterfeit license plates.

Furthermore, the technology enables surveillance cameras to conduct attribute-based searches. This includes tasks like pinpointing an individual in a crowd based on specific clothing, colour, or style of accessories, as well as identifying features like an accompanying child. The functionality is not limited to humans alone; the software is also capable of searching for a particular vehicle with distinctive signage or markings.

This, however, is not the first stint of the firm with UP government. Earlier, Staqu Technologies had installed its video analytical solution 'Video Wall' in all 71 prisons of the state to monitor intrusion, sudden increase in crowd, violence and frisking among others.

Apart from this, the firm also used its AI functionality to analyse CCTV footage from counting booths during the Bihar panchayat elections of 2021 and 2022. The aim, as it says, was to ensure that every vote in the electronic voting machines (EVMs) is error-free and not manipulated during counting.

Growing Presence of Facial Recognition Technology in Public Spaces

The Ram Mandir inauguration is not the first event to witness AI-powered surveillance. The use of FRT for policing and security reasons has been seeing an upward trend in the recent past.

In the aftermath of the Delhi riots in 2020, the Delhi Police said that out of the 1800 arrests, 137 were facilitated through FRT. They further stated that the apprehension of the accused primarily relied on CCTV footage and publicly available videos, with FRT playing a pivotal role in the identification process. Meanwhile, the Chennai Police employed CCTV cameras integrated with FRT to instantly recognise individuals with a criminal record in crowded locations, particularly during festivals. Similarly, the Punjab Police utilises CCTV footage with FRT for ongoing investigations and real-time monitoring of various areas.

While hailed for its potential to enhance public safety, the increasing reliance on facial recognition has sparked a myriad of concerns from privacy advocates, civil rights activists, and concerned citizens. As the lens of surveillance sharpens, questions arise about the ethical implications, potential biases, and the overarching impact of this technology on individual freedoms. 

The Railways Department has also recently announced that it is implementing a facial recognition system (FRS) at major stations in the East Central Railway to enhance security. This move aims to link the FRS data with existing criminal activity databases to prevent crimes.

Tech lawyer Sharanya Mukherjee told Decode that this "could potentially lead to mass profiling and surveillance, posing risks of discrimination against certain communities". 

Talking about the legal recourse, she further added, "India's Digital Personal Data Protection Act 2023 contains several exceptions that allow government entities to handle individuals' personal data without adhering to necessary safeguards, like obtaining consent or informing the individuals whose data is being processed."

Additionally, Section 17(1)(c) provides exemptions to entities processing personal data for crime prevention, detection, investigation, or prosecution. Thus, according to Mukherjee, the DPDP enables government to gather people's facial data for crime prevention without consent or awareness, with inadequate clarity regarding database security measures.

Misidentification and Bias Concerns

Decode spoke to Conor Healy of Internet Protocol Video Market, a security and surveillance industry research group based in the US, who delved into how misidentification and false positives have been a common problem with FRT. Citing a shortcoming he said, "In order for facial recognition to perform well, it usually needs a direct and clear view of a face, good lighting, and no obstructions like hats, sunglasses, or masks. If they prevent a direct view of someone's face, then yes, crowds can be a problem too."

Therefore, according to Healy, FRT cannot be relied upon as the beginning and the end of any investigation. He said, "Law enforcement cannot rely on facial recognition alone in order to identify someone. When they do so, and a person is misidentified, it can lead to unnecessary police encounters, or even false arrests, which is bad for both police and the public as it degrades public trust."

Healy added that FRT usually does not work as well on individuals with darker skin. "Often this is an artifact of what materials are used to train it. The technology has improved in recent years, and in the long run this will be less of a problem, but it is still a present concern," he said.

Furthermore, just like AI, the performance of FRT also depends on the kind of data it is trained on. According to Mukherjee, if the training data consists primarily of faces from a certain ethnicity or gender, the algorithm may perform poorly when attempting to recognise faces from underrepresented groups, such as ethnic minorities in India.

She added, "This can lead to misidentification or underrepresentation of certain communities. The algorithms themselves can be biased, often reflecting the prejudices of those who design and train them." Stressing on the ramifications of bias within the algorithms, Mukherjee said, " It is bound to have a chilling effect on free speech and assembly. People may become hesitant to participate in protests or public events out of fear that law enforcement will misidentify and target them."

Mishi Chowdary, a digital lawyer and online civil liberties activist, told Decode that there’s a general tendency to believe that CCTV surveillance leads to better security.

The lawyer explained that any event on a large scale does carry several security and other risks, some of which can be addressed by having more control over movements of personnel. These will provide potential evidence for investigation.

“None of this guarantees safety or security as human intervention and enforcement is the key,” she said, questioning the storage of data that all will be collected, and for long it will be kept. “We must note that if a crime is in progress, surveillance cannot prevent or stop that crime. They may be able to accelerate emergency response if the correct personnel act in time,” said Chowdary.

She explained that in such a situation, the expectation should be of deletion of data within 90 days of the event. “I have no idea if they (authorities) have any cyber security measures in place and how they handle data. Large scale collection of data and subjecting everyone to facial recognition without data minimization and deletion after the activity raises concerns,” she told Decode in a chat.

The lawyer points out that the primary reason for implementation of such tools is to provide safeguards. “At present we only concern ourselves with security and not rights,” she said.

Back in Ayodhya, some visitors are concerned.

“I get the protocol, and if the recordings are in safe hands, like within the police, that is trustworthy. But, if they get compromised or hacked by cybercriminals, it's a real worry. In this age of artificial intelligence, anything can be twisted through filming, inviting shame or other cybercrimes,” said Akansha Tiwari, a college student from Bihar, who visited the Ram Mandir with her parents.

Standing next to her, Gyatri Devi, her aunt and a private school teacher, added, “Misusing technology can make anything fake seem real. By the time someone proves their innocence, the damage is already done for the individual. Authorities need to consider all these factors and take necessary precautions,” she said.

Tags: