In 2017, while delivering the landmark Right to Privacy judgement, a Supreme Court of India judge said, "Humans forget, the Internet does not, and does not let humans forget." Justice SK Kaul ruled that the right of an individual to exercise control over personal data and to be able to control their life would also encompass the right to control their existence on the Internet.
The court found the ‘right to be forgotten’ an aspect of the right to privacy. Since then, courts across India have received several petitions seeking the deletion of personal information from search engine results and the Internet.
One such petition, pending in the Delhi High Court, is of an actor seeking for removal of videos, photos and articles related to a drunk driving incident he was involved in. The actor claims that the incident, for which he was arrested in 2009, has left a permanent scar in his life as a result of which he has faced professional and career losses.
The court is yet to rule over his plea.
The new Digital Personal Data Protection (DPDP) Bill, 2022 published on November 18, 2022, extends the concept of the ‘Right to be forgotten’ and has renamed it ‘Right To Correction and Erasure of Personal Data’.
‘Right to be forgotten’ is the right to have one’s personal information, which is publicly disclosed, removed from the internet. This concept has found global recognition in the European Union, Turkey, Russia, and Siberia. Courts in England and Spain have also ruled on the subject. In India though, the idea is still in its nascent stage.
The new amended provision ‘right to correction and erasure of personal data’ will allow users to request the erasure or correction of personal data held by a company even if it is not publicly disclosed. “A Data Principal—an individual to whom data relates—shall have the right to correction and erasure of her personal data, in accordance with the applicable laws and in such manner as may be prescribed,” the new Data Protection bill reads.
However, experts are concerned that the bill gives wide exemption to search engines like Google and Yahoo from right to erasure. They say that the unclear definition of ‘public interest’ will allow search engines to continue processing personal data, thereby leading to fears of mass profiling.
‘Right To Be Forgotten’ Or Censored Internet?
The idea behind the provision of such a right is for individuals to have the autonomy to develop their life without being periodically stigmatised for actions committed in the past.
But critics of this right say that if such a right is allowed to be too liberal, all-encompassing, or absolute, it would amount to the rewriting of history and a more censored internet.
Globally, there are concerns about the kind of impact such a right would have on the public’s right to know and right to expression.
For instance, countries like the United States of America have very strong freedom of speech laws, also encoded within their constitution. A right to be forgotten, which would essentially seek censorship of information, would conflict with this fundamental right.
The 2022 Data Protection Bill ensures that personal data could only be erased, “unless retention is necessary for a legal purpose”. This means that data could be kept as it is, without being erased, if it is necessary for a legal purpose, say, an ongoing investigation.
This allows for the adjudicating officer to balance a person’s right to forget or erase and the public’s right to obtain information. But experts fear that this may be misused.
Ramya Dronamraju, Associate Litigation Counsel at the Internet Freedom Foundation, an NGO advocating for digital rights and liberties, believes that such a balancing act should ideally protect public information regarding public officials, promote and encourage government transparency, and not allow for claims of erasure in an attempt to withhold information that is in the public interest.
She explains that the public has a right to know if a politician or a high-profile individual was implicated in a criminal case that may not have been widely reported by the media. However, since the government has retained wide powers to decide what information stays in public and what is to be erased, it could potentially disallow the circulation of such critical information.
“The process and the authority who will determine whether such data is necessary or must be erased is unknown. So it is extremely likely that the public interest in such personal data will not be protected,” Dronamraju told BOOM.
Will Google Erase Everything You Ask Them To?
The 2022 Data Protection Bill says that a person can ask for the erasure of their data that is no longer necessary for the purpose for which it was processed. That may sound like a good thing for privacy, but experts have pointed out a lacuna in the law.
Experts point out that the government has done nothing to restrict data search engines, like Google and Yahoo, from processing personal data available publicly. It has instead given them blanket exemption under ‘public interest’. This means, that platforms like Google and Yahoo would be able to process your data in public interest, no matter if you make a request for the erasure of your data to them.
Raman Chima, Policy Director at Access Now, a digital rights advocacy group, says that, unlike the 2019 Bill, the 2022 Bill has removed all safeguards against the usage of personal data by search engines.
“It's important to note that while the government has not proposed a right to be forgotten for search engines, it has gone ahead on an even more dangerous path of saying that usage of personal data for search engines will be covered by the ‘public interest’ exception. In previous drafts of the bill, at least an effort was made to say that such uses may be permitted subject to further regulations,” Chima told BOOM. “The current draft bill removes all safeguards,” he added.
Experts believe an entire sector (search engines) getting a blanket exemption to process personal data is “very dangerous”.
For example, if persons were involved in a government pilot project that required the government to deposit a certain amount of money in such persons’ digital wallets, it would have to collect a host of personally identifiable data, including age, home address, bank account details, etc. If this database finds its way to an online platform, for example, a government website, search engines could then process such data because of the blanket exemption under public interest given to them.
“Now if such persons have a concern about their personally identifiable data being available online, they would have no option but to approach the government, since search engines are given a free pass to process such data. So are we saying there is no right to remedy here?” Chima asks.
He also points out that a person could attempt to file a right-to-erasure request with a search engine but it will most likely not be successful. “This is because of the public interest clause that allows it to process any information published on any other website”.
Some experts are also concerned about mass profiling by corporations due to this blanket exemption.
In simple words, mass profiling refers to the practice of collecting and analyzing large amounts of personal data from a group of individuals, usually without their explicit consent, in order to identify certain patterns, behaviours, or characteristics.
This process can be performed by government agencies, law enforcement, or corporations like Google and Yahoo search engines for various purposes, such as security, marketing, or research.
However, mass profiling raises concerns about privacy and civil liberties, and it is often criticized for its potential for abuse and discrimination. There is enough data to prove how such profiling leads to building a 360-degree profile of an individual and as a result, is used to predict their behaviour. This is dangerous since such corporations have an immense influence on individuals' lives and such profiling can help them manipulate people’s behaviours.
An example of such abuse is the famous 2018 Facebook-Cambridge Analytica scandal. Cambridge Analytica, a consulting firm, had purchased Facebook data on millions of Americans without their consent to build a “psychological warfare tool,” and used it to predict voting patterns of individuals in the US in order to help elect Donald Trump as president. Such profiling which ultimately influences one’s behaviour proves to be dangerous not just at an individual level but to society as a whole.
“Ultimately, if corporate actors are not really taking data with consent and are subject to no regulatory oversight, there is a fundamental problem there because they have a very wide exception (under public interest) and are not subject to clear restrictions. Giving an entire sector that is dealing with even potentially sensitive information a blanket carve-out is dangerous,” Chima told BOOM.
What Could The Government Have Done Differently?
A right to erasure request before a search engine would effectively mean a right to de-index. In other words, a person would seek their information which is currently being shown as a search result on say, Google, not to be shown i.e. be de-indexed.
“We generally do not recommend a right to de-index. If you're having a right to de-index, that should be clearly spelt out in the law. It should be subject to limits and there should be an appeals mechanism,” Chima believes.
In the 2019 version of the Data Protection Bill, the government actually tried to create some sort of mechanism by getting the data regulator involved in the process directly. The current bill lacks this.
Chima also pointed out that the government in the 2019 bill seemed to have acknowledged that such a right to de-index is potentially needed but could also be dangerous and therefore wanted to make sure that the data regulator is directly involved. “Now, they have completely defanged the regulator and in fact left people to the tender mercy of filing complaints directly to the companies”, he noted.
However, as noted previously, such a request with search engines would fail due to the public interest exemption added in the 2022 Bill. By providing this exemption, the government has let private corporations escape regulation.
On the government’s role in the process, Chima says that it is letting two private parties—a person and a search engine company—fight out a battle for erasure directly, and then giving the company a trump card by saying that all their activities are covered by the public interest exception. “So I would say it's like a race to the bottom. It's removing the critical responsibility of the government and the regulator that should be there.”
Experts demand that the ground of processing public data by search engines “in the public interest”, which is already “overbroad and dangerous” should have safeguards to ensure it is not misused by search engines and that certain kinds of data aren't circulated just because they had been accidentally or non-consensually publicised.
(Saurav Das is an independent investigative journalist and transparency activist)