The right to be forgotten designates a legitimate claim from a person to remove information about their past that could potentially interfere with and jeopardise their present life

The right to be forgotten designates a legitimate claim from a person to remove information about their past that could potentially interfere with and jeopardise their present life.In fact, no law uses the term right to be forgotten, or to erasure, however, in 1995 a European regulation was introduced for the protection of digitised personal data (1995 protection directive). At the time, the Internet was not in every household as it is today; it did not make sense to make a law on erasing or dereferencing. Nobody put their information or photos on social networks; many did not know what a search engine was. What had to be protected was personal data stored by public administrations and companies to prevent the reselling of data without agreement or the person’s consent. This regulation was then applied to personal data on the Internet, and the corresponding terms were invented.

The right to erasure is the right of a person to eliminate contents from the electronic network that may harm him or her. It may be a press article, a photo, a video, or any other publication that is relevant. Whether it be on social networks, on an online media site, or on any digital space of expression, you can ask to delete information that concerns you. Whereas dereferencing consists of prohibiting search engines of certain terms, which will be exempt from the results of a search in the future.

The right to be forgotten has been a topical debate recently due to the Mario Costeja Gonzalez case Google Spain v AEPD and Mario Costeja Gonzalez 2014 C-131/12. The case began in 2010 with the complaint of a Spanish citizen, Mario Costeja Gonzales, to the Spanish Data Protection Agency (Agencia Espanola de Proteccion de Datos, AEPD) against the publisher of a daily newspaper La Vanguardia Ediciones SL, Google Spain and Google Inc. In support of his claim, Gonzales argued that when a user entered his name into the Google search engine, the list of results showed links to two pages of the daily newspaper of La Vanguardia, dated to January and March 1998, mentioning an auction of a seized property in payment of social security debts. Mr Gonzales stated that these references were no longer relevant since the debts had been cleared a long time ago. Consequently, he requested that La Vanguardia should delete or modify the pages in question so, that his personal data no longer appeared and that Google Spain or Google Inc. removed or obscured his personal data so that it disappeared from the search results.

The Data Protection Agency rejected the complaint against La Vanguardia, claiming that the information had been legally published in the newspaper, as it was a legal notice of sale. On the other hand, it asked Google to take the necessary steps to remove the data from its index. Google Spain and Google Inc. consequently appealed to the appeals court (Audiencia Nacional) to reverse the decision of the AEPD.

The case was sent all the way to the Court of Justice of the European Union CJEU where the debate was limited to the responsibility of the operator of the search engine, but the issue of the liability of the publisher of the site was not envisaged.

On the 13th of May 2014, the CJEU ruled on a reference for a preliminary ruling by the Audiencia Nacional on interpreting the 1995 directive. On those grounds the court found that;” the activity of a search engine consisting in finding information published or placed on the internet by third parties, indexing it automatically, storing it temporarily and, finally, making it available to internet users according to a particular order of preference must be classified as ‘processing of personal data’ within the meaning of Article 2(b) when that information contains personal data and, second, the operator of the search engine must be regarded as the ‘controller’ in respect of that processing, within the meaning of Article 2(d).”

Consequently, for the CJEU, the provisions of the Directive applied to the indexing activity of personal data carried out by a search engine, the operator must ensure that their activity complies with the requirements of the Directive.

In response to another of the AEPD questions, on the geographical scope of the Directive. The CJEU considered that the Directive is applicable when a parent company ” sets up in a Member State a branch or subsidiary which is intended to promote and sell advertising space offered by that engine and which orientates its activity towards the inhabitants of that Member State;” as is the case here for Google.

The Court decided next on the extent of liability of the operator of a search engine on the matter of indexing personal data. In doing so, manifested a right to digital forgetting, the right to privacy and the implementation that leads to the suppression, under certain conditions; of links on a search engine referring to websites containing personal data.

The Court decided that the operator of a search engine is obliged to remove the name of a person, links leading to web pages containing personal data relating to that person from the list of results obtained by typing the name of a person into a search engine.

When the processing of the personal data in question does not adhere to the 1995 Directive, such incompatibility may be a result of inaccurate data, more specifically that they are inadequate, irrelevant or excessive. Also that they have not been updated or that they have been kept for a longer period than is necessary unless they are to be preserved for historical, statistical or scientific purposes. Personal data in question has to fall into one of these categories for the person concerned to demand the deletion of the links of the search engine.

This criterion for the implementation of the right to oblivion raises a number of questions that appear to be particularly vague and loose(what is an unnecessary length of time, or an excessive or irrelevant piece of data?) . The risk of mass suppression of links on search engines is therefore real. Despite the CJEU arguing that it is “a possibility for everyone to control their digital footprint and their life – private as well as public online. This bold decision of the CJEU has already given rise to many critics from Google of course, but also from the founder of Wikipedia who speaks of (Lee) “one of the most wide-sweeping internet censorship rulings that I’ve ever seen”. Which brings me to my thesis that the current right to be forgotten legislation is incomplete and a vague solution to a very real problem.

To be convinced by my claim my research will begin by exploring allegations made claiming that the right to be forgotten is another form of censorship and what data suppression could mean for corporations and those that retain power and how it may affect our freedom of speech. The European Court of Justice and those for the ruling argue that is about a deletion right and control over personal data.

However, those arguing against the ruling believe that it is a right to force others to delete things they would not otherwise remove or choose to forget. The problem with the ruling is its vague boundaries and far reaching stretch. What categorises inadequate or excessive content and for whom and who can use the ruling? The CJEU states that everybody who uses the right to be forgotten does not have some sort of public interest narrative to fulfil, in fact whenever the nature is public interest; where someone has stepped out themselves into the public spotlight the right to be forgotten does not apply. One good example of this is the pianist who tried to remove criticism from the Washington post. Dejan Lazic is a European pianist who decided to take the ruling one step further on the 30th of October he sent to the Washington post a request to remove a 2010 review that he claimed marred the first page of Google results when his name is searched. Lazic exclaimed, (Davey) “To wish for such an article to be removed from the internet has absolutely nothing to do with censorship or with closing down our access to information.” He argued it has to do with control of one’s personal image — control of, as he puts it, “the truth.” However, Lazic misdirected his request as it is the search engines that remove content, not the sites.

For some right to oblivion supporters, the core debate is privacy and trying to return it to how it was before the internet stored all our data. The fact that Google is a database where anyone can access your information all over the world is a genuine worry supported by both sides of the argument. In deleting or not referencing links on search engines they believe that they will make the process of finding information slower and more painful for someone. Comparisons can be drawn to when the internet did not exist; you would have to research and acquire palpable information on that person making it more time consuming and in some cases a specialist job. The ruling would help to slow down the process and maybe deter some internet users, nonetheless if someone wants your information the process has just been prolonged and not entirely made that much more difficult (if you know where to look).

Those for the ruling may argue that censorship has always existed in modern democracies through laws like libel and defamation. However, these laws are adjudicated the process is done in a Court of justice not by a company and for most rulings the information is transferred to the public. The cases are presented to a judge who then decides with the help of a substantial amount of precedent cases. Google has had to remove content from the very beginning including pirated content, data deemed illegal by a court, child sexual abuse imagery, malware, personal data like bank details and content prohibited by local law. However, since the ruling, the requests have changed: (Drummond) ” former politicians wanting posts removed that criticise their policies in office; serious, violent criminals asking for articles about their crimes to be deleted; bad reviews for professionals like architects and teachers; comments that people have written themselves (and now regret)”.

Which raises the question of how this may affect or empower global corporations like Google. Google at the moment is still trying to fight the ruling despite the CJEU expressing complete cooperation. Nonetheless, the key element here is that it is a private company making a decision that has to be kept secret under governmental insistence; rather alarming when you consider that a company’s main narrative is lucrative. The CJEU argue that Google is subject to the law like all other corporations and that it is in their best interests to adhere to those rules both financially and ethically. Their decisions can be checked if the request is not met, and they can be taken to Court or a data protection agency. One wonders how they are making their decisions and what process each request is going through because Google cannot disclose information due to the legal implications of individuals privacy rights. Most complaints that are being received are being dealt with by interns or by artificial intelligence; the lowest in the corporation’s foodchain. Google say that they have become quite successful at dealing with the thousands of claims that they receive however it does make you wonder if they are just passing content off to be deleted because it is easier and less expensive than the case going to appeal or to a data protection authority. The the fact that Google is a company changes the narrative from a legal and fair procedure to a cost cutting and efficient way of dealing with the increasing amount of requests. The eventuality could be mass suppression of data leading to a cleansed internet space, and if Google starts to accept the ruling, they could use it as a means of generally improving their browser. Deleting at will because they do not have to disclose any information, the online reality we experience will be one set by one or two companies. Artificial intelligence is most certainly going to be used in the future however it could be used on the public side as well. Other companies could use this as an initiative to create a business where somebody pays a contract that every time his name is mentioned in a way that does not suit him the computer sends an automated request to Google to delete the link.

The most substantial threat that the right to be forgotten has is the right to know or freedom of speech; this has been not so much of a problem in Uk or Europe. However, the United States are very keen on keeping their right to freedom of speech intact due to its prominent place in the first amendment. You could argue that this does not matter, that the United States are outside the EU. However, Google and most other search engines such as Yahoo or Bing are from the US. They may be deleting content that has been signalled for deletion or dereferencing on their European version of their sites like in the case of Gonzalez vs. Google Spain, but they are keeping the link on Google.com the US version of the site the law does not oblige them to remove the link. At the moment due to the ruling only applying to European search engines you can still find the information elsewhere, if you change your browser simply from .co.uk to .com, you will find the results that have been removed. For someone searching randomly or vaguely this will put them off the trace of any deleted content, but for those who really want to find out, they can do so with great ease. Ideally, the European Union would like the whole digital world to follow suit as the system can only work if the content that is removed in the EU is deleted from all search engines worldwide. (Dawinder) Sidhu concluded his view on the ruling on US grounds that show how difficult it will be in America especially when contradicting constitutional rights: “In short, the right to be forgotten, while a well-intentioned attempt to close the gap between privacy and Internet technology, is difficult to square with American constitutional principles.”

The right to know about someone’s or something’s past is something we take granted: reviews, criticism and praise of people or institutions. What happens if everybody wants to cleanse their image, therefore, removing anything that may affect or damage them or their business? What about victims families or close ones who don’t want a crime that somebody has done to be removed, they may want the stories to live on and the legacy of the victim not to be forgotten. That the perpetrator should be associated with a crime as a matter of justice. Will critics become more lenient on criticism and reviews for fear of being removed from a search engine due to the discontent of somebody?

This diagram is a an official Google transparency report from May 29, 2014 – July 15, 2015. To this day Google has evaluated for removal: 2,006,506 URLs and received: 714,636 requests for deletion. The sites that are the most impacted are predominately social media, ten sites represent 8% of domains were URLS have been removed and they include: Facebook, youtube and twitter for example. The graph shows that they do not remove more than half the requests however just under half the requests could still represent up to 300 000 URLs which is a very substantial amount of content.

Just by scratching the surface of this new ruling you find more and more inaccuracies and questions with very unclear answers. The vagueness and the lack of precedent only increase the bewildering factors of who and what has the right to oblivion. The fact that, that they have given “journalistic exception” which means not deleting the content from the source just dereferencing it in the search results, “hiding it away” makes one wonder if the hassle is worthwhile. It is like saying the books will stay in the library but we are going to destroy the index cards. Which brings me to the most alarming point of the whole debate that this ruling that has vague guidelines, and little precedent is controlled entirely by a company that wants to make a profit. There needs to be steps taken for information like legal proceedings such as divorces where children may have access; to be harder to obtain. And I agree that someone reading a story online that may be detrimental has no idea of the context and therefore considers less of that person as a person having never met them. If the system is to work, it has to be adjudicated and not run by interns in global companies wanting to prove a point by making the system efficient by just removing at any request. I believe that the moral behind the ruling is correct, but the format needs to have clear boundaries if not we could be looking at a search engine that not only chooses what it wants to show us but also a very bland digital world. The internet has changed and so has the understanding of privacy yet our online forgiveness is a lot harsher than real life forgiveness maybe that’s where the problem lies