Privacy in the Digital Economy final - The Privacy Surgeon

0 downloads 144 Views 1MB Size Report
Its analysis was based on the four building blocks of the ... as Basic Right in the European Charter of Fundamental Righ
Privacy in the Digital Economy: Requiem or Renaissance? An essay on the future of privacy By Gerald Santucci

Privacy in the Digital Economy: Requiem or Renaissance? By Gérald Santucci

“The makers of our Constitution undertook to secure conditions favourable to the pursuit of happiness… They sought to protect Americans in their beliefs, their emotions and their sensations. They conferred, as against the Government, the right to be let alone – the most comprehensive of rights and the right most valued by civilized men. To protect that right, every unjustifiable intrusion by the Government upon the privacy of the individual, whatever the means employed, must be deemed a violation of the Fourth Amendment.” (U.S. Justice Louis D. Brandeis, 1928) “Privacy and competitiveness should not be presented as antagonistic objectives. On the contrary, modernised data protection rules are a crucial market opener in our digital economy. We are encouraging innovation by introducing tools, such as ‘privacy by design’ or data portability, which will give companies the right incentives to gain a competitive edge in digital marketplaces where privacy does increasingly matter for consumers.” (Vice President of the European Commission Viviane Reding, 6th June 2013) “Germany will make clear that we want Internet firms to tell us in Europe who they are giving data to. We have a great data protection law in Germany. But if Facebook is registered in Ireland, then Irish law is valid, and therefore we need unified European rules. Germany will take a strict position.” (Chancellor Angela Merkel, 14th July 2013)

The concept of “privacy” is a symbol of the growing gap between the on-going changes in our environment and our capability to understand and integrate them. The seamless connection of citizens’ on- and off-line lives is an increasingly important issue facing society. In the future, individuals should be able to manage how information related to them is used - whether directly or through other means - in a way that meets with their preferences, contexts, and values, all within existing legal frameworks. A clear understanding of the impact of these changes is essential because individuals may not be aware of the majority of the data that exists or can reveal information about them. Two topics that are very much in the news these days – PRISM (the surveillance programme of the U.S. National Security Agency) and the rise of wearable computers that record everything – are not directly related to each other yet they are part of a growing move for - or against - a “surveillance society”.

Gérald Santucci, September 2013

2

On one side, we have people declaring that too much surveillance, especially in the form of wearable cameras and computers, is detrimental and leaves individuals without any privacy. Recent revelations in the press about spying on personal data that is available at companies like Google, Facebook, Microsoft and many other ICT companies, are bringing to the forefront concerns that individuals have about use of their personal data. Such sensitive information was under their control in the pre-digital age but now is collected and stored by organisations that are bound by law to provide information demanded by intelligence services and law enforcement agencies. On the other side, there are people who argue that a society with surveillance - and even selfsurveillance - technologies everywhere will make the world safer and wealthier. For them, a trade-off must be found between security and privacy. Since they consider personal data as a tradable asset, just like water, oil, wheat or gold that must move to create value, they brandish evidence of economic and social value impact from trusted flows of personal data. In Europe where many countries experience a toxic cocktail of weak economic growth, poor competitiveness and high unemployment, the value creation argument naturally looks a very convincing one. Data collection and video surveillance will continue to grow as ubiquitous computing pervades almost all areas of our culture, either harnessed to our body or hovering over cities to monitor people from the sky. As technology moves to the nano scale, wearable devices will be both on us (e.g. smart glasses and other head-mounted displays, devices connected to the smartphone in the fitness and sports environment) and in us (e.g. a nanorobot navigating through the human circulatory system or shielding the human body against pathogens). Therefore the question is: do we want to live in a surveillance society that might ensure justice for all, yet privacy for none? Are we ready to live in a “City of Control” or do we definitely cherish a “City of Trust”?1 Would we accept a panopticon – a scenario in which the “few” see everything without ever being seen, while the “many” are totally seen but never see who is watching them – or would we prefer an openspace/synopticon – a scenario in which individuals are fully empowered to define borders of their own personal space?2 Looking at these issues, it appears clearly that current approaches to data protection, primarily based on contractual agreements, are largely inadequate to address such asymmetries. We need new thinking and new concepts to structure an interdisciplinary discussion (including science, technology, law, politics, and business), and formulate approaches that ensure protection of personal data as well as innovation in service and policy development. Defining the concepts What is “personal data”? The question may appear strange, as the concept has been well defined in Directive 95/46/EC and related Community legislation. In 2007, the EU Article 29 Data Protection Working Party went further by providing guidance on the way this concept should be understood and applied in different situations3. Its analysis was based on the four building blocks of the legal definition of “personal data”, i.e. “any information”, “relating to”, 1 2

3

Rob van Kranenburg, The Internet of Things: A critique of ambient technology and the all-seeing network of RFID, Notebook, September 2008. See http://www.theinternetofthings.eu/sites/default/files/Rob%20van%20Kranenburg/Jury%20on%20Panoptico n%20Contest%20February%202012_0.pdf Article 29 Data Protection Working Party, Opinion 4/2007 of 20 June 2007.

Gérald Santucci, September 2013

3

“an identified or identifiable”, “natural person”. Different contexts were considered, in particular e-Government, e-Health, and RFID. Yet, the opinion of the Article 29 Data Protection Working Party left room for a number of interpretations: “flexibility is embedded in the text (of the Directive) to provide an appropriate legal response to the circumstances at stake”, “the scope of application of the Directive excludes a number of activities, and flexibility is embedded in the text to provide an appropriate legal response to the circumstances at stake”, “the scope of the data protection rules should not be overstretched”, “but unduly restricting the interpretation of the concept of personal data should also be avoided”. It is worth noting that the proposed Data Protection Regulation suggests simplifying the definition of “personal data”: “any information relating to a data subject”. What is “privacy”? Until recently the concept meant “the right to be let alone”4 and was supported over the years by a string of data protection rules: the Data Protection Act in the German State of Hesse (1972), the U.S. Privacy Act (1974), the German Federal Data Protection Act (1977), the OECD Guidelines on the Protection of Privacy (1980), the Council of Europe’s Convention for the Protection of Individuals with Regard to Automatic Processing of Personal Data (1981), the EU Data Protection Directive (1995), Data Protection as Basic Right in the European Charter of Fundamental Rights (2000), and the APEC Privacy Principles (2004). In Europe, there exists no established definition of “privacy”, for the term “privacy” has multiple meanings depending on context and interpretation5: bodily privacy (to protect the integrity of the physical person); territorial privacy (to protect personal space, objects and behaviour); communications privacy (to protect against eavesdropping); locational privacy (to protect against surveillance); and information privacy (to protect personal data). The difficulty we have today to provide clear interpretations of the terms “personal data” and “privacy” is very likely to increase in the upcoming years as emerging digital technologies raise questions about who is responsible for data protection when data is scattered “in the clouds” or embedded in smart connected objects and when individuals disclose a great deal of personal data – their own and others’ – on social networking sites. Rapid changes in digital technology highlight the challenges of ensuring that individuals have control over their personal data. Recently, Germany’s federal data protection commissioner Peter Schaar likened Microsoft’s next-generation console, the Xbox One, to a monitoring device6: “The Xbox continuously records all sorts of personal data about me: reaction rates, my learning or emotional states. These are then processed on an external server, and possibly even passed on to third parties. Whether it be deleted ever, the person concerned cannot influence (…). The fact that Microsoft is now spying on my living room is just a twisted nightmare.” Indeed, as technology becomes more ubiquitous and pervasive, the issues are increasingly difficult to deal with: can individuals understand the ways in which their personal data is handled? Do they have the ability to meaningfully consent to various uses? Do they have access to their personal data? How long should data that has been collected be kept? What is the best way to protect the data? These questions are complex and I will not seek to offer answers in this paper, but I would like to crystallize some of my views around three novel concepts in the proposed EU Data 4

5

6

Samuel Warren and Louis D. Brandeis, Harvard Law Review, 15 December 1890. See for example David Wright, Kush Wadhwa, Paul De Hert, Dariusz Kloza, A Privacy Impact Assessment Framework for data protection and privacy rights, study prepared for the European Commission, Directorate General Justice, 21 September 2011. Interview to Der Spiegel, 26 May 2013.

Gérald Santucci, September 2013

4

Protection Regulation – Data Protection Impact Assessment, the Right to be Forgotten, Privacy by Design – and to share my sense that at the confluence of Science, Technology and Society the challenges are becoming some complex and intertwined that a real, total new narrative is necessary in order to interpret the world as it is and to make it a better place for human beings to live in. The proposed EU Data Protection Regulation The European Commission recently proposed a reform of the 1995 EU Data Protection legal framework that would help strengthen consumers’ data protection rights while creating a robust EU digital economy with a unified approach to data collection, protection, and transfer. The current EU Data Protection Directive would actually become a Data Protection Regulation7. The purpose of the reform is to protect individuals in relation to the processing of personal data and to ensure the free movement of personal data within the Union. Some groups claim that the Commission’s proposal is an attempt to revive “old-school” privacy thinking while others argue that it reveals and heralds a well-tuned way to adapt the EU legislation to the new data-driven digital environment. At least, the proposed Regulation is a game changer. It offers the combined force of a mandatory “data protection impact assessment” (DPIA), data protection by default (data minimisation at the level of applications), data portability (enabling an effective right to withdraw consent without losing the value of self-generated profiles), the “right to be forgotten”, rights against measures based on profiling (a right to object to being subjected to automated decisions and transparency rights as to the existence of such measures and their envisaged effects), and finally “data protection by design”. This is not mundane at all. Yet, this would be worthless if the proposal had not introduced efficient mechanisms to incentivize the industry to actually develop data protection by design: the liability regime is inspired by competition law while the burden of proof per default rests with the data controllers. The goal is to eventually create a level playing field on which both industry and government can develop intuitive and auditable transparency tools. In other words, the proper implementation of the future Regulation could empower individuals and communities to work out and establish a new hybrid “social contract” (à la Rousseau) that allows a plurality of options, a choice of exposure and of places to hide. Just as writing did not suppress speech but actually changed its nature, digital devices will not suppress writing but will change the nature of the reading mind. What is at stake is the capability of the EU to integrate modern, adequate legal data protection into its socio-technical fabric, i.e. its hardware, software and the many associated protocols and standards that enable and constrain its affordances. The “success” of the Regulation will depend primarily on a cultural change: the challenge is no longer about how to technically enforce legal compliance – legal issues cannot be solved by technical solutions – but about the design of novel articulations of fundamental legal rights into network infrastructures instead of the printing press. Written legal rules and their underlying unwritten legal principles increasingly face the challenge of developing a new type of technology-neutrality: wherever a technology (e.g., Internet of Things, Big Data) changes the substance or the effectiveness of a right, its articulation must be revisited among both lawyers and network 7

European Commission, Proposal for a Regulation of the European Parliament and of the Council on the protection of individuals with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation), COM(2012)11 final, 25 January 2012, available at http://ec.europa.eu/justice/data-protection/document/review2012/com_2012_11_en.pdf

Gérald Santucci, September 2013

5

designers to take into account how our society is inclined to reinvent, redesign and reengineer the right within the network of related rights and principles. Legislation versus Self-regulation: the EU and U.S. examples A comparison of the U.S. regime to protect consumer privacy with the proposed EU Data Protection Regulation highlights both convergence on the principles and divergence on the mechanisms. Indeed, the U.S. regime and the proposed EU Regulation reflect common ground on key issues such as promoting privacy by design, protecting children’s privacy, enhancing data security, and providing consumers with appropriate access, correction and deletion rights. But they often differ on how to achieve these common goals, e.g. when and how consent should be obtained, timing and scope of notification of data breaches, how to protect the privacy of personal data that flows across borders. Historically, the EU and U.S. have taken divergent approaches to implementing data protection rules. In the U.S., where privacy interests are balanced with the right to free expression and commerce, and in recognition of the fact that not every piece of personal information can be protected and policed, the framework provides highest levels of protection for sensitive personal information, such as financial, health and children’s data. In addition, targeted enforcement actions against bad (or negligent) actors, principally by the U.S. Federal Trade Commission (FTC), have created a “common law” of what is expected from business and industry when it comes to the collection, use and protection of personal information. In the EU, by contrast, a Directive transposed with national laws in 28 jurisdictions to implement the requirements of the Directive seeks to regulate personal data and is predicated on the notion that privacy is a fundamental human right. Thus, under the EU approach of regulation, there are strict limits on the collection and use of data, although enforcement of those limits has historically been episodic. Both the European Commission and the FTC are actively engaged in on-going policy development to improve their respective approaches to privacy protection in light of rapid technological change. As the privacy challenges that consumers face become more complex because of the advances in Internet and mobile technologies, and in the Cloud, bottom-up, industry-driven self-regulation seems to be no longer sufficient to address these challenges. The “Do Not Track” system put in place by the U.S. industry, and its recognized ineffectiveness at stopping tracking, suggest that much remains to be done to withstand the huge technological and socio-economic challenges of an increasingly online market. Therefore, the FTC report of March 20128 provides a new framework for thinking about privacy in the U.S. It actually calls on companies handling consumer data to implement recommendations for protecting privacy, including: “Privacy by Design” (companies should build in consumers’ privacy protections at every stage in developing their products), “Simplified Choice for Businesses and Consumers” (companies should give consumers the option to decide what information is shared about them, and with whom, including a Do-NotTrack mechanism that would provide a simple, easy way for consumers to control the tracking of their online activities), and “Greater Transparency” (companies should disclose details about their collection and use of consumers’ information, and provide consumers access to the data collected about them). Much of this is similar to what the European

8

FTC Report, Protecting Consumer Privacy in an Era of Rapid Change: Recommendations for Businesses and Policymakers, March 2012.

Gérald Santucci, September 2013

6

Commission has included in its proposed Data Protection Regulation, except for the “right to be forgotten” which remains a weird concept in the U.S. Data Protection Impact Assessment The EU Data Protection legal framework in force (i.e. Directive 95/46/EC) does not contain reference to “impact assessment”, therefore the choice of the European Commission to include the concept of Data Protection Impact Assessment (DPIA) in its proposal for a Regulation is key. The main reference to this concept in the proposed EU Regulation is Article 33: “Where processing operations present specific risks to the rights and freedoms of data subjects by virtue of their nature, their scope or their purposes, the controller or the processor acting on the controller’s behalf shall carry out an assessment of the impact of the envisaged processing operations on the protection of personal data.” This definition borrows from the well-known concept of Privacy Impact Assessments (PIA) that have been used recently in Europe in the implementation of the Commission Recommendation on RFID9 and for Smart Grid and Smart Metering Systems10. The RFID and Smart Grid/Smart Metering examples reflect the necessary shift from a “framework” (i.e. identification of objectives, outline of a methodology, definition of the scope of the assessment in terms of the boundaries of the system/process under analysis) to a “template” (i.e. the framework content plus the provision of an operational instrument to manage the risks of the specific system/process and its use cases, and finally the possible controls and best available techniques to mitigate those risks and provide specific guidance). It is worth noting that the European Commission does not impose impact assessment requirements only to EU Member States or other external stakeholders, but actually also to itself. In the context of its drive towards Better Regulation11, it has been conducting in 2013 a review of its impact assessment guidelines 12 that takes into account, based on the recommendations put forward by DG CONNECT, potential guidelines concerning the effects of new EU regulation on the Internet, the modernisation of public services, IT-driven evidence gathering, ICT ethics, etc. The Right to Be Forgotten The “right to be forgotten” is described in Article 17 of the draft Regulation. The different aspects of this right have been intensely debated in different contexts. It is subject to 9

10

11

12

See Point 4 of the Commission Recommendation of 12 May 2009 on the implementation of privacy and data protection principles in applications supported by radio-frequency identification http://eurlex.europa.eu/LexUriServ/LexUriServ.do?uri=CELEX:32009H0387:EN:HTML The Privacy and Data Protection Impact Assessment Framework for RFID Applications (RFID PIAF) was developed by industry (January 2011), endorsed by the Article 29 Data Protection Working Party (February 2011) and signed in Brussels on 6th April 2011, in the presence of Vice-President Neelie Kroes. See in particular Opinion 04/2013 on the Data Protection Impact Assessment Template for Smart Grid and Smart Metering Systems prepared by Expert Group 2 of the Commission’s Smart Grid Task Force, 22 April 2013. The EU's Better Regulation policy aims at simplifying and improving existing regulation, to better design new regulation and to reinforce the respect and the effectiveness of the rules, all this in line with the EU proportionality principle. See http://ec.europa.eu/governance/impact/commission_guidelines/docs/iag_2009_en.pdf

Gérald Santucci, September 2013

7

interpretation, as it does not provide precise definitions for what constitutes “personal data”, who has the right to request deletion of a particular data item, and what are acceptable means of implementing the right. Therefore, it is up to the courts to interpret the law in ways appropriate to specific cases and to evolve the case law as new applications, products and scenarios emerge. Since the beginning the inclusion of a universal “right to be forgotten” has proven one of the stumbling blocks. When in June 2013 Advocate General Niilo Jääskinen gave a formal opinion to the European Court of Justice according to which Internet search engine service providers are not responsible, on the basis of the current Data Protection Directive, for personal data appearing on web pages they process, he actually suggested that such a right in the future Regulation would indeed be a new development. Besides the interpretation of the law, the ability to enforce a “right to be forgotten” depends on the capabilities of the underlying information system. The key technical challenges lie in allowing a person to identify and locate personal data items stored about them, tracking all copies of an item and all copies of information derived from the data item, determining whether a person has the right to request removal of a data item, and finally effecting the erasure or removal of all exact or derived copies of the item. These challenges are indeed huge and the possibility to address them is different depending on whether the information systems are closed (e.g., corporate networks, access-controlled public networks) or open (e.g., the public portion of the Internet). In the case of open systems, since data can be digitally copied, stored locally, and re-entered into the Internet in different locations for different purposes, the information accessible to the public cannot be kept under the control of the user who originated them. Therefore, it is unlikely that the “right to be forgotten” could be ensured using technical means alone. What remains possible is to develop techniques that aim at preventing the unwanted collection and spread of personal data (e.g., robot.txt, do-not-track, access control). Policy makers in the EU should support the principle of minimal disclosure in order to ensure that minimum amount of personal data is collected and stored online. In addition, they should enforce compliance with rules and regulations to address tracking and profiling online by misbehaving players. During the years 2008-2011 European Commission’s Directorate-General CONNECT (then Directorate-General Information Society and Media) triggered a public debate and expert discussion on the technical and legal aspects of the so-called “right to silence of the chips”, a concept very similar to the one of “right to be forgotten”. The concept “right to silence of the chips” was coined in 2008 by Bernard Benhamou, Inter-Ministerial Delegate on Internet Usage at French Ministry of Digital Economy13 after his reading of Bruce Sterling’s book “Shaping Things”. It expresses the idea that individuals should be able to disconnect from their networked environment at any time. With the Internet of Things we have seen indeed many examples of applications that connect objects to the Internet and amongst each other: from cars connected to traffic lights that fight congestion, to home appliances connected to smart power grids and energy metering that allows people to be aware of their electricity consumption or connected pedestrian footpaths that guide the visually impaired. While 13

Bernard Benhamou, Organizing Internet Architecture, October 2006: “More still than with the ‘Internet of machines,’ it will be necessary that (the) ‘Internet of things’ be under the control of citizens. They must be able to control the way in which their personal data are used, and even the way in which these chips can be deactivated. So in the future, citizens will have to intervene in the architecture of these systems in order to enjoy a new kind of freedom: the ‘silence of the chips’.

Gérald Santucci, September 2013

8

supporting the Internet of Things and its promise to boost economic growth, competitiveness and employment, the European Commission wanted to ensure that Europeans, as citizens, entrepreneurs and consumers, lead and control the technology, rather than the technology leading and controlling people. However, the concept of the “right to silence of the chips” never took off within the Internet of Things Expert Group discussions 14 due to some confusion regarding its interpretation and the difficulty to identify relevant technical means for implementing it. We may regret in Europe having lost the opportunity to transform a legal requirement into a powerful competitive asset for EU companies on global markets where consumers are sensitive to their privacy and the protection of their personal data. A related development has been the collaborative work between European Commission’s Joint Research Centre and DG CONNECT to identify best available techniques for deactivating RFID tags at the point of sale in the retail sector15. These techniques should be privacy-, data protection-, and security-friendly, but also be economically viable to provide EU companies with a competitive advantage on the global RFID marketplace. Unfortunately, despite a large operation of information mining, it proved difficult to involve a statistically solid number of stakeholders. Both the European Commission and the Article 29 Data Protection Working Party noted at different occasions that very few RFID application operators were nowadays implementing the RFID Recommendation, whether regarding the deactivation of the tags or the use of the RFID DPIA. Without strong commitment and leadership from the European Commission, it appears that self-regulation is not sufficient to enforce provisions in a Commission Recommendation, leaving it reasonable to favour instead mandatory requirements. Privacy by Design “Privacy by design” is an expression that was coined in 1997 by the Canadian privacy expert and Commissioner for Ontario, Dr Ann Cavoukin, but it took about 10 years before it was considered a positive requirement into EU, U.S. and Canadian data protection legislative and regulatory frameworks. The idea is that effective privacy legislation ought to include an explicit privacy-by-design requirement, including mandating specific technological requirements for those technologies that have the most privacy-intrusive potential. In addition to regulation16, the concept of “privacy by design” has emerged as a way to deal with privacy issues. It appears at Article 23 of the proposed Regulation under “Data protection by design and by default”.

14

15

16

Established by Commission Decision in August 2010, the IoT Expert Group was dissolved in November 2012 and its report was published in February 2013 (http://ec.europa.eu/digital-agenda/en/news/conclusionsinternet-things-public-consultation). JRC Scientific and Policy Reports, RFID Tag Privacy Threats and Countermeasures: Current Status, 2012. This work is an answer to points 11-14 of the May 2009 Commission Recommendation on the data protection, privacy and security aspects of RFID-enabled applications. Obviously Directive 95/46/EC, but also Regulation (EC) No. 45/2001 applicable to data processing by EU institutions and bodies, Directive 2002/58/EC on privacy and electronic communications, Framework Decision 2008/977/JHA on data protection in the area of police and judicial cooperation in criminal matters, and Directive 2009/136/EC (amending Directive 2002/22/EC on universal service and users’ rights relating to electronic communications networks and services, Directive 2002/58/EC concerning the processing of personal data and the protection of privacy in the electronic communications sector and Regulation (EC) No

Gérald Santucci, September 2013

9

The proposal states that with regard to the state of the art and the cost of implementation, “the controller shall, both at the time of the determination of the means for processing and at the time of the processing itself, implement appropriate technical and organisational measures and procedures in such a way that the processing will meet the requirements of this Regulation and ensure the protection of the rights of the data subject. The Commission shall be empowered to adopt delegated acts in accordance with Article 86 for the purpose of specifying any further criteria and requirements for appropriate measures and mechanisms referred to in paragraph 1 (…), in particular for data protection by design requirements applicable across sectors, products and services.” The introduction of “data protection by design” in the proposed EU Data Protection Regulation is a welcomed development, yet it is not easy to transpose this concept into specific, precise and non-ambiguous technical requirements if they are to allow industry to competitively design, develop and deploy privacy-complaint solutions. The current effort in DG CONNECT to integrate its Research, Policy and Regulation pillars constitutes, in this respect, a prerequisite for initiating projects and studies in the Horizon 2020 on concrete approaches to data protection (and privacy) by design, including tests of their viability. Trust: control versus orientation Privacy protections are key to building trust in the digital economy and to protecting the online marketplace. ICT developments have long posed challenges to the privacy of individuals. Since 1995, EU data protection regulations have largely held up well, and the proposed Data Protection Regulation is a timely and well-thought initiative to rise up to the new societal, legal and ethical challenges that are brought about by changing business models and technological advances. As objects around us have identities and virtual personalities operating in smart spaces, using intelligent interfaces to connect and communicate, and develop attributes such as existence, sense of “self”, connectivity, interactivity, sense of environmental awareness, regulations cannot ignore the new complexity of the digital environment and need to maintain trust by finding the balance between the individual’s right to privacy and the need to support innovation. How can we obtain trust? On one hand, regulations on data protection can be considered a powerful instrument. However, the complexity of privacy challenges in the global digital economy cannot be addressed by any central control. Put in place without any consideration of the bigger picture, regulations would stifle innovation, prevent the realisation of economic benefits, and ultimately not serve the interests of the consumer or citizen. Economic operators might actually see them as burdensome and confusing, not able to keep pace with scientific and technological progress, and hence becoming quickly out-dated. Given the comparative lack of technology-driven innovation in Europe, in particular when considered in the context of an economic and financial environment in dire need of every stimulus it can get, regulations must be fine-tuned with innovation and global competition in mind. On the other hand, it is reasonable to consider that problems created by new technology should be solved by new technology. This is often the case, as illustrated by the examples of interoperability and security. But we have seen that regarding the “right to be forgotten”, 2006/2004 on cooperation between national authorities responsible for the enforcement of consumer protection laws).

Gérald Santucci, September 2013

10

privacy by design, and the deactivation of RFID tags in the retail sector, technical measures are not easy to identify, develop and/or implement. There is an alternative to excessive top-down regulation and uncoordinated, technologydriven solutions: “Economics 2.0”, i.e. the thrust towards a self-regulating societal and market architecture17. Such a new paradigm, which is in line with the “openspace/synopticon” vision of a non-surveillance and non-punishment society, was already outlined by media theorist Rob van Kranenburg at the end of his notebook on the Internet of Things18: “I focused on moving from privacy to privacies, which acknowledged that in a hybrid environment we leave different traces and might want to build temporary personalities around these traces, not exposing our entire personality all the time. One of the concrete applications we focused on one the idea of having privacy levels on your mobile phone, as we guessed it would have an RFID reader soon. In industry terms you would have a lifestyle manager, in privacy activist terms you would have the equivalent of Melanie Rieback’s RFID Guardian – a firewall.” Just as privacy needs to be re-thought by considering privacies, trust is an issue that should not be addressed in terms of the actions required to create or restore trust. We should focus instead on the conditions that are necessary to empower citizens to orient themselves properly in a hyper-connected environment and make their decisions themselves. The notion of orientation should prevail over the notions of transparency and control: acquiring the knowledge, means and tools to orient oneself and make proper decisions is much more conducive to societal cohesion and sense of freedom than basing freedom on transparency and control. While the classical model would suggest that in order to make informed decisions we need to know everything (transparency) and consequently be in control, the Economics 2.0 model tells us that we do not need to become an expert in all things before making decisions. Too much information kills information; what we need is to be fair to each other or respectful for each other, i.e. providing the relevant information for a respectful interaction rather than drowning other people under a heap of information. Therefore, today’s challenge for policymakers consists in the elaboration of new rules ensuring that we, as citizens, maintain and grow our own orientation abilities: this is what we can call “digital literacy”19. Conclusion The increasing network connectivity and systemic interdependencies that are developing at the confluence of the Internet, the Internet of Things, Cloud Computing, Big Data, Robotics and Semantic technologies outdate the institutions and social codes that were used so far to interpret and steer the evolution of our societies. Therefore, we get the perception that we live in an increasingly complex and de-stabilised world, not only as a result of dramatic shifts in the geostrategic environment or the natural environment but mainly of feedback effects that have been woven in to our technological fabric. ICT tends to cascade in waves into the market place: large computers in the 1950s, microprocessors and memories in the 1970s, personal computers in the 1980s, the Internet and the World Wide Web in the 1990s, wearable cameras and computers, smartphones and tablets in the 2000s, intelligent (nano-)sensors and other 17

Dirk Helbing, Economics 2.0: The Natural Step towards A Self-Regulating, Participatory Market Society, June 7, 2013. 18 See footnote 5. 19 I owe much to my DG CONNECT colleague Nicole Dewandre for her insights into digital literacy and the orientation principle.

Gérald Santucci, September 2013

11

embedded devices in the near future. Driven by globalisation and technological revolutions, the world is changing fast but the intellectual framework that continues to inspire the current institutions surrounding our socioeconomic system dates back to the agricultural and first industrial revolutions and the pioneering works of Thomas Hobbes (the “Leviathan” – 1651), Adam Smith (the “invisible hand” – 1776) and David Ricardo (“value comes from labour” – 1817). It is time we realise that a new intellectual framework, a new paradigm, is needed if we are to grasp the diverse complex issues at stake. The idea of connected devices of all sorts chatting away to one another is certainly attractive - most people want to enjoy the new, exciting services that a data-driven future can provide, but at the same time they do not trust companies and governments as regards the collection and processing of personal data. Such lack of trust obviously hampers adoption of potentially useful information and communications technologies. How can we have the Internet of Things (or the 'Internet of Everything') while preserving our fundamental right to privacy? Several answers exist, but we have seen that they can actually be clustered around two: the first one is technology itself - embedding privacy and security in the very design of new systems and components; the second one is adequate rules and regulations. A combination of technology and regulation can also be a wise approach. However, what is essential is to realize that the word 'privacy' will not have the same meaning in 2020 than the one it has today. By then, technology will have profoundly changed the way we live, and more importantly the way we apprehend our surroundings and even our existential challenges. We will have a different perception - and then conception - of 'privacy', 'personal data', 'security'. Our societies will have developed new narratives of self, identity, life and relations to objects. The huge but honourable challenge of our generation is to build an awesome future without trading away our human need for privacy. Without trust technology cannot thrive; without technological change, human progress is impossible. Therefore, we need to forget the emotional weight of certain words and all get together to work out a new "social contract". It is still time to choose Trust against Control, but we must know that the task will take time and will impose that we think differently for steering a different world. ______________________ The views expressed in this text are those of the author and do not necessarily represent the views of, and should not be attributed to, the European Commission. Any mistakes are obviously the sole responsibility of the author.

Gérald Santucci, September 2013

12