A Guide to Privacy by Design PDF
Document Details
Uploaded by ConciseTrust
2019
Tags
Summary
This document provides a guide to privacy by design, outlining the concept, foundational principles, and strategies. It explains how privacy should be an integral part of system design and discusses how data protection regulations can support data protection standards.
Full Transcript
A Guide to Privacy by Design V. october 2019 V. october 2019 TABLE OF CONTENTS I. PRIVACY BY DESIGN................................................................................................................... 5 The Concept of Privacy by Design..................
A Guide to Privacy by Design V. october 2019 V. october 2019 TABLE OF CONTENTS I. PRIVACY BY DESIGN................................................................................................................... 5 The Concept of Privacy by Design................................................................................................. 5 The Foundational Principles of Privacy by Design....................................................................... 7 1. Proactive not Reactive; Preventative not Remedial........................................................... 7 2. Privacy as the Default Setting............................................................................................ 7 3. Privacy Embedded into Design........................................................................................... 8 4. Full Functionality: Positive-Sum, not Zero-Sum................................................................. 8 5. End-to-End Security: Full Lifecycle Protection................................................................... 9 6. Visibility and Transparency: Keep it Open.......................................................................... 9 7. Respect for User Privacy: Keep it User-Centric................................................................. 10 Parties bound by data protection by design.............................................................................. 11 II. PRIVACY REQUIREMENTS OF SYSTEMS................................................................................... 12 Privacy and security goals........................................................................................................... 12 III. PRIVACY ENGINEERING: PRIVACY ENGINEERING................................................................ 14 IV. PRIVACY DESIGN STRATEGIES............................................................................................. 16 Minimise................................................................................................................................... 17 Hide.......................................................................................................................................... 18 Separate................................................................................................................................... 19 Abstract.................................................................................................................................... 19 Inform....................................................................................................................................... 19 Control...................................................................................................................................... 20 Enforce..................................................................................................................................... 21 Demonstrate............................................................................................................................. 22 V. PRIVACY DESIGN PATTERNS.................................................................................................... 24 Design pattern catalogues.......................................................................................................... 25 VI. PRIVACY ENHANCING TECHNOLOGIES (PETS)................................................................... 26 Classification of PETs.................................................................................................................. 27 PET catalogue.............................................................................................................................. 27 VII. CONCLUSIONS..................................................................................................................... 30 VIII. ANNEX 1: SELECTION OF PRIVACY DESIGN PATTERNS...................................................... 32 IX. ANNEX 2: REGULATORY EXTRACTS..................................................................................... 44 Recital 39...................................................................................................................................... 44 Recital78...................................................................................................................................... 44 Article 5 Principles relating to processing of personal data...................................................... 45 Article 13 Information to be provided where personal data are collected from the data subject...................................................................................................................................................... 45 Article 14 Information to be provided where personal data have not been obtained from the data subject................................................................................................................................. 46 Article 24 Responsibility of the controller.................................................................................. 48 Article 25 Data protection by design and by default.................................................................. 48 Article 28 Processor..................................................................................................................... 49 Article 32 Security of processing................................................................................................. 50 Article 36 Prior consultation....................................................................................................... 51 Article 83 General conditions for imposing administrative fines.............................................. 52 I. PRIVACY BY DESIGN THE CONCEPT OF PRIVACY BY DESIGN The idea of “data protection by design” has been around for more than 20 years and a great deal of work has been carried out in this area under the term “privacy by design” (Privacy by Design, PbD). This concept was developed by Anne Cavoukian, Ontario’s Data Protection Commissioner, in the 90’s and presented at the 31st International Conference of Data Protection and Privacy Commissioners in 2009 with the title “Privacy by Design: The Definitive Workshop”. It was internationally accepted at the 32nd International Conference of Data Protection and Privacy Commissioners, held in Jerusalem in 2010, with the adoption of the “Resolution on Privacy by Design”. This resolution recognised the importance of incorporating privacy principles within the design, operating and management processes of organisational systems, in order to attain a frame of integral protection regarding data protection. It also encouraged the adoption of the Foundational Principles of Privacy by Design as defined by Ann Cavoukian, and invited Data Protection Authorities to actively work on and promote the inclusion of privacy by design in policies and legislation on data protection within their respective States. The Foundational Principles of Privacy by Design 1. Proactive not Reactive; Preventative not Remedial 2. Privacy as the Default Setting 3. Privacy Embedded into Design 4. Full Functionality: Positive-Sum, not Zero-Sum 5. End-to-End Security: Full Lifecycle Protection 6. Visibility and Transparency: Keep it Open 7. Respect for User Privacy: Keep it User-Centric Table 1 – Principles of “Privacy by design” The General Data Protection Regulation (EU) 2016/679 (hereinafter, GDPR), in Article 25 and under the heading “Data protection by design and by default” incorporates the practice of considering privacy requirements from the first stages of product and service design into data protection regulations. It therefore confers on it the status of a legal requirement in order to integrate the guarantees for protecting citizens’ rights and freedoms with regard to their personal data from the early development stages of 1 Peter Hustinx, European Data Protection Supervisor. Privacy by Design: Delivering the Promises, Madrid 2009 https://edps.europa.eu/sites/edp/files/publication/09-11-02_madrid_privacybydesign_en.pdf 2 Ann Cavoukian, Identity in the Information Society, Aug 2010, Volume 3, Issue 2, pp 247-251. Privacy by design: the definitive workshop. A foreword by Ann Cavoukian, Ph.D https://link.springer.com/content/pdf/10.1007%2Fs12394-010-0062-y.pdf 3 Resolution on Privacy by Design. 32nd International Conference of Data Protection and Privacy Commissioners. Jerusalem (Israel) 27-29/10/2010 https://edps.europa.eu/sites/edp/files/publication/10-10-27_jerusalem_resolutionon_privacybydesign_en.pdf 4 General Data Protection Regulation (EU) 2016/679 https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:32016R0679 5 Article 25. “Data protection by design and by default” - General Regulation (EU) 2016/679, on Data Protection https://eur- lex.europa.eu/legal-content/ES/TXT/HTML/?uri=CELEX:32016R0679&from=ES#d1e3126-1-1 systems and products. Understood therefore as the need to consider privacy and the principles of data protection from the inception of any type of processing and for the purposes of drafting this document, the terms “data protection by design” and “privacy by design” can be considered as equivalent. Privacy by Design (hereinafter, PbD) involves a focus geared towards risk management and accountability to establish strategies that incorporate privacy protection throughout the life cycle of an object (whether it is a system, a hardware or software product, a service or a process). By an object’s life cycle we take to mean all the stages that it goes through, from its concept development until its removal, passing through the stages of development, production, operation, maintenance and withdrawal. Furthermore, it involves taking into account not only the application of measures for privacy protection in the early stages of the project, but also to consider all the business processes and practices that process associated data, thus achieving a true governance of personal data management by organisations. Figure 1 – Privacy by design and by default as the comprehensive approach to risk and accountability. 6 European Data Protection Supervisor (EDPS). Opinion 5/2018 Preliminary Opinion on Privacy by design, May 2018 https://edps.europa.eu/sites/edp/files/publication/18-05-31_preliminary_opinion_on_privacy_by_design_en_0.pdf “’Privacy by Design’ or ‘Data Protection by Design’? For the purpose of this Opinion, we use the term “privacy by design” to designate the broad concept of technological measures for ensuring privacy as it has developed in the international debate over the last few decades. In contrast, we use the terms “data protection by design” and “data protection by default” to designate the specific legal obligations established by Article 25 of the GDPR.” 7 European Union Agency for Cybersecurity (ENISA). Privacy and Data Protection by Design – from policy to engineering, Dec 2014 https://www.enisa.europa.eu/publications/privacy-and-data-protection-by-design/at_download/fullReport “The term “Privacy by Design”, or its variation “Data Protection by Design”, has been coined as a development method for privacy-friendly systems and services, thereby going beyond mere technical solutions and addressing organisational procedures and business models as well”. 8 Information Commissioner’s Office (ICO). Data protection by design and default. https://ico.org.uk/for-organisations/guide-to-data- protection/guide-to-the-general-data-protection-regulation-gdpr/accountability-and-governance/data-protection-by-design-and- default/ “This concept is not new. Previously known as ‘privacy by design’, it has always been part of data protection law. The key change with the GDPR is that it is now a legal requirement.” 9 The new European regulations are a paradigm change in how the rights and freedoms of data subjects are guaranteed when it comes to the processing of their personal data. They have a risk-based focus, a dynamic and continually improving approach to better understand the risks to privacy and to determine the technical and organisational measures to be implemented, and from the perspective of accountability, understood as a continuous and traceable critical self-analysis of the data controller in fulfilling the duties assigned to them by law. The final goal is to ensure that data protection is present from the early stages of development and not a layer added to a product or system. Privacy should be an integrated part of the nature of said product or service. THE FOUNDATIONAL PRINCIPLES OF PRIVACY BY DESIGN PbD is based on the conception of privacy as the default modus operandi within the business models of organisations, extending to information technology systems that support data processing, related business processes and practices, and physical and logical design of the channels of communication utilised. Privacy can be ensured by putting into practice the seven Foundational Principles defined by Ann Cavoukian : 1. Proactive not Reactive; Preventative not Remedial. PbD involves anticipating events that affect privacy before they take place. Any system, process or infrastructure that uses personal data must be conceived and designed from the beginning by identifying possible risks to the rights and freedoms of the data subjects and minimising them before they can cause actual damage. PbD policy is characterised by the adoption of proactive measures that anticipate threats, identify weaknesses in systems to neutralise or minimise risks instead of applying remedial measures to resolve security incidents once they have taken place. That is to say, PbD avoids “the policy of rectification” and anticipates the materialisation of risk. This involves: A clear commitment by the organisation which must be promoted from the highest levels of the Administration. Developing a culture of commitment and continued improvement by all workers, as a policy means nothing until and unless it is translated into concrete actions that are fuelled by results. Defining and assigning concrete responsibilities so that each member of the organisation is clearly aware of their tasks with regard to privacy. Developing systematic methods based on indicators for the early detection of processes and practices that are deficient in guaranteeing privacy. 2. Privacy as the Default Setting PbD seeks to provide the user with the highest levels of privacy possible given the state of the art, and especially, that personal data are automatically protected in any system, application, product or service. The default setting must be established by design to be set to the level that provides the maximum possible privacy. If the subject does not modify the setting, their privacy is guaranteed and must remain intact, as it is integrated into the system and constitutes the default setting. 10 Ann Cavoukian, Ph.D. Information & Privacy Commissioner Ontario, Canada. Privacy by Design: The 7 Foundational Principles, Jan 2011 https://www.ipc.on.ca/wp-content/uploads/Resources/7foundationalprinciples.pdf 11 Ann Cavoukian, Ph.D. Information & Privacy Commissioner Ontario, Canada. Operationalizing Privacy by Design. A guide to implementing strong privacy practices, Dec 2012 http://www.ontla.on.ca/library/repository/mon/26012/320221.pdf This principle, in practical terms, is based on data minimisation throughout the stages of processing: compilation, use, retention and distribution. For this it is necessary to: Make data collection criteria as restricted as possible. Limit the use of personal data to the goals for which they were collected and ensure that there is a legitimate basis for processing. Restrict access to personal data to the parties involved in the processing in accordance with the “need to know” principle and according to the function behind the creation of differentiated access profiles. Define strict time limits for retention and to establish operational mechanisms that guarantee compliance. To create technological and procedural barriers to the unauthorised linking of independent sources of data. 3. Privacy Embedded into Design Privacy must be an integral and inseparable part of the systems, applications, products and services, as well as the business practices and processes of an organisation. It is not an additional layer or module that is added to a pre-existing entity, rather it must be integrated into the group of non-functional requirements from the stages of concept development and design themselves. To guarantee that privacy is accounted for in the early design stages, we must: Consider it as an essential requirement within the life cycle of systems and services, as well as in the design of organisational processes. Perform a risk analysis of the rights and freedoms of persons and when applicable, perform data protection impact assessments, as an integral part of any new processing initiative. Document all decisions that are adopted within the organisation from a “privacy design thinking” perspective. 4. Full Functionality: Positive-Sum, not Zero-Sum It has traditionally been understood as privacy gained at the cost of other functionalities, thus presenting dichotomies such as privacy vs. usability, privacy vs. functionality, privacy vs. business benefit, and even privacy vs. security. This is a contrived approach and the goal must be to seek an optimal balance for a “win-win” search, with an open mind that accepts new solutions for fully functional, effective and efficient solutions both at business and privacy levels. For this, from the first stages of concept development of products and services, an organisation must: 12These approaches were already discussed when concepts such as cybersecurity or quality control were introduced in organisations. Assume that different and legitimate interests may coexist: those of the organisation and those of the users to whom it provides services, and that it is necessary to identify, assess and balance them accordingly. Establish channels of communication for collaboration and consultation for the participants in order to comprehend and bring together multiple interests that, at first glance, may seem to diverge. If the proposed solutions threaten privacy, seek new solutions and alternatives to achieve the intended functionality and purposes, but never losing sight of the fact that risks to the user’s privacy must be adequately managed. 5. End-to-End Security: Full Lifecycle Protection Privacy is born in design, before the system is set in motion, and it must be guaranteed throughout the life cycle of the data. Information security involves the confidentiality, integrity, availability and resilience of the systems that store it. Privacy also guarantees unlinkability, transparency and the data subject’s capacity for intervention and control in the processing (intervenability). To integrate privacy throughout the stages of data processing, the different operations involved (collection, recording, classification, conservation, consultation, distribution, limitation, erasure, etc.) must be thoroughly analysed and in each case, the most adequate measures for information protection must be implemented, among which are: Early pseudonymisation or anonymisation techniques such as k-anonymity. Classification and organisation of data and processing operations based on access profiles. Default encryption so that the “natural” state of data when stolen or robbed is “illegible”. The safe and guaranteed destruction of the information at the end of its life cycle. 6. Visibility and Transparency: Keep it Open One of the keys to guaranteeing privacy is to be able to demonstrate it, verifying that the processing is in accordance with the given information. Transparency in data processing is essential for demonstrating diligence and accountability before the Supervisory Authority and as a measure of trust before data subjects. As established in Recital 39 of the GDPR , it should be transparent to natural 13 Spanish Data Protection Agency (AEPD) – Unit of Evaluation and Technological Studies. K-anonymity as a privacy measure, June 2019 https://www.aepd.es/media/notas-tecnicas/nota-tecnica-kanonimidad-en.pdf K-anonymity is a property of anonymised data which makes it possible to quantify to what extent the anonymity of the subjects present in a dataset in which the identifiers have been removed is preserved. In other words, it is a measure of the risk that external agents can obtain information of a personal nature from anonymised data.” 14 General Data Protection Regulation (EU) 2016/679 https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:32016R0679 persons that personal data concerning them are collected, used, consulted or otherwise processed and to what extent the personal data are or will be processed. Promoting transparency and visibility requires the adoption of a series of measures such as: Making the privacy and data protection policies that govern the functioning of the organisation public. Developing and publishing concise, clear and comprehensible information clauses that are easily accessible and allow data subjects to understand the scope of the processing of their data, the risks that they may be exposed to, as well as how to exercise their rights regarding data protection. Although it is not compulsory for all controllers, making public or at least easily accessible for data subjects, the list of all the processing carried out in the organisation. Sharing the identity and contact details of the organisation’s data controller Establishing accessible, simple and effective mechanisms of communication, compensation and complaints for the owners of the data. 7. Respect for User Privacy: Keep it User-Centric Without forgetting the legitimate interests of the organisation with regard to the data processing it performs, the ultimate goal must be to guarantee the rights and freedoms of the users whose data is processed, and therefore, any adopted measure must focus towards guaranteeing their privacy. This involves designing “user-centric” processes, applications, products and services, anticipating their needs. The user must play an active role in managing their data and in controlling what others do with it. Their inaction must not imply reduced privacy, referring back to one of the aforementioned principles which advocates a default privacy setting that offers the highest level of protection. Designing processes, applications, products and services that are focused on guaranteeing the privacy of data subjects involves: Implementing privacy settings that are “robust” by default and where users are informed of the consequences to their privacy when established parameters are modified. Making available complete and suitable information that leads to an informed, free, specific and unambiguous consent that must be explicit in all cases that require it. Providing data subjects access to their data and to detailed information on the processing goals and communications carried out. Implementing efficient and effective mechanisms that allow data subjects to exercise their rights on data protection. Figure 2 – The Foundational Principles of Privacy by Design PARTIES BOUND BY DATA PROTECTION BY DESIGN The GDPR establishes “data protection by design” as a legal requirement to be fulfilled. Article 83 considers it a punishable offence to not comply with this obligation and its correct application is one of the criteria for measuring the gravity of an infringement As established by Article 25 of the GDPR , the obligation to implement data protection by design is applicable to all data controllers regardless of their size, the type of data processed or the nature of the processing. Concretely, it requires the appropriate technical and organisational measures to be implemented “both at the time of the determination of the means for processing and at the time of the processing itself”. Although it is the data controller who is responsible for fulfilling this obligation in light of Recital 78 and the Article 28 of the GDPR , data protection by design also involves other participants in the processing of personal data, such as service providers, product and application developers or device manufacturers. The data controller must encourage them to “take into account the right to data protection when developing and designing such products, services and applications” and when they must engage another processor for data processing, they must use “only processors providing sufficient guarantees to implement appropriate technical and organisational measures in such a manner that processing will meet the requirements of this Regulation and ensure the protection of the rights of the data subject.” In brief, it is the data controller who, as part of their duties, must limit their selection of products and processors to those that can ensure the fulfilment of the GDPR 15Article 83. “General conditions for imposing administrative fines” - General Data Protection Regulation (EU) 2016/679, https://eur- lex.europa.eu/legal-content/EN/TXT/HTML/?uri=CELEX:32016R0679&from=EN#d1e6301-1-1 16On 4 July 2019 the Romanian Data Protection authority announced that it had fined the company UNICREDIT BANK S.A for failure to comply with Article 25.1 of the GDPR https://www.dataprotection.ro/index.jsp?page=Comunicat_Amenda_Unicredit&lang=en 17 Article 25. “Data protection by design and by default” - General Data Protection Regulation (EU) 2016/679 https://eur- lex.europa.eu/legal-content/EN/TXT/HTML/?uri=CELEX:32016R0679&from=EN#d1e3126-1-1 18 General Data Protection Regulation (EU) 2016/679 https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:32016R0679 19 Article 28. “Processor” General Data Protection Regulation (EU) 2016/679https://eur-lex.europa.eu/legal- content/EN/TXT/HTML/?uri=CELEX:32016R0679&from=EN#d1e3210-1-1 requirements and is especially obliged by law to guarantee data protection by design and by default. This requirement is also applicable to joint controllers based on their respective responsibilities jointly assumed in determining the means and purposes of the processing. Figure 3 – Privacy by design as the basis of the organisation’s culture of privacy (The figure has been designed using images from Freepik.com) II. PRIVACY REQUIREMENTS OF SYSTEMS To understand how personal data processing can affect the privacy of individuals is key to designing and developing trustworthy systems from the point of view of data protection. In Article 5 , the GDPR lists the basic principles to be adhered to when processing data, such that these six principles (lawfulness, fairness and transparency, purpose limitation, data minimisation, accuracy, storage limitation, integrity and confidentiality) joined to accountability become the core and the goal that every system, application, service or process must ensure in its design, apart from the functional requirements or requisites of the system itself. PRIVACY AND SECURITY GOALS Traditionally, designing secure and trustworthy systems has focused on analysing risks and responding to threats that affect the security goals that are more geared towards privacy: confidentiality, avoiding unauthorised access to systems, integrity, protecting them from unauthorised modifications of information and availability, guaranteeing that the data and systems are always available when necessary. 20 Article 5. “Principles relating to processing of personal data” General Data Protection Regulation (EU) 2016/679https://eur- lex.europa.eu/legal-content/EN/TXT/HTML/?uri=CELEX:32016R0679&from=EN#d1e1873-1-1 Nevertheless, although unauthorised access to and modification of personal data can become a critical aspect that threatens the privacy of individuals, there are other risk factors that may surface during an authorised data processing and that must be identified during risk assessment of the rights and freedoms of data subjects. The loss of control in decision making, excessive data collection, re-identification, discrimination and/or stigmatisation of persons, biases in automated decisions, users’ lack of comprehension of the scope and the risks of unlawful processing or profiling that is invasive or incorrect, are examples of risks to privacy that have a clear effect on the rights and freedoms of persons which cannot be managed by using only a traditional risk model that focuses exclusively on security goals. Keeping in mind the aforementioned scenario and the possible risks to privacy associated with the planned and authorised functioning of systems that collect, use and reveal personal data, it is necessary to widen the scope of analysis to include not only risks derived from unauthorised processing, but also those that may arise from planned and authorised information processing. To cover these possible risks, it is necessary to include three new privacy-focused protection goals within the frame of analysis , whose guarantee safeguards the processing principles established by the GDPR: Unlinkability: seeks to process data in such a manner that the personal data within a domain cannot be linked to the personal data in a different domain, or that establishing such a link involves a disproportionate amount of effort. This privacy goal minimises the risk of an unauthorised use of personal data and the creation of profiles by interconnecting data from different sets, establishing guarantees regarding the principles of purpose limitation, data minimisation and storage limitation. Transparency: seeks to clarify data processing such that the collection, processing and use of information can be understood and reproduced by all the parties involved and at any time during the processing. This privacy goal strives to delineate the processing context and make the information on the goals and the legal, technical and organisational conditions applicable to them available before, during and after data processing to all involved parties, both for the controller and the subject whose data are processed, thus minimising the risks to the principles of loyalty and transparency. Intervenability: ensures that it is possible for the parties involved in personal data processing, and especially the subjects whose data are processed, to intervene in the processing whenever necessary to apply corrective measures to the information processing. This objective is closely linked to the definition and implementation of procedures for exercising data protection rights, presenting complaints or revoking consent given by the data subjects, as well the mechanisms to guarantee the data controller’s evaluation of the fulfilment 21 Sean Brooks, Michael Garcia, Naomi Lefkovitz, Susanne Lightman, Ellen Nadeau - National Institute of Standards and Technology (NIST). NISTIR 8062 An Introduction to Privacy Engineering and Risk Management in Federal Systems, Jan 2017 https://nvlpubs.nist.gov/nistpubs/ir/2017/NIST.IR.8062.pdf 22 Harald Zwingelberg, Marit Hansen. 7th PrimeLife International Summer (PRIMELIFE), Sep 2011, Trento, Italy. pp.245-260. Privacy Protection Goals and Their Implications for eID Systems. https://hal.inria.fr/hal-01517607/document 23 Marit Hansen. 7th PrimeLife International Summer (PRIMELIFE), Sep 2011, Trento, Italy. pp.14-31. Top 10 Mistakes in System Design from a Privacy Perspective and Privacy Protection Goals https://hal.inria.fr/hal-01517612/document and effectiveness of the obligations that are assigned to them by law, which gives greater respect to the principles of accuracy and accountability highlighted by the GDPR. These three new protection goals together with existing security goals establish a global framework of protection in personal data processing and determine, by means of a risk assessment, other non-functional attributes or requirements that the system must satisfy and which become entry points to privacy design processes. PRIVACY PROTECTION GOALS UNLINKABILITY TRANSPARENCY CONTROL Data minimisation Lawfulness, fairness and Purpose limitation transparency Storage limitation Accuracy Purpose limitation Integrity and Integrity and confidentiality confidentiality Accountability Table 2: Guarantees of the GDPR processing principles through privacy goals Viewed from a global perspective, these six protection goals complement each other and occasionally overlap, which is why for each data protection impact assessment (DPIA) that is made on prospective data processing, the possible priority of one goal over another must be evaluated and the necessary safeguards adopted. III. PRIVACY ENGINEERING: PRIVACY ENGINEERING Privacy Engineering is a systematic process with a risk-oriented focus whose goal is to translate into practical and operational terms, the principles of privacy by design (PbD) within the life cycle of information systems entrusted with personal data processing: - specifying the privacy properties and functionalities that must be fulfilled by the system such that their design and implementation is possible (privacy requirements definition) - designing the architecture and implementing system elements that cover the defined privacy requirements (privacy design and development) 24 Marit Hansen, Meiko Jensen, Martin Rost.. International Workshop on Privacy Engineering. Protection Goals for Privacy Engineering, May 2015 https://www.ieee-security.org/TC/SPW2015/IWPE/2.pdf 25 Spanish Data Protection Agency (AEPD). Guía para la Evaluación de Impacto en la Protección de Datos personales, Oct 2018 https://www.aepd.es/media/guias/guia-evaluaciones-de-impacto-rgpd.pdf 26 Spanish Data Protection Agency (AEPD). Template for Data Protection Impact Assessment Report (DPIA) for Public Administrations https://www.aepd.es/media/guias/Modelo_Informe_PIA_V15_EN.rtf 27 This section deals with privacy engineering as a process within the design and development of an object. Privacy engineering may also be understood as a discipline, as described in: : https://www.aepd.es/en/blog/2019-09-11-ingenieria-privacidad.html 28 Massachusetts Institute of Technology Research & Engineering (MITRE) – Privacy Community of Practice (CoP). Privacy Engineering Framework, Jul 2014 https://www.mitre.org/sites/default/files/publications/14-2545-presentation-privacy-engineering-framework- july2014.pdf - confirming that the defined privacy requirements have been correctly implemented and fulfil the expectations and needs of the stakeholders (privacy verification and validation) Figure 4 - Privacy Engineering The goal is to make privacy an integrated part of system design, such that privacy requirements are defined in terms of fully implementable properties and functionalities and any privacy risk that is identified is appropriately managed by the system in a proactive manner. For this, a systematic and methodological approach is required, transferring the what of the concept development and analysis stages (identified privacy requirements) to the how of the design and implementation stages (concrete strategies and solutions), thus working sequentially at different levels of abstraction. At the highest level, in the initial stages of concept development of the object and the analysis of its requirements, it is necessary to work with privacy design strategies , high-level general approaches meant to identify tactics to be followed during the different stages of data processing, in order to guarantee privacy goals and the fulfilment of data processing principles. The strategies provide an accessible model for engineers designing an object to define the privacy requirements identified during the analysis and requirement stages. Privacy design strategies serve as bridges between processing principles imposed by law and the implementation of privacy in concrete solutions. As we shall see later, they are centred on responding to actions that may present a threat to privacy in processing activities and their use is not exclusionary. On the contrary, it is desirable that all or most of them should be applied in order to make the developed object as privacy-friendly as possible. Privacy design strategies are manifested, at the lowermost level, in privacy design patterns. These are reusable solutions that are employed in the design stage and can be applied to solving common privacy problems that frequently crop up in product and systems development. The goal of these patterns is to create a catalogue of reusable solutions in the privacy design of systems and to standardise the design process. 29 Jaap-Henk Hoepman. Institute for Computing and Information Sciences (ICIS) – Radboud University Nijmegen, The Netherlands. Privacy Design Strategies, Oct 2012 https://www.cs.ru.nl/~jhh/publications/pdp.pdf The association between patterns and strategies is not one-to-one, that is, the same pattern can implement and respond to multiple privacy strategies, providing solutions to different problems that appear throughout data processing activities. Finally, at the lowermost level, the development stage, we find improved privacy technologies or PETS (Privacy Enhancing Technologies) that are used to implement privacy design patterns with a concrete technology. The Commission, in its communication to the European Parliament and the Council on Promoting Data Protection by Privacy Enhancing Technologies (PET) defines them as “a coherent system of ICT measures that protects privacy by eliminating or reducing personal data or by preventing unnecessary and/or undesired processing of personal data, all without losing the functionality of the information system.”. Similar to privacy design patterns and strategies, a single PET can be used to implement multiple design pattern solutions. Figure 5 – Privacy strategies, patterns and techniques (PET) within a system’s life cycle. IV. PRIVACY DESIGN STRATEGIES Current research identifies eight privacy design strategies that are known as ‘minimise’, ‘hide’, ‘separate’, ‘abstract’, ‘inform’, ‘control’, ‘enforce’ and ‘demonstrate’. In turn, these eight strategies can be divided into two categories : data-oriented strategies and process oriented strategies. The first includes the strategies of ‘minimise’, ‘hide’, ‘separate’ and ‘abstract’ and is of a more technical nature and focuses on privacy- friendly processing of collected data. The second category includes the strategies of 30 COM(2007)228 COMMUNICATION FROM THE COMMISSION TO THE EUROPEAN PARLIAMENT AND THE COUNCIL on promoting data protection by Privacy Enhancing Technologies (PETs) https://eur-lex.europa.eu/legal- content/EN/TXT/PDF/?uri=CELEX:52007DC0228&from=EN 31 Jaap-Henk Hoepman. Privacy Design Strategies (The Little Blue Book), Mar 2019 https://www.cs.ru.nl/~jhh/publications/pds- booklet.pdf 32 Michael Colesky, Jaap-Henk Hoepman – Digital Security. Radboud University Nijmegen, The Netherlands, Christiaan Hillen – Valori Security. Nieuwegein, The Netherlands. A Critical Analysis of Privacy Design Strategies, May 2016 https://www.researchgate.net/publication/305870977_A_Critical_Analysis_of_Privacy_Design_Strategies ‘inform’, ‘control’, ‘enforce’ and ‘demonstrate'. These are of a more organisational nature and geared towards defining processes that implement responsible personal data management. PRIVACY DATA ORIENTED PRIVACY PROCESS ORIENTED PROTECTION GOAL PROTECTION STRATEGIES PRIVACY PROTECTION STRATEGIES UNLINKABILITY MINIMISE, ABSTRACT, SEPARATE, HIDE CONTROL CONTROL, ENFORCE, DEMONSTRATE TRANSPARENCY INFORM Table 3 – Link between privacy goals and privacy design strategies Although, depending on the context, certain strategies may be more applicable than others within a system’s frame of development, these eight strategies, considered from the initial stages of concept development and analysis and jointly applied, permit the inclusion of safeguards and protection measures in data processing operations and procedures, making it possible for final results to take into account privacy requirements that guarantee the rights and freedoms of data subjects. Figure 6 – Privacy Design Strategies Minimise The goal of this strategy is to collect and process the least amount of data possible, thus averting the processing of unnecessary data and limiting possible impacts on privacy. This may be achieved by collecting data from fewer subjects (reducing the population size) or less data from subjects (reducing the volume of collected information), for which the following tactics may be used: Select: select only the sample of relevant individuals and the attributes required, with a conservative approach when establishing the selection criteria and processing only the data that satisfy the selection criteria (white list). Exclude: is the reverse of the earlier approach and consists of excluding beforehand subjects and attributes that are irrelevant to the processing (black list). In this case, an open attitude must be adopted, excluding as much information as possible unless their inclusion can be justified as being absolutely necessary for the intended goal. Strip: partially eliminate personal data as soon as they cease to be necessary, which requires establishing beforehand the storage period of each of the collected data, and to establish automatic deleting mechanisms when said period is over. In case the data are part of a record that has more information than is necessary, the value of the unnecessary fields can be modified to a prefixed default value. Destroy: completely delete personal data as soon as they cease to be relevant, ensuring that it is impossible to recover both the data and any backup copies. It is also important to remember that only strictly necessary data must be communicated and shared, and in case of processing where new personal information is extrapolated, generated data that is not necessary to achieve the intended purpose must also be deleted. Hide This strategy focuses on limiting data observability by establishing necessary means to guarantee the protection of confidentiality and unlinkability. The following tactics are used to implement this strategy: Restrict: restrict access to personal data by setting limits through an access control policy that implements a “need to know" principle both in space (details and type of data accessed) and in time (processing stages). Obfuscate: make personal data unintelligible for those who are not authorised to consult it, using encryption techniques and hashing, both for storage operations as well as information transmission. Dissociate: eliminate the link between datasets that should be kept independent, as well as the identification attributes of data records to avert correlations between them, with special attention to metadata. Mix: Group together information on various subjects using generalisation and suppression techniques to avoid correlations. 33 Spanish Data Protection Agency (AEPD) – Unit of Evaluation and Technological Studies. K-anonymity as a privacy measure, June 2019 https://www.aepd.es/media/notas-tecnicas/nota-tecnica-kanonimidad-en.pdf Separate The goal of this strategy is to avoid, or at least minimise the risk that while processing, within one entity, different personal data of the same individual that are used in independent processes, can be combined to create a complete profile of the data subject. For this, it is necessary to maintain independent processing contexts that make it difficult to correlate datasets that should be unlinked. The following tactics help to implement a separation strategy: Isolate: collect and store personal data in different databases or applications that are independent, either logically or are executed on different physical systems, adopting additional measures to guarantee unlinkability, such as the programmed deletion of database indexes. Distribute: Spread out the collection and processing of different subsets of personal data corresponding to different types of processing over management and handling units that are physically independent within the system, and use different systems and applications to implement decentralised and distributed architectures that process the information locally whenever possible, instead of using centralised solutions with unified access which may depend on a single control unit. Abstract The idea behind this strategy is to limit the details of the processed personal data as much as possible. While the ‘minimise’ strategy makes a previous selection of the data to be collected, this strategy focuses on the degree of detail in which the data are processed and on their aggregation by using three tactics: Summarise: generalise the values of the attributes using value ranges or intervals, instead of a concrete field value. Group: aggregate information of a group of records into categories instead of using the detailed information on each of the subjects that belong to the group, by using average or general values. Perturb: use approximate values or modify the real data using some type of random noise instead of employing the exact value of the personal data. For each data processing, it is necessary to study how the level of detail of the entered data affects the result, and what is the degree of precision necessary for effective processing. Especially, the length of time after the data was collected may affect their relevance, which is why it is useful to periodically review the stored information and apply these strategies. Inform This strategy implements the transparency goals and principles established by the Regulation and seeks to make data subjects fully aware of the processing of their data in a timely manner. Whenever processing is carried out, the subjects whose data is processed must be aware of what is being processed, to what end and which third parties 34We must remember that even an aggregated processing of records may pose a certain risk to privacy when a data subject may be classified as belonging to a particular group or profile (for example, persons with a specific illness or that have a concrete risk profile). are given this information, in addition to all that is laid down in Articles 13 and 14 of the GDPR. Transparency with regard to this information is a basic requirement of privacy as it allows data subjects to make informed decisions on the processing and accordingly provide free, specific, informed and unambiguous consent. Any modification in processing regarding previously provided information must be communicated, including possible security breaches that may significantly affect the freedoms and liberties of data subjects. This strategy is based on the existence of privacy clauses that facilitate the global communication of this information to the data subjects, along with the use of the following tactics: Supply: provide data subjects with all the information required by the GDPR on what personal data is processed, how it is processed and why, by identifying the motive and goal. Details on data storage periods must be provided, as well as on the sharing of this data with third parties. Along with this information, which must be accessible and continually provided in order to promote an authentic transparency, it must also indicate who can be contacted by the data subjects and how, in order to ask questions on their privacy as well as their rights with regard to personal data protection. Explain: provide information on data processing in a concise, transparent, intelligible and easily accessible fashion in clear and simple language. To avoid dense, complex and unwieldy information policies, it is worth adopting a layered approach which first provides basic information at the same time and within the same data collection medium and makes additional detailed information available at a second level. Notify: inform data subjects of the processing when the data are not derived directly from them, at the time these have been obtained and within a maximum of one month, or if they are going to be used for communication with them, in the first message. Subjects must also be informed if their data is going to be transferred to third parties. Mechanisms to notify subjects of security breaches that may have happened and may pose a serious risk to their freedoms and rights must also be implemented, using clear and simple language to describe the nature of the breach. Considering that data collection procedures are varied, the means of notification must be adapted to the circumstances of each method used including, additionally, the possible use of standardised icons that offer an overall view of the anticipated processing. Control This strategy is closed linked to the ‘inform’ strategy and it seeks to provide subjects with control over the collection, processing, use and transfer of their personal data by 35Article 13. “Information to be provided where personal data are collected from the data subject” - General Data Protection Regulation (EU) 2016/679 https://eur-lex.europa.eu/legal-content/EN/TXT/HTML/?uri=CELEX:32016R0679&from=EN 36Article 14. “Information to be provided where personal data have not been obtained from the data subject” - General Data Protection Regulation (EU) 2016/679 https://eur-lex.europa.eu/legal-content/EN/TXT/HTML/?uri=CELEX:32016R0679&from=EN 37 Spanish Data Protection Agency, Catalan Data Protection Authority, Basque Data Protection Agency. Guide for compliance on the duty to inform, Jan 2017 https://www.aepd.es/media/guias/guia-modelo-clausula-informativa.pdf implementing mechanisms that allow them to exercise their rights of access to, rectification, erasure, objection, portability and restricting their data and its processing, as well as the right to give and withdraw consent or modify privacy options in applications and services. To implement these mechanisms, the following tactics are used: Consent: acquire the consent of data subjects, in cases without any other basis of legitimacy, and which must be given unambiguously, by demonstrating either a clear affirmative action which must be explicit in certain situations such as the processing of sensitive data, the adopting of certain automated decisions or in international transfers. Besides, the subject must be able to withdraw their consent at any time, by guaranteed mechanisms and procedures that make it as easy to withdraw consent as it is to give it. Alert: to provide real-time notification to the user when personal data is being collected, even when general information on the legal basis of the processing has been provided or even when the subject has already given their consent. Choose: to provide granular functionality of applications and services, especially when it comes to basic functionality, without it being contingent on the user’s consent to process personal data that is not required for execution. Update: implement mechanism that make it easy for users to or even lets them directly make revisions, updates and rectifications of the provided data for a specific processing, so that they are accurate and reflect reality. Retract: provide mechanisms for users to withdraw or ask for the deletion of their personal data provided to a controller for processing. Technological advances that make it possible to continuously collect data also makes it possible for data to be easily managed by the data subjects through the implementation of privacy platforms where they can access, update, cancel and modify the selected privacy settings. These functions must be accounted for when designing an application. Enforce This strategy ensures that personal data processing is compatible with and respects the legal requirements and duties imposed by regulations. For this, it is necessary to define a privacy framework and an administrative structure that includes a data protection policy supported by the senior levels of management, as well as the roles and responsibilities that are entrusted with its compliance. Privacy culture must be an essential part of the organisation and all members must be participants. The following tactics may help to achieve this: Create: Specify a data protection policy that reflects internally the privacy clauses that are communicated to data subjects. The necessary structures must be created, and resources assigned to support this policy and ensure that the organisation’s processing activities respect and comply with data protection regulations. A training and awareness plan must also be developed 38 Functions that require consent to for their legal use must be made available separately irrespective of whether they are the main goal of the object or not. for all members that seeks to ensure a committed and responsible attitude as part of accountability. Maintain: to support the policy defined by establishing procedures and implementing the necessary technical and organisational measures. The existence of effective mechanisms and procedures must be reviewed in order to guarantee the exercise of rights, the handling and notifying of security incidents, adjusting possible processing activities to legal requirements and providing proof of compliance with the obligations imposed by regulations. Uphold: to ensure the compliance, effectiveness and efficiency of the privacy policy and the procedures, measures and controls implemented so that they account for all processing activities carried out and for the day to day activities of the organisation. The figure of the Data Protection Officer plays an essential role in implementing this strategy, by assessing the controller and supervising the compliance with data protection regulations within the organisation. Implementing privacy management models such as that recently proposed by the ISO/IEC 27701:2019 standard is also effective. It lists the requirements and provides directions for establishing, implementing, maintaining and continually improving a Privacy Information Management System (PIMS). Demonstrate This strategy goes one step further than the earlier strategy and its goal is that in accordance with Article 24 of the GDPR , the data controller must be able to demonstrate to data subjects as well the supervisory authorities that the applicable data protection policy is being adhered to, in addition to other legal requirements and obligations imposed by the Regulation. From a practical point of view, this implements the accountability demanded by the Regulation, based on a critical, continuous and traceable self-analysis of all decisions on data processing and ensuring authentic management of personal data within the organisation. The following tactics are used to carry out this strategy in order to ensure and demonstrate that the processing is in accordance with the Regulation: Record: document each and every decision taken even when they contradict each other, identifying who took the decision, when and why. Audit: carry out a systematic, independent and documented review of the degree of compliance with the data protection policy. Report: document audit results and any other incident regarding personal data processing operations and make them available to the supervisory authorities whenever required. In case of new data processing and if the result of the data protection impact evaluation shows that processing involves a high degree of risk to the rights and freedoms of the data subjects if the controller 39 Technical Committee ISO/IEC JTC 1 /SC 27. ISO/IEC 27701:2019 Security techniques – Extension to ISP/IEC 27001 and ISO/IEC 27002 for privacy information management – Requirements and guidelines, August 2019 https://www.iso.org/standard/71670.html 40Article 24. “Responsibility of the controller” General Data Protection Regulation (EU) 2016/679https://eur-lex.europa.eu/legal- content/EN/TXT/HTML/?uri=CELEX:32016R0679&from=EN#d1e3106-1-1 does not adopt measures to mitigate it, perform the prior consultation referred to in Article 36 of the GDPR. Performing a risk analysis and when applicable, a data protection impact assessment together with the documentation on decisions taken with regard to the results, is a good beginning to establish the privacy requirements that must be implemented in applications and systems as part of privacy by design, as well as to fully document how personal data is processed, and follow the principle of accountability. Other resources for demonstrating the controller’s fulfilment of their obligations is their adherence to codes of conduct and certifications as optional instruments for implementing this strategy. DESCRIPTION AND DESIGN CONTROLS PRIVACY DESIGN STRATEGY TACTICS AND PATTERNS Limit the processing of Anonymisation personal data as much as possible. Pseudonymisation Minimise TACTICS: select, Block correlation in exclude, strip and federated identity destroy management systems Avoid making personal Encryption data public or known Mix networks Hide TACTICS: restrict, obfuscate, dissociate Attribute Based and mix) Credentials Anonymous black lists Data oriented Keep personal datasets strategies separate. Homomorphic Separate encryption TACTICS: isolate and distribute Physical and logical separation Time-based group K-anonymity Limit the level of detail used in personal data Added noise processing as much as measurement Abstract possible. obfuscation TACTICS: summarise, Dynamic location group and perturb granularity Differential privacy 41Article 36. “Prior consultation” General Data Protection Regulation (EU) 2016/679https://eur-lex.europa.eu/legal- content/EN/TXT/HTML/?uri=CELEX:32016R0679&from=EN#d1e3210-1-1 DESCRIPTION AND DESIGN CONTROLS PRIVACY DESIGN STRATEGY TACTICS AND PATTERNS Security breach Keep data subjects notifications informed of the nature and conditions of Dynamic visualisation of Inform processing. privacy policy TACTICS: supply, explain Privacy icons and notify Ambient notices Privacy dashboard Provide data subjects with effective control Active broadcast of Control over their personal data. presence TACTICS: consent, alert, Credential selection choose, update, retract Informed consent Process oriented Privacy impact strategies Respect and promote the assessment in federated fulfilment of the identity management obligations set by current solutions Enforce regulations and the data protection policy itself. Access control. TACTICS: create, Obligation management maintain, uphold Adherence to policies To be able to demonstrate that data processing is respectful Audit Demonstrate of privacy. Logs TACTICS: record, audit and report. Table 4 – Privacy design strategies along with tactics and privacy patterns for implementation V. PRIVACY DESIGN PATTERNS Once the privacy goals and strategies to be included in the product, system, application or service as part of its definition are established, it is necessary to integrate them in its design. For this, privacy design patterns are used as reusable solutions to solve common and reiterated problems of privacy that repeatedly appear in a specific context during product and systems development. Typically, the description of the design pattern contains at least its name, its purpose, its context of application, objectives, structure, implementation (relation to other patterns), the consequences of its application and known uses. Figure 7 – Example of privacy design pattern As we have mentioned before, a design pattern can be used to implement more than one privacy strategy, and therefore these are not closed and exclusive solutions, rather we must look at privacy strategies from a combined and overall focus. For example, the ‘Added noise measurement obfuscation’ pattern displayed in Figure 7, which adds noise to real measurements taken during the operation of a service so that additional information cannot be extrapolated, lets us implement ‘abstract’, ‘hide’ and ‘minimise’ strategies simultaneously. DESIGN PATTERN CATALOGUES There are different collections or catalogues of privacy design patterns where we can find comprehensive definitions, their goals and information on how to use them. The PRIPARE (Preparing Industry to Privacy by design by supporting its Application in Research), project funded by the European Union has developed a catalogue of 26 privacy design patterns. Another similar initiative is the result of a project carried out at the Vienna University of Economics and Business, which creates an interactive repository of solutions which classifies 40 privacy design patterns by the 11 principles of protection defined by the ISO/IEC 29100:2011 standard. There is another collaborative initiative among various centres and universities that maintains a pattern catalogue in order to operationalise legal requirements in specific solutions, standardise the language of privacy, document and compile common solutions to concrete problems and help systems and applications designers to identify privacy problems and respond to them. Annex 1 lists these patterns in the form of a table with links to each entry; a total of 54 privacy design patterns that have been published on websites as a result of the aforementioned initiatives, with a brief summary of the purpose of each pattern and the privacy design strategy or strategies that they seek to implement. VI. PRIVACY ENHANCING TECHNOLOGIES (PETS) Once the prospective product, system, application or service’s privacy strategies are defined and privacy patterns designed, we come to its implementation at the development stage by using a specific technological solution. Privacy Enhancing Technologies or PETs are an organised and coherent group of ICT solutions that reduce privacy risks by implementing the previously defined strategies and patterns. Due to the changing technological context, their effectiveness, in terms of privacy protection changes from one PET to another based on time, it being complicated to provide an updated classification and typology. A PET may be an independent tool bought and installed by the end user on their personal computer or a complex information systems structure. 42 ATOS, Inria, Gradiant, Trilateral and UPM. Privacy and Security by Design Methodology Handbook , Dec 2015 http://pripareproject.eu/wp-content/uploads/2013/11/PRIPARE-Methodology-Handbook-Final-Feb-24-2016.pdf 43 Online - privacypatterns.eu - collecting patterns for better privacy https://privacypatterns.eu/ 44 Olha Drozd, Sabrina Kirrane, Sarah Spiekermann – Vienna University of Economics and Business. Towards an Interactive Privacy Pattern Catalog. 12th Symposium on Usable Privacy and Security (SOUPS 2016), Jun 2016, Denver CO. https://www.researchgate.net/publication/305811615_Towards_an_Interactive_Privacy_Pattern_Catalog 45 Olha Drozd – Vienna University of Economics and Business. Privacy Pattern Catalog: A Tool for Integrating Privacy Principles of ISO/IEC 29100 into Software Development Process., Julio 2016, https://www.researchgate.net/publication/304995300_Privacy_Pattern_Catalogue_A_Tool_for_Integrating_Privacy_Principles_of_ ISOIEC_29100_into_the_Software_Development_Process 46 Online – privacypatterns http://privacypatterns.wu.ac.at:8080/catalog/ 47 Technical Committee ISO/IEC JTC 1 /SC 27. ISO/IEC 29100:2011 Information Technology - Security techniques – Privacy Framework, Dec 2011 https://www.iso.org/standard/45123.html 48 Online – Privacy patterns https://privacypatterns.org/ 49 COM(2007)228 COMMUNICATION FROM THE COMMISSION TO THE EUROPEAN PARLIAMENT AND THE COUNCIL on promoting data protection by Privacy Enhancing Technologies (PETs) https://eur-lex.europa.eu/legal- content/ES/TXT/PDF/?uri=CELEX:52007DC0228&from=EN CLASSIFICATION OF PETS. There are multiple classifications of PETs, most of them based on technical characteristics. Another possible classification of these tools is that which is offered in this guide, and is based on the goals that they pursue. Therefore, they shall be classified according to whether they are meant to protect privacy or manage it, thus maintaining a focus consistent with the classification of strategies that have been discussed earlier. The first group combines tools and technologies that actively protect privacy during the processing of personal data (for example, hiding personal data or eliminating the need for identification). The second group deals with tools and technologies that support procedures related to privacy management but do not actively operate on the data. CATEGORY SUBCATEGORY DESCRIPTION Pseudonymisation tools Allow transactions without asking for personal information Anonymisation products Provide access to services and services without requiring the data subject’ identification. Encryption tools Protect documents and Privacy protection transactions from being viewed by third parties Filters and blockers Avoid undesired emails and web content Anti-trackers Eliminate the user’s digital footprint Privacy Information tools Create and verify privacy policies management Administrative tools Manage user identity and permissions Table 4 – One possible classification of PETs (META Group Report) PET CATALOGUE Similar to privacy design patterns, there is no single unified catalogue of PET tools and technologies, although there are different initiatives. The Technology Analysis Division of the Office of the Privacy Commissioner of Canada has developed a general overview of PETs based on their functionality and given some concrete examples of solutions. 50 Lothar Fritsch, Norwegian Computing Center Report, No 1013. State of the art of Privacy-enhancing Technology (PET), Nov 2007 https://www.nr.no/publarchive?query=4589 51 Ministry of Science, Technology and Innovation, Denmark. Privacy Enhancing Technologies – META Group Report v1.1, Mar 2005 https://danskprivacynet.files.wordpress.com/2008/07/rapportvedrprivacyenhancingtechlologies.pdf 52 The Technology Analysis Division of the Office of the Privacy Commissioner. Privacy Enhancing Technologies – A review of Tools and Techniques, 2017 https://www.priv.gc.ca/en/opc-actions-and-decisions/research/explore-privacy-research/2017/pet_201711/ The Center for Internet and Society (CIS) at Stanford University’s Law Faculty (California) have published a database of PET tools and technologies as an open-source wiki, so that users can have better control over their personal data. In Europe, the European Data Protection Supervisor (EDPS) has developed IPEN (Internet Privacy Engineering Network), in order to support developers in using privacy design patterns and other reusable blocks that aim to protect and improve privacy in a more efficient and effective manner. In 2015, ENISA carried out the study “Online privacy tools for the general public” which analysed PET tools for online privacy protection and compiled a list of web portals that promote the use of this technology. Although the tools suggested in the portals displayed in Table 5 are generally software applications aimed at end users to improve their personal data protection, their analysis and study is also useful for data controllers as examples of privacy requirements that must be included in services, products, an applications to be developed. If these solutions (secure communications encryption, anonymisers, anti-trackers, etc.) are serially integrated in systems, they will prevent end users from being left unprotected or having to add an unimplemented privacy layer to to a later installation of third party tools. From this initial study, ENISA has published several reports on the evolution of PET tools and the development of a methodology for comparing PET maturity. It is working on developing a platform provides support as well as a centralised repository on solutions that are best suited to intended privacy goals. WEBSITE ORGANISATION URL DESCRIPTION Secure Electronic https://www.eff.org/ A presentation and Messaging Frontier deeplinks/2018/03/se evaluation of secure Scorecard Foundation (EFF) cure-messaging- messenger applications more-secure-mess and tools, using a list of predefined criteria. PRISM Nylira (Peng https://prism- A selection of tools to Break Zhong) break.org/en/ prevent tracking and 53 The Center for Internet and Society Stanford University. Ciberlaw PET wiki, https://cyberlaw.stanford.edu/wiki/index.php/PET 54 https://edps.europa.eu/data-protection/ipen-internet-privacy-engineering-network_en 55 European Union Agency for Cybersecurity (ENISA). Online privacy tools for the general public, Dec 2015 https://www.enisa.europa.eu/publications/privacy-tools-for-the-general-public/at_download/fullReport 56 European Union Agency for Cybersecurity (ENISA). Readiness Analysis for the Adoption and Evolution of Privacy Enhancing Technologies, Mar 2016 https://www.enisa.europa.eu/publications/pets/at_download/fullReport 57 European Union Agency for Cybersecurity (ENISA). PETs control matrix – A systematic approach for assessing online and mobile privacy tools, Dec 2016 https://www.enisa.europa.eu/publications/pets-controls-matrix/pets-controls-matrix-a-systematic- approach-for-assessing-online-and-mobile-privacy-tools/at_download/fullReport 58 European Union Agency for Cybersecurity (ENISA). Privacy Enhancing Technologies: Evolution and State of the Art, Mar 2017 https://www.enisa.europa.eu/publications/pets-evolution-and-state-of-the-art/at_download/fullReport 59 European Union Agency for Cybersecurity (ENISA). A tool on Privacy Enhancing Technologies (PETs) knowledge management and maturity assessment, Mar 2018 https://www.enisa.europa.eu/publications/pets-maturity-tool/at_download/fullReport 60 UNIPI Workshop on Privacy Enhancing Technologies. Evgenia Nikolouzou - ENISA Data Security and Standardization Unit. PETs Repository Community Building and Evaluation, Nov 2018 https://www.enisa.europa.eu/events/personal-data-security/pets-maturity WEBSITE ORGANISATION URL DESCRIPTION mass surveillance, such as encryption tools, anonymisers, etc. Security in- Tactical https://securityinabo A security website for a-box Technology x.org/en/ general use which Collective and includes privacy Front Line protection tools, such as Defenders encryption tools. EPIC Online Electronic https://www.epic.org It offers lists of privacy Guide to Privacy /privacy/tools.html tools arranged according Practical Information to different areas Privacy Center (EPIC) (browser add-ons, Tools anonymisers, etc.). The BestVPN https://proprivacy.co A security website for Ultimate (4Choice Ltd) m/guides/the- general users which offers Privacy ultimate-privacy- lists of commercial VPNs. Guide guide The privacy guide provides a list of tools classified according to area. Free Free Software https://directory.fsf.o A website for general Software Foundation (FSF) rg/wiki/M ain_Page users with information on Directory security and privacy freeware, focusing mainly on encryption. Privacytools Privacytools.io https://www.privacyt It offers lists of tools to.io ools.io safeguard privacy such as VPNs, browser add-ons, etc. Me & My Tactical https://myshadow.or A website focusing mainly Shadow Technology g on digital footprints and Collective online tracking. Offers recommendations on various relevant tools. WEBSITE ORGANISATION URL DESCRIPTION Gizmo’s Gizmo's http://www.techsupp Website on general use Freeware Freeware ortalert.com/content freeware, which also /free-windows- proves a list of open- desktop-software- source privacy tools. security-list- privacy.htm Best Privacy Best Privacy http://bestprivacytoo It offers a list of privacy Tools Tools ls.com/ tools,. especially messenger apps, VPN, secure browsing, etc. Internet Internet Privacy http://privacytools.fr It offers a list of privacy Privacy Tools eeservers.com tools,. especially email Tools filters, browser-based encryption, etc. Reset The Fight for the https://pack.resetthe It offers a list of free Net Future and net.org privacy tools and Privacy Center for Rights pertinent advice (for Pack example, secure communications, anonymous browsing, etc.). Table 5 – Websites that promote the use of online privacy tools to the general public according to the ENISA study Online privacy tools for the general public VII. CONCLUSIONS Within a context where every day organisations and companies are developing services based on an intensive use of personal data and whose impact on privacy is visibly strengthened by the use of disruptive technologies, it becomes necessary to adopt effective and efficient technical and organisational measures that ensure that the rights and freedoms of persons are respected with regard to the processing of their personal data. Ensuring privacy and establishing a framework of governance that guarantees personal data protection does not represent an obstacle to innovation. On the contrary, it offers advantages and opportunities for the different participants: for organisations it means improved efficiency, optimised processes, establishing a cost-reduction strategy and obtaining a competitive edge for the market it means the development of long-term sustainable economic models for society as a whole, it means being able to access the benefits of technological advances without having to compromise on individual freedoms and independence. Ensuring privacy is indeed innovation in itself and it introduces a new technological discipline: privacy engineering. The efficient and effective implementation of privacy principles requires them to be an integral part of the nature of products and services and to achieve this, they must be taken into account from the initial stages of concept development, design and development themselves as another part of the group of specifications, both functional and non-functional. This approach is known as Privacy by Design. Privacy by design involves the use of a methodological focus centred on risk management and accountability that lets us determine privacy requirements by means of practices, procedures and tools. For this: The risk analysis will establish the specific objectives of data protection (unlinkability, transparency and intervenability) as well as security goals from the perspective of privacy (confidentiality, availability and integrity) that guarantee the basic principles established in Article 5 of the GDPR. Next, the data-oriented and process-oriented privacy strategies that specify the requirements of each privacy goal are to be studied. These strategies are: ‘minimise’, ‘hide’, ‘separate’, ‘abstract’, ‘inform’, ‘control’, ‘enforce’ and ‘demonstrate’, and for each of them, the protection tactics that implement these strategies effectively are to be defined. In the design stage, the selected tactics shall be integrated by means of available solutions, that is to say, privacy design patterns that deal with common and reiterated problems, by accessing available catalogues, of which a selection is given in this document. Finally, in the development stage, these patterns shall be implemented. This implementation shall be carried out by development teams either by programming the code with the necessary functionality or whenever possible, by using existing ICT solutions, i.e., Privacy Enhancing Technologies. In any case, data protection by design is the data controller’s obligation and they must work to guarantee it by any possible means of development, acquisition or subcontracting of systems, products or services, without delegating completely to third parties (manufacturers and processors) the responsibility of implementing this principle. As a part of fulfilling their duty, they must actively participate in privacy engineering tasks by defining the requirements that must be taken into account, continually monitoring its correct implementation and verifying its full operability before the system production, so that the privacy of individuals whose data are to be processed is guaranteed. 61 Article 5. “Principles relating to processing of personal data” General Data Protection Regulation (EU) 2016/679https://eur- lex.europa.eu/legal-content/ES/TXT/HTML/?uri=CELEX:32016R0679&from=ES#d1e1873-1-1 VIII. ANNEX 1: SELECTION OF PRIVACY DESIGN PATTERNS NAME OF DESIGN OBJECTIVE AND PURPOSE STRATEGY OR PATTERN STRATEGIES SUPPORTED Added Noise Modifies the detailed measurements of Abstract Measurement use or any other attribute of a service by Hide Obfuscation adding noise values that mask real data Minimise in order to avoid the deduction of patterns and behaviours by unauthorised third parties that may intercept the communication. Aggregation in time Consists of collecting data from different Abstract moments in time and processing information in an aggregated manner to protect privacy. Differential Privacy This pattern modifies search results by Abstract adding new data (noise) randomly extracted from a distribution generated from original data, so that statistically said modification has an insignificant result on the results of the algorithm analysing the data, but it still maintains the privacy of the individuals. Trustworthy Privacy On many occasions, providing a service Abstract Plug-in means taking detailed and repetitive measures that, upon a time-based evaluation, can reveal certain behaviours and put the subject’s privacy at risk. This plug-in securely aggregates the detailed values of records at the user side for the intended goal, but hides the detailed disaggregated values. Dynamic Location Uses k-anonymity to reduce precision of Abstract Granularity user location in location-based services Minimise (LBS), but maintains a balance with regard to the use of information required to provide the service. Aggregation It implements homomorphic encryption, Abstract Gateway coding, aggregating and later decoding Hide the processed information. By operating Separate on encrypted information, it is possible to NAME OF DESIGN OBJECTIVE AND PURPOSE STRATEGY OR PATTERN STRATEGIES SUPPORTED process measurements taken at different moments in time on a user, but without extrapolating patterns of behaviour. It works with aggregated data without accessing individual information. Active Broadcast Of It allows the user to decide when they Control Presence want to actively share information, especially location-based information. Information broadcast settings must not be holistically applied by default, and when unsure, clarification must always be sought. Obtaining Explicit For certain processes, the data controller Control Consent must obtain the user’s informed consent. Implementing this pattern ensures the display of a clear, concise and understandable notice before collecting data and beginning of the processing where, by using the service, the user consents to the processing of required data and is aware of the possible consequences. Complete details must be easily accessible so the user can decide whether to use the service or not. Private Links In environments where the controller Control provides users with a content storage service, these may contain personal data. If the user wishes to share part of the content but in a limited manner, implementing this pattern allows them to send a private link to certain individuals, which provides access to this information, but without making it completely public. Sticky Policies They are privacy policies that are Control automatically read and interpreted, and which accompany data shared with third parties to define their uses, limits and user preferences, thus improving the user’s control over their personal data. NAME OF DESIGN OBJECTIVE AND PURPOSE STRATEGY OR PATTERN STRATEGIES SUPPORTED Enable/Disable Data controllers often collect more data Control Functions than is strictly necessary in order to provide additional functionalities related to the main processing goal. This pattern allows users to selectively choose the system functions that they want to use and provide only those data that are required to achieve this. Selective Access Used in forums, social networks and Control Control content-based websites, it provides users with a tool to define the visibility of their posts and the content they share by