Usable Privacy PDF
Document Details
Uploaded by FirstRateIron
Universität Paderborn
Tags
Related
- The Critical Importance of Cybersecurity in Today's Digital Landscape PDF
- digital society : privacy on the internet and cybercrimes PDF
- Data Privacy Act Of 2012 PDF
- Discovering Computers: Digital Technology, Data, and Devices PDF
- Artificial Intelligence in Protecting Cyber Security (PDF)
- Digital Security, Ethics, and Privacy: Avoiding and Recognizing Threats PDF
Summary
This document discusses the concept of privacy in the digital age, emphasizing the importance of usable privacy in the design of digital systems. It examines different definitions of privacy and introduces several pitfalls to avoid in privacy design.
Full Transcript
USABLE PRIVACY USABLE SECURITY & PRIVACY Privacy is hard to define “Privacy is a value so complex, so entangled in competing and contradictory dimensions, so engorged with various and distinct meanings, that I sometimes despair whether it can be usefully addressed at all.”...
USABLE PRIVACY USABLE SECURITY & PRIVACY Privacy is hard to define “Privacy is a value so complex, so entangled in competing and contradictory dimensions, so engorged with various and distinct meanings, that I sometimes despair whether it can be usefully addressed at all.” -- Robert C. Post, “Three Concepts of Privacy”, 89 Geo. L.J. 2087 (2001). What does privacy mean to you? Draw it 2 Mental models Mental models were described by psychologist Johnson-Laird as “structural analogues of the world as perceived and conceptualized, which enable people to make inferences and predictions, to understand phenomena, to decide and control actions, and to experience events by proxy” D. Jonassen and Y. H. Cho. Externalizing mental models with mindtools. In D. Ifenthaler, P. Pirnay- Dummer, and J. M. Spector, editors, Understanding Models for Learning and Instruction, pages 145–159. Springer US, Boston, MA, 2008. 3 Mental models Oates, M., Ahmadullah, Y., Marsh, A., Swoopes, C., Zhang, S., Balebako, R. and Cranor, L.F., 2018. Turtles, locks, and bathrooms: Understanding mental models of privacy through illustration. Proceedings on Privacy Enhancing Technologies, 2018(4), pp.5-32. 4 https://cups.cs.cmu.edu/privacyillustrated/ Mental models 5 Privacy Definitions Privacy is someone’s right to… Be alone and control access to themselves Keep personal matters and relationships secret 6 Privacy Definitions Privacy is someone’s right to… Be alone and control access to themselves – Right to be let alone (Warren & Brandeis, 1890)* – Right to control access to themselves – Right to have a space that is inaccessible by others Keep personal matters and relationships secret * Warren, S.D. and Brandeis, L.D., 1890. Right to privacy. 7 Harv. L. Rev., 4, p.193. Privacy Definitions Privacy is someone’s right to… Be alone and control access to themselves Keep personal matters and relationships secret – Right to control the disclosure and use of personal data „the claim of individuals … to determine for themselves when, – Right to not disclose information how, and what extent of information about them is communicated to others.“ -- Westin 1967 8 Be alone and control access to themselves 9 Digitally Be alone and control access to themselves 10 Keep personal matters and relationships secret 11 Digitally Keep personal matters and relationships secret 12 Solove’s Privacy Taxonomy – Pluralistic view Rejects the need for a single definition to allow nuance and flexibility, Defines a taxonomy of privacy concepts*, encompassing key elements of previous definitions 13 *Solove, D.J., 2005. A taxonomy of privacy. U. Pa. l. Rev., 154, p.477. Different from security! Context – Privacy is very contextual: with whom?, for what?, where?, why?, and when?, can all impact privacy. Nuanced control – Control over data in every day life shifts quickly. Low and high fidelity – High fidelity is good for the computer because it provides more information about what the user wants. But it takes time and self-realization to express detailed opinions. 14 Usable privacy? It’s not always easy to: – understand privacy – control disclosure, exercise rights 15 Usable privacy Lessons learned in usable security also apply: Involve and design for the users, including developers Reduce friction (privacy not primary task either) Same research methods and metrics apply: efficacy, efficiency, satisfaction, memorability, learnability… Usable interface are important, but there is more to usability And there are a few new challenges to consider! 16 Human decision making in privacy is challenging Benefits and costs associated with privacy decisions are complex Incomplete information, uncertainty Lack of knowledge about technological or legal forms of protection A privacy concern is bundled with other useful functionality Trade long term privacy for short term benefits Privacy is understood differently by different people and cultures. Acquisti and Grossklags. Privacy and rationality in individual 17 decision making. IEEE Security & Privacy, 3(1):26–33, 2005 5 Pitfalls of Privacy Understanding Designs should not obscure the potential for 1 | Obscuring potential information flow information disclosure. 2 | Obscuring actual information flow Users should understand what information is being disclosed & to whom. Information flow - Type of information - Observers - Media through which info is conveyed - Length of retention - Potential for unintended disclosure - Collection of metadata 18 Scott Lederer, Jason I. Hong, Anind K. Dey, and James A. Landay; Five pifalls in the design for privacy; Security and Usability: Designing Secure Systems That People can Use; CH 21; 2005 5 Pitfalls of Privacy Understanding Designs should not obscure the potential for 1 | Obscuring potential information flow information disclosure. 2 | Obscuring actual information flow Users should understand what information is being disclosed & to whom. Action Designs should not require excessive 3 | Emphasizing configuration over action configuration to manage privacy, but privacy should be realized naturally, in context 4 | Lacking coarse-grained control Designs should provide an easy-to-find top- level method for stopping/resuming disclosure 5 | Inhibiting established practice People have existing methods of managing privacy, technology should support these, not inhibit them 19 Scott Lederer, Jason I. Hong, Anind K. Dey, and James A. Landay; Five pifalls in the design for privacy; Security and Usability: Designing Secure Systems That People can Use; CH 21; 2005 5 Pitfalls of Privacy Understanding Designs should not obscure the potential for 1 | Obscuring potential information flow information disclosure. 2 | Obscuring actual information flow Users should understand what information is being disclosed & to whom. Action Designs should not require excessive 3 | Emphasizing configuration over action configuration to manage privacy, but privacy should be realized naturally, in context 4 | Lacking coarse-grained control Designs should provide an easy-to-find top- level method for stopping/resuming disclosure 5 | Inhibiting established practice People have existing methods of managing privacy, technology should support these, not inhibit them 20 Scott Lederer, Jason I. Hong, Anind K. Dey, and James A. Landay; Five pifalls in the design for privacy; Security and Usability: Designing Secure Systems That People can Use; CH 21; 2005 21 Obscuring potential information flow 21 22 The disclosure should be obvious to the user as it occurs. Obscuring actual information flow 22 23 Emphasizing configuration Example: too many up-front decision before starting over action using Facebook 23 24 Example Bad: Not possible to switch off tracking in a shopping website while I’m searching something specific that I Lacking don’t want to be included in my history coarse Good grained control 24 25 Verbal conversations are temporary. Once something has been said, it is gone forever. Inhibiting But: email and texts are not like that Inhibits established established practice. practice Snapchat and other IM applications recreate the idea of something you say being only in the moment. 25 Exercise: Think about bad and good examples for each case 26 5 Pitfalls of Privacy – General Conclusions Make clear how information is being shared Make it easy and natural for users to control privacy Make the default practice match user expectations 27 How to design for PRIVACY? Foundations International Association of Privacy Professionals (IAAP) - Non-profit organization seeking to define, promote and improve the privacy profession globally - It’s “Body of Knowledge” covers well established Privacy Risk Models, Frameworks & Design Principles, including Privacy by Design 29 https://iapp.org History In the past privacy mostly = legally compliant (”window dressing”, little impact) Gap translating legal principles to actionable requirements 1970s: technology development seen not only as the cause of increasing privacy concerns but also part of the solution – Research and tech. advances in anonymous transactions, communications, and crypto – PETs are born! Yet, security & privacy not integrated as primary requirements in the development of Internet Buttarelli G. Preliminary Opinion on Privacy by Design. Opinion. 2018;5:2018. 30 https://edps.europa.eu/data-protection/our-work/publications/opinions/privacy-design_en History Internet Engineering Task Force (IETF), 2013: “the scale of recently reported monitoring is surprising. Such scale was not envisaged during the design of many Internet protocols...” 31 Privacy by Design (PbD) “Privacy must be embedded into every standard, protocol and process that touches our lives” Ann Cavoukian 32 Privacy by Design (PbD) 7 principles*, by Ann Cavoukian (Information & Privacy Commissioner, Ontario) Framework for building privacy proactively into new systems Image and link to video – Proposed in 2009 – Not only applicable to technology, also to processes – Internationally accepted as a standard for privacy engineering – Reflected in many new legal instruments (e.g. https://gpsbydesign.org/ann-cavoukian-privacy-by-design/ GDPR) 33 * https://iapp.org/resources/article/privacy-by-design-the-7-foundational-principles/ 7 principles of Privacy by Design (PbD) Proactive not Reactive; Preventative not Remedial Privacy as the Default Full Functionality Privacy Embedded into – Positive-Sum, not Zero-Sum Design End-to-End Security Visibility and Transparency – Full Lifecycle Protection – Keep it Open Respect for User Privacy – Keep it User-Centric 34 Adapted from Tobias Pulls’ slides (source: https://kau.instructure.com/courses/5335/pages/ann-cavoukians-privacy-by –design) Privacy by Design -Conclusions Privacy by Design is an excellent general framework, but also… still vague and challenging for engineers or designers to operationalize. Current situation: limited uptake of commercial products and services fully embracing PbD PETs have made some way into the mainstream commercial offer Strong relevant research on data science, cryptography, usable security, machine learning as well as human sciences 35 Privacy by Design - Conclusions Further efforts needed : - Research to make PETs a more mature and affordable technology - Policies promoting PETs, public administrations should lead by example - Economic incentives (specially for SMES), is hard to compete in the data-driven business landscape, investing in PETs can be an obstacle - Paradigm Shift: Development of new creative business models with individuals at the centre! Buttarelli G. Preliminary Opinion on Privacy by Design. Opinion. 2018;5:2018. 36 https://edps.europa.eu/data-protection/our-work/publications/opinions/privacy-design_en QUESTIONS 37 Identity Wallets & Usable Security – Challenges User-Centric Identity through Identity Wallets [Example: European Identity Wallet, EUDI] 2 Control. How much do people want or can handle? Trade-offs complexity vs understanding Paradoxically: perceived control can lead to an illusion of security and users sharing more + Easier - Complex - Less control + More control 3 User Control does not prevent Overasking by Services Services might ask for more data than they need Identity attributes are high assurance, verified! But identification-with your governmental ID-is not needed for every service Users trade privacy for benefits Bank National ID Social Media … Entertainment Any Service 4 Control = Better Privacy? Oversharing potential emphasized by Overasking Easiness of data-sharing Decision Fatigue, Habituation, Biases, e.g.: Immediate gratification Without true data minimization Users could get in a situation where they would share more data with a mobile wallet than they would have without 5 Control Interface. How to make it Usably Secure? Under research Label sensitive information Communicate risk, alerts Secure Defaults Rate Service Providers reputation → Transparency is difficult when information is huge, If users should involve too much time and effort (=friction), they won’t 6 Don’t Annoy Me With Privacy Decisions! Easy to use, easy to understand Will users know what happens with their data? - User understanding of entities in control is a challenge - Complex technologies, hidden from users - Misconceptions: users might expect data backups, not as their responsibility Mitigations: Awareness Notification – Pop-ups with educational information, subtle assistance – Training and education 7 An empirical study of a decentralized identity wallet Easy to use securely Security limitations in current technologies: QR Codes & Phishing - QRLjacking - Quishing → Users can be tricked into disclosing their wallet credentials 8 Easy to use securely Security limitations in current technologies: Authentication First line of defense! Currently, is commonly a PIN/password → If attacker gets access to wallet due to poor authentication practices, they get access to all verified identity data! 9 People do not choose good passwords! Human (ask, guess) Brute force (try all possible combinations) Common word Dictionary P@$$w0rd!! https://www.youtube.com/watch?v=opRMrEfAIiI 10 Easy to use securely Growing number of identification options NASCAR problem: overwhelming selection interface Dark Patterns : - privacy-friendly login choices listed last Login with wallet A - privacy choices (and implications) are highly Login with wallet B invisible to users Login with wallet C https://indieweb.org/NASCAR_problem 11 Easy to use for everybody. Accessibility As societies become ‘digital by default’ or ‘digital first’ → Risk of exclusion from essential services increases ~100 million people with disabilities of various kinds in Europe – Studies on cryptowallets already identified issues for blind users : Compatibility with screen readers, labeling of buttons – Potential exclusion of low-income households - Need for an smartphone with adequate security Anderson B. 2020—The Year of Digital Accessibility in the European Union (EU) (2020). https://codemantra.com/directive-eu-20162102-accessibility-law/ 12 Iterative Design of An Accessible Crypto Wallet for Blind Users Trust User-centric identity involves an ecosystem of entities Reputation of wallet provider/operator decisive for trust: government entity or private company? – Trust in governments-trust in them as providers (low trust, skepticism) – Data breaches lead to less trust in gov Expectation that users trust their devices – E.g., lack of trust in Password Managers is hindering adoption and efficient use Who is the Better Operator of an Identity Wallet Prioritised by the User? 13 Trust Trust needed for positive security perceptions (over technical details!) Self-efficacy: people are concerned* – risk of leaking personal information from the user interface – risk of providing data to the wrong online service during enrolment – data protection or forget password Usability has a positive effect on trust Trust is fundamental for adoption 14 * An empirical study of a decentralized identity wallet Adoption Usability, Security, and Trust are not enough Participants in user studies: – did not find the processes intuitive or familiar – found the identity wallet app easy to use, however, they also questioned the value of the app “I mean I’m able to use it. Whether I want to use it is a different thing." 15 TRANSPARENCY ENHANCING TECHNOLOGY (TETS) USABLE SECURITY & PRIVACY Learning Goals 1 The complexity of making Privacy Decisions What is the Privacy Paradox? What influences privacy related decision making? 2 Tools supporting users What are TETs and why are they relevant? How do TETs work and how can they be classified? 2 THE PRIVACY PARADOX Gerber, N., Gerber, P. and Volkamer, M., 2018. Explaining the privacy paradox: A systematic review of literature investigating privacy attitude and behavior. Computers & security, 77, pp.226-261.. Surveys say people care about privacy 91% Americans think that consumers have lost control over how personal information is collected and used by companies (Pew Research Center, 2014) 57% Europeans are worried their personal data is not safe (Symantec, 2015 ) 4 …but do little to protect themselves 1 in 4 European users read the terms and conditions when buying or signing up to products and services online 30% Would trade their e-mail address for money or the chance to win a prize/enter a raffle (Symantec, 2015) 5 Contrast between attitude and behavior ◯ If you ask respondents if they are concerned about internet privacy, they'll say YES. They desire to protect themselves. ◯But in practice, those same individuals share their personal privacy data readily online (e.g., in social networks) and rarely make an effort to protect their data (e.g., erase cookies). seemingly paradoxical behavior 6 Privacy Paradox ~157 years of research trying to explain this phenomenon! Discussion ◯What do you think are reasons for the paradoxical behavior? ◯What do you think affects your privacy-related decision making? 8 Theoretical Explanation Attempts* ◯ Privacy Calculus ◯ Bounded Rationality & Decision Biases ◯ Lack of personal experience and protection knowledge ◯ Social Influence ◯ The Risk and trust model ◯ Quantum theory ◯ Illusion of Control 9 * Gerber et al. 2018 ◯ Based on the ‘homo oeconomicus’ concept ◯ Decisions driven by attempt to maximize benefits ◯ Privacy: users will trade data to earn benefits Privacy ◯ e.g.: discounts, increased convenience, socialization Calculus ◯ Benefits are clear, though costs of data disclosure are less tangible ◯ e.g., security impairments, identity theft, unintended third-party usage, or social criticism and humiliation 10 ◯Privacy calculus assumes rational user, but cognitive biases affect decision making! ◯Incomplete information ◯Even if complete: cognitive processing limitation Bounded Rationality = bounded rationality & => resulting behavior might not reflect the Decision original intention or attitude Biases 11 ◯ Availability Bias ◯ Optimism Bias ◯ Confirmation Bias Bounded ◯ Affect Bias ◯ Immediate Gratification Bias Rationality ◯ Valence Effect & ◯ Framing Effect ◯ Rational Ignorance Decision Biases 12 ◯Few users have suffered online privacy invasions ◯=>most privacy attitudes are based on Lack of heuristics or secondhand experiences. Personal ◯But: only personal experiences can form stable attitudes that influence behavior Experience & Protection Knowledge 13 ◯ Some users might simply lack the ability to protect their data ◯ No or limited knowledge of technical Lack of solutions like the deletion of cookies, mail encryption, anonymous communication tools Personal like TOR, etc. Experience & Protection Knowledge 14 ◯ Most people are not autonomous in their decision to accept or reject the usage of an application/service Social Influence ◯ Social environment influences privacy decisions ◯Especially in collectivistic cultures => actual behavior affected by social factors while expressed attitude reflects unbiased individual opinion 15 ◯ Trust influences behavior directly ◯ because trust it’s an environmental factor: dominates in concrete decision situations Trust & Risk Model ◯ Perceived risk influences intention, ◯ because risk dominates in abstract decision situations ◯E.g.: when a user is asked if s/he would be willing to share his/her data in a hypothetical situation 16 ◯Human decision-making underlies the same effects as the measurement process in quantum experiments: Quantum →Outcome of a decision process not determined until the actual decision is made Theory ◯If an individual is asked, their answer do not necessarily reflect the actual decision outcome 17 ◯ experiments show that users seem to confuse: ◯ control over the publication of information Illusion of with ◯control over the assessment of that Control information by third parties ◯They disclose more when they can initially decide over the publication (illusion of control) 18 Theoretical Explanation Attempts ◯ Privacy Calculus ◯ Bounded Rationality & Decision Biases ◯ Lack of personal experience and protection knowledge ◯ Social Influence ◯ The Risk and trust model ◯ Quantum theory ◯ Illusion of Control No comprehensive explanation, complex phenomenon 19 We need transparency ◯ improves trust in applications/services ◯ reduces information asymmetry ◯ prerequisite for self-determination (allows ‘intervenability’) ◯ essential for democracy 20 ◯``A society in which individuals can no longer ascertain who knows what about them and when … would not be compatible with the right to informational self-determination.” ◯Self-determination is ``an elementary prerequisite for the functioning of a free democratic society predicated on the freedom of action and participation of its members''. 21 German Constitutional Court, ``Volkszahlungsurteil,'' BVerfGE, vol. 65, no. 1, 1983. 22 https://www.merriam-webster.com/dictionary/transparency 23 https://www.merriam-webster.com/dictionary/transparent Transparency ◯Transparency of personal data processing is a basic privacy principle and an individual right that is well acknowledged by data protection legislation 24 Transparency The principle of transparency as stated in the GDPR requires that: ◯“Natural persons should be made aware of risks, rules, safeguards and rights in relation to the processing of personal data and how to exercise their rights in relation to such processing”. 25 https://www.privacy-regulation.eu/en/recital-39-GDPR.htm Usable Transparency The GDPR also informs that: “Any information addressed to the public or to the data subject be concise, easily accessible and easy to understand, and that clear and plain language and, additionally, where appropriate, visualization be used”. 26 https://gdpr-info:eu/ TRANSPARENCY ENHANCING TECHNOLOGIES Murmann, P. and Fischer-Hübner, S., 2017. Tools for achieving usable ex post transparency: a survey. IEEE Access, 5, pp.22965-22991. Transparency Enhancing Tools (TETs) ◯Ex Ante: inform about the intended data collection, processing and disclosure. Consequences can be anticipated. ◯E.g.: Privacy Policies 28 Transparency Enhancing Tools (TETs) ◯Ex Post: inform about what data were collected, processed or disclosed by whom and to whom, and whether this complies with agreed policies. Should inform about consequences of revealed information. ◯E.g.: Google’s My Activity dashboard 29 What TETs do you know? https://padlet.com/pacabarcos/s8bkiodlihnn3poa 30 State of the art 31 Principles for Usable Transparency 1) what information should be made visible? (GDPR perspective) 2) what usable forms of presentation of this information can be used to achieve visibility? (HCI perspective) 32 Legal Principles for Transparency (GDPR) ◯What information should be made visible? (Ex Post) data being processed information about the logic involved with regard to any automatic processing data processing purposes including profiling, data recipients or categories of significance and envisaged recipients consequences of such processing 33 Legal Principles for Transparency (GDPR) ◯Additionally, the Data Controller shall provide ◯ Right to Data Portability: copy of personal data under processing in a “commonly used” electronic format ◯Intervenability rights: withdraw consent, be forgotten, data breach notification 34 HCI and Usability Principles ◯ ISO 9241-11: (1) Effectiveness, (2) Efficiency (3) User Satisfaction ◯ Nielsen’s 10 heuristics for user interfaces 35 Nielsen’s Usability Heuristics #1 Visibility of #2 Match between system #3 User control system status & the real world and freedom #4 Consistency & #5 Error prevention #6 Recognition rather standards than recall #7 Flexibility and #8 Aesthetic and #9 Help users recognize, #10 Help and efficiency of use minimalist design diagnose, & recover documentation from errors 36 https://www.nngroup.com/articles/ten-usability-heuristics/ Visualization - Graphics Chronological, Data items, Notifications, Map, Group View, Hierarchical, Home Screen, Service-based 37 Visualization - Icons Ideally ◯ self-descriptive ◯ conform with the user's expectations ◯ recognised rather than recalled (even if used infrequently), ◯ ideally due to being standardised across multiple implementation platforms Reality: icons in TETs reviewed by Murmann, P. & Fischer-Hübner are rarely universal but tailored to specific audiences 38 Visualization - Guidance Judgmental statements ◯Change visualization to signal consequences ◯Useful to raise awareness or nudge behavior ◯But: algorithm does not necessarily reflect the actual view of a particular user!!! ◯Only a few TETs handle individualization (ML on user preferences) Recommendations ◯explicitly gives advice about the necessity of a change ◯Based on: social norms, previous decision (user, user’s friends) 39 Intervenability ◯TETs may tap directly into an automated functionality at the data controller (provided it exists) ◯Easier when the service provider also provides the transparency functionality ◯ Modifying privacy settings ex post != rectification ◯ technical operation, only affects future data ◯ not a legal process of rights exertion 40 In summary… 41 Trends & Gaps ◯Scope: Location transparency most researched type of TET →IoT TETs missing → Need for future research! ◯ Individualisation: TETs should be adaptive to individual needs (e.g., expert vs novice) ◯ e.g.: multilayered, multiperspective visualization; level of automation ◯ Usability: need to evaluate TETs comprehensibility, self- descriptiveness, and provide error prevention in privacy settings configuration. Standardized icons. 42 Trends & Gaps ◯ Intervenability: few TETs provide (usable) support to exert intervenability rights, Data breach notification barely supported → Challenge: tech. standards for erase/modify requests on behalf of data subjects ◯Accountability: how to verify that data controllers process data legitimally? ◯Trust: shifting trust to a 3rd party TET provider complicates the process. →Challenge: Certification 43 QUESTIONS 44 References ◯Gerber, N., Gerber, P. and Volkamer, M., 2018. Explaining the privacy paradox: A systematic review of literature investigating privacy attitude and behavior. Computers & security, 77, pp.226-261. ◯Murmann, P. and Fischer-Hübner, S., 2017. Tools for achieving usable ex post transparency: a survey. IEEE Access, 5, pp.22965-22991. 45 TRANSPARENCY & IDENTITY Case Study: European Digital Identity User-Centric Identity through Identity Wallets [Example: European Identity Wallet, EUDI] 47 What is a Wallet? Cryptowallets 48 (EUDI) Identity Wallet Ecosystem - Verified Identity - Selective Disclosure My nationality is … I’m over 18 My home address is … My name is … My bank account number is… I’ve a valid driving license My phone number is… My subscription to service X is “Gold” [Source: What is the EUDI Digital Identity wallet] 49 Identity Wallet Operation Getting Accessing Setup Attributes Services 50 Identity Wallet Operation Getting Accessing Setup Attributes Services 51 Identity Wallet Operation Getting Accessing Setup Attributes Services 52 Is this really the best solution for digital identity? LET’S DISCUSS! First, it should be inclusive, consider all type of users Who are you?: https://docs.oregonstate.education/gendermag/ 54 First, it should be inclusive, consider all type of users Who are you?: https://docs.oregonstate.education/gendermag/ 55 Work in teams (12 minutes) to answer: #1 What challenges do you see for people to adopt and securely use Digital Identity Wallets? #2 What mitigations could be applied? Consider an “Abi” persona* Motivation: uses technology to achieve their tasks Computer Self-efficacy: Lower self confidence than their peers about doing unfamiliar computing tasks. Blames themselves for problems. Attitude towards risk: Risk-averse about using unfamiliar technologies that might require a lot of time + Information Processing Style: Comprehensive Learning style: process oriented Image source: https://gataca.io/ 56 Plenary Discussion Time 57 Identity Wallets & Usable Security – Challenges USABLE S & P IN VIRTUAL REALITY USABLE SECURITY & PRIVACY Emiram Kablo, 13th May 2024 Virtual Reality Varjo VR-3 Meta Quest 3 Valve Index Apple Pro Vision 3 Use Cases 4 Sensitive data collected by HW and SW We need Authentication! Only authorized individuals gain access Maintain privacy by verifying identity of user Authentication builds trust between user and system Regulations and compliance requirements Accountability 5 Let’s tackle the problem Exploring Authentication Usable and Secure Privacy in the age of VR Challenges Mechanisms What are public attitudes What are authentication Adapt novel schemes to VR towards privacy in VR? challenges that VR users encounter? What are users’ expectations and needs? User study Brainwave Authentication Contextual Integrity-based Adaptive Authentication user studies 6 Let’s tackle the problem Exploring Authentication Usable and Secure Privacy in the age of VR Challenges Mechanisms What are public attitudes What are authentication Adapt novel schemes to VR towards privacy in VR? challenges that VR users encounter? What are users’ expectations and needs? User study Brainwave Authentication Contextual Integrity-based Adaptive Authentication user studies 7 The Authentication Problem 8 Passwords: A burden for users Poor Usability Users have hundreds of passwords Remembering, typing, complex policies Users can’t cope well with passwords1 Security Problems2 Social Engineering, Cracking Adams, Anne, and Martina Angela Sasse. "Users are not the enemy." Communications of the ACM 42.12 (1999): 40-46. Verizon Data Breach Investigations Report 2020. https://www.verizon.com/about/news/verizon-2020-data-breach-investigations-report 9 Passwords in VR Predominant authentication method in VR Input method: controller and virtual keyboard Issues3: Slow typing Difficult Unpleasant Stephenson, Sophie, et al. "Sok: Authentication in augmented and virtual reality." 2022 IEEE Symposium on Security and Privacy (SP). IEEE, 2022. 10 Other Authentication methods in VR PIN: Pattern: Code/Pairing: 11 More methods from research RubikAuth by Mathis et al.4 PassGlobe by Länge et al.5 Mathis, Florian, et al. "Rubikauth: Fast and secure authentication in virtual reality." Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems. 2020. Länge, Tobias, et al. "PassGlobe: Ein Shoulder-Surfing resistentes Authentifizierungsverfahren für Virtual Reality Head-Mounted Displays." (2022). 12 More methods from research RoomLock by George et al.6 Passimoji, C-Lock, Randomized PIN by Länge et al.7 George, Ceenu, et al. "Investigating the third dimension for authentication in immersive virtual reality and in the real world." 2019 ieee conference on virtual reality and 3d user interfaces (vr). IEEE, 2019. Länge, Tobias, et al. "Vision: Towards Fully Shoulder-Surfing Resistant and Usable Authentication for Virtual Reality." (2024). 13 WIP: VR Authentication in different domains Goal: Take users into consideration when designing usable security solutions in VR Explore how users perceive and encounter authentication challenges, especially users from other domains than Gaming RQ1: What are authentication challenges that VR users from various domains encounter? RQ2: What are VR users’ expectations and needs towards usable authentication methods in VR systems? 14 Methodology Study consists of 3 phases, leaned on methodology by Stephenson et al. Recruitment via Prolific and Social Media N1=409 Phase 2 N2=69 Phase 3* N3=~15 Phase 1 Follow-up Interviews Screening survey Survey 1. Domains for VR usage 4. Authentication challenges 6. In-depth questions about user 2. Authentication experience Knowledge Usability Security authentication in VR 3. Demographics & Background *WIP Selection criteria: 5. Users’ expectations & needs Selection criteria: - VR usage in different areas - Willing to do an interview - Authentication experience in VR Password Managers Improvements Stephenson, Sophie, et al. "Sok: Authentication in augmented and virtual reality.“ 2022 IEEE Symposium on Security and Privacy (SP). IEEE, 2022. 15 Selection Criteria for Phase 2 Removed Users who never authenticated in VR Smartphone VR users N = 69 Invited Participants from Male 58 % Gender different sectors Female 38 % 25-34 53 % 18-24 32 % 35-44 9% Age 45-54 3% 55-64 3% Yes / No 50 % Education IT. BA/MA/PhD 68 % High School 16 % 16 Category Purpose Participants (n=69) Training and Simulation Military training 14 Medical training Flight simulation Industrial training Design and Planning Prototyping and production planning 10 Architecture, engineering and construction Health and Wellness Fitness 19 Therapy sessions Stress management Productivity and Communication Web browsing 9 E-learning Attending courses Meetings/Video conferences Entertainment Gaming 17 Watching videos and other media purpose Users reported to Virtual tours use VR in multiple Metaverse sectors 17 Intermediate Results – When to authenticate? Use cases of authentication (MC): During device setup (85%) While setting up an app (60%) Before purchase (31%) Every time when opening an app (20%) Before every device usage (17%) 18 Used Authentication Mechanisms 19 Perceived Usability and Security Likert scale (1 to 5) How easy/hard is it for you to use the method? How secure did you feel while using the method? 20 Perceived Usability Easiest methods by scores Lowest easiness by scores 1. Fingerprint (4.9) 1. Body Movement (3.15) Reasons: Easy to use, Fast & efficient Reasons: Body fatigue, inaccurate, intolerant 2. PIN (4.7) 2. Iris Scan (3.5) Reasons: Easy to remember, quick to enter Reasons: Takes long, errors quickly 3. Face recognition (4.4) 3. Eye Tracking (3.7) Reasons: Useful and easy, bad light is a challenge Reasons: Darkness, inaccurate, no detection On PIN: "I noted it down so I always have it On Eye tracking: available." P419 On PIN: "I chose the "Sometimes it doesn't detect minimum authentication if it's me or someone else so measure. it gives some anyone can have access on them” P359 extra security, but it's not too intrusive" P404 21 Perceived Security Most secure by scores Lowest security by scores 1. Fingerprint (4.9) 1. Body movement (3.18) Reasons: Unique, can’t get stolen Reasons: Fear of observation and replication, robustness 2. Face recognition (4.8) 2. Iris scan (3.63) Reasons: unique BUT: deep fakes can be dangerous Reasons: Revocability, Privacy 3. Password (4.3) 3. Eye tracking (3.8) Reasons: Known method, depends on chosen pw Reasons: Not accurate, Fear of cloning On Passwords: "It seems the standard authentication, On Passwords: "Passwords are On Iris scan: "Even so must be ok" P455 theoretically more secure, but I tend to though I was not sure of choose simple/short passwords for use what I was doing I thought in a VR headset because of the limitations my information will be of the ‘point and shoot‘ method of entering shared "P227 them." P320 22 Password Managers Can support users with managing passwords: improves security and usability4,5,6 79% of our participants use a PM Only 30% use a PM for or on VR → Awareness, Availability Mayer, Peter, et al. "Why Users (Don't) Use Password Managers at a Large Educational Institution." 31st USENIX Security Symposium (USENIX Security 22). 2022. Arias-Cabarcos, Patricia, et al. "Comparing password management software: toward usable and secure enterprise authentication." IT Professional 18.5 (2016) 23 Ion, Iulia, Rob Reeder, and Sunny Consolvo. "{“... No} one Can Hack My {Mind”}: Comparing Expert and {Non-Expert} Security Practices." Eleventh Symposium On Usable Privacy and Security (SOUPS 2015). User Needs and Improvements 62% believe that VR Auth process can be improved Need to more usable and smoother solutions, more accurate, faster authentication, more secure options Users see high potential in biometric authentication in VR “Right now, most VR platforms rely On PMs: "It will be easier to on usernames and login and authenticate, passwords to verify users’ identities, without going to the main which is like the device." P46 oldest trick in the book.” P209 25 First domain specific analysis Most frequently used authentication methods: → 47% of Entertainment-users use Eye tracking (3rd most) Easiness rating: → Biometrics not favored for all groups, most favored: Fingerprint/Pattern/PINs Next steps: Interviews Goal: Deeper insights into auth experiences and needs Selection Criteria: Willing to do an interview (n=29) Possible question blocks: Primary VR usage purposes and domain specific questions Non-common auth methods VR Password Manager usage and awareness 26 Let’s tackle the problem Exploring Authentication Usable and Secure Privacy in the age of VR Challenges Mechanisms What are public attitudes What are authentication Adapt novel schemes to VR towards privacy in VR? challenges that VR users encounter? What are users’ expectations and needs? User study Brainwave Authentication Contextual Integrity-based Adaptive Authentication user studies 27 Let’s tackle the problem Exploring Authentication Usable and Secure Privacy in the age of VR Challenges Mechanisms What are public attitudes What are authentication Adapt novel schemes to VR towards privacy in VR? challenges that VR users encounter? What are users’ expectations and needs? User study Brainwave Authentication Contextual Integrity-based Adaptive Authentication user studies 28 Brainwaves What are brainwaves? Electrical signals produced by the brain Brain-Computer Interfaces (BCIs) Left: OpenBCI EEG Electrode Cap (https://shop.openbci.com/products/openbci-eeg-electrocap), Right: InteraXon Muse 2 (https://choosemuse.com/) 29 2 3 1 7 4 Current EEG Wearables in the Market 5 6 Imec (https://www.imec-int.com/drupal/sites/default/files/2019- Neurosity The Crown (https://dev.to/neurosity/using-brainflow-with-the-neurosity-headset-2kof) 01/EEG_Headset_digital.pdf) InteraXon Muse 2 (https://choosemuse.com/) Neurosky Mindwave 2 (https://www.mindtecstore.com/neurofeedback-neurosky-brainwave-starter-kit- Galea Varjo (https://galea.co) kaufen) Neurable Enten (https://neurable.com/) 30 Emotiv MN8 (https://www.emotiv.com/mn8-eeg-headset-with-contour-app/) Workplace Safety Communication Wellbeing Education 31 Entertainment Biometrics as Password alternatives Something You Are Biometrics characteristics Physiological or behavioral Unique between individuals Usability No need to remember or carry a secret 32 Brainwave Authentication Why? Brainwaves have distinctive features Not observable Can be implicitly sensed 33 How does brainwave authentication work? Cognitive task: Person engages in specific task Oddball paradigm Infrequent target stimuli Results in event-related potentials (ERPs) P300 ERPs Specific brain reactions depending on stimulus Waveform peak (P300) 34 Designing Brainwave Authentication for the Real World11 RQ1: Usability NeuroPack – a library that realize How do users perceive the usability of brainwave- brainwave-based authentication based authentication in the real world? KeyWave – prototype implementation based on NeuroPack RQ2: Adoption Under what conditions would users want to use Foundation Password Manager brainwave-based authentication? (FPM) – a real-world use case application for brain biometrics Röse, Markus, Emiram Kablo, and Patricia Arias-Cabarcos. "Overcoming Theory: Designing Brainwave Authentication for the Real World." Proceedings of the 2023 European Symposium on Usable Security. 2023. 35 KeyWave: Encapsulated Authentication Process Prototype implementation of NeuroPack # Create keywave = KeyWave( BrainFlowDevice.CreateMuse2Device(), PersistentImageTask(3, 5, all_images, target_image, 200), PreprocessingPipeline(BandpassFilter()), BandpowerModel(), TemplateDatabase(), bounded_cosine_similarity,.725 ) # Enroll keywave.enroll("Emiram") # Authenticate keywave.authenticate("Emiram") 36 Foundation Password Manager Fully functional password manager with added brainwave-based authentication User Interface Start view, Overview view, Settings view Chosing database, saving credentials, configurations Browser Plugin Using KeyWave Connecting device Enrollment Authentication 37 User study Introduction Phase Experiment Phase Survey Phase Introduction to study & Exposure to FPM Perceived Usability SUS hardware Free exploration of Attitude toward BCIs and preconfigured websites brainwave authentication N=9 Muse 2 by InteraXon Key Findings SUS Score 85 Perceived effort low to medium N=6 would want to use KeyWave Reasons against it: Better alternatives (n=2) and need for external device (n=2) 38 Brainwave Authentication for VR in Future? Devices with EEG sensors: Muse 2 + Valve Index Meta Quest Pro Galea by OpenBCI x Varjo 39 Let’s tackle the problem Exploring Authentication Usable and Secure Privacy in the age of VR Challenges Mechanisms What are public attitudes What are authentication Adapt novel schemes to VR towards privacy in VR? challenges that VR users encounter? What are users’ expectations and needs? User study Brainwave Authentication Contextual Integrity-based Adaptive Authentication user studies 40 Let’s tackle the problem Exploring Authentication Usable and Secure Privacy in the age of VR Challenges Mechanisms What are public attitudes What are authentication Adapt novel schemes to VR towards privacy in VR? challenges that VR users encounter? What are users’ expectations and needs? User study Brainwave Authentication Contextual Integrity-based Adaptive Authentication user studies 41 Neuroprivacy Brainwaves Correlate with mental state, cognitive abilities, medical conditions12 Basis for inferring emotions, interests, health disorders, personality traits or other private data Data Protection Call for legal frameworks13 Shravani Sur and Vinod Kumar Sinha. 2009. Event-related potential: An overview. Marcello Ienca and Roberto Andorno. 2017. Towards new human rights in the 42 age of neuroscience and neurotechnology. Investigating Public Attitudes on Neuroprivacy14 RQ-1 | Neuroprivacy Expectations Under which conditions do people consider sharing N = 347 participants neurodata acceptable? 62% Male, 37% Female 80% < 35 years old RQ-2 | Neuroprivacy and Neurotechnology Awareness How aware are people of neurotechnology privacy 54% Bachelor‘s or implications? Master‘s degree How would they use this technology? EU countries Kablo, Emiram, and Patricia Arias-Cabarcos. "Privacy in the Age of Neurotechnology: Investigating Public Attitudes towards Brain Data Collection and Use." CCS’23. 43 Methodology – Defining Neurodata flows* Sender (1) Recipient (10) Subject & Attribute (1) Transmission Principles (16) BCI device Its manufacturer User’s brain signals If user has given consent Online service provider If user is directly notified before data collection Academic researchers If data is kept confidential and secure User’s social media accounts If data is stored online for a limited period … … Null (no principle) * Formulated based on Nissenbaum’s “Privacy as Contextual Integrity” Theory 44 Results – Acceptability Scores of Sharing Neurodata Without Any Condition N = 287* 45 What influences acceptability? 46 Results – Acceptability Scores of Sharing Neurodata With Conditions Using brain data for marketing is not acceptable Best average acceptability score over all principles Medical doctors and academic researchers favored over all principles 47 Results – Neuroprivacy Awareness What do you think can be derived about a person from their brain signals collected by a commercial BCI? N = 347 19% 7% Mental Mental Processes Constructs 26% Mental State 15.5% Higher Order 9% Inferences Physiological Data 48 Results – Neurotechnology Adoption For non-users: If you have the opportunity, would you use a BCI headset? N = 336 Yes (78%) No (34%) Use cases for personal and social good: Reasons: Curiosity and Entertainment Privacy as main concern Medical use No necessity in using a BCI Research headset Self-monitoring 49 Main Takeaways Usable Consent for neurodata-driven applications needed Need for more Transparency Privacy and utility are key factors for adoption, important to raise Privacy Awareness 50 Future Work: Perspectives on Privacy in Virtual Reality VR devices able to collect tons of data (sensors and cameras) → Inferences Is it acceptable? What are public attitudes? Methodology: CI-based survey for VR Users Example: Sender Recipients Subject Attributes Transmission Principles VR device Entertainment User Subject’s If user has given consent companies Environment Social media Iris Data If user is directly notified before data collection accounts Institution Body Movements If data is kept confidential and secure (school/uni) Medical doctors Brain signals If data is stored online for a limited period … … … 51 Main Takeaways Exploring Authentication Usable and Secure Privacy in the age of VR Challenges Solutions Need for more usable and Adapt novel schemes to Data protection becomes secure methods VR more relevant Users are open for Password Managers in VR alternative methods 52 USABLE AUTHENTICATION USABLE SECURITY & PRIVACY Patricia Arias Cabarcos WHY SHOULD WE CARE ABOUT USABLE SECURITY? Lack of Usability Costs Security Security tools are too complex, and as a result – Users make errors – Users don’t comply – Users don’t use In the long term – Likelihood of breaches increases – Frustration, lack of appreciation/respect for security Bad security culture, intentional malicious behavior (insider attacks) hard to detect attacks due to the “noise” created by habitual non-compliance 3 Lack of Usability Costs Money 95% of all cybersecurity issues can be traced to human error - direct & indirect economic costs - loss of intellectual property, data, reputation, market power, fines, business disruption, … data breach costs in 2020 took up 39% of an large organizations organization’s budget spend up to $1 $4.27 million cost more than a year after million handling per breach password resets World Economic Forum, Global Risk Report 2022, IBM, Cost of a Data Breach Report 2020 4 Forrester Research, 2018, Best Practices: Selecting, Deploying, And Managing Enterprise Password Managers. In summary Security systems are sociotechnical systems, and should be usable: 1. Psychologically acceptable designed for easy and correct use without error by any human user 2. Economically acceptable reasonable cost 3. Sustainable Reconfigurable/scalable/manageable 5 6 Anti-phishing Inclusive S&P Usable Authentication GETTING USABLE SECURITY Case Study: Usable Authentication Security “Protection of information and information systems from unauthorized access, use, disclosure, disruption, modification, or destruction.” C I A NIST Special Publication 800-12, https://doi.org/10.6028/NIST.SP.800-12r1 8 Security is a secondary task Design to reduce friction Authentication is the first line of defense! (process of verifying the identity of a user) Spear Phishing Cryptojacking Sextorsion Stalkware Botnets DoS Identity Theft Financial Fraud … but it’s also a “secondary” task for users, creating friction. 9 The Authentication Problem Password Authentication is dominant, despite… Poor usability - memorize, type, follow complex policies - users can’t cope well with passwords Security problems - 80% of data breaches from 2018 through 2020 were the result of compromised passwords Adams, Anne, and Martina Angela Sasse. "Users are not the enemy." Communications of the ACM 42.12 (1999): 40-46. 10 Verizon Data Breach Investigations Report 2020. https://www.verizon.com/business/resources/reports/dbir/2020 Consider user behavior Problem: Users choose bad passwords Random? 11 https://wpengine.com/unmasked/ cybersecurity training If we look at online password attacks Human (ask, guess) Brute force (try all possible combinations) Common word Dictionary https://www.youtube.com/watch?v=opRMrEfAIiI 12 Password complexity & security – current (obsolete) practice The strength of a password is a function of length, complexity, and unpredictability – Common metric: entropy (randomness), , N number of symbols, L length – Rationale: the larger the password space, the longer it takes to guess a password Entropy: Policy: use irregular capitalization, [a–zA–Z0-9@#...$%^&+=] 95 symbols special characters, and A 10-character password with this policy has: at least one numeral log2((95)^10) ~= 65.7 bits Time to break: ? (10B guesses/second) 13 Password complexity & security – current (obsolete) practice The strength of a password is a function of length, complexity, and unpredictability – Common metric: entropy (randomness), , N number of symbols, L length – Rationale: the larger the password space, the longer it takes to guess a password Entropy: Policy: use irregular capitalization, [a–zA–Z0-9@#...$%^&+=] 95 symbols special characters, and A 10-character password with this policy has: at least one numeral log2((95)^10) ~= 65.7 bits Time to break: ? (10B guesses/second) ~ 190 years fulfills policy, but P@$$w0rd!! ~