IT and Society Lecture 2 - Privacy

Document Details

HardWorkingAestheticism

Uploaded by HardWorkingAestheticism

Technical University of Munich

2024

Jens Grossklags

Tags

privacy digital technology information technology society

Summary

This lecture, part of the IT and Society course at the Technical University of Munich, introduces the concept of privacy in the context of rapidly evolving digital technologies. The lecture covers topics like the complexities of digital business models and consumer reactions to privacy challenges, and concludes with potential solutions and an acknowledgment that there are yet unresolved issues in this space.

Full Transcript

IT and Society Lecture 2: Privacy – An Introduction Prof. Jens Grossklags, Ph.D. Professorship of Cyber Trust Department of Computer Science School of Computation, Information and Technology Technical University of Munich April 22, 2024 TUM: Digital Leaders Ranking Strong focus on digital expert...

IT and Society Lecture 2: Privacy – An Introduction Prof. Jens Grossklags, Ph.D. Professorship of Cyber Trust Department of Computer Science School of Computation, Information and Technology Technical University of Munich April 22, 2024 TUM: Digital Leaders Ranking Strong focus on digital expertise and teaching of transferable digital skills across their curriculum Rank 5 in 2024 https://www.emerging.fr/digital-leaders/rankings (April 17, 2024) 2 Recap – IT Productivity Job prospects in the U.S., European, German IT field Studied literature on IT productivity – Productivity growth paradox – Labor market polarization Trends that shape future of work – E.g., new business models providing new employment opportunities but also potential societal tensions – Probability of computerization of professions – Avoiding “unemployability“; Fostering relearning. How? – Impact of generative AI: Different style of work? Reducing inequality? 3 Perspective of the Individual To which degree does IT make us more productive? Example: Email Numerous benefits in the workplace! Study: tracked email usage with computer logging, biosensors and daily surveys for 40 information workers over 12 days Finding: The more daily time spent on email, the lower was perceived productivity and the higher the measured stress Takeaway: Managing IT productivity at the individual level is also challenging Mark et al. (2016) Email Duration, Batching and Self-interruption: Patterns of Email Use on Productivity and Stress, CHI 2016 4 Recap – IT Productivity Can everything that is boring be automatized…? or… Everything that is boring should be automatized…? Gallup report, “How Millennials Want to Work and Live“ (2016) Summary: https://www.gallup.com/workplace/236477/millennials-work-life.aspx 5 Lecture 2 Privacy – An Introduction 6 Back in the days… Large degree of Internet freedom (but limited functionality) (Too) anonymous on the Internet No rules ‒ “Wild West“ culture 7 The New Yorker — July 5, 1993 “On the one hand, information wants to be expensive, because it's so valuable. The right information in the right place just changes your life. On the other hand, information wants to be free, because the cost of getting it out is getting lower and lower all the time. So you have these two [powers] fighting against each other.” Stewart Brand (1984) What trend has won? 8 Evolving Business Models Providing many (more or less) useful services – Often digital content generated by users or professionally – Improved by or even based on user data  Personalization/ Customization vs. Your data is the product 9 Tremendously Successful Social Networking Number of active users in millions (January 2024; worldwide) https://www.statista.com/statistics/272014/ global-social-networks-ranked-by-number-of-users/ 10 … and profitable! Ad-dominated. Revenue Streams 2016 Other business models: Data from annual company reports. Visualizations: Visual Capitalist 11 But: Not as Profitable for all Parties Only 51% of advertiser spent is received by publishers “Small” study: 15 advertisers, 8 agencies, 5 Demand Side Platforms (DSPs), 6 Supply Side Platforms (SSPs) and 12 publishers for the study, representing approximately £0.1 billion of UK programmatic media spend. High Complexity: Over a thousand distinct supply chains were identified 12 https://www.isba.org.uk/news/time-for-change-and-transparency-in-programmatic-advertising/ (2020) Advertisement Ecosystem Europe (2016) System with tremendous complexity 13 https://www.slideshare.net/kleonfre/lens-academy-digital-advertising-ecosystem-market-map-europe-2016 Marketing Technology Ecosystem (7040 Entities) Source: https://cdn.chiefmartec.com/wp-content/uploads/2019/03/marketing-technology-landscape-2019-slide.jpg And for you: 15 Data Impactful Beyond Targeted Advertisement Delivery 16 Tangible Benefits: Advances in Data Analytics E.g., healthcare – Precision medicine – Monitoring devices Other important scenarios and contexts Requires detailed health data of many individuals  see Corona Apps 17 Our Data – Our Privacy? 18 What is Privacy? How much do we need? Welfare Multifaceted concept “Privacy is in disarray and nobody can articulate what it means” (Daniel J. Solove) Information revelation Selection of definitions (Survey data) – Ownership of and control over personal information (90%) – Personal dignity (60%) – Freedom to develop (50%) – Ability to assign monetary values to each data flow (26%) Acquisti & Grossklags, 2005 19 Nuanced theories have emerged over time. 20 Right to be left alone The intensity and complexity of life, have rendered necessary some retreat from the world […] Warren & Brandeis, 1890 21 Right to be Left Alone (2) - New technological innovations and business models - Newspaper enterprises & mobile photography - Invasion into privacy beyond territory - Described effect: “Mental pain and distress far greater than bodily injury” 22 Right to be Left Alone (3) - A ”right to enjoy life”  A “right to privacy”  A ”right to be let alone” - Important legal specifications: - No one should be able to invade your privacy by publishing information about you without your explicit consent. - Exceptions: - When you have published that information yourself - When you are in public (e.g., on the street) Boundary Regulation Temporal dynamic process of interpersonal boundary negotiation (Altman, 1975) 24 Boundary Regulation (2) Negotiation of accessibility and inaccessibility that characterizes social relationships Regulation both by how physical spaces are built and through the behaviors that take place in them Practices are applied to achieve contextually desirable degrees of social interaction Also: Boundary regulation as a group Source: Airi Lampinen 25 Contextual Integrity Helen Nissenbaum (2004): – Privacy can be provided by appropriate information flows – Appropriate information flows: when conformity with contextual information norms is provided Paradox: How much does an (AI-based) system need to know about us and society to contextually manage our privacy? https://crypto.stanford.edu/portia/papers/RevnissenbaumDTP31.pdf 26 Theories did not Foresee Enormous Economic Pressure on Privacy 27 Personal data is the new oil of the internet and the new currency of the digital world. Meglena Kuneva (2009) 28 Consumer rights must adapt to technology, not be crushed by it. Meglena Kuneva European Consumer Commissioner (2009) 29 Or to paraphrase: We must adapt? 30 Moving towards the U.S. standard of reasonable privacy expectations? Two-part legal test: – “Subjective”: Person asserting that a search was conducted must show that they kept the evidence in a manner designed to ensure its privacy – “Objective”: Would society at large deem a person’s expectation of privacy to be reasonable? Unfortunate realization: The more knowledge in society about privacy- invading technologies, the less “objective” expectation of privacy! 31 What is reasonable to assume nowadays? 32 Many would argue: Online privacy is crushed, flattened and several times run over. 33 Example: Online Tracking 34 Brookman et al.; PETS 2017 [Federal Trade Commission] 35 Example: Cross-Device Tracking 36 Matching across publishers using hashed identifiers Brookman et al.; PETS 2017 [Federal Trade Commission] 37 Example: Addition of Data from Offline World 38 39 Let’s Learn About the User 40 Turow et al. Americans Reject Tailored Advertising and Three Activities that Enable It (2009) 41 78% state that they “actively” protect themselves – 43% read privacy policy Browser settings (cookies) = 46% – Not effective against many tracking strategies Anti-tracking software = 18% – Cat and mouse game 42 Advertisement Economics Rule of thumb: Knowledge about consumer matters when allocating advertisements to available advertisement space on users’ “real estate” (i.e., browsers/devices) − Switching off targeting entirely (i.e., “no cookies”, “no fingerprinting”) may hurt stakeholders  How much? How bad is basic contextual advertisement? − What is the marginal return of increasing levels of targeting? − How good/poor are ad-tech companies at guessing users’ identity, characteristics and tastes anyway?  Big players are likely quite good  Small players may have to do a lot of guessing Coming changes to the Ad/Tracking Ecosystem? Example: Federated Learning of Cohorts - FLoC (Google) – 2020: Announcement to end support for Third-Party Cookies in Chrome – End of 2020: New approach presented as part of the “privacy sandbox” initiative (https://www.privacysandbox.com/) based on federated learning and k-anonymity – Critique from various sides – for example: Follow this debate. It’s interesting. Advertisers: Retargeting not possible; too much control to Google Consumer protection/other tech companies: Deeply integrated in the browser; not enough consent; not GDPR-compliant 44 See also: https://developer.chrome.com/blog/privacy-sandbox-update-2021-jan/ Apple 14.5 Update: App Tracking Transparency April 2021: Forced app publishers to include a pop-up asking for permission to track behavior for ad sales Apple iOS users who did an opt-in via ATT (in March 2022; 4600 apps with ATT enabled) Meta Earnings “warning” by Zuckerberg $10bn in lost revenue since Apple’s changes (estimated in Feb 2022) Takeaway: Changes in the ecosystem can matter A LOT! Looking under the Hood: Apple ATT Tracking is a highly privacy-invasive data collection practice that has been ubiquitous in mobile apps for many years due to its role in supporting advertising-based revenue models. In response, Apple introduced two significant changes with iOS 14: App Tracking Transparency (ATT), a mandatory opt-in system for enabling tracking on iOS, and Privacy Nutrition Labels, which disclose what kinds of data each app processes. So far, the impact of these changes on individual privacy and control has not been well understood. This paper addresses this gap by analysing two versions of 1,759 iOS apps from the UK App Store: one version from before iOS 14 and one that has been updated to comply with the new rules. We find that Apple’s new policies, as promised, prevent the collection of the Identifier for Advertisers (IDFA), an identifier for cross-app tracking. Smaller data brokers that engage in invasive data practices will now face higher challenges in tracking users – a positive development for privacy. However, the number of tracking libraries has – on average – roughly stayed the same in the studied apps. Many apps still collect device information that can be used to track users at a group level (cohort tracking) or identify individuals probabilistically (fingerprinting). We find real-world evidence of apps computing and agreeing on a fingerprinting-derived identifier through the use of server-side code, thereby violating Apple’s policies. We find that Apple itself engages in some forms of tracking and exempts invasive data practices like first- party tracking and credit scoring from its new tracking rules. We also find that the new Privacy Nutrition Labels are sometimes inaccurate and misleading, especially in less popular apps. Overall, our observations suggest that, while Apple’s changes make tracking individual users more difficult, they motivate a countermovement, and reinforce existing market power of gatekeeper companies with access to large troves of first-party data. Making the privacy properties of apps transparent through large-scale analysis remains a difficult target for independent researchers, and a key obstacle to meaningful, accountable and verifiable privacy protections. Source: Goodbye Tracking? Impact of iOS App Tracking Transparency and Privacy Labels Individuals’ Privacy Preferences and Behaviors - Experimental Evidence - 48 A Classic Privacy Experiment Study of interaction behavior between an intelligent sales advisor agent („Lucy“) and 171 consumers Participants signed privacy statement indicating that their data would be sold to an anonymous entity Subjects spent their own money on products Spiekermann, Grossklags, Berendt, 2001 49 Privacy Attitudes Extension of segmentation never of consumers regarding 5 privacy concern by Westin privacy Profile Revelation Identity-concerned = Concern 4 profiling fundamentalist about collection and use of averse 30% identifying information 26% 3 under certain marginally Profiling-averse = Concern conditions identity concerned about collection and use of concerned 2 24% information that can help to 20% build a behavioral profile always under certain conditions never 1 1 2 3 4 5 Identity Revelation 50 Identity-Related Results Identity information revealed – No reason was given for elicitation A form was presented to the participants soliciting personal information – Fundamentalist: 26% – Identity-concerned: 23% – Profiling-concerned: 35% – Unconcerned: 64% 51 Profiling-Related Results Privacy cost – Controlled for privacy sensitivity with pre-study – Fundamentalist: 78% – Profiling-concerned: 78% How many questions did participants answer when interacting with a digital – Identity concerned: 97% sales advisor agent? – Unconcerned: 100% 52 Interpretation Are privacy preferences reflected in behavior? – Participants pick up cues – Degree of information revelation is very high Attitudes and behaviors are not random, but exhibit a privacy gap (often called the privacy paradox) 53 What 3 Factors Shape Behavior? Incomplete or asymmetric information – Lack of understanding of situation Bounded rationality – Analysis of privacy consequences is too difficult Psychological aspects – For example: Total immersion in activity leads to lack of metacognitive monitoring (i.e., flow state) 54 Obstacles Decision-making over time – Actions “now” have consequences “later” Choices are not and should not be perceived as independent 55 Solution Approaches Notice and Consent Privacy by Design What are the challenges Privacy Tools associated with these strategies? Abstinence/Change Behavior Laws and Penalties 56 Takeaways Data enables many new business models; data analytics may lead to important new insights in many contexts Collection and monetization of data is pursued aggressively Individuals‘ understanding inadequate to “solve“ privacy challenges (also due to obstruction and opaqueness) Discussion of merits of different solution approaches needed 57 Again, the End. For Today. See you next week. 58

Use Quizgecko on...
Browser
Browser