🎧 New: AI-Generated Podcasts Turn your study notes into engaging audio conversations. Learn more

f02fbf0e-74d9-42ea-aa2f-f59039468738_Study_Plan.pdf

Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...

Full Transcript

🔒 Study Plan Exam 1 Exam 2 read notes orders public view lecture 3/4 emotion ques set 1...

🔒 Study Plan Exam 1 Exam 2 read notes orders public view lecture 3/4 emotion ques set 1 set 2 Lecture Class 9: Victims study 1- interviews (ptsd every time i walked thru office door) 83 participants, semi structured interviews, employees at orgs that had been directly targeted w ransomware examples of questions what was impact of attack on employees? what was impact on you as an indiividual? Did the attack have any negative impacts on your customers/suppliers/clients etc.? Study Plan 1 What were the negative impacts of the attack on your organization? (E.g. financial, reputational) What factors aggravate the negative experience encountered by a victim organization after a ransomware incident? physical harms lack of adequate exercise lack of adequate nutrition lack of adequate sleep minor illness (heart palpatations) overcomsumption of caffeine serious illness (heart attack/ stroke) weight changes economic harms cancellation of annual leave/ holiday plans economic risk to personalassets (ex. micro-sme owner) inc future risk of fraud loss of/ interruption to salary productivity impact redundancy psychological harms anger, confusion, enbarassment, frustration, guild, isolation, loss of self-confidence, PTSD, self-doubt, shame, stress, suicidal thoughts reputational harms clients have less trust in victim org’s staff exposure of individual in media/ social networks industry peers have less trust in victim org’s staff loss of trust within org supply chain haveless trust in victim org’s staff social/societal harms disruption to family routine Study Plan 2 inability to rake bereavement leave inability to undertake childcare duties what factors might alleviate victims experiencing of ransomware harms? actions before incident appropriate comms (internal/external) strategy appropriate cybersecurity, inhibits attack or redues access levels approrpaite technical resisliency, ex. viable/regularly updated backups core business systems can continue func w/o IT cyber insurance good leadership high employee morale actions after incident appropriate comms (internal/external) strategy calming/experience voices in room core response staff are/can be rotated counseling services offered to staff employee welfare cosiderations expedited access to expert help expert help in guinding engagement w regulatory bodies good employee support good leadership what factors might aggravate victims’ experence of ransomware harms? before the incident bad leadership bad luck complex IT estate incl system not designed to be off/ taken online disgruntled emplotees inadequate business ocnitnuty plan inapproate comms (internal/external) inapproprate tech resiliency ex. backups not deployable/ can be infenced Study Plan 3 afer the incident bad leadership bad luck core respose staff not rotated delay in engaing third parties for support diagremeent abt whether to pay ransom employee resignations employee welfare conds not taken into acct employees blaming e/o negotiations w threat actors handled poorly (ex. CFO conducts negotiation) study 2- observations (being hackers: understanding victims understanding of iot hacking) in planning data collection, concerned that social stigma attached to being voctim could affect quality of collected data and interview data might be unrelaible like if strong social desirability bias during intial interview recruiting, hesitancy to talk abt sens issues to researches → explore other means to collect data → reddict, product support forms, user generated product reviews a. have i been hacked? cry for help, OPs use pplaforms to recieve ouside perspective to help make sense of incident a. most common ques b. who hacked me? someone known or unknown c. why was i hacked? d. dealing w the hack? find temp/perm solution like getting tech support, getting social legal support, dealing w harm and hurt study 3- experiment (shandler and gomez) sep 9 2020, ransomware attack struck dusseldorf university hospital wifi connected medical equipment inoperablo w/o ransom emergency surgery for 78 yr old patient cancelled, died eventuall identify as russian organizzed crime ring, they released malware provided decryption key w/o further demands Study Plan 4 what constitutes exposure to an attack? 1. awareness- participant aware of incidant? 2. familiarity- participan has detailed knowledge of incident? 3. proximity- distance from attack 4. connection- possessing connection to attack site theory of psych distance- geographically distant events seen thru lens of abstraction, proximity to nearby event concretixes facts w indiv experienceing raw, unvarnished details, spatian distance makes indic create figurative deptictions/hypothetical understandings corresponding w preconceievd notions Class 11: Susceptibility to Disinformation drivers of false beliefs ⭐ cognitive drivers intuitive thinking lack of analytical thinking and/or deliberation cognitive failures neglect source cues and/or knowledge forget source and/or counter-evidence illusory truth Study Plan 5 familiarity fluency cohesion socio-affective drivers source cues elites in-group attractive emotion emotive information emotional state worldview personal views partisanship studies study 1: susceptibility to misinformation exploring confusion vs partisanship as causes of idisinformation. how do discourage sharing of disinfo? testing confusion based acct (mistakenly believe it is accurate) recruited american voters and provide w headline, half false half true, half democrat half republican participants randomly assigned to either judge veracity of headline (accuracy condition) or indicate whether they would consider sharing it online (sharing condition) maybe then ppl dont care abt accuracy and partisanship is far more important in guiding sharing decision? but..when asked at end of study, ppl said accuracy extremely important Study Plan 6 study 3 and 4: in control condition of each experiment, participants showed 24hr news headlies asked how likely they’d share each headline on facebook in treatment condition, participants asked to rate accuracy of single non-partisan news healdine at outset of study then completed same sharing intention task as control condition, but w concept of accuracy more likely salient in minds study 7: conducted digital field experiement on social media selected twitter users who prev shared linkes to teo particularly right leaning sites that professional fact-checkers say are highly untrustworthy send them private msgs asking to rate accuracy of single non polit headline then analyzed quality of their news content sharing 24 hrs after treatment Study Plan 7 relative to baseline, accuracy msg increased avg quality of news sources shared + total quality of shared sources summed over all posts Class 12: Cognitive Security and Disinformation what is cyber psychology? The interdisciplinary study of the psychology of cyberspace and those who use the tools of cyberspace Cyberpsychology focuses on the internal experience of the cyberspace user In contrast with HCI which focuses on the interaction between the user and technology, and the performance of the technology to suit user needs Goals of cyberpsychology Identify overlap between online and offline life Explore psychologically informed conceptualizations of individual and community cyber behavior Inform decision makers with insight as to individual and group motivations Understand how to develop and implement effective policy and legislation pertaining to cyberspace Cyberpsychology: Some Emphasis Areas Cyber Presence Digital Identity & Identity Management Online Disinhibition Effect Digital Deviance Dark Personalities in Cyber Deception & Disinformation Social Influence Privacy & Trust Cybercrime, Cyber Terrorism, & Forensics Cyberspace and Social Science The field of cyberpsychology has developed into a broad discipline Availability and affordability of internet has caused personal and social changes High home penetration by internet in socio-technical societies Study Plan 8 Multimedia internet Collective, interactive experience Meaning and purpose are constructed by visual, auditory, and other perceptual stimuli Represents human experience of the ‘real’ world The internet may serve as a “transitional space” Extension of the user’s internal world Intermediate zone between self and other that fosters psychological development Influences the psychology of the individual, interpersonal relationships, group behavior, and culture Is Cyberspace Different than Physical Realm? No real geographical boundaries (unless sovereignty applies) Almost everything is recordable, some permanent Privacy is unclear concept Synchronous, asynchronous, or hybrid social interactions Partial to complete anonymity Sensory experience is different, can be limited or enhanced Cyberspace as Psychological Space We often experience and talk about cyberspace as a literal “space” Through a psychological lens, what role does cyberspace play? A “transitional space” between self and other An extension of your mind and other people’s minds A representation of the collective human mind Connected to but distinct from “the real world” Point of Failure in Cyberspace Human users create vulnerabilities in Cyberspace Human error Lack of knowledge Cognitive biases Insider threat (accidental or deliberate) Study Plan 9 Cognitive Security: Exploitation of the Cognitive Dimension Cognitive Science “the interdisciplinary scientific study of psychology, computational science, linguistics, philosophy, and neuroscience to understand the human mind (Andrade & Yoo, 2019, p. 2)” Cognitive Security (COGSEC) “Practices, methodologies, and efforts made to defend against social engineering attempts –intentional and unintentional manipulations of and disruptions to cognition and sensemaking What is Human Cognition? The states and processes involved in knowing, which in their completeness include perception and judgment. Cognition includes all conscious and unconscious processes by which knowledge is accumulated, such as perceiving, recognizing, conceiving, and reasoning. Put differently, cognition is a state or experience of knowing that can be distinguished from an experience of feeling or willing.” Disrupting Decision Making in the OODA Loop Observation Gather information needed to make a decision Orientation Destruct the situation into manageable parts Creation of a general plan of action through analysis and synthesis of information Decision Choose which plan to implement Action Implement the chosen plan Disinformation targets the “Observation” stage of the OODA loop Deception and Threat: The Basics Deception Fundamentals Deception exploits trust This can be trust in a technical system, or in a social system Deception is difficult to detect whether you’re dealing with a human, or a technology Study Plan 10 Deceptive agents may use power and persuasive tactics to gain trust Socially, most people must rely on verbal and nonverbal cues that may signal deception With technology, some of the same social cues can be adapted to detect deception either by a user or ‘smart’ software What is Deception? Deception is, “anything that misleads another for some gain” (Harrington, 2009, p. 58) Deception is deliberate Deception is not misunderstanding, mistake, or mis- remembering Some forms of deception may be by consent, if the recipient already knows what to expect and trusts that boundaries are honored (e.g., acting) Other forms of deception occur with the recipient unaware, which can have the effect of destroying trust (e.g., a lie, falsehood, misdirect) Sometimes deception is used in strategic communications practices What is a Threat? Someone or some thing that is trying to do harm Usually by exploiting a known/unknown vulnerability A vulnerability is only a threat if someone wants to exploit it One vulnerability may be associated with multiple threats Threats may arise in response to our security efforts Objectives of “Threat Actor” Financial theft Stealing trade secrets and private information Critical infrastructure disruption Insider threat exploitation of information access Targeted attack on an individual/other actor Automated attack on general targets Deception in Cyberspace: Is it Different? “Deception can be defined as an interaction between two parties, a deceiver and a target, in which the deceiver successfully causes the target to accept as true a specific incorrect version of reality, with the intent of causing the target to act in a way that benefits the deceiver.” (Rowe & Custy) Study Plan 11 Examples: Fake systems, fake users, fake traffic Deception in Cyberspace is different from deception in “offline” life Fewer perceptual cues available in cyberspace (verbals, nonverbal) Information in cyberspace can be quickly and easily changed (“impermanence”) Malign Influence: Manipulating the Information Ecosystem Social Engineering – Used To Influence Opinions Exploits vulnerabilities in target population Often employs manipulation of the things causing the target fear or anxiety In socio-technical systems Uses technology platforms to deliver “messaging” to intended target (e.g., social media) Relies on speed and scale at which technology platform can spread messaging Echo Chamber Repetition and amplification of beliefs within a “closed system” This closed system could be an offline social community, or an online social community Social Engineering and APTs Social Engineering “Influence particular attitudes and behaviors on a large scale” Social Engineering Cycle Advanced Persistent Threats (APTs) Threats that possess advanced, sophisticated planning and execution methods (p55) Focus on a specific topic Coordinated team of experts with specific skills Intelligence and surveillance that may take months (i.e., casing) Intrusion (use social engineering, malware, etc) Exfiltration (extracts specific information) Social cybersecurity: exploitation of socio-technical ecosystem Study Plan 12 “As noted by the National Academies of Science NAS (2019): Social cybersecurity is an applied computational social science with two objectives: 1. ‘characterize, understand, and forecast cyber-mediated changes in human behavior and in social, cultural, and political outcomes 2. build a social cyber infrastructure that will allow the essential character of a society to persist in a cyber-mediated information environment that is characterized by changing conditions, actual or imminent social cyberthreats, and cyber-mediated threats.’” uses computational social science techniques to identify, counter, and measure (or assess) the impact of communication objectives provide evidence about who is manipulating social media and the internet for/against you or your organization, what methods are being used, and how these social manipulation methods can be countered Information Manipulation: Misinformation vs Disinformation aims to disrupt decision making processes Misinformation = Inaccurate information that is inadvertently spread May lead people to form inaccurate (and potentially harmful) beliefs Disinformation = Deliberately misleading information that is strategically spread and that has the function of misleading the target audience Analysis of disinformation can help us develop techniques for detecting disinformation and policies for deterring its spread Video: Why Do People Fall for Misinformation? Joseph Isaac TED-Ed (4:58mins) In 1901, David Hänig published research that led to what we know today as the taste map: an illustration that divides the tongue into four separate areas. It has since been published in textbooks and newspapers. There is just one problem: the map is wrong. So how do misconceptions like this spread, and what makes a fake fact so easy to believe? Joseph Isaac dives into the world of misinformation. Misinformation/Disinformation Case Studies Example 1: Covid-19 Mis/Disinformation The World Health Organization declared the outbreak of the coronavirus disease 2019 (COVID-19) a pandemic on March 11, 2020 Study Plan 13 Since then, a significant amount of mis/disinformation regarding COVID-19 has been circulated online and in news media Oxford Analytica (2020) called this an “infodemic” Government & Think Tanks Fighting COVID-19 Mis/Disinformation U.S. Government – CISA RAND Report - Bad Actors in News Reporting: Tracking News Manipulation by State Actors Example 2: Russian Disinformation Russian Internet Research Agency (IRA) Russian disinformation troll “farm” of paid internet trolls IRA influence on recent U.S. Presidential elections Russia-Ukraine Conflict The Russian "Firehose of Falsehood" Propaganda Model Distinctive Features of the Contemporary Model for Russian Propaganda 1. High-volume and multichannel 2. Rapid, continuous, and repetitive 3. Lacks commitment to objective reality 4. Lacks commitment to consistency Global Engagement Center (GEC) Special Report: Russia’s Pillars of Disinformation and Propaganda (August 2020) GEC, “is leading and coordinating efforts of the U.S. Federal Government to recognize, understand, expose, and counter foreign propaganda and disinformation.” Russian disinformation and propaganda ecosystem “official, proxy, and unattributed communication channels and platforms that Russia uses to create and amplify false narratives” Five pillars Official government communications State-funded global messaging Cultivation of proxy sources Weaponization of social media Cyber-enabled disinformation Study Plan 14 Online media multiplies the spread of disinformation (e.g., “disinformation storms”) U.S. Government Agencies Combatting Foreign Malign Influence (FMI) U.S. Cybersecurity & Infrastructure Security Agency (CISA) Combatting foreign influence operations and disinformation “CISA reduces risk to U.S. critical infrastructure by building resilience to foreign influence operations and disinformation. Through these efforts, CISA helps the American people understand the scope and scale of these activities targeting election infrastructure and enables them to take action to mitigate associated risks. U.S. Department of Justice (DoJ) Justice Department Disrupts Covert Russian Government-Sponsored Foreign Malign Influence Operation Targeting Audiences in the United States and Elsewhere (September 4, 2024) “The Justice Department today announced the ongoing seizure of 32 internet domains used in Russian government-directed foreign malign influence campaigns colloquially referred to as “Doppelganger,” in violation of U.S. money laundering and criminal trademark laws. U.S. Department of State (DoS) State Department Actions to Counter Russia’s Election Interference and Foreign Malign Influence Operations (Press Statement, Anthony Blinkin (SoS), September 4, 2024) Jaitner & Kantola (2016) Study: Reflexive Control Russian information operations Goal = to use information manipulation to induce an adversary to make a decision that is strategically advantageous to the initiator Reflexive Control Theory = Informs a methodology of information manipulation, where specifically prepared information is conveyed to an adversary to lead that adversary to make a decision desired by the initiator Deliberate manipulation of decision making How can elements of conflict (and what are those?) be injected into an online community to trigger a community conflict that is disruptive Conclusion = Cyberspace (i.e., the online environment) presents a multitude of opportunities by which Reflexive Control Theory could be implemented to shape the behavior of a targeted community Note: Remember that online life probably effects offline life in various ways Study Plan 15 Howard et al. (2019) Study: Russian Trolls Social media and political polarization analysis based on data provided to Senate Select Committee on Intelligence (SSCI) Russian IRA (Internet Research Agency)...i.e., “Russian Trolls” Reached 10s of millions of users in US between 2013-18 According to multiple sources including the U.S. Intelligence community, divisive tactics were used through social media platforms such as Facebook, Twitter, Instagram, and YouTube, to influence the U.S. 2016 Presidential Election and Black Lives Matter movement Report documenting research that used social media data to assess use of divisive tactics to influence the US 2016 Presidential Election Howard et al (2019) Highlights Russia's IRA activities were designed to polarize the US public and interfere in elections” African American voters encouraged to boycott elections or follow the wrong voting procedures in 2016, and Mexican American and Hispanic voters encouraged to distrust US institutions; Extreme right-wing voters to be more confrontational Sensationalist, conspiratorial, and other forms of junk political news and misinformation spread to voters across the political spectrum...”these campaigns did not stop once Russia's IRA was caught interfering in the 2016 election“ Class 13: Cyber Attackers Study Plan 16 Lowly skilled hackers who novices script kiddies, newbies heavily rely on online toolkits Low to medium-skilled hackers Cyberpunks Crashers, Thugs, Crackers who wreak havoc for fun Disgruntled current or ex- Internals, User Malcontents, employees who abuse their Insiders Corporate Raiders access to get what they want White Hats, Sneakers, Grey Old Guards Professionals Hats, Tourists Extremely skilful hackers who Black Hats, Elites, Criminals, hack to further their criminal Professionals Organized Crime, empire or are Information Brokers guns-for-hire Hackers who use their technical Hacktivists Political Activists, Ideologists skills to further their political agendas Highly trained hackers who work for governments to Information Warriors, Cyber Nation States destabilize, disrupt, Terrorists, Cyber Warriors and destroy systems and networks Students Hackers with no malicious intent, who hack to gain Study Plan 17 knowledge Criminals who move their Extortionists, Scammers, activities online, using their low Petty Thieves Fraudsters, Thieves to medium skills Hackers who possess and engage in illegal duplicating, Digital Pirates Digital Pirates distributing, downloading, or sale of copyrighted materials Hackers who misuse the Internet to engage in sexually Online Sex Offenders Cyber Predators, Pedophiles deviant behaviour with children Individuals who come together to solve a problem, often using Crowdsourcers questionable methods or chasing dubious goals Provide tools and technical knowhow to cybercriminals, Crime Facilitators Supporters enabling them to launch sophisticated attacks why is it important that we categorize adversaries in structured classes? diff hackers have diff strategies- by knowing the primary type of adversary that we face, we can tailor security concerns some security responses will be ineffective, depending on motivation of hacker identifying trends- if we observe that the majority of new attacks are coming from a particular country or are using a particular technique, we can pre-emptively optimize security measures 1) Identify the category, motivation, and strategy of the attacker 2) Devise a pre-emptive and after-the-fact security response that is tailored to this attacker / motivation combo. young et al. (2007)- hacking into minds of hackers defcon conference in vegas and collected data from conference Study Plan 18 data was collected through handout surveys distributed to participants; 127 ppl filled out when asked if they had participated in hacking activity outside courts bound, 42.5% said yes perception of hacking among hackers and students: moral informal punishment punishment utility value disengagement sanction severity certainty hackers 4.31 1.33 4.84 2.23 2.56 other students 2.65 3.08 4.88 2.74 1.68 students 2.13 3.86 3.79 2.97 1.73 Reading Week 5: Victims of cyberattacks What does it mean to be exposed to a cyberattack? Who are the victims of cyberattacks? exposed = person/org/device has been targeted by/vulnerable to malicious activities conducted via internet/ other digital networks org has had systems compromised by malicious actors who hold data hostage, demanding ransom for decryption keys victims fall into categories like indiv users- phishing, malware, hacking of personal devices → leads to identity theft, fin fraud, privacy invasion organizations- businesses by ransomware, databreaches, ddos (distributed denial of services) → suffer fin loss, reputational damage, legal consequences suffer operational interruptions public sector etities- govt agencies, educ instit, other public orgs → subjected to attacks stealing sens info, causing operational disruption or polit expoinage critical infrastructure- systems/assets vital to nat security, economy, public health, safety → can cause widesprad harm like breach in hospital system or disabling of water/electricity utilities Study Plan 19 IoT device users- internet of things devices (smart home gadgets, cameras, locks) → have vulnerabiities allowing hackers to gain control → physical safety concerns, privacy violations, theft employees- staff members in orgs psych/emotional impact- inc stress, anxiety, ptsd (isolation, guilt, self doubt, even suicidal) physical health issues- high pressure/stress can lead to heart palpatations or other stress related illness job security concerns- layoffs, pressure to resign, esp for orgs w significant operational/financial stress clients/customers- entities that rely on services → may face identity theft, loss of data, service disruption society at large- potential lives at risk What kinds of cyberattacks are people exposed to? phishing- deceptive emails/msgs to trick. to give sense info spear phishing = attacks target at spec indiv ransomware- malicious software encrypts victims files, demands ransom payment to restore access harms like business interruption, fin loss from ransom payments and recovery costs, reputational damage bc loss of clients/partners trust DDoS- make network service unavailable by overwhelming w traffic data breaches- unauthorized access to private data identitiy theft- criminals steal personal info to impersonate other malware- viruses,worms, trojans infect devices to harm/steal data/ backdoors for future IoT device expotation social engineering- echnique to deeive indiv to divulge confidential/personal info (with trust/fear) insider threats- indiv in org expolit access for malicious purposes credential stuffing- stolen username/passwords to gain access to other service (using reused credentials) spyware- secretly monitor users behav and collects into Study Plan 20 man in the middle (MitM) attack- intercept comms btwn 2 parties to eavesrop/manipulate msg supply chain attacks- targeting less secure elements in supply chain to gain main orgs system access factors influencing harm preparedness- orgs w better cyber training, awareness, business ocntutity planning recover more effectively leadership culture- support leaders lead to robust incident response while poor may exarcebate challenges and reduce morale crisis communication- effective internal/external comm strats can mitigate confusion during incident external support- assistance from cyber professionals, incident responders, cyber insurance can reduce overall impact of attack What are the societal and political consequences of exposure to cyberattacks? societal inc anxiety and fear- vulnerable, helpess, paranoia (cybernoia- irrat fear abt tech use) → limit engagement w benef tech or avoid altogether trust erosion- mistrust in IoT manufacturers → stop new tech adoption and inc reliance on traditional systems normalization of surveillance/control- shifted perceptions of personal space impact of relationships- suspect close ones, strained relat, feel not safe in own home impact on vulnerable populations- psych/physiological harms in personal lives and well being → dec empoyee productivity and overall morale, cycle of org dysfunction reputational damage- lost business opportunities, strained relationships w clients and partners, reduced trust within communities social cohesion challenges- stress/harm may disrupt workplace culture/ community interactions political legislation/ govt respose- implement stricted data protect laws, security req for IoT → funding for cybersec initiatives/ awareness campaigns international cybersecurity policy- hacking w foreign actors provoke milit/diplomatic respose → acknowledgement for collaboative internat efforts to combat resource allocation- to cyber measures, threat detection, vicitim support services Study Plan 21 focus on vulnerable communities- tech expolited in intimate partner abuse, design policies to protect domestic abuse survisiors from tech-facilitated violence Week 6: Susceptibility to disinformation, social engineering, and useable security What is disinformation? What is social engineering? disinformation = false/misleading info deliberately created/disseminated to decieve/manipulate others ex. fabricating false ingo, distorting facts, spreading rumors to influence public perception/behavior designed to achieve spec outcomes like swaying public opinion, disrupting social cohesion, interfering in public processes social engineering = technique used to manupulate indiv into dovulging confidential/personal info used for fraudulent purposes ex. impersonating trusted entity, phishing attepts, exploiting psychological tricks to bypass security measures often assoc w cybercrime- attackers trick indiv to reveal sensitive info like passwords, financial details, other private data social engineering = manip technique that exploits human psych to gain cofndiential sens info from indiv, can involve divulging personal info clicking on malicious links, or performing actions that compromosie security often thru tactics that appeal to emotions, curiosity, or urgency. some forms: phishing = deceptive emails/msgs that appear legit to lure victims into providing personal info or opening malicious links pretexting = attacker created fabricated scenario to obtain info from target, often portraying themselves as someone in. pos of authority/trust baiting = offering smth enticing like free software/physical items to encourage victims to engage, often leading to malware installation What disinformation is being spread? Where is it being spread? By whom is it being spread? How is it being spread? what is being spread? health misinfo- like efficacy and safety of vaccines political misinfo- false claims abt candidates, electoral processes Study Plan 22 conspiracy theories- surrounding covid origins or climate change denial attacks relying on deception semantic attacks- phishing emails, fake websites, and other misleading techniques to trick users into revealing sens info or executing harmful actions where’s it being spread? social media platforms- ex. facebook, twitter, yt bc reach vast audiences quickly traditional news outlets- propagate thru opinion pieces, sensationalist reporting in mainstream media websites/blogs- many sites publish often masquerading as legit news sources disinformation traditional phishing attacks have evolved, now manifest in social engineering on social media, cloud applications, and instant msging services spread by whom? individuals- ppl share info (unknowingly) bc cog bias and emot reactions polit groups/activists- to achieves spec goals like voter suppression/ reinforcing partisan beliefs media outlets- due to inaccuracies, sensationalism, editorial bias bots/automated accts- by generating/circulating false narratives at scale cybercriminals/malicious actors organized groups behind sophisticated phishing campaigns how is it spread? social network dynamics- info spreeds virally as users share content within networks w/o verifying accuracy emotional appeals- content designed to elicit strong emot reactions (fear, anger, happiness) tends to be more widely shared echo chambers- environment where indiv exposed primarily to info that reinforces belives = higher susceptiblity to misinfo cog biases- heuristic judgements, like illusory truth effect (repeated exposure to claim = inc belief in its truth) facilitate spread of false info spear-phishing- targeting spec indiv w personalized emails that appear legitimate sppofed websites- create fake webs that mimic legit ones to gather user creds Study Plan 23 scarware- using misleading prmpts creating urgency, convincing ousers to disclose personal info fake links/ URLs- misleading users into licking on harmful links socila media manipulation- utlizing fake accts/msgs to mislead users Why are people susceptible to disinformation? cog biases- humans rely on heursitcs (cog shortcuts) to process info quick, leads to biases like illusory truth effect, exacerbated when indiv dont engage in critical reasoning and instead follow initial intutions or gut feelings confirmation bias social influence- those within echo chambers most susceptible, leads to reinforced views/greater acceptance of misinfo that aligns w preexisting opinions emotional engagement- disinfo appeals to emot which can override rational thinking, more memroable and likely to be shared, indiv may trust emot msgs even when lacking facts identity/beliefs- misinfo can threaten persons identity or worldview, ppl may respond defensively → motivated reasoning source credibility- likely to accepts info from attractive, authoritative, similar sources memory/recall- can affect judgement and decision making, regardless of subsequent corrections education/info deficits- “info deficit model” (assumption that providing accurate info will correct misunderstandings) technological amplification- rapid spread thry social media and other platforms, algorithms to max engagement can inadvertently promote sensational/misleading content over fact lack of media literacy- musinderstandings, not equipped to discern credible from disinformation cognitive oberload- overwhelmed by volume of info, rely on heuristics leading to inaccurate concl trust in authorities- align info from news, celebs, social media influences, folowers accept w/o question limited knowledge- superficial understanding of complex topics =more vulnerable to beliebale simplistic/misleading but compelling/straighforward narratives misinformation as cognitive shortcuts Study Plan 24 What are the predictors of susceptibility to social engineering, and what educative, design, and technical strategies might mitigate the threat? predictors of of susceptibility to social engineering cognitive factors intuitive thinking- indiv may rpely on gut geelings rather than analytic reasoning reasoning leading to poor judgements memory failures- misinfo can became intertwines w existing knowledge, making it difficult for indiv to differtentate btwn fact and falsehood reptition/familiarity- exposure to misinfo inc perceived turht (illusory truth effect) as repeated claims seem more believable social factors source credibility- info from percieved authoritiate/trustworthy soruces more readily accepted regardless of accuracyc group identity- indiv more likely to believe info that aligns w social/political group ⇒ leads to partisan bias social norms- peer behavior/beliefs can influence indiv susceptibility to misinsfo affective factors emotion/mood- emot appeals can inc persuasiveness of misinfo, neg emotions like fear/anger can furhter entrench false beliefs identity threat- corrections that challenge ones worldview/identity can lead to defensive rejection of accurate info, reinforcing misinfo contextual factors echo chambers- indiv who primarily engage w like-minded peers may be less exposed to corrective info, perpetuating false bliefs info overload- vast amt of info available may overwhelm indiv, leadings to reliance on heursics rather than carefule eval security training- self study for lower susceptability, formal training less effective bc engagement/relevance matters more computer literacy- self reported, lower susceptability familiarty w platform- greater awareness of normal behavior frequency of access- understanding/detecting attacks Study Plan 25 duration of use- longer may lead to habutation/deeper understanding of normal vs abnormal behavior security awareness- not always strong standalone predictor, but correlates w reduced susceptinility strategies to mitigate social engineering threat educative strategies media literacy programs- enhance critical thinking/ analytical skills can make indiv more discerning consumers of info, programs that teach lateral redings and evaluation of sources can empower users to assess info cred inoculation techniques- preemptive exposure to techniques used in minsinfo can help indiv develop resistance to future misinfo (ex. by explaining common persuasive tactics) emotional training- teaching indiv to recognize and manage emot reactions can help mitigate impulsive sharing/ acceptance of misinfo tailored training programs- self guided learning, scenarios spec to user roles, regulatly update gamification and interacting learning- engaging formats to teach security practices = improved retention/awareness design strategies user-centered design- prioritize accurate info like prompts to verify info before sharing enhance user ability to identify phishing attempts or suspicious activites feedback mechanisms- enabling users to report misinfo to create feedback loop for corrections/highlight social unacceptability of sharing falsehoods integrating systems that provide immed alerts abt potential threats based on user behavior interface changes- altering social mdia algos to reduce amplificaiton of misinfo/ prioritize reliable sources technical strategies fact-checking tools- can provide real-time corrections to misinfo labeling misinfo- clearly marking false claims/misleading content can boster skepticism and encourgae users to verify info before sharing algo adjustments- tweaking algos to reduce visibility of sources known for spreading misinfo can help prevent its spread Study Plan 26 adaptive seucrity measures- employ machine larning algos to analyzse user behav an predict susceptability, introducing dynamic access controls based on user profiles monitoring/tracking- supportsystems monitoring user ineractions w platforms and providing support when anomalous behavior is detected Week 7: Into the mind of cyber attackers Who are hackers? What are the psychological profiles of white-hat and black-hat hackers? What are their motivations? hackers = gain unauthorized access to computer system white-hat hackers = ethical hackers using skills to improve cybersec, work w orgs to identify vulnerabilities in systems, secure networks, educate abt risks; legal activities sanctioned by orgs they assist psych profile- possess strong ethical framework, drive to help others, motivated by desire for problem-solving, intellectual challenge, commitment to using skills for good motivations- ethical considerations, sense of duty to rptoect others, proffessional advancement in cybersec, financial compensation from orgs they help high creativity, conscientiousness values- high self transcendence, openness to change values, help others and embrace new ideas black-hat hackers = exploit system for malicious purposes; such as stealing data, spreading malware, causing damage; actions are illegal and can have significant consequences for indiv and orgs psych profile- harbor anti-establishment sentiments, feel thrill from rule-breaking, possess strong desire for control/power, lack empathy for victims, view actions through self-justification lens thrill-seeking, revenge, ideological, superiority motivations- financial gain (theft/extortion), notoriety within hacker communities, challenge to demonstrate technical skills, anti-corporate/anti-authority ideological motivations risk-taking, lack of empathy, anti-social tendencies values- often scored on self-enhancement values (power, achievement), lower on self-transcendence ⇒ personal gain not communical well being Study Plan 27 key factors influencing hacker behavior moral disengagement- hackers (esp black hat) rationalize actions thru moral disengagement, convincing self that actions are justified/ victims deserve these conseq perception of risk- hackers often assess, the likelihood of being caught and severity of punishments; percieve low risk of detection bc complexity/anoynimity of cyber env acceptance in peer groups- within certain subcultures, illegal activities may be viewed as courageous/honorable - can diminish percieved social repuercussions/norms intellectual challenge curiosity- explore systems and how they func financian gain ideological- promote social change/ political causes revenge/personal crievances- spec orgs/ppl due to past shwarts’ theory of motivational types of values openness to change and self transcendence = prominent among hackes than conservation values self-enhancement values relating to personal gain and recognition motivate black- hat hackers more What behavioral factors influence the success of a hack? moral disengagement- hackers use cog mechs to justify actions, viewing illegal hacking as acceptable if no damage/ strengthens society view self as “modern day rebon hoods” helping orgs identify/fix security volnerabilties not criminals informal sanction- societal reaction to behavior; many hackers believe that peers, family, society will not judge them hashly for hacking; low level of conern abt social disapproval ⇒ increased willngness to do illegal activities punishment severity- hackers see severe legal penalties (prison tiems, fines) as distant/unlikely to personally affec them, diminishing deterring effect of severe punishments punishment certainty- many perceive low likelihood of being caught- contributes to decision to engage in hacking; aligns w studies that indiv commit crime if they believe chances of apprehended are low Study Plan 28 utility value- calculate perceived benefits of hacking (fame, monetary rewards, intellectual challnege) against risks involved; if they believe that gains outweigh potential costs, they are more inclined to proceed w action community dynamics- w norms and values reinforces behaviors; sense of belonging/recognition among peers can supersede fear of legal repurcussions, esp in subculture that valorizes hacking skills motivational drivers- intellectual challenge/ curiosity are primary drivers social recognition- peer respect esp establishing credibility within hacker community, driver for social validation can lead hackers to try impressing others dislike for conservatism- avoiding obedience to rules/societal norms, tendency can motivate them to circumvent security systems, viewing such actions as rebellion self-transcendence values- can lead to ideologically driven hacking risk-tolerance and thrill-seeking- hackers w adernaline rush may be persistent to find volnerabilties/ engae in hacking activities community/team dynamics- team play and collaborative effort to enhance success of hacking attempt psych traits- high cog ability, problem-solving, tech expertise, better equipped to identify/exploit vulnerabilities in security systems cultural influence- those embedded in hacker culture may adopt innovation, risk-taking, defiance of convenctions → successful hacking endeavors How can defenders exploit the decision-making weaknesses of cyber attackers? target moral disengagement- promote narratives/ awareness campaigns highlighting real world impact hacking on indiv/orgs; by framing hacking as fundamentally harmful defenders can seek to disrupt themoral justifications hackers use enhance perception of informal sanctions- fostering community disapprving of hacking, improve social accountabiliy inc perceived punishment certainty- share stories of arrest, legal actions against hackers, robust monitoring tools leverage the utility value framework- highlight consweq of hacking that outweigh benefits, educational programs abt ethical hacking, letgitiamte avenues to apply skills encourange ethical educations- resources for interest in tech, reduce allure of illegal activities utilize intelligence and cyber forensics- advanced analytics/intelligence tools for security teams to identify patterns in hacker behav and motives, preempting attacks before they happen Study Plan 29 collaboration w law enforcement- sharing methodolgies, intelligence, techniques btwn cybersec community and law enforcement for effectiveness behavioral analysis- defenders can anticipate tactics, techniques, procedures (TTPs) used ex. attackers driver by curiosity/recognition = lead attackers to less critical areas of network for monitoring/counterattacking decoys/honeypot creation- exploit curiosity driven hackers psych manipulation- social factors, leaking false info to manipulate decisions and targets of attackers unpredictability/randomness-defenders change strats once hackers establish pattern. ex. rotating defense mechanisms, updating systems, changing access controls exploiting overconfidence- defenders enhance monitoring and inc visibility of security syst so hackers make mistakes leveraging community ethics- promoting ethical hacking and responsible disclosure practices can shift attackers motivations to constructive conducting simulations and drills- defenders understand potential system weaknesses and refine strats identifying patterns and threat intelligence- patterns in hackers to predict when/how attack might occur, defenders position selves to counteract cyber threat effectively community engagement- building relat w white hat hackers can provide insights into potential attack vectors, proactive approach to security, using knowledge to defend against common attack patterns How can attackers exploit the decision-making weaknesses of targets / cyber defenders? cognitive biases overconfidence- defenders may overestimate ability to detect/terminate attack making them vulnterable normalization of deviance- defenders see minor security breaches/anomalies in behav overtime, may begin to dismiss as typical confirmation bias- defenders may focus on info confirming existic beliefs abt security measures/threat levels, ignoring evidence of new/evolving threats social engineering techniques phishing attacks- attackers manipulate target to divulge sensitive ingo by epxpoiting psych factors Study Plan 30 manipulating morality- attackers may frame actions within context that minimizes percieved wrongdoing ex. claiming greater good creating scenarios of scarcity- stressful/ high-pressure sitations can lead defenders to react impulsively, making poor decisions resource limitations exhausting human resources- launch sustained attacks that overwhelm defeders’ resources causing fatigue and leading to errors/oversights in minorting and incidient respose prioritixation conflicts- orgs may face competing priorities, causin g defenders to neglect certain systems or threats, attackers can exploit these gaps by targeting lower priority systems moral disengagement justification of hacking- robin hood type to exploit moral frameworks pressures within cyber culture- culture glorfies illicit activity, hard for defenders to invoke preventive attitudes/measures info asymmetry knowledge gaps- hackers have technical advantage, axploit knowledge where defenders lack awareness insider knowledge- tailor strategies to evade detection/countermeasures utilixing advance techniques zero-day exploits- attackers may use vulnreabilities unkown to defense advanced persistent threats (APTs)- long-targeted operations to exploit defenders’ cognitive or resource weaknesses undermining their decision-making capabilities overtime pressure and psych manipulation creating false sense of security- well placed misinfo can lead defenders to feel secure, distracting from legit threats leveraging peer influence- attackers can exploit peer dynamics withing orgs, creating dissent/doubt about security policies, leading to weakened understanding motivations/values- find that defenders motivated by conformity/stability (conservation vals) over innovation/exploration (opennenss to change values) ⇒ allows attackers to exploit defenders’ aversion to change and risk leveraging pscyh factors- ex. fear of reporting breaches = defenders cant act swiftly; social engineering to appeal to defenders duty/trust to gain unauth access Study Plan 31 creating complexity- cyber defenders prefer simimplicty so confused when complex/sophisticated attacks exploiting info asymmetry- defenders overconfidence and underestimate risks bc attackers have better knowledge of own tools/techniques targeting intellectual challenges- alude risk-averse defenders who focuson complaince double-edged nature of collaboartion- defenders use online tools to problem solve while attackers use it to strengthen operations, hard for defenders to anticipate/mitigate threats behavioral analysis- understanding emot/ethical considerations of security lets attackers choose right moment to strike executing memeory attacks- fake alerts/misleading reports to misdirect defenders’ attention social engineering- manipulate defenders into revealing sens info, weakening security posture scenario crafting- fake things that align w typical focus of defenders, misdirecting efforts from legit stuff ex. phishing disguised as legit requests Study Plan 32

Use Quizgecko on...
Browser
Browser