Final Test Study Guide - JOUA01H3F (Dec 2024) - PDF

Summary

This study guide provides an overview of the key concepts and terms from weeks 5-12A on introduction to journalism and news literacy from 2024. This guide prepared for the final test on December 17, 2024. Includes detailed descriptions of disinformation and misinformation, as well as a discussion on the use of bots and social media to spread disinformation.

Full Transcript

JOUA01H3F – Introduction to Journalism and News Literacy I Final Test – Study Guide for Tuesday December 17, 2024 *Updated on December 2, 2024 Covers Weeks 5-12A (Oct. 7 – Dec. 2). The test will run from 9am-11am (2 hours) in IC...

JOUA01H3F – Introduction to Journalism and News Literacy I Final Test – Study Guide for Tuesday December 17, 2024 *Updated on December 2, 2024 Covers Weeks 5-12A (Oct. 7 – Dec. 2). The test will run from 9am-11am (2 hours) in IC 130. The test will be multiple choice (10 questions) and short answer (10 questions). You do not need to memorize readings; try to understand the content on the slides listed below; you will not need citations on the test. You do not need to remember whole quotes or specific scholars, just try to understand the main ideas of the slides. The following terms (concepts and lecture slides) could be on the midterm test. Any terms not listed WILL NOT be on the midterm test. Disinformation – Defined o deliberately false information (an intention behind the spread) o info that is intentionally and verifiably false o designed to manipulate people’s perceptions of real facts, events and statements Misinformation – Defined o accidental–one having their facts be wrong o missing information o no intent to mislead audience o false, incomplete or inaccurate bt some truth may exist o information that is uncertain, vague, or ambiguous o biased, subjective, myopic, etc. o deception is not a motive Previous Uses of the Fake News Term: o Satire ▪ provided factual information on current issues using humor ▪ no manipulation ▪ Attempts at humour to present an argument, often in a political context o Parody ▪ non factual information ▪ key: person doing parody don't want to deceive audience-have assumption that audience is in on it with them ▪ audience gets the joke ▪ if audience doesn't gets its a joke they may spread the info o Sensationalism ▪ attempts to appeal to an audience by presenting material that is shocking and exciting ▪ aka yellow journalism: using eye catching headlines to increase views ▪ the great moon hoax in 1835 (discovery of life on mars) ▪ the information is false but they were trying to sell the newspapers Post-truth Era truth is secondary an era in which audiences are more likely to believe information that appeals to emotions or existing personal beliefs, as opposed to seeking and readily accepting information regarded as facial or objective similar to “truthiness” Why People Click (Confirmation Bias) o suggests that users may actively seek and use information that already coincides with existing mental schema, as opposed to seeking information from a variety of potentially conflicting sources o ex) election (trump vs kamala) - people identify with T -if there's pro info online about t –they may seek that info instead of info that goes against T Who Spreads Disinformation? political bots are automated software programs that are written to learn from and mimic real people so as to manipulate public opinion across a diverse range of platforms and device networks o Bots ▪ Understanding Bots Artificial intelligence allows bots to simulate internet users’ behavior (e.g., posting patterns) which helps in the propagation of fake news. boat mimic how people post online ▪ Dampener Bots bots that suppress certain messages, channels or voices. Their goal is to discourage or drown out information or people. suppress certain messages ex) person in BC critical of stephan harper–bots flooded their posts and were abusive to person—he took down post do the the threats–they were dampening critique of stephan harper ▪ Amplifier Bots “bots (that) deliberately seek to increase the number of voices or attention paid to particular voices and messages. [...] these bots increase the popularity, visibility and reach of certain accounts and/or messages online. flood comment sections–so its higher in algorithm like a post–so its high rin social media feed may retweet posts try to promote a person or persons–encourage popularity o Trolls real people sometimes working to upset or confuse users sometimes working to use confirmation bias and other tactics to create political discord russian internet research agency real people in russia were trying to promote pro trump information on american platforms during the US elections ex) man with an account that looked real–put pictures of a kid and pretended to be a different person–so when people look at account people believe information from it to be true Three Democratic Problems w/ Disinformation o 1. Wrongly Informed Citizens ▪ disinformation difficult to recognize ▪ informed citizens are vital to democracy ▪ ways to find facts?--using news literacy o 2. Echo Chambers ▪ the repetition of information, often within an enclosed space or routine that can amplify certain information, without providing access to competing ideas ▪ different problematic sources saying the same thing ▪ keeps people wrongly informed an perpetuates the threat to democracy o 3. Affective Content ▪ content meant to create emotional reactions ▪ create sensationalists content ▪ incit unlawful action ▪ relate to what you feel not what happened Russia’s Social Media Influence Operation o steps taken to plague american o 1. Reconnaissance ▪ get to know the target audience ▪ first hurdle for effectively infiltrating and influencing an audience ▪ social media provides content and data about individuals, communities, relationships, and networks ▪ to influence them, you need to know how they act online ▪ identifies key influencers and adversaries ▪ platforms help bad actors orchestrate enticing tailored influence packages cognizant of each audience members’ professional and personal details o 2. Placement ▪ place disinformation on social media platform ▪ drive narratives that are supporting some sort of ideology ▪ create then place ▪ social media platforms that are pro “free speech” -instagram, tiktok–darker things can be brought up–won't be taken down ▪ platform specific chosen to spread info–as they surveille them o 3. Propagation ▪ the ability to spread narratives and themes to audiences far and wide ▪ social media is designed for this ▪ should a russian theme, narrative from their state sponsored outlets garner a response from mainstream media outlets, the organic audience engagement will naturally further russian aims and mask kremlin influence ▪ bots help trolls ▪ automated repetitions and considerable volume ▪ social bots can be tailored to replicate the appearance and speech of the target audiences making unwitting observers more likely to engage with and believe the falsehoods they spread ▪ what people see first and what they see the most is what they are most likely to believe o 4. Saturation ▪ now work form online to offline ▪ people learn of the information then tell their friends, family, coworkers about it ▪ in everyday conversations ▪ problematic; continue spread of disinformation to more info ▪ people trust their peers and family more then when the information is online Information Literacy (2 STEPS-4 COMPONENTS) o The set of skills and knowledge that not only allows us to find, evaluate and use the information we need, but perhaps more importantly, allows us to filter out the information we don’t need. o 1. Task Definition a) define the problem b) identify the information need i) Different problems require different information (content specific) ii) Clarify the context-its very important iii) Identify the information needed to address the questions raised in that context o 2. Information Seeking Strategies c) determine all possible sources–to help answer the question that needs to be answered d) select the best sources i) best sources determined by context ii) best sources determined by definition of expertise (specialists vs. the multitude) iii) How do we find sources? Civic Online Reasoning (3 COMPONENTS) o Definition: The ability to effectively search for, evaluate, and verify social and political information online o Three Components ▪ 1. Who is behind the information? who wrote the articles–are they credible do they have motives: are sources trustworthy–who's behind the source ▪ 2. What is the evidence? what evidence they provide that led you to believe they are accurate how do they present that to audience–does evidence align with what story is actually about -make sense critical of the information–is it accurate-where it came from ▪ 3. What do other sources say? investigate multiple sources –see if they say same thing -use same stats, experts Source Triangulation validating information by researching, evaluating and comparing multiple sources with the aim of finding credible commonality o Why Source Triangulation? move away from biases of single sources/single methods identify extent of trustworthy information and information problem always be critical Lateral Reading - Definition o When reading laterally, one leaves a website and opens new tabs along the browser’s horizontal axis, drawing on the resources of the Internet to learn more about a site and its claims o evaluate what other sources say about the news source your examining -not the content -if credible read it NDIA - Digital Inclusion definition: o Digital Inclusion refers to the activities necessary to ensure that all individuals and communities, including the most disadvantaged, have access to and use of Information and Communication Technologies (ICTs). o everyone has equitable access to technology and digital world o a digital divide on nations, countries-effects access people have to information and be at disadvantage o Digital Inclusion must evolve as technology advances. Digital Inclusion requires intentional strategies and investments to reduce and eliminate historical, institutional and structural barriers to access and use technology. o assess systemic issue of why certain countries don't have access to technology o how to provide people with access to AI and technology–so there is no disparity o Policy Recommendations 1) government agencies addressing technology and education questions should develop an expansive vision of digital inclusion that includes the use of technologies, with a focus on information literacy initiatives in general, and source triangulation outcomes in particular a) information literacy -the ability for government agencies to focus on providing communities with information literacy tactics–so they know what information is truthful and disinformation 2) support and complement the work of NDIA by funding efforts to realize information literacy skills via schools, libraries, and other non/for profit programs to address misinformation and disinformation challenges a) not just technology but information literacy itself b) not just government but enact it in schools and library-where citizens can go to places and access the information for free c) not just educating people –but make it accessible by making it free to people (free education)--so there is equitability in countries for information access Gatekeeping Definition: the process through which information is filtered for dissemination, weather for publication, broadcasting, the internet, or some other mode of communication Internet governance by platforms Definition: a growing area of inquiry examines private information intermediaries, such as social media platforms, in enacting global governance via platforms design choices and user policies two ways where designs of platforms restrict content 1) you can only post what the platform allows you to post Content moderation definition o the organized practice of screening UGC (user- generated content) posted to internet sites [...] in order to determine the appropriateness of the content for a given site, locality, or jurisdiction. o with (so much UGC) has come the need for platforms and sites to enforce their rules and relevant or applicable laws, as the posting of inappropriate content is considered a major source of liability. o The style of moderation can vary from site to site, and from platform to platform, as rules around what UGC is allowed are often set at a site or platform level, and reflect that platform’s brand and reputation, its tolerance for risk, and the type of user engagement it wishes to attract. o all platforms dont have same policies–what appropriate in one website may not be appropriate for another o done on a platform level o no only do platforms make policies–of kind of content allowed–has to work with government policy of the platform –tiktok canada work with Canadian governments policy–vs in China o Badging ▪ by seeing checkmark-know its account they say they are o Removing users ▪ that violates the policy of gov or platform o Directing users: ▪ providing users supplementary information and sources to further engage with the topic o Removing content ▪ platforms can be taken down if they don’t obey the legal laws ▪ maybe not removed as a user but your content may be removed for policy purposes Content moderation problems o A problematic job for algorithms ▪ What is an algorithm? o A problematic job for people Roberts – Lack of mental health resources o Moderating social media o Burnout and volume o A balanced approach Algorithm vs. humans as content moderators roberts argues that there should be a mix of both humans and algorithms have algorithms flag potentially harmful content and have humans check over them o content moderators ▪ not as quick ▪ consequences of engaging with problematic content: emotional tolls due to traumatizing content-need mental resources doing it for low wages may not be transparent of why they allow or don't allow content burnout-so much content that need to moderates become less effective less critical of content they are moderating o Algorithm ▪ set of instructions for solving a task-them completes a task-written by people ▪ instructions follow a computer ▪ analyze millions of data points-give info back ▪ Limits: don't have nuisance perspective-humans can interpret the information-algorithms don't have sam perspective: if a journalists speak on terrorisim–may flag them for being pro terrorism ▪ goal of algorithms is often to analyze data ▪ they don't understand social norms like humans do ▪ three important components of algorithmic processes 1) the instructions (the algorithm) 2) the data analyzed 3) the use of the results Benefits of social media for journalists o create a brand-a persona o create trust o be unique and creative o journalist can interact with other journalists o seek new sources o metric based: in newspaper person may have bought a newspaper with their story but not for the story- in social media they can know how many people accessed their story Double-edged sword of aggregators - News media organizations “simultaneously fear missing out on the massive audiences offered by such platforms but also worry about the long-term trade- offs of allowing technology companies to supersede them as publishers - websites that collect news stories - ex) google news just organize news for public but aren’t journalists themselves - bring popularity to articles-but aggregators make more money from it -less people go to the website to read it instead they read the info from aggregators - platform are now paying journalists Affecting other readers o what's being said effects how one process info o User ratings suggest a news story is strong. o “Most read” stories will be promoted more. o read comments- do they affect the way you think of the news story ▪ Focus on specific aspect ▪ Implement bias ▪ Quality of comments suggests news suitability to intellect. Cultural influence on journalism - what audience want to read–base decisions on cultural takes - border systemic forces pla out at the micro-level in news production by sharing the cultural context in which newsmakers make assumptions about who/what is newsworthy and who/what their readers will judge significant or insignificant What constitutes citizen journalism? o ordinary community members commenting on issues affecting their communities o not formally trained o not employed by legacy outlet o generate own content o Often has point-of-view, rather than impersonal detachment o May conduct “sousveillance” ̈ ▪ Grassroots holding media and those in power accountable ▪ Create counter-narratives challenging authority o Outside citizen surveillance that state and platforms enact o Footage or snapshot of events generated by ordinary citizens. o Bottom-up, grassroots approach from ordinary individuals. ̈ o Rather than top-down imperatives of professional news reporting The fourth estate o Journalist act freely without being influenced by other institutions (aka-people in power) o An independent news media or press that operates outside the boundaries of the established societal power structure o Hold powerful people accountable to the public o represent the interests of the populace rather than the dominant groups, and their independent power to directly and independently challenge those dominant groups. The fifth estate o Citizen journalism as the fifth estate o Remedy for 4th estate -hold them accountable o members of broader publics attempting the work of holding those in power accountable o members of broader publics surveilling, investigating, reporting and sharing Citizen journalism’s influence o can threaten factual reporting if purchased by other outlets o fabricated? shows only part of the story? o Why do newsrooms use CJ? - cheap to produce - cheap to buy - the only content available on story

Use Quizgecko on...
Browser
Browser