Fake News and Disinformation: A Historical Overview PDF
Document Details
![CheeryLion9500](https://quizgecko.com/images/avatars/avatar-16.webp)
Uploaded by CheeryLion9500
University of Galați
Vasil Zagorov
Tags
Related
- Trends in the Diffusion of Disinformation on Social Media PDF
- Purposive Communication Lesson 4 - Effective Communication (PDF)
- Fighting "Fake News" in the Regime of Disinformation PDF
- The Routledge Companion to Media Disinformation and Populism 2021 PDF
- Digital Citizenship PDF
- WK5-Media Literacy and Fake News PDF
Summary
This document traces the history of fake news and disinformation, illustrating how these phenomena have existed across various communication methods throughout history. It examines how the development of oral, written, and print media influenced the spread of misinformation.
Full Transcript
**CHAPTER 1: Define, explain and understand "Fake News and Disinformation"** **Vasil Zagorov, ULSIT** \- Fake news and Technology: From the past to the present \- Defining fake news and disinformation \- Exploring the impact of misinformation \- Types of mis- and dis-information and infodemic:...
**CHAPTER 1: Define, explain and understand "Fake News and Disinformation"** **Vasil Zagorov, ULSIT** \- Fake news and Technology: From the past to the present \- Defining fake news and disinformation \- Exploring the impact of misinformation \- Types of mis- and dis-information and infodemic: **- FAKE NEWS AND TECHNOLOGY: FROM THE PAST TO THE PRESENT** The progressive development of computer technologies, internet networks, and AI has made the issue of disinformation and fake news more real than ever. The urgency of the problem makes many people think that it is a modern consequence of technical development. But is it so? According to eminent historian Robert Darnton: "The concoction of alternative facts is hardly rare, and the equivalent of today's poisonous, bite-size texts and tweets can be found in most periods of history, going back to the ancients" (Darnton, R., 2017). Disinformation and fake news are an essential part of verbal and written communication of humanity. From this point of view, it is very important to understand the main technical ways in which information spreads between individuals and groups in society. If we talk about the main channels and technical terms of communication, historically we can identify 4 main layers: -Oral communication; -Manuscript communication; -Print communication; \- E-communication. It should be noted that these stages are not mutually interchangeable but are built upon each other as layers, with communication practices and their positive and negative aspects, including disinformation and fake news, existing in different forms. Rumors, intrigue, false notes, books with unverified facts, troll posts on social media, etc., are all part of the same information ecosystem. Each of these technical media has its characteristics for the dissemination of information. **Oral communication** Spoken communication, which uses verbal and gestural means, is filled with communicative intentions that often approach disinformation and fake news. Jokes, lies, slander, rumors, intrigue, unsolicited information, legends (fairy tales), public speeches, and sermons can frequently shape an individual\'s perspective in ways that, in the medium or long term, are not in their best interest. Oral communication requires direct contact and, in most cases, is limited in its ability to influence groups of people outside a specific community. The dispersion of oral information across larger geographical areas and time periods, due to the instability of the message, often leads to its distortion, similar to the game of \'broken telephone.\' Mental, verbal, and expressive efforts tend to give less attention to oral communication, but it should be noted that, particularly in sparsely populated areas and subcultural peripheries, the transfer of information, including disinformation, is often carried out by word of mouth. In such cases, fake news can lead to the so-called \'witch hunts.\' The influence of verbal disinformation is exemplified by the Salem witch trials (1692--1693), where 19 people were sentenced to hanging through slander (Alden, 1997, p. 282). The Salem witch trials are a product of verbally delivered slander. **Manuscript communication** Although historical overviews on the topic of fake news and disinformation do not give sufficient attention to handwritten communication, it should be noted that this is the first system to permanently introduce into social circulation facts and information that do not correspond to reality (Hanley & Munoriyarwa, 2021). The technical means of handwritten text dissemination is an extension of oral culture, which implies mass accessibility, durability, and dissemination among literate segments of society by their informational needs. The technique of reproduction from prototype to output and the generation of multiple copies with subjective individuality suggests that information can be \'trimmed,\' supplemented, and manipulated/distorted. In Plato\'s dialogue \"Phaedrus\", Socrates points out to his student, through the words of the Egyptian Pharaoh Thoth, that writing poses a risk for developing one\'s own opinion without having directly observed the things one claims to have studied (Plato, 274-275). Herodotus also addresses the issue of fact falsification through written sources, while the Roman writers Lucian and Aulus Gellius explore the topic more seriously. In his works \"A True Story\" and \"How to Write History\", Lucian first distinguishes at a theoretical level between facts that are part of our surrounding reality and those that are products of artistic invention, specifically reflecting on invented facts that are presented as part of objective reality (Lucian, 1975; 1959). His conclusions about the reportorial nature of early history are particularly important, as individuals lacking the necessary preparation often deliberately distort it. In contrast to flattery, exaggeration for patriotic or selfish purposes, excessive rhetorical devices, or distortion of facts due to carelessness, he concludes: "\...history can have only one task and purpose -- the useful, which can only arise from the truth" (Lucian, 1959, Book IX). The problem of the public circulation of false, fake, or conspiratorial information is thoroughly examined by Lucian\'s contemporary, Aulus Gellius. During his stay in Brundisium, he found \'bundles of books exposed for sale\' at the marina market. The books were cheap and sleazy but authored by famous writers. Gellius shares his impressions: \'All those books were in Greek, filled with marvelous tales, things unheard of, incredible; but the writers were ancient and by no means authoritative: Aristeas of Proconnesus, Isigonus of Nicaea, Ctesias and Onesicritus, Philostephanus and Hegesias.\' He distrusts authoritative authors, analyzes the information in the books, and subjects the claims in the text to critical analysis, making a list of improbable claims about different creatures with strange appearances and habits who live in distant lands (Gellius, Book IX, 4). The problem mentioned by Gellius can be related to the situation of infodemic (an excessive amount of information about a problem that is typically unreliable, spreads rapidly, and makes a solution more difficult to achieve) in Roman society. Gellius even accuses Pliny the Elder of quoting foreign words and using the names of scientific authorities to present incredible and unverified facts in his *Natural History* (Gellius, Book X, 12). During the Middle Ages, manuscript communication was strictly regulated by religious canon. Officially recognized by the church, books were opposed to heretical texts, which were widely persecuted and burned. Cases such as the trial against Vasiliy Vrach (1111 AD), the Albigensian Crusade (1209-1229 AD), and the burning of Maya books by Diego de Landa (1562 AD) illustrate the attitude of official information gatekeepers toward those who became information outsiders. Keeping in mind that the technical nature of manuscript books assumes the possibility of error from protograph to copy and from copy to copy, combined with a lack of information for checking sources (less information), disinformation was widely spread during the Middle Ages. To mitigate this, church and secular authorities created regulations, such as the religious statute of St. Benedict and the secular system of Petia, related to the control of work in university scriptoriums. However, the instability of manuscript culture became evident in the 15th and 16th centuries with the advent of printing. The transfer of manuscript versions to printed formats raised questions about the divergence between the author\'s intent, the distortion of the text over time, and the desires of the end user. This transitional period was associated with significant social contradictions, as exemplified by the Russian Old Believers (Eastern Orthodox Christians who maintain the liturgical and ritual practices of the Russian Orthodox Church as they were before the reforms of Patriarch Nikon of Moscow between 1652 and 1666), a topic that remains relevant even today. **- Print communication**; The Gutenberg press and printing technology from the mid-15th century onwards significantly changed the information landscape first in Europe and then globally (Eisenstein). Printing solved the problem of information scarcity, leading to a gradual increase in literacy and the democratization of intellectual processes. However, several new problems also emerged. **Information overload** was noted as early as 1494 when Sebastian Brant mocked the "Literary Fool" in his satirical poem---a person with an enormous, dusty collection of useless books (Brant, 2010). The possibility of rapidly spreading new ideas in society through cheap mass-produced printed texts led to shifts in social strata and hierarchies. Consequently, censorship and propaganda emerged as mechanisms to control the spread of foreign ideas while promoting one\'s viewpoints. Despite attempts by secular and ecclesiastical authorities and the intellectual elites in science and literature, Europe became flooded with political pamphlets, fictional works claiming credibility, mystifications, pseudoscientific papers, and unverified theories that made fake news an integral part of European life. This situation can be described as **information fabrication** and **information disorder**, concepts introduced by Julie Posetti and Alice Matthews in "A Short Guide to the History of 'Fake News' and Disinformation" (Posetti & Matthews, 2018). Gradually, mechanisms for controlling the situation began to take shape, and in his The Novum Organum, Francis Bacon urged to stop citing and repeating unverified facts from sources and to shift towards direct observation and conclusions from the surrounding environment (Beykŭn, F., 1968). The problem of infodemic and disinformation intensified with the emergence of periodic printing and became critical with the invention of rapid newspaper printing machines in the 1820s and 1830s. The editorial offices of the penny press phenomenon turned into enormous, insatiable monsters with a continuous need for information. The fabrication of news, including false news, became a business. Sensationalism sells, boosts circulation, and enhances the influence of media in society. A notable example is the Great Moon Hoax of 1835. In the realm of geopolitics and daily life, the construction of national state concepts plays a significant role. Throughout the 19th century, vast streams of periodic and non-periodic publications defended the national causes of various peoples in Europe. From the second half of the 19th century, major political ideologies such as socialism, communism, anarchism, fascism, and National Socialism were added to this mix. While they cannot be classified as fake news, titles like "The Communist Manifesto" and "Mein Kampf" can be characterized as predatory printed publications whose ideas oppose entire continents and lead to the deaths of millions. In their desire to assert their historical claims and political ideas on the geopolitical map, national and political ideologies created a vast volume of unverified or deliberately distorted realities. This leads us to consider the staging of reality, beautifully depicted in George Orwell\'s novel "1984". Fake news and conspiracies in print, against which censorship mechanisms were established in socialist countries, literally flooded the states of the former Eastern Bloc after its dissolution. Conspiracy theories, palmistry, phrenology, zodiac signs, and divination books became an inseparable part of the transitional generation from socialism to democracy. It can be said that fake news, both as a commodity and as a tool for manipulation, is an integral part of the communication cycle of books and printed products. The permeability of the system, high print runs, and low prices led to the mass distribution of propaganda, fake news, **misinformation**, and **disinformation**. **E-communication** The development of electricity and technology in the late 19th and throughout the 20th century significantly transformed print communication, introducing audio, visual, and multimedia products that have taken a permanent place in information dissemination and entertainment. Cinema, radio, and later television became key tools for the mass passive and easy reception of news, much of which is veiled manipulation, propaganda, or disinformation. A little over 100 years after the Great Moon Hoax by the New York Sun, a scandal erupted surrounding the panic caused by Orson Welles\' radio play \"War of the Worlds.\" To this day, it remains controversial whether it truly caused panic or if the print media exaggerated the news of the panic, concerned about the growing influence of the radio industry, which was increasingly capturing a larger share of the advertising market at the time. Alongside this, radio and television largely began to displace print media, which now holds its former influence only in geographic and cultural peripheries. Fake news and disinformation reached entirely different scales and channels of distribution in the context of e-communication. The ability for anyone to generate and disseminate information online changed the scale of negative factors associated with the spread of false content. New technical possibilities altered the very mechanisms of information dissemination, while social networks created a new type of communicator and amplified the potential for distortion and manipulation of public opinion. De-institutionalized media, anonymous opinion leaders, scammers, and trolls have taken a permanent place in public discourse, and for much of the literate population, the mechanisms for verifying the authenticity of information remain unclear. A novelty in the new e-information environment is multimedia content that can be entirely generated by AI. From harmless memes to complex deep fake images, these have become a natural part of the information landscape, necessitating significant retraining for library professionals, whose primary task for centuries has been to provide a reliable channel for disseminating accurate, verified, and useful information. According to Julie Posetti and Alice Matthews, the scale and intensity of the problem are unprecedented and it could threaten the very significance of journalism as a social phenomenon and a driving force in modern society (Posetti & Matthews, 2018). Conspiracy theories on social networks pose a similar threat to science as a system. Despite the quality control systems for scientific information, in 2018, the so-called \"Audacious Hoax\" gained notoriety, associated with the names of three young scholars who managed to publish 20 articles with falsified scientific research in prestigious journals (Mounk, 2018). **IMPORTANT:** In technological reproduction, transfer from medium to medium, translation, and modernization, part of the information is often lost. This is the most convenient moment when manipulation---insertions, omissions, or distortions of information can occur. Manipulations may also be spontaneous, accidental, or intentional. **PARADOX:** From a historical perspective, the risk of encountering an environment orchestrated by fake news, disinformation, and propaganda has always been present in the development of civilization. It is logical to pose the question: Is it possible for fake news to create intellectual flexibility and mental competitiveness that underpins the essence of modern Western societies? **- DEFINING FAKE NEWS AND DISINFORMATION** Communicators, recipients, communication channels, and technical means of informing form the information ecosystem or information landscape of every society. Each communication system is subject to positive and negative influences that facilitate or hinder the dissemination and assimilation of information. Negative factors contribute to what is known as information pollution within the information ecosystem. A fundamental concept that plays a negative role in the distribution and proper use of information is defined by the term \"Fake News\". According to the Cambridge Dictionary, Fake News is \"false stories that appear to be news, spread on the internet or using other media, usually created to influence political views or as a joke\" (Cambridge, 2024). Most dictionaries and university resources adhere to similar straightforward definitions: \"False or inaccurate information, shared via traditional media or online, that is presented as a factual news story, image, or video\" (Newcastle University, 2024). Broader definitions are also provided, placing the concept of Fake News within the context of the media environment: \"fabricated information that mimics news media content in form but not in organizational process or intent. Fake news outlets, in turn, lack the news media\'s editorial norms and processes for ensuring the accuracy and credibility of information\" (University of Birmingham, 2024). According to the Merriam-Webster Dictionary, the term Fake News appears and sees general use at the end of the 19th century (Merriam-Webster, 2024). Ireton and Posetti argue that the term Fake News, according to Google Trends data, began to be actively searched in the Google search engine in the second half of 2016 (Ireton & Posetti, 2018). A similar analysis in Google Scholar shows a significant increase in academic interest in the subject between 2015 and 2024. Major review papers have also emerged that aim to present findings in this area (Buitrago, Pastor-Galindo, & Ruipérez-Valiente, 2024). Ireton and Posetti note that over time the term Fake News has lost its clear meaning and has become politicized, turning it into an enemy of serious journalism. Therefore, and due to the fact that the term is not comprehensive enough, the UNESCO handbook "Journalism, fake news & disinformation: handbook for journalism education and training" highlights the concepts of "misinformation", "disinformation" and "information disorder". They emphasize the difference between disinformation, misinformation and malicious information. So, false information is created with the intent to harm an individual, social group, organization, or state and it is inaccurate information, incorrect information. On the other hand, misinformation is false information that, however, is not created with the intent to harm. The third category: malicious information (malinformation) is information based on real facts but also used to harm (Ireton & Posetti, 2018, p. 56). Most of the time, speaking about \'fake news\' combines two concepts: disinformation and inaccurate information. It is helpful to explain that inaccurate information is false information, but the person spreading it believes it to be true. Disinformation, on the other hand, is false information, and the person disseminating it knows it is false. This is a deliberate lie, spread with intent -- evidently, people allow themselves to be actively disinformed by ill-intentioned individuals. **Misinformation** is false or misleading information that is disseminated to the public without being verified and confirmed. The establishment of professional and ethical standards in media and science aims to protect the audience from the presentation of information with false or inaccurate content (Ireton & Posetti, 2018). **Disinformation**, according to Merriam-Webster, is "false information deliberately and often covertly spread (as by the planting of rumors) to influence public opinion or obscure the truth" (Merriam-Webster, 2024). Unlike misinformation, it involves the intentional distortion of information aimed at influencing public opinion for a specific selfish purpose. Some authors, such as Taylor, associate disinformation with false public reports about the state of the economy in the USSR (Taylor, 2016). Disinformation can be observed in the reporting of public funds, the implementation of significant social and economic reforms, political campaigns, and marketing strategies that aim to influence people in specific ways (Ireton & Posetti, 2018). Both concepts lead to **information disorder**, a state in which an individual needs additional preparation to search for reliable information. In communication theory, the terms **communication noise** (general difficulties that hinder the clear and accurate transmission of information) and **information pollution** (contamination of an information supply with irrelevant, redundant, unsolicited, hampering, and low-value information) are used (Orman, 1984). The concept of **staged reality** can also be derived - a reality in which individuals make personal and professional decisions entirely under the influence of disinformation and propaganda. Decisions made in a staged reality are often influenced by products of artistic creativity - books, television shows, and films, rather than by real facts. According to Ireton and Posetti, a third category besides misinformation and disinformation could be designated as "**malicious information**". This is information based on real facts but used to harm an individual, business, organization, or country. For example, it could be an article that unjustifiably reveals someone\'s sexual orientation from the perspective of public interest (Ireton & Posetti, 2018). **EXPLORING THE IMPACT OF MISINFORMATION AND DISINFORMATION** According to Kan, Pizzonia, Drummey, and Mikkelsen, misinformation can take many forms, ranging from an innocent misrepresentation to a blatant lie. Regardless of intent, the damage that misinformation can cause is undeniable. Consequently, it is crucial to identify factors that perpetuate fake news and strategies that can mitigate their influence (Kan, Pizzonia, Drummey, 2021). Negligence, emotionality, misconceptions, inaccurate sources, inattention, and technical errors are the main reasons why individuals, social groups, or journalists/media spread incorrect information. Despite varying degrees of innocence in intent and the capabilities of the communicator, the consequences for the recipient are often significant and can lead to lasting effects. Measuring the impact of misinformation and disinformation is a challenging issue, the scale of which can be assessed through statistical methods, surveys, interviews, and experiments. A good example of a study on the impact of harmful information is the research conducted by Kan's team (Kan, Pizzonia, Drummey, 2021). The historical development of media shows that editorial offices, political elites, and advertisers have always been interested in the number of readers, listeners, and viewers. In this case, every reaction evaluates its market share and assesses how its audiences quantitatively respond to a certain type of news. Experience shows that serious, multifaceted, and in-depth topics are often avoided in favor of sensational materials. It is little known that at the heart of modern mass print are lies, sensation, scandal, and deception. An example is the Great Moon Hoax presented by the New York newspaper *The Sun*, which featured the discovery of life on the Moon in its pages from August to September 1835. Although on 16^th^ September of the same year, the newspaper admitted that it was a hoax, its circulation increased from 4,000 to 20,000 and never fell back to previous levels. This made *The Sun* the best-selling newspaper in the world at that time (Burrows, 1999). In contemporary online society, the number of followers is crucial. For this reason, online influencers often do not provide the information that is needed and useful, but rather what will give them a larger following. **TYPES OF MIS- AND DIS-INFORMATION AND INFODEMIC:** According to Ireton & Posetti, the elements that constitute Information Disorder include: - **Satire and Parody**: These forms of art can often be perceived as credible information in a daily life overwhelmed by information. - **False Connection**: This refers to misleading headlines or visual materials that contain links unrelated to the main content. The most common form of this type of disinformation is clickbait headlines. - **Misleading Content**: This involves the use of quotes, statistics, or photographs intended to present an issue from a specific angle. - **False Context**: Recycled materials are often used in the information space, taken out of their original context. There are numerous examples of reporting elements or photos from past events used to convey content relevant to the present. - **Imposter Content**: Articles and media materials are signed or presented on behalf of recognized public authorities, without their actual involvement in the creation of the content. - **Manipulated Content**: This is authentic content that has been technically manipulated to change its viewpoint. - **Fabricated Content**: This includes text, audio, visual, or multimedia material that is entirely fabricated with manipulative or commercial intent (Fig. 1). Detailed examples regarding these terms can be found in "*A Short Guide to the History of \'Fake News\' and Disinformation"* by Julie Posetti and Alice Matthews (2018). Additionally, several other concepts can be highlighted that complement or expand the context of the aforementioned terms. Historically significant is the concept of **Propaganda**---a targeted information campaign intended to highlight the advantages of a specific religious doctrine or political ideology while simultaneously diminishing the opposing viewpoint to the extent that it can no longer exert social influence. Another important term is **News Hoax**, which refers to a well-planned media deception designed to increase influence and benefit the editorial team. Also relevant is the concept of **Deep Fake**, which refers to well-scripted multimedia products generated by AI that present manipulated statements of real individuals using their actual voices. Part of Information Disorder and Information Pollution that nuance the understanding of the problem includes superstition, lies, rumors, intrigue, and conspiracy theories. 7 types of mis-and disinformation Fig. 1 Seven types of mis-and disinformation[^1^](#fn1){#fnref1.footnote-ref} **REFERENCES:** Audacious Hoax: [[https://www.theatlantic.com/ideas/archive/2018/10/new-sokal-hoax/572212/]](https://www.theatlantic.com/ideas/archive/2018/10/new-sokal-hoax/572212/) Antichni romani / Săstav. \[s predg.\] Bogdan Bogdanov. (1975). Sofiya: Nar. kultura. Beykŭn, F. (1968). *Nov organon* (M. St. Markov, Trans.; A. Bŭnkov, Ed.). Nauka i izkustvo. Brant, S. (2010). *Korabut na glupcite*. Zaharii Stoyanov. Cambridge Dictionary. (n.d.). *Fake news*. In Cambridge Dictionary. Retrieved October 18, 2024, from [[https://dictionary.cambridge.org/dictionary/english/fake-news]](https://dictionary.cambridge.org/dictionary/english/fake-news) Darnton, R. (2017, February 13). The true history of fake news. The New York Review of Books. [[https://www.nybooks.com/online/2017/02/13/the-true-history-of-fake-news/]](https://www.nybooks.com/online/2017/02/13/the-true-history-of-fake-news/) Eisenstein, E. L. (2005). *The printing revolution in early modern Europe*. Cambridge University Press. (Original work published 1979). Geliĭ, A. (1985). *Atsichni noshti* \[Aulus Gellius; compiled by Anna Nikolova; translated from Latin and Ancient Greek by Vladimir Atanasov\]. Sofiya: Nauka i izkustvo. Hanley, M., & Munoriyarwa, A. (2021). *Fake news*. ResearchGate. [[https://www.researchgate.net/publication/354166421\_Fake\_News]](https://www.researchgate.net/publication/354166421_Fake_News) Buitrago López, A., Pastor-Galindo, J., & Ruipérez-Valiente, J. A. (2024). *Frameworks, modeling and simulations of misinformation and disinformation: A systematic literature review*. arXiv. [[https://doi.org/10.48550/arXiv.2406.09343]](https://doi.org/10.48550/arXiv.2406.09343) Burrows, E. G. (1999). *Gotham: A history of New York City to 1898* (pp. 524--525). Oxford University Press. Kan, I. P., Pizzonia, K. L., Drummey, A. B., et al. (2021). Exploring factors that mitigate the continued influence of misinformation. *Cognitive Research: Principles and Implications*, *6*(1), 76. [[https://doi.org/10.1186/s41235-021-00335-9]](https://doi.org/10.1186/s41235-021-00335-9) Lucian. (1959). *How to write history* (LCL 430). Harvard University Press. [[https://www.loebclassics.com/view/lucian-how\_write\_history/1959/pb\_LCL430.1.xml]](https://www.loebclassics.com/view/lucian-how_write_history/1959/pb_LCL430.1.xml) Merriam-Webster. (n.d.). *Disinformation*. In Merriam-Webster.com dictionary. Retrieved October 18, 2024, from [[https://www.merriam-webster.com/dictionary/disinformation]](https://www.merriam-webster.com/dictionary/disinformation) Merriam-Webster. (n.d.). *Fake News*. In Merriam-Webster.com dictionary. Retrieved October 18, 2024, from [[https://www.merriam-webster.com/dictionary/disinformation]](https://www.merriam-webster.com/dictionary/disinformation) Mounk, Y. (2018, October 8). What an audacious hoax reveals about academia. *The Atlantic*. [[https://www.theatlantic.com/ideas/archive/2018/10/new-sokal-hoax/572212/]](https://www.theatlantic.com/ideas/archive/2018/10/new-sokal-hoax/572212/) Newcastle University. (n.d.). *Fake news*. Retrieved October 18, 2024, from [[https://libguides.ncl.ac.uk/fakenews]](https://libguides.ncl.ac.uk/fakenews) Orman, L. (1984). Fighting information pollution with decision support systems. *Journal of Management Information Systems*, *1*(2), 64--71. https://doi.org/10.1080/07421222.1984.11517704 Platon. (2007). *Fedr* \[Phaedrus\]. Planeta 3. Posetti, J., & Matthews, A. (2018). *A short guide to the history of fake news and disinformation*. International Center for Journalism. [[https://www.icfj.org/sites/default/files/2018-07/A%20Short%20Guide%20to%20History%20of%20Fake%20News%20and%20Disinformation\_ICFJ%20Final.pdf]](https://www.icfj.org/sites/default/files/2018-07/A%20Short%20Guide%20to%20History%20of%20Fake%20News%20and%20Disinformation_ICFJ%20Final.pdf) University of Birmingham. (n.d.). *Fake news*. Retrieved October 18, 2024, from [[https://libguides.bham.ac.uk/c.php?g=652508&p=4577876]](https://libguides.bham.ac.uk/c.php?g=652508&p=4577876) Vaughan, A. (1997). The Puritan tradition in America (p. 283). UP of New England. ISBN 978-0874518528. For citation: [[https://www.degruyter.com/document/doi/10.1515/9783110740202-009/html]](https://www.degruyter.com/document/doi/10.1515/9783110740202-009/html) **CHAPTER 2: Defining the problem of "fake news"** **Vasil Zagorov, ULSIT** **- How Misinformation and Disinformation Spreads** Misinformation spreads in the same way as any other type of information. To understand this, one can refer to the basic communication model (Fig. 2) (Ruben, B. D., 2001). If the sender has been misled by a previous communication act, they will spread misinformation. Often, between the sender and receiver, there is information noise, which can lead to the misinterpretation of the message, and this misinformation may then enter the public domain. Therefore, it is the sender\'s responsibility to verify the information before introducing it into public circulation. This is especially true for professionals with significant public responsibilities---politicians, journalists, teachers, and librarians. A sender of disinformation, on the other hand, deliberately acts to manipulate the receiver\'s opinion. Here, three levels of malicious intent can be distinguished, depending on their frequency: sporadic, spontaneous, and professional. ![Diagram showing the most common components of models of communication](media/image2.png) Fig. 2 Basic model of communication. The communicative nature of fake news can be illustrated through examples. For this purpose, two cases of financial and journalistic fraud have been selected -- the first took place in London in 1814, and the second in Sofia between 1899 and 1902. The time and geographic scope aim to demonstrate the communicative consistency and systematic stability of disinformation, which does not depend on language, culture, or time. The first case is known as The du Bourg hoax -- a situation in which a news story of significant geopolitical importance created conditions for financial gain on the stock market. On 21^st^ February 1814, a man in uniform, presenting himself as Colonel du Bourg and aide-de-camp to Lord Cathcart, arrived at an inn in Dover, England, with the news that Napoleon I had been killed and that the Bourbons had been restored to power. He requested that this information be sent to the Admiralty in London via telegraph. After that, \"Colonel du Bourg\" continued his journey to London, stopping at various inns to spread the news. In London, three men dressed in Bourbon uniforms were also seen celebrating the restoration of the monarchy. Throughout that month, there had already been rumors about Napoleon\'s defeat. When the news from Dover reached the stock traders in London, government securities quickly rose in value. However, the lack of official confirmation caused prices to drop, only to rise again when reports of the French \"officers\" celebrating surfaced. Later that day, the government announced that the news of Napoleon\'s death and the restoration of the Bourbons was false, and the stock prices immediately returned to previous levels. The London Stock Exchange\'s committee, suspecting manipulation, launched an investigation. It was discovered that on that Monday, over £1.1 million in government securities had been sold, with most of the stock purchased the previous week. Eventually, eight individuals were convicted of conspiracy to defraud, including Lord Cochrane, a radical member of Parliament and well-known naval hero, his uncle Andrew Cochrane-Johnstone, his financial advisor Richard Butt, and Captain Random de Berenger, who had impersonated both du Bourg and one of the \"French officers.\" Six of them were sentenced to twelve months in prison, and the most prominent were sentenced to public pillory; fines were also imposed. Lord Cochrane was stripped of his naval rank and expelled from the Order of the Bath and the House of Commons (Gurney, W. B. (1814, June). The case presented above is a classic example of disinformation, planned and executed by the organizers in the form of a theatrical production with the goal of financial gain. The information was passed directly by word of mouth as a rumor. To cover the distance from Dover to London, indirect communication was used via a semaphore system. The recipient at the starting point of the semaphore system was unaware of the context of the message and relied on its credibility. Lacking the ability to fact-check, they then became a distributor of misinformation. The spread of the fake news in London was confirmed by a second act of disinformation---French officers celebrating. The case also illustrates counteraction---an official government denial and a court case initiated by a professional guild. The second case is different because it involves the journalistic profession, which is called upon to inform people, not to misinform them. The renowned Bulgarian journalist Stefan Tanev tells an intriguing story about the emblematic Bulgarian newspaper publisher, Stoyan Shangov, who was the editor of Vecherna Poshta. A colorful and sensational figure, Shangov was regarded by his contemporaries as having an unrivaled commercial instinct, and he is considered the pioneer of sensational journalism in Bulgaria. Between 1899 and 1902, the Second Anglo-Boer War was taking place. Bulgarian society closely followed the events and sympathized with the Boers---firstly because British policy in the Balkans was anti-Bulgarian, and secondly because the Boers were a small, freedom-loving people fighting against a large "evil" empire---a theme relevant in the context of Bulgarian-Ottoman political relations after the restoration of the Bulgarian state in 1878. Shangov received reports from Captain Bozukov, who was a volunteer in the Boer army. These reports were well-received by the public, and one day Shangov decided he could use them to boost the sales of Vecherna Poshta. He went to the market in Sofia and bought dried grass, normally used for filling furniture. Then he printed posters informing the people of Sofia that the next issue of the newspaper would come with a special gift for readers---grass from the blood-soaked heroic battlefields of the Transvaal. The next day, the newspaper was literally snatched up by readers. Rival newspapers exposed the story, but there were no sanctions. In this case, Shangov used genuine news provided by Captain Bozukov to create an advertising campaign that promoted a false story to boost the newspaper\'s circulation. He achieved financial success but was exposed, though without any penalties. The negative effect was that after the corrections in the rival newspapers, readers might lose trust in that particular media outlet. Both cases are emblematic because they cover all levels and technical means of communication and illustrate the broad spectrum of mechanisms through which disinformation is spread. **- Identifying Players and Pressure Points** Every individual develops as a person by gathering and processing information from his or her surrounding environment. Browsing and surfing the media and public space (including online spaces) highlights several groups of content consumers. These groups can be loosely categorized by age, as well as by their level of education and professional training. **By age, they are:** - **Children up to 7 years of age**: Extremely vulnerable, as they absorb audiovisual information uncritically. - **Children aged 7 to 12**: They can read but still lack criteria for evaluating information. - **Teenagers and young adults aged 13 to 24**: This age group becomes particularly critical of adult actions. Misinformation affecting this group is a significant concern, as individuals form their life perspectives during these years. - **Adults aged 24 to 64**: These are the most active information consumers and, consequently, the most frequently targeted ones, including by fake news. - **Seniors aged 64 and above**: This group is particularly vulnerable, especially to high-tech misinformation such as deepfakes, particularly on topics related to health or life assessments. In terms of education, individuals can be categorized as uneducated, with primary education, with secondary education, with higher education, and with specific professional qualifications. As evident from Berlo's model, intelligence influences how a message is encoded, transmitted, decoded, and perceived (see Figure 3). The more experienced and intelligent the recipient, the harder it is for them to succumb to misinformation and become a victim of disinformation. Most creators of fake news specifically target the mass consumer with limited life experience, competencies, and education. A diagram of a program Description automatically generated Fig. 3. Berlo's model Every individual is a content creator, and nowadays, the parameters of the online environment allow them to be active participants in the creation and distribution of information. Today, many ordinary people like, share, and create their own content---from short skits to personal experiences. This type of communication is particularly fruitful for spreading misinformation and conspiracy theories about miraculous cures, reptilians, and the flat Earth. Bloggers, vloggers, and influencers are the new heroes of online communications. These individuals continuously generate online content to gain popularity, and followers, and become brand ambassadors for specific product lines. Gatekeepers are individuals (usually specialists in a specific field) or organizations that aim to filter and ensure that only credible and relevant information is allowed within the group they oversee. Within this context, we can also introduce the concept of the \"dark gatekeeper\"---a person who keeps society misled by filtering information. Journalists and publicists are professional categories tasked with ensuring that people are informed according to ethical and professional standards, which include equal access to information, fact-checking, and accurate presentation. Depending on the type of media---print journalism, radio journalism, television journalism, and e-journalism---these fields differ in the technical specifications of communication channels, but not in professional and ethical standards. A particularly relevant topic in this discussion is the media troll---a communication character that exists in the media space to manipulate and distort information. Trolls are individuals who traditionally defend certain political and/or corporate interests while undermining the opposing side's position. Today, entire armies of trolls work for major geopolitical players to monitor their narratives and counter the persistent points of their opponents. The battleground on an informational level, as demonstrated by the Bolshevik strategy during the collapse of the Russian Empire in 1917, is just as significant as the actual battlefield. In this game, artificial intelligence (AI) has also intervened in the last decade. The landscape of content distribution has evolved significantly, leading to various perspectives on how to classify different types of content distributors. There is an ongoing debate among scholars, media professionals, and content creators regarding the most effective ways to distinguish these entities. One common approach is to categorize content distributors into two primary groups: traditional and digital. Traditional content distributors include newspapers and broadcasters, which have long been the cornerstone of news dissemination and information sharing. These platforms typically follow established editorial processes, where trained journalists and editors curate content, ensuring adherence to journalistic standards and ethical guidelines. They operate within a framework that values accuracy, accountability, and transparency, aiming to provide reliable information to their audiences. On the other hand, digital content distributors encompass a wide range of platforms, including wikis, blogs, social media networks, search engines, and online news aggregators. Unlike traditional media, these digital entities often prioritize speed and accessibility, allowing users to create, share, and disseminate information with minimal oversight. The rise of digital platforms has democratized content creation, enabling diverse voices to contribute to public discourse. However, this shift has also raised concerns about the quality and reliability of information, as many digital distributors lack rigorous editorial processes. Another perspective emphasizes the distinction based on the editorial processes employed by content distributors. Some advocate for a classification that separates those entities with a defined editorial process, such as newspapers and certain blogs, from those that rely heavily on algorithmic selection to determine the content presented to users. Search engines and many social media platforms utilize complex algorithms to curate content, often prioritizing engagement metrics over traditional editorial standards. This approach can lead to the amplification of sensational or misleading information, raising questions about accountability and the potential impact on public opinion. Ultimately, the debate surrounding content distributors highlights the complexities of modern information dissemination. As the boundaries between traditional and digital media continue to blur, understanding these distinctions becomes crucial for consumers, policymakers, and content creators alike. Recognizing the strengths and weaknesses of each type of distributor is essential for fostering a more informed and responsible media landscape. This ongoing conversation invites further exploration into the implications of different content distribution models on society, information integrity, and the role of the audience in shaping media narratives. Globally, UNESCO is attempting to regulate the issues surrounding social media and networks. In 2023, an action plan to regulate social media platforms was created to combat online disinformation (UNESCO, 2023). A notable contribution to the field of journalism comes from the non-governmental organization Reporters Without Borders, which publishes the World Press Freedom Index annually (RWB, 2024). The Regulation 2024/1083 (EU), also known as the European Media Freedom Act, entered into force on 7^th^ May 2024, as the first EU regulation aimed at protecting media independence and pluralism. The Regulation introduces safeguards against political interference in editorial decisions and the surveillance of journalists. In this context, it establishes common rules to ensure the proper functioning of the internal market for media services and establishes the European Board for Media Services (EU, 2024). Each nation also has its agency or council to regulate the media at the national level and impose sanctions for the dissemination of fake news. Often, non-governmental organizations, private companies, and "name-and-shame" watchdogs are involved in combating fake news and disinformation. Economic supporters of the media, such as advertisers, foundations, and users, play a key role in the creation and maintenance of media content. Advertisers provide significant financial resources, often determining which media will receive more investments depending on their audience and market value. Foundations and non-profit organizations also support the media, especially when it comes to independent journalism or projects of public interest. Users, in turn, influence the media environment through subscriptions, donations, or their activity on online platforms, shaping how and what content is produced and offered. However, these economic supporters can also exert pressure on the media by funding or endorsing content that serves their interests, including the dissemination of fake news or manipulated information, with the aim of influencing public opinion or market conditions. This dynamic creates a risk of compromising editorial independence and the integrity of information. **Assignment 1:** Try to simulate a communication situation involving the dissemination of fake news using Berlo\'s model. **Assignment 2:** prompt AI to write a short fake news piece with parameters of your choice. Try to identify the elements. **REFERENSSES** Berlo, D. K. (1960). *The process of communication: An introduction to theory and practice*. Holt, Rinehart and Winston. European Union. (2024). *Regulation (EU) 2024/1083 of the European Parliament and of the Council of 11 April 2024 establishing a common framework for media services in the internal market and amending Directive 2010/13/EU (European Media Freedom Act) (Text with EEA relevance)*. EUR-Lex. [[https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=OJ%3AL\_202401083]](https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=OJ%3AL_202401083) Gurney, W. B. (1814, June). *The trial of Charles Random de Berenger, Sir Thomas Cochrane et al* (2007 ed.). Project Gutenberg. https://www.gutenberg.org/ebooks/600 (Original work published 1814). Reporters Without Borders. (2024). *2024 World Press Freedom Index: Journalism under political pressure*. [[https://rsf.org/en/2024-world-press-freedom-index-journalism-under-political-pressure]](https://rsf.org/en/2024-world-press-freedom-index-journalism-under-political-pressure) Ruben, B. D. (2001). Models of communication. In J. R. Schement (Ed.), *Encyclopedia of communication and information*. Macmillan Reference USA. Tanev, S. (1994). *Otvoreni pisma: Spomeni i izpovedi na glaven red. na v. Utro pisani v Tsentralen zatvor* (A. Benbasat, Predg.). Universitetsko izdatelstvo \"Sv. Kliment Ohridski\". UNESCO. (2021, November 9). Online disinformation: UNESCO unveils action plan to regulate social media platforms. [[https://www.unesco.org/en/articles/online-disinformation-unesco-unveils-action-plan-regulate-social-media-platforms]](https://www.unesco.org/en/articles/online-disinformation-unesco-unveils-action-plan-regulate-social-media-platforms) **CHAPTER 3: Psychological Factors Influencing Misinformation** **Gabriela Angelova, ULSIT** **Summary**: In this part of the manual, librarians will understand more about cognitive biases and how they affect belief formation and the decision-making processes. The chapter explains the most common information-processing and social biases. Suggestions on how to professionally address the problem when the user's question is clearly bias-influenced are given. There are examples of what tools and communication strategies to use to gently confront the bias without insulting the person and how to present the information objectively by minimizing as much cognitive bias media messages as possible. **Cognitive biases and their role in belief formation** Cognitive biases reflect the delivery and effectiveness of library services, by impacting both -- librarians and readers. These biases form the pattern of how individuals seek, interpret, and evaluate information, in ways that can lead to misinformation, poor decisions, and limited insights. Understanding how these biases affect critical thinking is essential for librarians to help their readers interact with the information. In order to impact such problematic conducts, the library specialist should differ the types of most common cognitive bias related to the information consumption behaviors. As a leading library association The International Federation of Library Associations and Institutions (IFLA) put emphasis on biases in one of the most popular of their infographics. The simple, yet well defined rules IFLA suggest are useful for librarians when guiding users to carefully select information ([[IFLA]](https://repository.ifla.org/items/c5dbfa34-7d00-47ac-8057-0183a5056438/full), 2017). ![](media/image4.jpg) ***Confirmation Bias*** Confirmation bias is the tendency to search for, interpret, and recall information in a way that confirms one's personal beliefs, while giving less to none attention to alternative information. People feel more inclined to accept information without question when it supports their ideas. Respectively, cognitive dissonance results from information that contradicts their views (Festinger, 1957). Confirmation bias is a significant factor in the polarization of opinions when it comes to fake news. In the context of fake news, confirmation bias plays a major role in the polarization of beliefs. Multiple studies have shown that people are more likely to believe and share information, considered as misinformation if it aligns with their political and ideological viewpoints, and health care attitude. A study of Garrett (2009) suggests that people are more likely to selectively read news which support their political preferences, and this behavior may lead to encountering and believing partisan misinformation. A study by Pennycook and Rand (2018; 2021) also demonstrates that individuals with rigid political beliefs are more prone to fake news that matches with their ideological concept. A study by Taber and Lodge (2006) showed that people with strong political beliefs often interpret and distort sudden information in ways that confirm their viewpoints. This information behavior may lead to a number of situations when the librarians should deal with readers who have strong affinity to favor information that aligns with those ideas. In practice, confirmation bias shapes a favorable environment (so called *echo chambers*) on social media platforms, where people are systematically exposed to information that reinforces their beliefs. The limited and harmful information background makes it harder for them to distinguish between truthful and false information. This also leads to a situation where debunking the falsehood is diminished, as people tend to avoid self-contradicting and question their own judgement. *Know-how for librarians*: When the user seeks confirmation of a specific viewpoint, the librarian should guide them toward balanced resources. Additionally, by fostering curiosity about alternate perspectives, librarians can help readers reduce the influence of confirmation bias in their information-seeking behavior. By encouraging diverse source selection librarians promote multiple perspectives that challenge the reader's assumptions in case of strong bias reasoning. When conducting reference interviews, librarians can ask open-ended questions that help identify bias patterns (so called Socratic questions), and gently lead them toward a more balanced information. ***Anchoring Bias*** Anchoring bias refers to the cognitive bias which causes a person to accept the first piece of information found (the "anchor") when making judgments, even if it's not relevant to the situation. This information consumption behavior is linked to taking risky decisions based on flat, insufficient and superficial arguments. The *anchoring bias* is part of the broader heuristics-and-biases research program of Amos Tversky and Daniel Kahneman (Tversky & Kahneman, 1974). Their work has demonstrated that people often use mental shortcuts to make decisions led by uncertain information backgrounds. Such individuals are prone to scams, impulsive actions, advertisement tricks, as their approach is naive. In the context of library research, *anchoring bias* can affect how readers evaluate the information sources. In most cases, if readers find a persuasive but misleading article at the earlier stage of the research, one may assign to the article way more weight than it deserves compared to more credible sources later on. This can distort the individual's overall judgement on the problem and lead them to grasp misinformation from less relevant resources, and make wrong decisions, respectively. Some of the best examples to showcase the *anchoring bias* reasoning is the reliance of the first results page of the Google search -- no matter if it's accurate, biased, an advertisement or a clickbait. In conditions of information *flood* we are prone to accept these messages in the media that are tempting and easy to grasp. As a result some individuals never thought about the opportunity to develop a more patient and insightful manner of information retrieval and this is one of the biggest challenges for the librarians. *Know-how for librarians*: Offering teachings on reliable databases and advanced search strategies when browsing helps readers to scope a broader range of sources. Librarians can guide users to look beyond the first list results, and try flexible ways to reach information they need. Another crucial skill is to offer guidance on source evaluation by using carefully and individually chosen criteria. As librarians, we must not forget that every individual needs a different approach, and sometimes we must start from fundamentals to spark curiosity and cultivate sustainable habits and healthy skepticism towards sources. *More to know*: *Confirmation bias* and *Anchoring bias* are considered as one the most commonly experienced information-processing biases. Along with them people encounter other cognitive tricks, such as *Selective Exposure Bias* and *Availability Heuristic.* Sometimes *Selective Exposure* is mistaken with other biases from the information-processing categories, and indeed there are similarities. The main difference between them is the timing of the information consumption. *Selective Exposure* occurs consciously driven by an individual who seeks out and reads only information that matches his existing beliefs. The *Confirmation bias* is a mental habit that filters the information to gather, remember and recall only the supporting arguments of one's core beliefs after reading a random content. An illustrative example of how these two biases work differently can be given the following situation: A library user who is fixated on conspirative theories would read only conspirative literature and avoid other sources, especially when contradictive (*Selective Exposure*). Another conspiracy prone person would read diverse information and grasp only the arguments and evidence that support one's conspirative mentality (*Confirmation Bias*). Another commonly seen perception bias is the *Availability Heuristic*, which is considered as a mental shortcut for making judgements, guessing and statements based on the quantity of examples. The individual tends to make decisions and form beliefs based on numerous recent events recalled rather than rely on objective data (Tversky & Kahneman, 1973; Schwarz, Bless, Strack, Klumpp, Rittenauer-Schatka & Simons, 1991). It is important to mention that this bias should not be mistaken with the experience-based learning outcomes, such as *Learning by doing.* Individuals with tendency to *Availability Heuristic* are prone to judge through random examples which are not bonded with their own expertise. ***The Dunning-Kruger Effect*** The *Dunning-Kruger effect* is a concept, which explains why individuals with a mediocre background in certain spheres tend to overestimate their competences, while experienced specialists are more prone to underestimate their expertise (Kruger & Dunning, 1999). The explanation behind this is that people with deeper knowledge are aware of the complexity of the subject and avoid emphasizing on their own proficiency. This cognitive bias derives from a lack of self-reflective assessment of one\'s own abilities. In their primary study, David Dunning and Justin Kruger (1999) tested a group of participants on various tasks related to humor, language, and logic. They found that participants in the lowest quartile of performance are more likely to rank themselves as significantly above the average line. In contrast, the higher performing participants gave more accurate self-reflective feedback, and even lowered their exportations. This opposite behavior is common among the experts in the field and it is called the *Imposter Syndrome*. *Know-how for librarians*: Librarians face a serious challenge when they are communicating with readers that are showing traits of *Dunning-Kruger effect*. To help them gain deeper knowledge, we must create a non-judgmental environment, to actively listen to one's request and use non-confrontational language in order to avoid backlash. A librarian should delicately encourage the discovery of new facts by using open-ended questions and offering more advanced sources. Librarians can subtly introduce the concepts of lifelong learning, self-assessment of knowledge gaps and collaboration with other people without overwhelming the user. ***The Illusory Truth Effect and Misinformation*** The *illusory truth effect* is another cognitive type of bias, and may appear when a person is repeatedly exposed to false information. Following the well-known quote that says "A lie repeated many times becomes the truth", as information specialists we know that the chance people to believe in lie increases if they hear it over and over again (Hasher, Goldstein, & Toppino, 1977). An interesting study examines how the *illusory truth effect* contributes to the wider spread of misinformation. The paper shows how people mistakenly accepted fake information as genuine, even if they knew it was false at first (Vellani, Zheng, Ercelik, & Sharot, 2023). Libraries are more challenged by the higher dissemination of fake news on social media and other digital hubs. As social media is used daily in opposition to the relatively sporadic visits of the local library, the users may be hard to convince that they've been exposed to bad information. As any other cognitive bias, the *illusory truth effect* is stubborn and hard to debunk. What we can do as librarians to guide them is introducing users to fact-checking tools and sources of quality. They can also accumulate collections of credible literature that contradicts popular misinformation. Libraries can also organize media literacy workshops devoted to common local falsehoods and guide the group until the myth is neutralized. So called cognitive biases have a significant role in how people process information and recognize misinformation. By understanding them, librarians are better prepared to address biases among their users through strategic communication and by stimulating society to distinguish the nuances and ambiguities in the information landschaft. **The psychology of misinformation consumption (doom scrolling)** Misinformation is a major social problem that affects public opinion and safety. In this part of the handbook, the psychological mechanisms that underlie the consumption of misinformation are discussed. Factors related to emotional responses, and social influences are examined through various social psychology, and media studies. One of the sustainable librarian\'s roles is to gain an understanding of why people consume and believe in non-legitimate news, and how the psychological factors contribute to the engaging in misleading information flow around us. Disinformation and misinformation have always existed but have gained new prominence with the rise of digital media. The fast transmission of it on social media platforms has brought about deep consequences for society, including public health crises (such as COVID-19), political disruption, and a lack of trust in authorities (Lewandowsky et al., 2017). ***Cognitive Mechanisms Emotional Drivers Behind the Disinformation Spread (doom scrolling)*** *Information Load. The Role of Fear and Anxiety* The huge amount of information in digital media often overwhelms people's cognitive performance. Under high information stimuli, people are less likely to question news, facts, statements etc. Instead, they rely on non-legitimate traits, such as the number of shares or likes a post has received (Pennycook & Rand, 2018). This overload can reduce the ability to distinguish between what is real and what isn\'t. Misinformation often catches on negative emotions such as fear and anxiety. In moments of public worldwide crisis, people may turn to misinformation to find an explanation for the chaotic environment easier (Vosoughi et al., 2018). Misinformation that triggers anxiety is more powerful because it's eye-catching and produces immediate emotional response. Anger and dissatisfaction triggers are significant motivators to catch on misinformation consumption. Studies show that misinformation that causes anger is much more likely to be reposted on social media, as people accept it as a way to warn others of sudden dangers (Vosoughi et al., 2018). This dependence can create an *echo chamber effect*[^2^](#fn2){#fnref2.footnote-ref}, where emotionally charged misinformation is virally shared. *Doom Scrolling as consequence* Doom Scrolling is "the activity of spending a lot of time looking at your phone or computer and reading bad or negative news stories" ([[Cambridge Dictionary Online]](https://dictionary.cambridge.org/dictionary/english/doomscrolling)). The obsessive consumption of negative news on social media is a common behavior in today's everyday life, especially when stressed (in case of health concerns, political elections, societal problems, etc.) This behavior is driven by several mental factors that provoke obsessive browsing for disturbing content. A main consequence of doom scrolling is its reinforcement of existing fears and anxieties, which can boost vulnerability to misleading information (Montag & Elhai, 2020).\ The content discovered during doom scrolling weakens the judgment of individuals and makes them easily rely on negative examples (Tversky & Kahneman, 1973). To gain popularity, social media platforms algorithms enforce this by continuously delivering users content that aligns with their emotional condition, which results in selective exposure bias (Iyengar & Hahn, 2009). Research shows that long exposure to negative news can result in the forming of false realities. This is especially concerning when talking about misinformation, because repeated exposure to harmful information triggers the bias (Hasher, Goldstein, & Toppino, 1977). The more often individuals are exposed to fake news during their doom scrolling sessions, the more likely they are to accept it as fact, especially if it confirms their existing viewpoint and fears (Pennycook & Rand, 2021). Doom Scrolling also eases the spread of misinformation by enhancing the proneness to cognitive biases which distort judgment. *The Role of Social Media Algorithms* Digital platforms boost information through algorithms designed to maximize engagement, including misinformation. These algorithms prioritize sensational and emotionally inclined content (Friggeri et al., 2014). Because of the algorithm, people are often exposed to the same triggering content that forms so-called *filter bubbles* (*ideological frame*) or the above-mentioned *echo chambers* ([[American Library Association, 2020]](https://www.ala.org/sites/default/files/tools/content/%21%20Media-Lit_Prac-Guide_FINALWEB_112020_0.pdf)). Facebook researchers have worked out exactly how much exposure people can have to different types of news and information on social media. The study showed that people's choices about content have a bigger impact than Facebook's News Feed algorithm in limiting exposure to cross-cutting content. It's clear that selective exposure to specific content leads to users forming *echo chambers*, where external and contradictory versions are ignored. Furthermore, the lack of experts mediating the production and diffusion of content encourages speculation, rumor and mistrust, especially on complex issues (Zollo & Quattrociocchi, 2018). **Strategies for Addressing Cognitive Biases in Information Evaluation** Addressing cognitive biases in information evaluation is critical for librarians to reduce misinformation and enhance users' critical thinking skills. Cognitive biases distort how individuals process information, but several strategies can help users recognize and mitigate these biases. ***Promoting Media Literacy Education to Cultivate Self-Awareness*** Libraries can go beyond one-on-one interventions by organizing workshops and offering resources that address common misinformation topics. Media literacy training, which includes skills such as source evaluation, recognizing logical fallacies, and detecting fake news certainly helps when identifying and rejecting misinformation. *Example*: A library may organize a workshop circle on "Debunking Popular Conspiracy Theories", to teach participants how to debunk fake arguments like "the Earth is flat", for example. By teaching how to make the difference between conspiracy theories and scientific data, librarians stimulate participants to fact-check further information.\ *Example*: The library creates a resource guide on "How to Identify Fake News." The guide includes tips on spotting harmful websites, checking for authority, and using image searches to verify content. People who engage with sensational news can use these tools to find visual evidence. *Example*: You see a story shared on social media about how information about the "secret cure" for some disease can be read at your library. It is recommended that the librarian give the user automatic reference to scientific health databases such as PubMed, or literature from WHO (World Health Organization), where the user can check this data. Librarians also need to be teaching the signs of a fake news site vs. real. For instance, all.edu websites. edu (educational),. gov (government), and.org (non-profit organizations) are more trustworthy than websites ending in.com or.net. A librarian should help users try to spot their own biases to reduce the risks of misinformation. To instigate self-awareness, librarians can make use of several approaches that may help users observe information from a critical eye. Communication strategies like Socratic questioning, with open-ended questions stimulate the readers toward self-reflection and a deeper analysis. This process, by design, makes them reevaluate their assumptions without attacking what they already think -- and in doing so makes bias realization slightly more naturalistic than defensive. *Example*: A conspiracy theorist may be convinced that the white lines airplanes leave are an example of "chemtrails", sent out by governmental agencies to reduce population in a secret biological program. The librarian may not directly deny it, but ask something like "What scientific evidence supports this theory?" or, "Have you read or found credible sources that offer a different take on what we see going around?" These questions force the person to interrogate their source material and dissolve the confirmation bias through exploration rather than direct confrontation. *Example*: A user may believe that drinking water from plastic bottles can change their DNA. The librarian can ask Socratic questions like, "Have you read any peer-reviewed research supporting this data?" or "What would convince you that this information isn't truthful?" These interventions help the user ask more questions and seek information from scientific sources. ***Introducing Fact-Checking Tools and Advanced Search Strategies*** The promotion of fact-checking tools among library users is a useful strategy for combating misinformation. A diversity of online tools, including fact-checking websites and browser extensions help users in the verification of information and the reduction of the impact of cognitive biases such as the illusory truth effect. *Example:* A person who is exposed to a strong flow of misinformation about the Bermuda Triangle or statements such as "ancient pyramids were constructed by extraterrestrial beings" may begin to accept illusory statements, despite the initial skepticism. The librarian may introduce such people to fact-checking tools such as Snopes or the [[FactCheck.org]](https://www.factcheck.org/) website, which provide researched articles that debunk myths. Or could suggest the use of [[NewsGuard]](https://www.newsguardtech.com/), a browser extension that assesses the credibility of news sources, or [[Google Fact Check]](https://toolbox.google.com/factcheck/explorer/search/list:recent;hl=en), which enables users to verify articles with reliable sources. To help participants avoid anchoring bias, where they tend to focus too much on the first bit of information they find, librarians can show them how to dig deeper into their research. For instance, a participant looking into a controversial topic might rely on the top few Google results, which might not always be accurate or impartial. Librarians can step in and teach them how to use tools like advanced search operators---things like "AND," "OR," or "NOT"---to narrow down or expand their search. They might also point participants toward academic databases like JSTOR or Google Scholar. This way, students are encouraged to look for more information of quality and critically assess a broader range of sources. ***Encouraging Critical Evaluation of Sources*** In addition to the provision of fact-checking tools, librarians can also offer guidance on the critical evaluation of sources. By instructing trainees in the evaluation of credibility, bias, and accuracy of information, librarians facilitate the acquisition of long-term competencies for the detection and refutation of misinformation. *Example*: A library user requests information regarding the purported "healing power of crystals," citing a blog as the source of this belief. Instead of immediately rejecting the clue, the librarian can direct the user to scientific journals and articles that investigate the psychosomatic consequences of belief in alternative therapies, and studies on the placebo effect. The user then may undertake a critical comparison of the legitimate information with the claims made in the blog, and thereby obtain a more balanced perspective. *Example*: A user may frequently look for articles that claim for the benefits of unconventional dietary regimes, such as a long "fruit detox" diet. The librarian can provide guidance on how to assess the qualifications of the authors, ascertain whether the studies referenced have undergone peer review, and search for reviews from reputable health institutions. This enables the user to apply these evaluation skills to future health-related content, thus avoiding the acceptance of suspicious claims and misinformation. ***Bibliography*** Albarracín, D. (2020). Conspiracy beliefs: Knowledge, ego defense, and social integration in the processing of fake news. In The Psychology of Fake News (pp. 196-219). Routledge. Lewandowsky, S., Ecker, U. K., & Cook, J. (2017). Beyond Misinformation: Understanding and Coping with the „Post-Truth" Era. Journal of Applied Research in Memory and Cognition, 6(4), 353-369. Beauvais, C. (2022). Fake news: Why do we believe it?. Joint bone spine, 89(4), 105371. Bowes, S. M., & Tasimi, A. (2022). Clarifying the relations between intellectual humility and pseudoscience beliefs, conspiratorial ideation, and susceptibility to fake news. Journal of Research in Personality, 98, 104220. Bowman, N. D., & Cohen, E. (2020). Mental shortcuts, emotion, and social rewards: the challenges of detecting and resisting fake news. Escolà-Gascón, Á., Dagnall, N., Denovan, A., Drinkwater, K., & Diez-Bosch, M. (2023). Who falls for fake news? Psychological and clinical profiling evidence of fake news consumers. Personality and Individual Differences, 200, 111893. Fazio, L. K. (2020). Repetition Increases Perceived Truth Even for Known Falsehoods. Cognitive Research: Principles and Implications, 5(1), 44. French, A. M., Storey, V. C., & Wallace, L. (2023). The impact of cognitive biases on the believability of fake news. European Journal of Information Systems, 1-22. Friggeri, A., Adamic, L. A., Eckles, D., & Cheng, J. (2014). Rumor Cascades. Proceedings of the Eighth International Conference on Weblogs and Social Media (ICWSM), 101-110. Gagliardi, L. (2023). The role of cognitive biases in conspiracy beliefs: A literature review. Journal of Economic Surveys. Garrett, R. K. (2009). Politically motivated reinforcement seeking: Reframing the selective exposure debate. Journal of Communication, 59(4), 676-699. Gjoneska, B. (2021). Conspiratorial beliefs and cognitive styles: An integrated look on analytic thinking, critical thinking, and scientific reasoning in relation to (dis) trust in conspiracy theories. Frontiers in psychology, 12, 736838. Guess, A., Nyhan, B., & Reifler, J. (2020). Exposure to Untrustworthy Websites in the 2016 US Election. Nature Human Behaviour, 4(5), 472-480. Halpern, D., Valenzuela, S., Katz, J., & Miranda, J. P. (2019). From belief in conspiracy theories to trust in others: Which factors influence exposure, believing and sharing fake news. In Social Computing and Social Media. Design, Human Behavior and Analytics: 11th International Conference, SCSM 2019, Held as Part of the 21st HCI International Conference, HCII 2019, Orlando, FL, USA, July 26-31, 2019, Proceedings, Part I 21 (pp. 217-232). Springer International Publishing. Hasher, L., Goldstein, D., & Toppino, T. (1977). Frequency and the conference of referential validity. Journal of Verbal Learning and Verbal Behavior, 16(1), 107--112. Kruger, J., & Dunning, D. (1999). Unskilled and unaware of it: How difficulties in recognizing one\'s own incompetence lead to inflated self-assessments. Journal of Personality and Social Psychology, 77(6), 1121-1134. Nickerson, R. S. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology, 2(2), 175--220. Pennycook, G., & Rand, D. G. (2018). The Implied Truth Effect: Attaching Warnings to a Subset of Fake News Increases Perceived Accuracy of Stories Without Warnings. Management Science, 66(11), 4944-4957. Pennycook, G., & Rand, D. G. (2021). The Psychology of Fake News. Trends in Cognitive Sciences, 25(5), 388-402. Schwarz, N. et a. (1991). Ease of retrieval as information: Another look at the availability heuristic. Journal of Personality and Social Psychology, 61(2), 195-202. Soon, C., & Goh, S. (2018). Fake news, false information and more: Countering human biases. Institute of Policy Studies (IPS) Working Papers, 31. Taurino, A., Colucci, M. H., Bottalico, M., Franco, T. P., Volpe, G., Violante, M., \... & Laera, D. (2023). To believe or not to believe: Personality, cognitive, and emotional factors involving fake news perceived accuracy. Applied Cognitive Psychology, 37(6), 1444-1454. Tversky, A., & Kahneman, D. (1973). Availability: A heuristic for judging frequency and probability. Cognitive Psychology, 5(2), 207--232. Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185(4157), 1124--1131. Van der Linden, S., Panagopoulos, C., & Roozenbeek, J. (2020). You are fake news: political bias in perceptions of fake news. Media, culture & society, 42(3), 460-470. [[https://doi.org/10.1177/0163443720906992]](https://doi.org/10.1177/0163443720906992) Vellani, V., Zheng, S., Ercelik, D., & Sharot, T. (2023). The illusory truth effect leads to the spread of misinformation. Cognition, 236, 105421. https://doi.org/10.1016/j.cognition.2023.105421 Vosoughi, S., Roy, D., & Aral, S. (2018). The Spread of True and False News Online. Science, 359(6380), 1146-1151. **CHAPTER 4: Fact-checking techniques** **Gabriela Angelova, ULSIT** This section outlines essential fact-checking techniques that librarians can use to teach and assist their users. By introducing them with the tools to critically evaluate online information, librarians contribute to the promotion of informed decision-making and they make the online environment safe for the community circle they belong to. As the facts are objective and our opinion is rather subjective it is crucial to check and verify if the information we consume is of quality. There are two types of fact-checking -- the preventative fact-check and post-publish fact-check, according to [[UNESCO Handbook]](https://unesdoc.unesco.org/ark:/48223/pf0000265552). As the rage of today's information industry makes it hard to be preliminary preserved, in this chapter we will focus on how to help our users to protect their self online environment out of potential harm. **Internet Scams and Deceptive Tactics** The internet is rife with scams and deceptive tactics aimed at tricking users into sharing personal information or believing falsehoods. Librarians should teach users how to identify these scams to protect themselves from falling prey to online manipulation. *Phishing*: Phishing involves fraudulent messages that appear to come from legitimate sources, tricking users into revealing personal information, such as passwords or credit card numbers. These messages often contain subtle cues, such as unusual URLs or generic greetings ([[Federal Trade Commission Consumer Advice]](https://consumer.ftc.gov/articles/how-recognize-and-avoid-phishing-scams)). *Deepfakes*: Deepfake technology uses AI to create highly realistic but fake videos or audio, often to cancel public figures or creating fabricated events. Deepfakes can be difficult to detect without specialized tools, making it essential to educate users about their existence and how to verify the authenticity of video content ([[Encyclopedia Britannica]](https://www.britannica.com/technology/deepfake)). *Social Engineering*: Social engineering scams exploit human psychology to manipulate users into divulging confidential information. Scammers may impersonate trusted authorities or create a sense of urgency to pressure individuals into making quick, uninformed decisions. Users should be taught to recognize common social engineering tactics and verify requests before responding. The best way to prevent the influence of Social Engineering is to set security policies that would make it harder to allow malefic introduction. Some of the most efficient protections are: Password management -- set a strong password, change it regularly and keep it private; Multi-factor authentication -- use additional verification when accessing social profiles and e-mails. Email security with anti-phishing defenses -- active phishing defenses of emails in order to direct the phishing-suspected letters in the SPAM folder. ([[Cisco]](https://www.cisco.com/c/en/us/products/security/what-is-social-engineering.html)). *Clickbait*: Clickbait refers to sensationalized or misleading titles designed to attract users into clicking on unreliable or harmful websites. Teaching users to critically evaluate headlines can reduce temptation to open such content ([[Sprinklr Social Media Glossary]](https://www.sprinklr.com/social-media-glossary/clickbait/)). **Criteria for Assessing the Reliability of Sources** One of the fundamental aspects of fact-checking is the ability to assess the reliability of sources. The following criteria can serve as guidelines for library users when evaluating information sources: - *Authority*: Does the author have credentials or expertise in the subject area? Reliable sources are often written by subject-matter experts, scholars, or organizations with a reputable standing in the field. For instance, an article on climate science by a renowned climatologist is more authoritative than one by a layperson or an individual without relevant credentials. - *Purpose and Objectivity*: What is the purpose of the source? Is it to inform, persuade, entertain, or sell something? A reliable source is usually objective and provides balanced information rather than promoting a specific agenda. Library users should be cautious of sources that exhibit bias, including political, ideological, or commercial motivations. - *Publication Date*: Is the information current? Some topics, especially in fields such as science, technology, and health, require up-to-date information. For example, scientific discoveries or health guidelines may change over time, so it is essential to check when the source was published. - *Publisher and Website Domain*: Who is responsible for publishing the content? Trusted publishers include peer-reviewed journals, academic institutions, and well-known media outlets. Websites with domain extensions such as ".gov" (government), ".edu" (educational institutions), and ".org" (nonprofit organizations) tend to be more reliable compared to those with ".com," which may be driven by commercial interests. - *Evidence and References*: Does the source provide evidence for its claims? Reliable sources often include citations, references, or links to other reputable works. The inclusion of data, statistics, and research-backed findings strengthens the credibility of a source. - *Cross-Verification*: Can the information be corroborated by other reputable sources? Reliable information can often be verified through multiple independent sources. If several trustworthy sources provide consistent information, this increases the likelihood of its accuracy. ***The CRAAP Method*** The CRAAP method, developed by California State University, is a widely used framework for evaluating the reliability of information sources. It provides library users with five key criteria: - *Currency*: Is the information up-to-date? Depending on the topic, the timeliness of information can significantly impact its usefulness. This is especially important in fields such as technology, medicine, and science, where information can quickly become outdated. - *Relevance*: Does the information meet the needs of the user? Users should assess whether the information is appropriate for their research and if it covers the topic adequately in depth and scope. - *Authority*: Who is the author or publisher of the source? The credibility of the source is often tied to the qualifications and expertise of its creator. Information from experts in the field or reputable organizations holds more weight than content from unknown or non-expert sources. - *Accuracy*: Is the information supported by evidence? Users should be encouraged to verify the information by cross-referencing it with other sources and checking if the claims are backed by data, statistics, or research. - *Purpose*: What is the purpose of the source? Users should evaluate whether the source is intended to inform, persuade, entertain, or sell something. A reliable source typically offers balanced information without trying to push a particular agenda or exhibit excessive bias (Blakeslee, 2004). CRAAP method is a framework that served as a developing base of information literacy games, such as [*[Information Trap Manager]*](https://www.navigateproject.eu/itm/) and [*[The Navigator]* [ ]](https://www.navigateproject.eu/navigator/) (check out [[Navigate Project]](https://www.navigateproject.eu/)) Teaching users effective fact-checking techniques is critical for ensuring they can accurately assess information. These techniques empower users to verify claims across multiple contexts and platforms -- a skill also known as *Transliterracy* (check out [[TLIT4U Project]](https://translit-eu.unibit.bg/)). ![](media/image6.png) ***RADAR Approach*** The RADAR Approach is another information checking framework designed to help users critically evaluate online sources by focusing on five criteria: Relevance, Authority, Date, Appearance, and Reason for Writing. This approach combines both a checklist format and reflective questioning, making it adaptable for digital sources, particularly in contexts where misinformation or agenda-driven content might be present. - Rational: Why did the author or publisher make this information available? Is there obvious and/or extreme bias or prejudice? - Authority: Who authored or published the information, and what are their qualifications? - Date: How current is the information, and has it been updated recently? - Accuracy: Was the information reviewed by editors or subject experts before it was published? Was it fact-checked? How do you know? - Relevance: What is the intent behind the content? Is it meant to inform, persuade, or sell something? Does the information answer your research question? The RADAR approach is particularly useful for evaluating content that comes from less traditional sources like blogs, news sites, or social media, as it prompts users to think beyond surface-level information (Mandalios, 2013). **Fact-Checking Techniques and Tools** ***Lateral Reading***: Lateral reading encourages users to leave a website and open new tabs to cross-check the information with other trusted sources. This technique allows users to verify whether different outlets or websites present similar information, ensuring it's not isolated or misleading. ***The SIFT Method***: Created by Mike Caulfield, SIFT is a rapid assessment method that users can follow when encountering unfamiliar or questionable information: - Stop: Pause before sharing or believing the information. - Investigate the source: Find out more about the website, author, or publisher. - Find better coverage: Look for more reputable sources that cover the same topic. - Trace claims, quotes, and media to the original context: Determine whether information has been taken out of context or manipulated (Caulfield, 2019). *Example*: If a video clip of a public figure is being shared widely with an inflammatory caption, the user should pause and investigate where the clip originally came from. They might find a longer version of the video or an article providing context that shows the comment was either edited or taken out of context. Source: [[The University of Chicago Library]](https://guides.lib.uchicago.edu/c.php?g=1241077&p=9082322). All SIFT information on this page is adapted from its materials with a CC BY 4.0 license. ***Fact-Checking Websites*** Fact-checking websites are an essential resource for verifying specific claims, especially around political or controversial topics. These sites are staffed by experts who meticulously verify information, often providing transparent sourcing and explanations. Librarians should familiarize users with some of the most respected fact-checking websites: - [[Snopes]](https://www.snopes.com/): A well-known website for debunking myths, urban legends, rumors, and viral content. It covers everything from bizarre claims to more serious political and health-related misinformation. - [[FactCheck.org]](https://www.factcheck.org/): A nonpartisan, nonprofit site that aims to reduce the level of deception in U.S. politics by checking the factual accuracy of claims made by political figures. - [[PolitiFact]](https://www.politifact.com/): Specializes in political fact-checking, using its "Truth-O-Meter" to rate the accuracy of political statements. It covers U.S. politics but also monitors global claims through partnerships. - [[Full Fact]](https://fullfact.org/): A UK-based fact-checking organization that scrutinizes claims made by politicians, the media, and social media. It provides detailed reports on the accuracy of specific claims. - [[AFP Fact Check]](https://factcheck.afp.com/): A global fact-checking service run by Agence France-Presse that tackles misinformation and disinformation across multiple languages and countries. - [[International Fact-Checking Network]](https://www.poynter.org/ifcn/) (IFCN): A global network of fact-checkers that adhere to rigorous verification standards. - [*[DISCERN Instrument]*](http://www.discern.org.uk/general_instructions.php): Originally developed to evaluate health-related information, the DISCERN tool assesses the quality and reliability of written information based on clarity, evidence, and balance. It helps users distinguish between trustworthy and misleading sources by analyzing how well the source presents information, especially when multiple perspectives or complex issues are involved. This tool can be adapted beyond health contexts for evaluating complex topics across various disciplines. Is the publication clear about its aims? Does it mention the sources of information used to compile the content? Does the content discuss treatment options in a balanced and impartial way? Does it specify when the information was produced or updated? (Portillo, C. Johnson, & V. Johnson, 2021) - [*[Health Feedback]*](https://healthfeedback.org/) is an academic fact-checking platform that specializes in verifying health-related news and claims. Its approach involves a network of scientists and experts who analyze the accuracy of public health information by comparing it against established scientific literature. The platform uses a transparent, evidence-based review system, where health professionals assess claims on their scientific merit and assign accuracy ratings. **Verification tools for image, video** Librarians should introduce users to verification tools that help them authenticate the credibility of online content, particularly images and videos. These tools support a thorough evaluation of the media, helping prevent the spread of manipulated content. ***Reverse Image Search**:* Manipulated or misleading images are often used to spread misinformation. Library users can utilize reverse image search tools to find the original context of an image or to see where else it has been used. This helps users identify when images have been taken out of context, digitally altered, or presented falsely as new. *Google Reverse Image Search* and *TinEye* are two primary tools for reverse image searches. Users can upload an image or paste its URL into these tools to check where else the image appears online and whether it is being used consistently or misleadingly. *Example:* A widely shared image on social media claims to show mass protests from a recent political rally. A reverse image search reveals that the image was actually taken several years ago in a different country during a completely unrelated event. ***Video Verifying Tools*** *Digital Image Guide (DIG) Method* The DIG Method focuses specifically on evaluating the credibility of digital images, a critical skill in an era where manipulated images and memes are widely used for misinformation. The DIG Method asks users to: - Analyze: Describe what is in the image and any accompanying text (e.g., captions, date, headlines). - Interpret: Investigate the source of the image, the message it conveys, and the context in which it was created. - Evaluate: Cross-check the image with other sources to determine whether it has been manipulated, misrepresented, or altered. - Comprehending: What judgments can you make about the image based on your evaluations above and the available information. This method is crucial for teaching digital literacy, as images are increasingly weaponized for spreading misinformation. It helps students develop skills in spotting alterations and understanding the biases behind visual content ![](media/image9.png) ***Tools*** [*[InVID (In Video Veritas)]*](https://www.invid-project.eu/): This browser extension is designed for verifying video content. InVID helps users extract keyframes from videos, perform reverse image searches on these frames, and analyze video metadata. [*[Exif Data Analysis]*](https://www.authentic8.com/static/media/uploads/resources/authentic8_fr__siloresearch_osint_exif_data_public.pdf): Exif data (metadata embedded in digital images) reveals details about when and where a photo was taken, along with camera settings. Tools like *[[Metadata2Go]](https://jimpl.com/), [[PicsIO]](https://pics.io/photo-metadata-viewer)* and *[[Jimpl]](https://jimpl.com/)* allow users to inspect this metadata, helping them verify whether an image has been altered or taken out of context. [*[YouTube Data Viewer]*](https://citizenevidence.amnestyusa.org/): Created by Amnesty International, this tool allows users to extract metadata from YouTube videos, including upload date and keyframes, making it easier to verify video content. [*[Deepware Scanner]*](https://scanner.deepware.ai/): Deepfakes are AI-generated videos or audio that mimic real individuals. As these become more sophisticated, it is important to teach users how to detect deepfakes. Tools like help Deepware Scanner are designed to detect deepfakes. Deepware Scanner analyzes videos to detect subtle inconsistencies in facial expressions, voice synchronization, or image artifacts that suggest manipulation. ***Games for Combating Fake News and Disinformation*** A notable example is the Bad News game, which flips the script by putting players in the role of someone spreading disinformation. This reversal helps players understand the mechanics of how disinformation is created and disseminated, thus better equipping them to recognize these tactics in real life. The game uses feedback and narrative-based scenarios to teach players how disinformation tactics like impersonation, emotional manipulation, and conspiracy theory creation work. For example, a study by Roozenbeek and van der Linden on the "Bad News" game demonstrated that exposing players to disinformation tactics through gamification can "vaccinate" them against future misinformation. The concept, based on vaccination theory, posits that exposing people to weakened forms of misinformation tactics helps them build resistance to more sophisticated versions. This research supports the idea that gaming environments can teach users to recognize and reject misleading information (Roozenbeek & Linden, 2018). The [[Navigate Project]](https://www.navigateproject.eu/) focuses on using serious games to teach information literacy. It promotes active learning through digital games that allow students to practice evaluating information and identifying fake content. The games, developed in collaboration with European universities, are based on real-world scenarios where users must distinguish between reliable and unreliable sources, making them more engaged in the process of learning. Criteria used in evaluating these games include Playability, Engagement, and Storytelling, helping students build critical skills in a more interactive and enjoyable way. (Encheva, Krüger & Zlatkova, 2023) ***Bibliography*** Agence France-Presse. AFP Fact Check. [[https://factcheck.afp.com/]](https://factcheck.afp.com/). \[Accessed: October 15, 2024\]. American Library Association. (2020). *Media literacy in an information age: A practical guide for libraries*. [[ala.org](https://www.ala.org/sites/default/files/tools/content/%21%20Media-Lit_Prac-Guide_FINALWEB_112020_0.pdf).] \[Accessed: October 15, 2024\]. Authentic8. EXIF data and OSINT: Public guide. [[https://www.authentic8.com/static/media/uploads/resources/authentic8\_fr\_\_siloresearch\_osint\_exif\_data\_public.pdf]](https://www.authentic8.com/static/media/uploads/resources/authentic8_fr__siloresearch_osint_exif_data_public.pdf). \[Accessed: October 15, 2024\]. Blakeslee, S. (2004). The CRAAP Test. *LOEX Quarterly,* 31, 4. [[https://commons.emich.edu/cgi/viewcontent.cgi?article=1009&context=loexquarterly]](https://commons.emich.edu/cgi/viewcontent.cgi?article=1009&context=loexquarterly) Blakeslee, S. (2010). Evaluating Information: Applying the CRAAP Test. *Meriam Library, California State University, Chico*. [[https://library.csuchico.edu/sites/default/files/craap-test.pdf]](https://library.csuchico.edu/sites/default/files/craap-test.pdf). \[Accessed: October 15, 2024\]. Caulfield, M. (2019). Introducing SIFT, A Four Moves Acronym. *Hapgood (blog). May*, *12*, 2019. [[https://hapgood.us/2019/05/12/sift-and-a-check-please-preview/]](https://hapgood.us/2019/05/12/sift-and-a-check-please-preview/). \[Accessed: October 15, 2024\]. Cisco. What is social engineering? https://www.cisco.com/c/en/us/products/security/what-is-social-engineering.html \[Accessed: October 15, 2024\]. Encheva, M. (2016). Teaching information literacy courses in the context of library and information science education in Bulgaria: challenges and innovative approaches. *Journal of library administration*, *56*(5), 595-602. Encheva, M., Krüger, N., & Zlatkova, P. (2023). Playability of Serious Games in Information Literacy: The Findings of the European Project NAVIGATE. *International Information & Library Review*. Encheva, M., Tammaro, A. M., Yancheva, G., Zlatkova, P., Conti, G., & Maasilta, M. (2024). Towards a STEAM model for digital fluency skills: Perceptions of students and teachers. IFLA Journal. https://doi.org/10.1177/03400352231209491 Encheva, M., Zlatkova, P., Keskin, N. Ö., & Vatansever, İ. (2017). Mobile and information literacy perceptions and skills of library and information sciences and humanities students from Bulgaria and Turkey. *Internatio