Summary

This document appears to be a review sheet for a midterm exam for a sociology course (SOCI 221). The review covers topics such as media, social structures and agency, and interactions between individuals and the broader social contexts.

Full Transcript

Exam #1 30 questions (15 points): 20 true-or-false questions & multiple choice questions (10 points) 2 short-answer questions (5 points) Structure & Agency: Social Structures: social groups/organizations, durable systems, patterns of social life. usually act as external for...

Exam #1 30 questions (15 points): 20 true-or-false questions & multiple choice questions (10 points) 2 short-answer questions (5 points) Structure & Agency: Social Structures: social groups/organizations, durable systems, patterns of social life. usually act as external forces that constrain or enable choices. Example: traditional family structure. Agency: intentional actions, the capacity to make choices unconstrained by external structure. Example: choose to have a non-traditional family. ○ Structure and agency in the digital age: Why are pop songs 3 minutes long? Should digital platforms remove content deemed inappropriate? How does social media transform the news industry? Enter Does generative AI have values? Why do people sometimes believe conspiracy theories? Our choices are based on the interactions between individual choice, technology, and social factors ○ Ex. which one do you use: Facebook or Instagram? Why don’t you use Facebook or Instagram at all? The push-and-pull interactions between the different elements of the social world are considered in terms of structure and agency ○ Structure: stable arrangements of institutions that shape recurring patterns of social behavior. Act as external forces that constrain or enable choices. ○ Agency: intentional actions, the capacity to make choices unconstrained by external structure Digital Culture & Society Sociological perspective: Human behavior is shaped by social forces We are social beings We need to understand the relationships between individuals and the broader social contexts in which we live. Our social backgrounds influence our attitudes, behavior, and life chances. Although we all differ from one another, we share many basic understandings and values that make us similar. Where do we get these shared values? ○ Socialization is a process in which individuals acquire social values, standards, and norms of a specific society and culture. ○ social institutions that provide the experiences of socialization family, school, peer group, work, media ○ Media shapes socialization Form collective norms and expectations: deliver messages about norms and expectations. Mediated reality: What is media? ○ Communication outlets or tools used to deliver information ○ A way to present the world and the reality A presentation is not equal to reality A presentation is related (we want to understand this relationship) to reality ○ Your relationship to the media can shape your relationship to the reality ○ Types of media: Traditional media: Tv, radio, print newspaper Digital media: social media, smartphone, video game Others: recorded music, smart speaker, outdoor ads Pervasiveness of media in everyday life ○ In Canada: in 2023, it is estimated that people spend about 5.5 h/day on digital media, and about 4.5 h/day on traditional media. ○ Media used on day to day basis: social media, smartphone, video game, TV, radio, print newspaper Media also shapes perception of reality ○ material world ( e.g., the pandemic and COVID-vaccine) ○ non-material culture: what is true ( beliefs), what is important ( values), and what is expected ( norms). The experiment of media censorship and free VPN in China: The Impact of Media Censorship: 1984 or Brave New World?: Does providing access to an uncensored internet lead citizens to acquire politically sensitive information? ○ What they did Provided free VPN (virtual proxy network) to college students in China ○ Findings: Access to uncensored internet alone has little impact Incentives to visit Western news outlets lead to an increase in acquisition of politically sensitive information Acquisition of politically sensitive information changes students’ knowledge, beliefs, attitudes, and intended behaviors Mass media: Use sociological perspective - specifically the concepts of structure and agency - to understand media, digital culture, and communications systems. Key players that influence the digital media system: Common users regard digital media as a source of entertainment, information, and self-presentation Advocates believe that digital media can shape public opinion in their favor Digital media companies & workers generate content and profit from it Digital media and other social institutions: The digital media system is best understood by considering the social, economic, and political contexts surrounding it. Key players shaping it: platform/technology (Instagram recommendation algorithms) Content producers/influencers Individual users Others? Ex.: If the business owner knows Instagram is toxic for teen girls, why do they choose to do nothing? ○ Economic factor: business mode ○ Legal factor: lack of regulation Where to find solutions? - Public policy - Market - Culture How to understand digital media using a sociological perspective? Think of digital media as a complex system and situate it in a larger social context Structure and agency - interactions between key players and factors Search engine: When you want to search a term online: - Where does the result come from? - Are your results the same as others’? What external forces might shape the results? - Will the search result influence your opinions? Case Study: What;s behind an online search? Factors shape Google’s search result: Technology: searching algorithm User: your personal information, including location, browsing history, age, gender Economic incentives: they’re advertisements Social factors: regulation/law, social norm By searching “CEO” in Google image in the U.S in 2015, a research found this: - % of women in the search results: 11% - % of U.S CEOs who were women: 27% - Survey: exposures to biased information over time can have a lasting effects on everything from personal preconceptions to hiring practices What can we do about it? Public awareness can make a difference Autocomplete in search engines: - A feature predicts the rest of a word a user is typing - When the writer writes the first letters of a word, the program predicts possible words as choices - Designed to increase search efficiency: originally designed to help people with disabilities - The list of autocomplete is generated based on the popularity of search and your search history, but is adjusted to hide ‘inappropriate’ results Use your agency! What can we do if we find inappropriate predictions? - Report it to the public - Report it to the search engine company Media and Technology Pop music’s “three-minute rule”: Why are pop songs 3 minutes long? - One side of the 10-inch flat record only accommodate a 3-minute recording - The 3-minute rule stands regardless of technological advancement (record players are not really used much anymore) Technology can shape the material and immaterial world, but society also has the power to shape technology. - Technology’s development & application are the result of the people who create, deploy, regulate, and use them. - Technology is not always used as it was designated to, and it may generate unintended effects. This is how history is made. Technological determinism vs. social constructivism: Does the invention of digital technology change society? Or does society determine the invention and usage of digital technology? Assume that your partner wants to break up with you. Do you mind the medium through which they tell you that? Which one of the following mediums is most and least acceptable? And why? - Social media post - SMS - Phone call - Letter - Face to face How important is the medium compared to the message? The medium through which a message is delivered plays a crucial role in how it’s received and interpreted. As Marshall McLuhan famously said, "The medium is the message," meaning the form of communication can influence its impact, sometimes even more than the content itself. While the message’s substance remains important, the medium shapes how it’s experienced, often influencing perception and behavior. In modern contexts like social media, the medium and message are deeply interconnected, as platforms dictate the way information is shared and consumed. Ultimately, both medium and message matter, but their significance depends on the context. Technological Determinism: Technology itself causes social change “What technologies do to people” Overwhelming and inevitable effects of technologies on users, organizations, and societies Technology as an external force imposed upon society Society is influenced by technology Criticisms No human agency is assumed Ignore cultural, norms, and regulations effects Social Constructivism: sees social reality as socially constructed “What do people do with technologies?” Technology is made up of objects and people choose how they use it Technology is influenced by society In Coded Bias, the issue of algorithmic bias is framed within the debate between technological determinism and social constructivism. Technological determinism sees technology as evolving independently, with biases as inevitable outcomes of progress. However, the documentary aligns with social constructivism, which argues that technology reflects human design and social influences. It shows how algorithms, shaped by human biases, are not neutral, particularly as facial recognition technology fails to identify people of color accurately. The film asserts that AI systems mirror societal inequalities rather than acting as impartial forces. Marshall McLuhan: Medium is more than a way to transmit messages but in fact is the key to its own social impact The medium is the message: the message transmitted is less influencing than the media Format, not content, matters Deterministic view of media type as shaping the content of a message An optimistic view of technology Neil Postman: A pessimistic view: TV led to the decline in the seriousness o f public life by encouraging particular ways o f thinking and speaking – undermining democracy Hyperreality: Representations that are no longer based on reality but hardly distinguishable from reality Digital divide and the digital survival challenge: What digital skills do we need to survive in a digitalized world? And who (don’t) have such skills? Digital Survival Challenge 72-hour network survival challenge… in 1999, China - 12 volunteers were provided with a room with a bed, dial-up network, a roll of toilet paper, and 1,500 CNY (can buy 600 lb. pork) in cash and 1500 electronic currency. They were asked to survive within the room for 72 hours. - 1 volunteer with no experience in using the internet quit after 26 hours, as he was unable to use e-commerce service to purchase anything. The same technology can be used in different ways and generates different impacts on individuals. The digital divide describes the differences between social groups in access to and use of digital tools. - Haves vs. have-nots - Skilled vs. unskilled Access to the Internet reflects existing inequalities. By 2020, Access: 6% o f Canadians did not have access to the Internet at home. Among this population , 39% reported cost as the reason; Speed: three-quarters (76%) o f Canadians living in metropolitan area had a download speed o f 50 Mbps or more , compared with less than half of those living outside these areas (48%) Basic digital skills could be a privilege and make a big di f ference during the crisis period (e.g., the pandemic) Digital Divide in Canada Technology may increase inequality in some cases. Solutions: - Access divide: increase infrastructure capacity (e.g., boost connection) - Skill divide: provide skills training (e.g., free class in local library) The documentary, Coded Bias: Algorithmic bias: Coded Bias demonstrates that the biases in these systems are often unintentional but reflect the biases of the people who create them. The algorithms can perpetuate and even amplify existing inequalities. For instance, predictive policing algorithms can disproportionately target minority communities, and hiring algorithms may discriminate against women or certain racial groups based on patterns in historical data. Facial recognition: The film opens with Buolamwini’s revelation that facial recognition algorithms are less accurate for people of color, especially women. Buolamwini’s research shows that the AI systems are trained on biased datasets primarily composed of lighter-skinned, male faces, leading to higher error rates for darker-skinned people and women. Global Impact: The film discusses how AI-powered surveillance technologies are being deployed worldwide, often without sufficient regulation or oversight. In China, facial recognition is used for mass surveillance, while in the United States, these technologies are increasingly used in law enforcement. The film also covers the work of activists and researchers who push back against these developments, advocating for ethical standards and regulations to address bias. Calls for Regulation: One of the central themes of the documentary is the need for greater regulation and transparency in AI development. Buolamwini and other activists featured in the film argue that governments must hold tech companies accountable and ensure that AI is developed with fairness and inclusivity in mind. Coded Bias ultimately highlights the risks of unchecked AI systems, especially in areas where they can affect civil rights and social justice, calling for greater scrutiny and ethical standards to prevent technology from exacerbating societal inequalities. Social media, Algorithm & AI Dr. Fatemeh Torabi Asr’s research on how algorithm can help us detect fake news (assigned reading) The article discusses how algorithms can help detect fake news by analyzing the language used. Researchers found that fake news often uses more exaggerated, hyperbolic language and lacks specifics. Algorithms can be trained to spot these patterns and flag questionable content. By understanding linguistic cues, AI can assist in identifying misinformation and improving fact-checking efforts. The article emphasizes that while technology helps, human judgment is still essential in combating fake news. Algorithms for predicting domestic violence (assigned reading) The article discusses how law enforcement is increasingly using algorithms to predict domestic violence incidents. Tools range from simple questionnaires to advanced systems like Spain’s VioGén, which assesses risks based on factors linked to violence. However, concerns about these tools include effectiveness, ethical issues, and potential bias. Some experts fear that machine learning could reinforce existing prejudices, raising questions about the fairness and transparency of these technologies. While the technology shows potential, there’s still significant debate around its proper use and efficacy​ Five big ideas about AI (assigned reading) 1. Reinforcing Inequalities: AI technologies can perpetuate existing social inequalities by reflecting biases in the data they are trained on, leading to discriminatory outcomes in fields like employment, criminal justice, and healthcare. 2. Power Dynamics: The development of AI is influenced by those in power, which often means marginalized groups are excluded from decision-making, exacerbating their disadvantage. 3. Non-neutrality of AI: AI systems carry human biases because they are designed, implemented, and trained by humans, embedding societal prejudices into their operations. 4. Sociological Approach: To understand AI’s full societal impact, we must look beyond its technical aspects and examine how it intersects with issues like race, gender, and class. 5. Interdisciplinary Research: Addressing AI’s challenges requires collaboration across fields like sociology, technology, and policy to develop ethical frameworks and solutions for its societal impact. This highlights the social consequences of AI and the importance of a multi-faceted approach to addressing its influence. Social media’s business model Social media logic & surveillance capitalism As a user o f Instagram, stand up if you: - want an ad-free version o f Instagram - are willing to pay $1/month for ad-free Instagram - are you to pay $5/month for ad-free Instagram - are willing to pay $10/month for ad-free Instagram - How much is a single user worth to Instagram? How much is your Instagram data worth? - Advertising revenue o f Instagram in 2022 is 107 billion CAD - Instagram has 1.28 billion users - Each user is worth $84.12/year ($7/month) How does advertising on social media differ from advertising on traditional media? - Targeting: Social media enables precise audience targeting based on user data, unlike traditional media's broader demographics. - Interactivity: Social media ads encourage direct engagement through likes and shares, while traditional ads are more passive. - Cost: Social media advertising is often more cost-effective with flexible pricing, compared to the higher fixed costs of traditional media. - Real-time feedback: Social media offers instant analytics, whereas traditional media relies on delayed metrics. Data collected by Facebook: - Demographics - Behaviors - Interests - Life events - Political views Datafication the process of turning various aspects of everyday life, social interactions, and physical phenomena into quantifiable data that can be tracked, analyzed, and utilized by digital technologies. Datafication in the age of social media: quantify digital activities when digital activities are converted to data, they also become commodities who can purchase these commodities? Surveillance capitalism Why is social media free of charge? Take Facebook’s Business Model for example: - Let users generate data ( for free) - Sell user behavior to data advertisers Advertising accounts for over 90% of Facebook’s annual revenue Capitalism: an economic system based on the private ownership of the means of production and their operation for profit. Surveillance capitalism: the use of predicted user behavioral data as a product for trade. Features of surveillance capitalism: data collection is mostly invisible and o f little cost: tech companies prefer not to inform users about what the collected data is used for automatic collection facilitated by big data and computing technologies Surveillance capitalism: How to resist? Technological breakthrough: Decentralized social media platforms? Public policies: End monopoly powers to expand customer choice (e.g., block deals like Facebook’s acquisition o f Instagram), etc. Individuals’ digital literacy: Stay informed and keep learning ○ Google collects data on more than 80% o f measured web traffic; but you can install Google’s “opt out” cookie in your browsers and pause data collection in My Activity; ○ Alternative browsers: Firefox (the only major browser that is fully open source and not controlled by one o f the web giants) , Opera , Vivaldi , Brave , etc. Civic education and digital literacy programs Characteristics of the internet era Designed and built to be an open, decentralized platform, accessible to anyone using its basic language and protocols The Internet's structure was designed to give users control over their experience: it is a non specialized platform made to accommodate whatever the user wants to do. The Internet is the first medium to embody digitization - a shift from analog to digital media - and convergence - the blurring of boundaries among types of media. Global system of communication whose governance structure transcends the regulatory reach of any single country. Algorithmic bias The recognition algorithm: How can an algorithm learn to recognize subjects? Algorithms learn from data. They find patterns, develop understanding, and make predictions. Most facial recognition algorithms are trained with images that can be freely accessed online (e.g., database o f celebrities). Any potential flaws? Algorithms learn from data. They find relationships and patterns. What patterns can you find from the data? Which group can be recognized more accurately? Racial bias: Racial minority groups are underrepresented in the public database The algorithm has higher error rates in recognizing people from these groups. The study measuring ChatGPT political bias Does ChatGPT have political bias? In the U.S. context, ChatGPT has a strong and systematic political bias toward the Democrats. In the context of Brazil, it systematically leans toward Lula. In the U.K. context, it systematically leans toward the Labour Party For further research: What are the sources of the bias? - Possibility 1:the training data - Possibility 2: the algorithm itself Conclusion: ChatGPT has a strong and systematic political bias, leaning toward the left-libertarian side of the political spectrum.

Use Quizgecko on...
Browser
Browser