Analyzing Digital Culture PDF
Document Details
Uploaded by DaringTourmaline2150
Tags
Summary
This document is about algorithms, specifically analyzing digital culture with regards to its use of algorithms. It explores the public perception of algorithms, how people feel about them, as well as their role in various industry sectors.
Full Transcript
WEEK 10 – ALGORITHMS AND FICTIONS (1) Algorithm: a process or set of rules to be followed in calculations or other problem- solving operations, especially by a computer. Key Takeaways - Algorithms have strong hold on public imaginations, but they are also real socio-technical objects that c...
WEEK 10 – ALGORITHMS AND FICTIONS (1) Algorithm: a process or set of rules to be followed in calculations or other problem- solving operations, especially by a computer. Key Takeaways - Algorithms have strong hold on public imaginations, but they are also real socio-technical objects that can be methodically studied. - How people think and feel about algorithms can matter (as much as) what they actually do. - The rise of foundation models and generative AI both complicates and exacerbates our previous understanding of algorithmic systems. ‘The algorithm, simply put, is just another term for those carefully planned instructions that follow a sequential order (Knuth, 1998). However, when social scientists speak about algorithms, they tend to be less concerned with the mechanical term, and more with the ways in which ‘software conditions our very existence’ (Kitchin & Dodge, 2011, p. ix).’ (Bucher 2017, 31) ‘In the context of this article, algorithms are defined as the codified step-by-step processes implemented by YouTube to afford or restrict visibility through the platform architecture. Technically, algorithms and algorithmic recommender systems “sort, manipulate, analyse, predict” (Willson, 2017, p. 3).’ (Bishop 2020, 2) ALGORITHM IMAGINARY (Taina Bucher,2017) How we feel about algorithms is no less meaningful than how they actually operate Their opaqueness and power forces us to narrativize and imagine their work. We act accordingly to the fictions we envision Searched for specific phrases on Twitter, mostly regarding Facebook (+ ”weird”, +”creepy”) Attempted to make sense of people’s types of reactions ‘Whoa’ moment ‘Sat down and opened up Facebook this morning while having my coffee, and there they were two ads for Nespresso. Kind of a ‘whoa’ moment when the product you’re drinking pops up on the screen in front of you. Just like algorithms track behaviour in order to profile identity, they can be productive of what Jessa calls ‘whoa’ moments – events in which the intimate power of algorithms reveals itself in strange sensations. Even for a tech-savvy journalist like Jessa, there is something peculiarly unexplainable about these whoa moments… Whoa moments arise when people become aware of being found.’ (Bucher 2017, 35). Algorithmic Affect (focus on emotional/philosophical influence) Using the theoretical lens of affect, understood as mood and intensity corresponding to ‘forces of encounter’… the aim is to understand how algorithms have the capacity ‘to affect and be affected’. (31) …the personal algorithm stories illuminate how knowing algorithms might involve other forms of registers besides code. This is to say that what the algorithm does is not necessarily ‘in’ the algorithm as such (Introna, 2016). Rather, we may begin to understand the performance of algorithms through the ways in which they are being articulated, experienced and contested in the public domain. (40) As algorithms are becoming a ubiquitous part of contemporary life, understanding the affective dimensions – of how it makes people feel – seems crucial. If we are to consider the future of the algorithmic intensification, questions arise as to what possibilities for living with and alongside algorithms do these forces of encounter inspire? How does the algorithm perceive its subjects, and to what extent does it influence their sense of self? How, in turn, does the way in which people perceive algorithms affect the logic of the system? (42) Moral panic: is a widespread feeling of fear that some evil person or thing threatens the values, interests, or well-being of a community or society. The Cathedral of Computation (Ian Bogost, 2015) ’Here’s an exercise: The next time you hear someone talking about algorithms, replace the term with “God” and ask yourself if the meaning changes. Our supposedly algorithmic culture is not a material phenomenon so much as a devotional one, a supplication made to the computers people have allowed to replace gods in their minds, even as they simultaneously claim that science has made us impervious to religion.’ (HOW DO YOU COMMUNICATE WITH GODS?) Algorithmic Lore (Sophie Bishop, 2020) Algorithmic experts and algorithmic lore (’a mix of data-informed assumptions that are weaved into a subjective narrative”) SEO but for YouTube: part motivational speakers, part influencers Attempting to generalise across audiences, but often through gendered performance and ignoring unsuitable data While ostensibly independent, and thus more trustworthy, ultimately train creators to align with YouTube and ignore systemic issues on the platform BUCHER BISHOP CONTEXT FEELING MEDIATING FOCUS ‘ORDINARY’ USERS SPECIALIST MEDIATORS GEOGRAPHY UNIVERSAL USA, UK URBAN ALGORITMS (2) An urban algorithm refers to a computational process or set of rules designed to analyze, optimize, or manage various aspects of urban environments. These algorithms are typically used in urban planning, transportation, infrastructure management, environmental monitoring, and smart city initiatives. They help process large datasets related to urban dynamics—such as traffic flow, population density, energy consumption, and air quality—to improve city planning, efficiency, sustainability, and the quality of life for residents. NEGATIVE (OR POSITIVE) > ALLOWS TO WHOEVER CONTROL THESE SYSTEMS THE ABILITY TO AFFECT OUR LIFE POSITIVE (RUSSIA EXAMPLE) > As arrests of activists and politicians mounted, the ethics of NTechLab’s technology became a recurring topic at company meetings. NTechLab staff have resisted the use of the company’s face recognition in rallies and refused to sell the technology to the military, according to people familiar with these discussions. Still, the NTechLab leadership concluded that the technology was ultimately positive— even if the occasional dissenting voice was arrested because of it. “We all saw these positive examples, we saw how it really catches criminals,” says one former NTechLab employee. “Most people in NTechLab would say they were doing something very good, technologies that can help and save people’s lives. It really did.” GENERATIVE AI (3) Neural Networks A neural network is a computational model inspired by the human brain, composed of layers of interconnected nodes (or "neurons") that work together to process data and recognize patterns. Neural networks are commonly used in machine learning and artificial intelligence to solve complex tasks such as image and speech recognition, language processing, and predictive analytics. They learn from large amounts of data by adjusting the connections (or "weights") between neurons, allowing them to improve performance on tasks over time. Tokenization Tokenization is the process of breaking down text into smaller units, called tokens. These tokens can be words, phrases, or even individual characters, depending on the application. Tokenization is a key step in natural language processing (NLP) and is used to prepare text data for analysis or machine learning models. By dividing text into manageable pieces, tokenization makes it easier for algorithms to analyze language structure, meaning, and context. Embedding/Vectors The tokens acquire meaning in a multidimensional vectorial space This image illustrates the concept of word embeddings in natural language processing. Word embeddings are vector representations of words in a multi- dimensional space where words with similar meanings or relationships are placed closer together. The arrows show that specific relationships (such as gender or size) can be captured through consistent vector differences between words. For example, the vector difference between "man" and "woman" is similar to the difference between "king" and "queen," capturing gender, while the difference between "small" and "smallest" captures degree or intensity. This allows models to understand semantic relationships between words. Tranformer (Attention) Repeat 1 - Attention 2 – Feed Transformer models are a type of deep learning architecture that uses self-attention mechanisms to process and understand sequences of data (like text) efficiently. Unlike traditional sequential models, Transformers can analyze all parts of a sequence simultaneously, making them highly effective for tasks like language translation, summarization, and question answering. Their encoder-decoder structure allows them to capture complex relationships within the data. x- Post-Training Alignment (Weights) Post-training alignment refers to fine-tuning a pre-trained AI model to make its responses safer, more helpful, and better aligned with human values and intentions. After the initial training, alignment techniques (like reinforcement learning from human feedback) are applied to adjust the model's behavior, ensuring it performs as desired in real-world applications. Ex.: Reinforcement Learning from Human Feedback (RLHF), Adjustment through backpropagation ‘[LLM] scale] as a power-law with model size, dataset size, and the amount of compute used for training, with some trends spanning more than seven orders of magnitude.’ (Kaplan et al. 2020, ) Foundation models: large-scale AI models trained on vast amounts of data that can be adapted to a wide range of tasks. Examples include GPT, BERT, and CLIP. These models serve as a foundational base for various applications, allowing them to be fine-tuned or adapted to specific tasks (like language translation, image recognition, or summarization) with relatively little additional training. Their broad capabilities come from training on diverse data, making them highly versatile and useful across different domains. Are Foundation Models Algorithms? Yes No Operate by procedure Black boxed A sequence of rules: training data, Tweakable, modular (self-)supervision, alignment Biased, but differently Have significant societal effects Algorithm ‘It is a technology that requires a mindset of manifest destiny, of dominion and conquest. It’s not stealing to build the future if you believe it has belonged to you all along.’ (Warzel 2024) Summary The algorithmic imaginary is the way we think and feel about algorithms. This does get incorporated back into the systems we use, through datafication, public discourse, and regulation. Algorithms are present not only in social media, but in many emerging forms of digital culture. They are inherently political, as they regulate how information flows and resources are spent. WEEK 11 – AI AND DISCRIMINATION OUTLINE Why study AI and discrimination? What is AI and how does it work? How does AI encode discrimination? KEY TAKEAWAYS Classification is a common task in machine learning–predicting the category of something—that has a long history in discrimination and racism. Today, classification systems reproduce discrimination and racism. AI, machine learning, and algorithms are all different processes. By the end of the lecture you should understand how and why these differ and why the term ‘AI’ obscures how these systems work. Through an example, you should begin to understand how some of these systems work, how they are trained, and the different areas in which AI, machine learning, and algorithms are applied. You should be able to identify potential points in which bias can emerge through these processes. CLASSIFICATION AND POWER Great Depression, 1933: Under new president FDR, US Congress passes numerous laws to stabilise the economy, including the Home Owners Loan Act. HOLC conducts a ‘City Survey’ project from 1935-1940, which surveys real estate professionals’ judgements of neighborhoods, and gave them a letter grade, and made a series of maps. Low grades were given to neighborhoods with the presence of black people and other “foreigners”, while wealthier, whiter neighborhoods were given the best grade. HOLC appriasals are associated with inequalities and isolation of black neighborhoods that persist even in 2010. REDLINING Redlining is a discriminatory practice in which financial services are withheld from neighborhoods that have significant numbers of racial and ethnic minorities. Redlining has been most prominent in the United States, and has mostly been directed against African-Americans. The most common examples involve denial of credit and insurance, denial of healthcare, and the development of food deserts in minority neighborhoods. Redlining, an algorithm of oppression? HOLC sought to classify “risk”. This reflected how it was trained: industry perceptions and mirrored similar trends in private sector. A low-tech ‘software’ for classification: 1. Collect data from ‘objective’ sources 2. Evaluate the data and make inferences 3. Use inferences to make decisions Technological Redlining “the power of algorithms in the age of neoliberalism and the ways those digital decisions reinforce oppresive social relationships and enact new modes of racial profiling” (2018, p. 1). What is an algorithm? Week 9: ‘The algorithm, simply put, is just another term for those carefully planned instructions that follow a sequential order (Knuth, 1998). However, when social scientists speak about algorithms, they tend to be less concerned with the mechanical term, and more with the ways in which ‘software conditions our very existence’ (Kitchin & Dodge, 2011, p. ix).’(Bucher 2017, 31). Society has always created technical systems that condition human existence! Discriminatory ‘AI’ at Work: Facebook Ads Manager Example of technological redlining Not only is Facebook monitoring us and then selling all this kind of qualities and interest about us to advs to target us based on race, it also values us differently (expl: interest in Malcom X = less value than interest in Donald Trump) (ex: more expensive to target ads to latin-american than people interested in purchasing weapons/guns) ‘Reach the right people’ Cotter et al 2020 The coded gaze - “embedded views that are propagated by those who have the power to code systems” (Buolamwini 2016, in Cotter et al. 2021, p. 8). NOT The unmarked user – white and male is the default category, and others are SO IMPORTANT included in disjointed and haphazard ways. What Cottom refers to as ‘predatory inclusion’ such that diverse categories are included for a corporate, rather than public, interest. Encoding ideology and politics with little regard to the effects of such categorization. AI & Discrimination How does the utilization of social categories of difference (eg. race, ethnicity, sexuality, and gender) by sociotechnical systems affect those categorized as minorities? How are existing systems of power (eg. racism, homophobia, and patriarichy) encoded into these sociotechnical systems? AI datafies humans, classifies them, and infers their identity, behaviour, and values. ARTIFICIAL INTELLIGENCE – 3 DEFINITIONS ‘artificial intelligence system’ (AI system) means software that is developed with one or more of the techniques and approaches listed in Annex I and can, for a given set of human-defined objectives, generate outputs such as content, predictions, recommendations, or decisions influencing the environments they interact with (Draft AI Act, European Parliament, Article 3(1), 2021). An artificially intelligent computer system makes predictions or takes actions based on patterns in existing data and can then learn from its errors to increase its accuracy (Microsoft, 2023). Artificial intelligence refers to the development of computer systems that can perform tasks that typically require human intelligence, such as learning, reasoning, problem-solving, and understanding natural language (ChatGPT 3.5, 2023). --- AI is more of a field of technological development that applies machine learning and computation to address a multitude of human-defined problems. Crawford (in the introduction) argues that it is neither artificial nor intelligent. It is a “registry of power” (p. 8). AI: Products and services, eg. ChatGPT, Goolge Search, Amazon Alexa. Automation of tasks, eg. sorting CVs, identifying offensive content, driving a car. Machine Learning: (is powering AI) Supervised & Unsupervised techniques** Computational power & cloud services (hardware, electricity, time, budget) Application and creation of algorithms Data: (no AI w/o datas, they runs ML) Data collection on users (eg. cookies & social media platforms) Training sets Crowdsourced annotation Proprietary data ownership *Supervised ML: making predictions based on annotated training data Object detection in images Predicting criminality in a city Training a car to drive Unsupervised ML: learning patterns in large datasets Categorizing users of a platform to create advertising segments Product recommendation engines AI systems are products that rely on machine learning to solve a problem How is an image classifier trained? Create an annotated dataset and, on about 80% of it, create a CNN (20% test) Creating an annotated dataset is extraordinarily expensive. Imagine scraping hundreds, if not thousands, of images of corgies in all angles and contexts and then manually verifying them. ENCODING DISCRIMINATION: DISCRIMINATION IS ENCODED TROUGH DATAS IN THIS SYSTEM (es image.net recognizing black people as animals) Fault: who feed the datas Mechanical Turk, or how to train a machine… PEOPLE WHO MAYBE HAVE NO IDEA OF THE SOCIAL CONTEXT OF WHAT THEY ARE CATEGORIZING DOING THE WORK OF CATEGORIZATION! Amazon Mechanical Turk enables ”crowdsourcing” of the task of annotation. Human labour and judgement is used throughout AI, but the role is obscured. Often relies on highly exploitative labour systems and cheap labour in less wealthy countries. For example, ImageNet was trained by using crowdsourced labour to verify image categories and words. Encoding Discrimination - Creating a Language Model Vectorization (coding into numbers) --- WORD EMBEDDING REPRODUCING THE BIASES OF THE ENGLISH LANGUAGE MODEL Google Search Autocomplete Constantly trained on people’s searches, from people shopping to doing research to looking for pornography. (Mostly) unsupervised, so it learns from how people use it, primarily through collocations of words, pretrained with language models. CONCLUSIONS Discrimination is foundational to human reasoning and AI. The way AI is trained means human-constructed social categories of difference (which are not inherentty ty7 t7ty ly true) are translated into AI systems. Discrimination is a product of human knowledge encoded into technical systems created by humans. AI is both a mirror of bias and power in society as much as it actively constructs discrimination, prejudice, bias, and difference. AI is trained by humans to automate tasks that are undertaken by humans, or AI is trained on human-created data (eg. language). AI, at least as we know it, can never be objective because it learns from us and we are subjective beings. When you hear the argument that it is “just math”, remember that the perceptions, beliefs, and biases of those that train these systems are encoded mathematically, not the other way around. Who has the power to challenge AI? The immense capital required to train ML models means that only very few actors with huge budgets are able to shape how AI is trained and to what it is applied. WEEK 12 – CONTENT MODERATION* *To monitor, screen and control content but also there could be the connotation of a calming force, avoiding extremes Is the idea of pre-screening contents before they appears on social media platform – or reported and flagged once is already published on the platform. 1. What is content moderation? 2. Soft moderation 3. Commercial content moderation and digital labor 4. Hard moderation 5. Business model 6. How to study its consequences (and discontents)? 7. Speech arbiters and censorship 8. Politicisation and deplatforming 9. Algorithmic imaginaries 10. Community management and rehab ---------------------------------------------------------------------------------------------------------------- 1. What is content moderation? What is the content to be moderated (and why)? None. Social media platforms born of the promise of the open web, the utopian dreams of enabled voices, unedited, unredacted, free to speak without gatekeepers, without in fact moderation. That did not last, however… Comment space. "Hate speech, offence, profanity, personal attacks, sleights, defamatory claims, bullying and harassment" ‘Conversational health’ versus ‘toxicity’ of comments What is content moderation? Grimmelmann’s (2015) broad definition - the ‘governance mechanisms that structure participation in a community to facilitate cooperation and prevent abuse’ ‘Reluctant project’ taken up by social media companies, because it admits community not functioning well – something that social media HAVE to do but do not want to talk about it. A ‘custodial task’ better ‘obscured’, ‘disavowed’, at least on the removal side – behind the scenes ‘cleaning up act’ What is content moderation (and its study)? A problem. Symbolic moment of recognition of the problem is Wired.com’s ‘open letter to the Internet’ (August, 2016) 2. Soft moderation (made by humans) *‘Artisanal’ content and user scoring *(platform crafted) User receives karma points - identified (+1); anonymous (coward) (-1) Content moderation - labels post as "normal", "offtopic", "flamebait", "troll", "redundant", "insightful", "interesting", "informative", "funny", "overrated", or "underrated", with each corresponding to a −1 or +1 rating ‘User or crowdsourced’ indicators (community-based system in wich users rate each other posts) Content upvotes/downvotes (e.g., YouTube / Reddit) (up=power,recommend) ‘Editorial’ (content moderation is a sort of editing - removing adding filtering, …) Newspaper comment spaces (editor-recommended comments) Moderated comment spaces where content is only occasionally removed by editor (newspapers/blogs) Wikipedia’s hierarchy of users and their decision-making rights 3. ‘Commercial’ content moderation and digital labor on social media platforms ‘Commercial’ content moderation (how most human based moderation work nowadays) Globalized, invisible, outsourced, sub-subcontracted workforce - some 150,000 content moderators worldwide (2017) Low paid job; ‘dispensable’ employees ‘discarded’ after ‘burn-out’ Not organised; no codes of practice Silenced by non-disclosure agreement Digital labour Content moderators have similar employment demographics: - people without work experience; - first job; - desperate for work; - do not appear to know what they are getting into Working conditions Speed - 360 items per hour (one item every 10 seconds) Relentless - ‘real-time work’ Pressure - ‘cannot commit a single error’ Material - content is vulgar Mental health issues - Burn out or become inured or hardened - (Either outcome is undesirable) 4. Hard moderation Hard moderation (algorithmic moderation systems) Algorithmic moderation displays company’s technological prowess Born of pressures to address copyright, terrorist content and toxic speech Hard moderation as responses to business pressures - YouTube responds to Viacom’s copyright lawsuit - Cross-industry threat of terrorist content - Twitter responds to toxic speech (unable to sell the company) Hard moderation as responses to public policy pressures - German NetzDG law requires content take-down in short time frame YouTube Content ID - now used by influencers with over 100,000 followers Twitter Quality Filter - now default for users Hard moderation (algorithmic moderation systems) AI techniques 1) matching (hash matching) 2) classification - categorising the unmatched 1) Hash matching finds matches between content - Such as ChristChurch live shooting videos or videos under copyright 2) Classification uses machine learning to predict how offensive the content is - Every picture and video uploaded to Facebook is given a predictive score that estimates how likely a post is to violate Facebook’s terrorism policies - Content moderation APIs (Perspective, Llama Guard, Open AI content moderation, Mistral moderation API, etc.) – give score to different type of controversy 5. Business model ‘Tech company’ versus media organisation Publishing editorial practices of news organisation versus community rules of social media Names and contact details of editorial board versus anonymous content moderators versus CEOs responding to major cases on Twitter (e.g. Jack Dorsey) Ombudsman versus arduous appeals process Business model – how a platform moderates is how it attracts users and how it ‘keep’ them How does content moderation reflect a business plan? The ‘surprising’ content that is allowed so as to retain customers (advertisers) or the content that is disallowed in order to satisfy governments and access to their markets ‘Competing’ content moderation policies - heavy vs. light moderation - Anonymous, pseudonymous, real name users ‘Foundational’ rather than peripheral (Gillespie’s argument) 6. How to study its consequences (and discontents) - Transparency reports and investigative journalism - Tech press and celebrated cases of demonetisation, deplatforming - User accounts and algorithmic imaginaries (demotion and shadow banning) - Alternative proposals for content moderation 7. Transparency and opacity Transparency / opacity (study transparency reports, investigative reporting, leaked documents) Transparency reports show that content moderation is political but only revealed in general terms (Kurdish worker party content removed after request by Turkish gov’t; abortion videos allowed). Leaked moderation guidelines are revealing. Rather than an intermediary or ‘platform’, Facebook and others are ‘highly regulated sites of speech’ Investigative reporting shows slow trickle of information reveals the ‘underside’ of the business. Initial refusal to acknowledge moderation practices. The politics of opacity. Does transparency shift the focus of attention from Facebook to the content moderators? Leaked Facebook guidelines in the news, 2018 (guidelines were released to the public only after 2020) 8. The politicisation of speech arbiters, censorship, deplatforming Empirical study of content moderation and especially ‘borderline content’, defined as ‘awful but lawful’ or which ‘brushes up against’ but does not cross policies concerning removal. – often demoted>reduction of visibility (shadowbanning, user is not aware of it) freedom of speech does not mean freedom of reach > you can say it but it is not going to be algorithmically spreaded (RELATED TO BORDERLINE CONTENTS) Study rankings of posts and videos over time (by saving search returns daily). Study moderation policies overtime (with Wayback Machine of Internet Archive) 9. Content moderation and algorithmic imaginaries Folk theories for explaining the felt effects of content moderation users thought the flagger was someone specific that they already knew they were flagged in the midst of a contentious discussion they were generally targeted users were flagged owing to their group identity (Muslims / transgender / hostile to feminists and minorities / pro- and anti-Zionist sentiments) evoked the discourse of censorship argued that the platforms are not protecting free speech they suspected they were “shadowbanned,” where their content is made invisible to other users without actually being removed entirely Meyers West reporting on onlinecensorship.org 10. Community management and alternative proposals Folk theories for explaining the felt effects of content moderation How to create a healthy community? ○ More granular information ○ Rehab and restitution (instead of reporting, ‘punishing’, censoring) ○ Demoting as visibility reduction without removal (‘free speech is not the same as free reach’) ○ Demonetisation – (publishing without earning) instead of deplatforming ○ Deplatforming as last resort Key takeaways and summarising Takeaways ○ Content moderation history: from dirty secret to public debate ○ All platforms moderate (business model) – it is never NO moderation, low or heavy moderation, depending on the audience that the platform wants to attract ○ How to study moderation? From definition to operationalisation - Leaks, transparency reports, rankings, platform policies, folk theories ○ Alternatives to ‘censorship’ - Demotion, community building, rehab -- NAPALM GILR????? WEEK 13 – PLATFORMS AND CULTURAL PRODUCTION 2017 – ADPOCALYPSE (apocalypse of Youtube monetization system) > ads related to extreme contents Youtube parter program - excluding advertisement from certain type of content (Sentitive social issues) – MORE CONTROL ON WHERE THE ADS COULD APPEAR but… Collapse of user ad revenue, (and number of videos in general) … To repair this damage, Youtube tried to do better (LGBT contents) but still many creator were affected by the adpocalypse Observations - Economic and cultural dependence: creator struggle to create their rights, there are not many platform alternatives (especially for monetizing contents) - Reshape conditions of cultural production: when they change the terms on the which people can distribute and monetize content, it can have a large impact on the condition on which cultural production is taking place. - Power not excised unilaterally: not by just one actor, result of quite complicate struggle between a variety of actors (e.g. social media platform> monetization of content > depends also on advertiser companies – they have large impact on this decision) Structure lecture 1. Concepts 2. Markets 3. Infrastructure 4. Governances 5. Power & Change 1. Concepts Platforms: data infrastructures that facilitate, aggregate, monetize, and govern interactions between end-users and content and service providers (Poell, 2021). Infrastructrure,market, government framework The App Store as platform market, infrastructure & governance framework Market: aggregates and enable the monetization of interaction between developers and end users … afforded by the app store data infrastructure: allow creator to upload their apps Governs what is shared trough the platform: no sexual, political controversial,… contents *Platformization: penetration of economic, governmental, and infrastructural extensions of digital platforms into the web and app ecosystems, fundamentally affecting the operations of the (not only) cultural industries (Nieborg & Poell 2018, 4276). *Process in which the platform extend itself and an external actor connects to the platform (the moment the actor connects become dependent) According to Nieborg and Poell (2018), platform-dependent and platform-independent relate to how media and content production are tied to digital platforms: Platform-dependent refers to media or content that relies on the infrastructure, affordances, and rules of specific platforms. These platforms shape how content is produced, distributed, and monetized. For instance, creators on YouTube are dependent on its algorithms, monetization policies, and technical features. Platform-independent refers to media or content that operates outside the constraints of specific platforms, offering more flexibility in distribution and interaction. This kind of content is not reliant on the infrastructure or policies of any single platform and can exist across multiple channels or in open formats. This distinction highlights the growing influence of platforms on media production and the challenges of maintaining independence in a platform-dominated digital ecosystem. Platform dependent: in terms of distribution, monetization, and increasingly creation> (eg apple provide software to developers that then can produce content that can run on Ios devices) GAMES: ALWAYS BEEN PLATFORM DEPENDENT, HOWEVER THE PLATFORM ON WHICH GAMES ARE DEPENDENT CHANGED (ex. Before>dedicated console/ after>play store, apple store) Platform independent: ex newspaper, they run their own presses, make their own article and connect readers to advs. + selling subscriptions. > news producers start distributing to platform and slowly became platform dependent. (ARE JOURNALIST WRITING ARTICLE FOLLOWING TRENDING TOPICS? CHANGE IN THE CULTURAL PRODUCTION) > EXCEPT THOSE WHO STAYED INDEPENDENT > GROWING ONLINE SUBSCRIBER BASE (NO ADVS) [variations within industries] ex NYTIMES, Buzzfeed Super apps & global variations Super apps are apps that do-everything; megaplatforms unto themselves. They are particularly prevalent in East Asia. Like China’s WeChat or South Korea’s KakaoTalk, Japan’s LINE has evolved from a single purpose chat app to the do- everything platform for everyday cultural and economic activities (Steinberg 2020, 1). Historical and regional particularities LINE and the regional convergences of super apps in East Asia are a potent reminder of the need to analyze platforms outside of the bi-polar hegemony of the United States versus Chinese tech world—which increasingly frames journalistic discourse and academic research—and of the need to attend to the historical and regional particularities of platforms and their cultural impacts (Steinberg 2020, 1). Emoji Creator: make money trough the generation of stickers – cultural commodity 2. Markets – platform is something that connects actors economically OPENNES It is relatively* easy to content creator to upload your content or offer your service of platforms, in contrast to try to get your content published on radio, television, … (*platforms guidelines and policies) CHANGE IN THE SHARING OF CULTURAL PRODUCTION! DEMOCRATIZATION Explores whether platforms democratize cultural production or reinforce monopolistic tendencies. > Participatory culture *NETWORK EFFECTS (Nieborg & Poell 2018, 4278) Describes how user growth amplifies platform dominance, creating "winner-take- all" markets. Platforms are not only neutral intermediaries, but are actually very powerful economic and financial actors in their own right. Multi-sided markets: different sites to the market which are connected trough the platform *people tend to go to the same platform to connect to the people they know> more content and service provides>more attractive to end user (again) Only few platform succeed in doing that > "winner-take-all"* *The concept of "winner-take-all" by Nieborg & Poell (2018) describes how digital platforms benefit disproportionately from network effects, where the value of a platform increases as more users join. This dynamic creates a scenario where a single or a few dominant players capture the majority of the market share, leaving little room for competitors. For example, platforms like Spotify or Apple Music dominate the music streaming industry because their large user bases attract more content providers and advertisers, reinforcing their leadership. This results in market concentration rather than fostering a diverse, competitive environment. Spotify & Music industry Spotify and Apple Music are undoubtedly the principal means by which the challenge to the recorded music industry once afforded by digital technologies has been contained. The status of music as property, threatened by the early “pirate” sites’ enabling of radical sharing, has now been restored and in recent years the revenues of the global recorded music industry have substantially recovered— though not to the levels achieved at the turn of the century. Revenues now increasingly derive from advertising and subscriptions, rather than the sales of ownable individual items such as “singles” or “albums” that once sustained the late twentieth-century industry (Hesmondhalgh et al. 2019, 10). Industry, in which legacy media companies were too strong that they could also start dictating the terms on the which platform themselves started to operate 3. Infrastructures Planetary-scale computation Data centres that permit the functioning of platforms (consume an elevate amount of energy) - allow us to share and watch video, have videocalls, streaming service, … Minor media companies rely on giant tech company that host their data service in cloud inside of these infrastructure data centres – DEPENDENCY (ex. Zoom>Amazon web services) Boundary resources All the resources that are provided by the platform to be able to access the infrastructure of the platform and to share your content. However, they also determine what you can or cannot do in the platform (regulations) 3. Infrastructures (Expanded) This section explores the foundational technologies and resources that enable platforms to operate at scale and integrate with external systems: 1. Planetary-Scale Computation: Platforms rely on vast computational infrastructures that span the globe, including data centers, cloud computing, and content delivery networks. This infrastructure supports the high-speed processing, storage, and distribution of enormous amounts of data, enabling platforms to provide seamless services to users across different regions. For example, platforms like YouTube or Facebook depend on these systems to handle billions of interactions daily, from video streaming to personalized content recommendations. o Impact on Cultural Production: These computational resources allow platforms to centralize and analyze user data, facilitating personalized content delivery and creating opportunities for targeted advertising. However, this also reinforces platform dependency, as creators must adapt to the infrastructure and algorithms of dominant platforms. 2. Boundary Resources: Boundary resources refer to the tools, APIs (Application Programming Interfaces), SDKs (Software Development Kits), and documentation provided by platforms to third-party developers. These resources enable external developers to create applications, services, or integrations that interact with the platform's core infrastructure. For instance: o Spotify provides APIs that allow developers to build apps showcasing playlists or analyzing listening habits. o Apple’s App Store acts as both a marketplace and a governance framework, setting rules for how third-party apps operate within its ecosystem. o Impact on Ecosystem Integration: By providing boundary resources, platforms create ecosystems that expand their reach and utility. However, they also maintain control over the rules and limits of integration, determining how and to what extent third-party developers can participate. This control can restrict innovation if platforms impose stringent policies or prioritize their own services over competitors’. o Dependence and Precarity: Creators and developers who rely on boundary resources often find themselves at the mercy of platform decisions, such as sudden API changes, policy updates, or monetization shifts. This creates a precarious environment for those dependent on platforms for their livelihoods. Overall Significance: By combining planetary-scale computation with boundary resources, platforms establish themselves as indispensable intermediaries in cultural production and economic activity. They shape not only how content is distributed but also how ecosystems of developers and creators interact with and depend on their infrastructure. This dual control over infrastructure and integration cements their dominance, making it difficult for alternatives to emerge. 4. Governance Governance by platforms 1. Moderation – filtering/removing content before of after it has been uploaded on the platform 2. Curation - (or recommendation) boost or make invisible certain kind of contents (+visibility-) (human or algorithmically done) 3. Regulation – software (like API) that set particular standards : gives external actor access to a restricted amount of data Resistance exercised by cultural producers against the ways in which the platforms govern (jailbreaking, alternative app stores) (providing own boundary resources) YouTube Adpocalypse, Demonitization & Governance discusses how YouTube's governance practices, particularly its demonetization policies, impact creators. It highlights events where advertisers pulled out due to concerns over ad placement on controversial content, leading YouTube to implement stricter content moderation and demonetization rules. These changes affected creators' revenue streams and raised questions about platform control, transparency, and the precariousness of platform-dependent cultural production. 5. Power & Change -Decentralization and/or Centralization? -Citizen journalism -Control -Platform evolution -Contingent cultural commodity Products and services offered and circulated via digital platforms are contingent in the sense that they are malleable, modular in design, and informed by datafied user feedback, open to constant revision and recirculation. As such, we will speak of contingent commodities, which appear not only in the news sphere but also across all domains of cultural production, including video, fashion blogging, and music (Nieborg & Poell 2018, 4276). 5. Power & Change (Expanded) This section delves into the shifting dynamics of power within the digital platform ecosystem, emphasizing the complex interplay between centralization and decentralization, as well as the emergence of contingent cultural commodities: Centralization vs. Decentralization: Centralization: o Large platforms like Google, Facebook, and Amazon wield significant political, economic, and cultural power by dominating key sectors of digital interaction, such as search, social media, and e-commerce. o These platforms centralize control over content distribution, monetization, and governance, often creating monopolistic environments. This concentration of power restricts diversity in cultural production, as creators and businesses must conform to platform policies and algorithms to reach audiences or generate income. o Centralized platforms impose rules for moderation, curation, and monetization, which may disproportionately impact smaller creators or those producing controversial content. Decentralization: o Despite this dominance, digital platforms also enable decentralized opportunities, particularly through tools for citizen journalism and grassroots movements. Individuals can produce and distribute content independently, bypassing traditional gatekeepers like media corporations. o Platforms like YouTube, Twitter, and TikTok allow underrepresented voices to emerge, democratizing access to audiences. However, these opportunities are often precarious, as they remain dependent on centralized platforms for infrastructure, visibility, and revenue. Contingent Cultural Commodities: Definition: o Cultural commodities on platforms are described as contingent, meaning they are flexible, modular in design, and shaped by continuous feedback loops generated through data analytics. o For instance, videos, music tracks, or articles are frequently revised, repackaged, or recirculated based on real-time user engagement data (e.g., likes, views, comments). Features of Contingent Commodities: 1. Malleability: Creators can adapt their content rapidly to align with platform trends or audience preferences. 2. Modularity: Content is often designed in fragments or smaller units (e.g., clips, highlights) that can be remixed or redistributed. 3. Feedback-Driven: Platforms analyze user data to determine which content to prioritize, influencing creators to produce content tailored to what is most likely to gain traction. Impact on Cultural Production: o This commodification transforms cultural production, making it highly adaptive but also heavily influenced by platform algorithms and audience metrics. o Creators face pressure to produce content that conforms to data-driven insights, often prioritizing popularity over originality or artistic integrity. o While contingent commodities allow for flexibility and innovation, they also reflect the precariousness of platform dependency, as shifts in platform algorithms or policies can disrupt creators’ work and income. Broader Implications: The tension between centralization and decentralization illustrates the duality of platforms as both enablers and gatekeepers. While they empower creators to reach global audiences and contribute to cultural innovation, their centralized control and data-driven frameworks fundamentally shape and constrain the nature of cultural production and distribution. This dynamic raises critical questions about sustainability, fairness, and the balance of power in the digital economy. Summary: 1. Key concepts: platforms, platformization, platform dependent cultural production 2. Analyse platforms as markets, infrastructures, and governance frameworks 3. Concentration of political and economic power in traditional media companies and a few major platform companies Key take-aways: 1. Power concentration rather than democratization 2. Platform-dependent cultural production is especially precarious because of continuous platform evolution 3. Contingent cultural commodities constitute a transformation in cultural production, distribution, and monetization. WEEK 16 - PLATFORM LABOR Outline of the lecture What is “platform labor” & why should we study it? Context: The Platform Labor project (2018-2023) Story: The rise (and fall?) of the gig economy - Part I: The Golden Years - Part II: The Tide Turns - Part III: The Dawn of a New Age (?) Key takeaways & some comments on The Gig is Up ------------------------------------------------------------------------------------------------------------- What is platform labor? Platform labor is, simply put, labor that (1) is mediated through a digital platform, (behind the screen) (2) labor that helps build and maintain digital platforms. (invisible labor) Mostly paid per “gig” or “task” and performed by contracted workers (not employee) In category (1), we can roughly distinguish four types of platform labor: 1. Creator/influencer labor 2. E-commerce labor (including ”live selling”) 3. Online gig work (freelancer platforms like Upwork and Fiverr; so-called “clickwork” or “crowdsourcing” platforms like Amazon Mechanical Turk) 4. Place/app-based gig work (ride-hailing, food & grocery delivery, cleaning) Why should we study platform labor? As students and researchers of Media & Information, we are interested in critically examining the societal impacts of new media technologies and algorithmic, datadriven practices. Platform Society: Our everyday lives are increasingly mediated/governed by digital platforms, also affecting how we work and generate a livelihood - Algorithmic management - Data-generating labor → capital asset - Precarious working conditions - New opportunities for marginalized populations (migrants) - Gig economy as “site of experimentation” and “future of work”(?) Vallas & Schor: Platforms as “permissive potentates”* “We argue that platforms govern economic transactions not by expanding their control over participants but by relinquishing important dimensions of control and delegating them to the other two parties to the exchange– hence the term permissive. The platform firm retains authority over important functions – the allocation of tasks, collection of data, pricing of services, and of course collection of revenues – but it cedes control over others, such as the specification of work methods, control over work schedules, and the labor of performance evaluation.” (Vallas & Schor 2020: 282) *Permissive Potentates: Summary Platforms operate as "permissive potentates" by combining worker autonomy with centralized control over critical aspects like task allocation, pricing, and data collection. They delegate responsibilities such as work schedules and methods to workers while externalizing costs (e.g., maintenance, insurance) and risks (e.g., job security) typically borne by employers. Although platforms advertise flexibility, this autonomy is often constrained by algorithmic oversight, rating systems, and fluctuating demand, leading to illusory freedom. Platforms profit by leveraging digital intermediation, avoiding traditional employer obligations, and exploiting a fragmented, heterogeneous workforce. This model disperses workers geographically, undermining solidarity and fostering competition. Ultimately, platforms centralize power while appearing decentralized, creating unstable and precarious labor arrangements that blur the lines between independence and control. According to V&S, platforms represent a “new economic architecture” that combines elements from the marketplace and the workplace. Ex: delivery people - are they in a workplace or are they participating in a marketplace, competing with other delivery workers? --- Overview of my own subproject Multi-sited study of gig economies in Amsterdam, Berlin & New York City Focus on app-based food delivery & domestic cleaning services 8 months of ethnographic fieldwork in each city (Non-)Participant observation & 151 semi-structured interviews Lots of informal conversations & 4 multi-stakeholder workshops Analysis: policy reports & regulatory documents, industry reports, news sources, platform companies’ financial reporting, marketing materials, emails & User Agreements/ToS documentation, app screenshots --- PART I – The Golden Years (2013-2017) - development of gig economy How can we account for the gig economy’s rise? Aftermath of the 2008 financial crisis (Great Recession): High unemployment and loss of personal/financial assets; Accommodative monetary policy/low interest rates = cheap credit; Governments embracing (tech-driven) innovation as platform for economic development; Cultural shift: power to the (working) people, demand for economic justice → “be your own boss” & “crowd-based capitalism” Introduction of iPhone & App Store; rapid expansion of mobile internet connectivity & cloud computing. Danny (28 yrs, Latino, NYC): “When Uber [Eats] started their thing, they broke in[to the market], they said "We need messengers, we need people on bikes." They gave us iPhones, they gave us batteries, they gave us bike racks, they gave us bags. They had us interview, we came in for an interview, and they said, "Boom, take this," because the app only worked with iPhones, so they just gave us an iPhone. It was pretty much locked to only the Uber app, but still, they just handed out all this money to us. The base rate, if you were online and accepted an order it was guaranteed $30 an hour.” How was this possible? Huge capital>Supply Discount and gifts> Demand Why did/do gig platforms attract so much capital? The “Pump & Dump” explanation: Investors do not care about the utility or future of these companies, which are just another investment vehicle that can potentially generate large returns upon “exit”. It’s about market valuation. The “Believers” explanation: Investors see enormous potential in these companies, due to their ability to scale quickly and leverage data analytics to create competitive advantages that result in growing market share and may lead to new products/services. They might become the next Amazon… What Uber tells its investors about data: “Managing the complexity of our massive network and harnessing the data from over 10 billion trips exceeds human capability, so we use machine learning and artificial intelligence, trained on historical transactions, to help automate marketplace decisions. We have built a machine learning software platform that powers hundreds of models behind our data-driven services across our offerings and in customer service and safety.” (IPO filing, 2019: 155-56) →Promise of automating gig work (i.e. eliminating labor costs) through Big Data & ML attracts investors. →It becomes possible to “convert data into money*” (Sadowski 2019, p. 11): Data as a new asset class. *Show data to investor to attract more money capital An Era of “Platform Exceptionalism” Powered by investment capital, gig platforms took cities by storm, rolling out different operational strategies that were increasingly sensitive to national and local norms and regulations - NYC: early phase of move fast and break things - Amsterdam/Berlin: move more carefully, break selectively Power of platform exceptionalism: “we’re not what you think we are so we’re not subject to industry/sectoral regulations” - we are data driven company NOT labor driven company! - Policymakers hesitated, emphasis on economic growth & competition - Economic empowerment/opportunity narrative initially resonated widely Resistance mainly from industry leaders in transportation & hospitality (ex: hotel protesting against AirBnB, or taxi companies protesting against Uber) - In case of food delivery and domestic cleaning, there were no such institutional actors → these were/are highly informal sectors! During these first years, gig work was an exciting new space of economic opportunity Erik (30 yrs, Colombian-Dutch, Amsterdam) “Deliveroo used to organize rider events, in the afternoon, once a month, where they would serve food and drinks and you could talk to your co-workers in an informal atmosphere. The general manager was also there. He’d usually give a talk, like “thanks for your effort, these are our plans for the coming time.” They also handed out small prizes, for riders who completed the most orders, or had biked the most kilometers, which was based on the statistics, their data, which they used to share with us weekly. They gave you the feeling that they cared, that they also made an effort.” (also raised a sense of competition> subtle regulation) PART II – The Tide Turns (2017-2020) Investors pull back from 'gig economy' start-ups Funding plunges as contract workers and policymakers become sceptical of business models Some reasons for the turning tide Automation was obviously not going to happen (any time soon) - practical and regulatory impediments, legal troubles First gig worker protests started to scale as well - Increasing critical scrutiny from media & regulators - Increasing costs of lobbying & litigation Too many gig companies had lots of capital to burn, so competition was tougher than expected Funding was still available, but more conditional: “be more operationally strategic and cost-efficient…” - These companies were burning through hundreds of millions of $$ Labor costs were a primary target for “optimization” Deliveroo introduces its “Distance-Based Fees” system More transparency, however workers started to be paid less and less due to regular algorithm adjustments – WAY FOR COMPANIES TO EXPEND LESS MONEY Behavioural economics: how do people think on their feed, how can we influence/manipulate them (ex. Accept big button, Decline small button, time limit) As I wrote in my first blog post (2018): “More order transparency also means more data on riders’ situated decisionmaking processes: what will it take for a rider to accept this type of order?; how far is a rider willing to go for this payout – and what if we add this particular incentive?” How Deliveroo uses rider data Data are incredibly important not just as that intangible asset that you will to investors to get more money, but also practical sense to control workers and to incentivize workers to work As I wrote in that blog post: “As the regime of stable (hourly or piece-rate) wages gives way to dynamic earnings schemes, and as these dynamic earnings tend to drop over time, gamified incentives will become increasingly important to 1) keep workers from leaving the platform, 2) push them to work on specific times and in particular areas, and 3) motivate them to take on more work (i.e. work longer and faster). Ed (38, Puerto Rican, NYC) explains Caviar’s bait-and-switch: “Let me explain you this; shiny object syndrome is they show you what they want you to see. Let's say this key here, they show you this key, they dangle it in front of you and then they go right into your pocket […] Now what Caviar does is they show you, for example, $10 for every three orders. Right? [...] But what they do is, a trip pays $7 and they're going to give it to you for $5, they lower the price […] Because basically what I see them doing is they cutting up your own money they was supposed to pay you, to make up for the bonus that they're giving you.” Delivery apps like DoorDash are using your tips to pay workers' wages Extending the predatory practice of tipped wages into the app economy Platform become less permissive, more manipulative, ‘authoritarian’ Gig work increasingly became migrant work As wages decreased and working conditions worsened over time, those with alternative income options leave the platforms (or put in fewer hours) Food delivery platforms shifted to recruitment among immigrant communities, where they offer a welcome income opportunity for immigrants → Kayode (Nigerian doing delivery in NYC): “I have many people depend on me. I need something like easy to make the money, to start moving […] When they’re doing promo [promotional bonuses], anyone that’s promo or whatever, you swing with that one, go with that one.” ENTREPRENEUAL FOR MIGRANTS, EXPLOITATIVE FOR ‘PRIVILEGED PEOPLE’ (it works until it doesn’t) What draws migrants to gig platforms? Lack of alternative/better labor market options Easy & quick (online) “onboarding” procedures (no specific requirements) “Indiscriminate” acceptance policies: relative lack of discrimination & low threshold for gaining platform access Informal waiving of formal/legal requirements: “turning a blind eye” Language barriers are less of an issue, both during onboarding and while working through the app Bonus pay (potential to make more than min wage) & option to “cash out” whenever you want; ability to set your own rates (cleaning) Relatively more work autonomy: “free login” & rejecting “offers” - (flexibility) Referral bonuses make recruitment via diasporic networks appealing Delivery workers are routinely required to identify themselves Since companies have been subject to increased public scrutiny over the presence of undocumented workers on their platforms, they have come up with new surveillance tech. PART III – The Dawn of a New Age (?) (2020 - …) – (the end of gig economy) More risk and surveillance during the Covid-19 pandemic “Contact-free delivery” was supposed to be pinnacle of frictionlessness and health safety during pandemic, but this new regime also introduces new burdens and risks for the rider. I was reported for not keeping to the “contact-free” rules. Meanwhile on the platform companies side > HUGE PROFIT (DELIVERY COMPANIES) > HUGE LOSS FOR HOSTING AND TRANSPORTATION COMPANIES (EX: AIRBNB OR UBER) HOWEVER + RISK FOR WORKERS> MORE PROTESTS> GOVERNMENT REFORM: RIDERS BECAME ACTUAL EMPLOYEES = MORE PROTECTION, LABOR RIGHTS BUT… 1-no more possibility to reject orders 2-no more free login (work whenever you want) 3-limited hours of work 4-most of rider will be deactivated (too expensive to hire them all) 5-minimum wage 6-student w Visas wont be able to work ONLY 16h per week End of ’platform exceptionalism’: more regulatory oversight One strategy: exit from unprofitable markets (ex. Deliveroo quit Germany) An alternative model: subcontracting The end of the gig economy (?) No longer business as usual; the gig economy as it once existed has no future. But it has also grown “too big to fail”: many depend on it! Gig work will increasingly look like a lot of other low-wage service work. - The end of platform exceptionalism: Increased regulatory oversight and realization that the “gig economy” is not distinct from larger economy - Further consolidation: Handful of multinational firms will dominate These firms will continue to rely on a mix of algorithmic management, gamification, and outsourced performance evaluation, but their power will become less “permissive” (Vallas & Schor) Migration will remain a critical issue: gig platforms and migrants “need each other” but new legislation puts pressure on this arrangement Or maybe it doesn’t…? While the ’last platforms standing’ in the industry appear to have reached profitability Key takeaways Platforms are both marketplaces and workplaces; This has enabled a situation I’ve called “platform exceptionalism”: platforms deviate from conventional forms of economic organization, which helps platform operators to resist existing regulations; (not for very long, as we seen) Vallas & Schor call platforms “permissive potentates” Place/app-based gig work is one type of platform labor, reshaping lowwage service sectors across cities; Over the past decade, the gig economy has been a site of (data- and algorithm- driven) experimentation, exploitation & opportunity – especially for migrant workers. WEEK 15 – ECOLOGY From Jakko Kemper’s slides Two definitions of ecology: 1. The environment, especially as it relates to living organisms. (What are a technology’s environmental effects? How does technology drive or mitigate environmental destruction?) 2. The totality of relations between an object and its environment. Earth / Cate Crawford The history of technology is connected to mining: San Francisco was built on gold, now runs on silicone. Echoes of this extractivism are found in modern uses of media, particularly in the training and deployment of AI. Supply chains are opaque and “dirty”. The waste produced by digital media is comparable to older heavy industries in terms of scope and devastation. “Cloud” hides material realities of energy and mineral use, particularly through data centres and logistical chains. AI should be thought as a “megamachine” comparable to the Manhattan Project to build the atom bomb. Data and Oil/ Taffel The metaphor of the “new oil” hides data’s impact on the world. The metaphor envisions data as a natural force to be controlled or a resource to be consumed. Oil nonetheless remains very present in the digital through plastics and energy requirements. Digital degrowth criticises: Bitcoin mining (based on proof-of-work, wasteful) Advertising platforms (requiring resource-intense ML for personalisation) Planned obsolescence (fuelled by marketing campaigns for the new)