Regulating Big Tech Summary PDF

Summary

This document summarizes the challenges of regulating big tech, focusing on the internet's core technologies and the role of legal expertise. It explores cyberlibertarianism, distinctions between the deep and dark web, and different regulation theories applied to online spaces. It highlights the complexities of enforcing regulations in a decentralized global network.

Full Transcript

**Regulating big tech summary** =============================== **Week 1** ========== ### **Summary combined with the learning tasks of this week:** ### **How the Internet Works: Core Technologies and the Role of Legal Expertise** The internet operates on a [complex network of technologies], inc...

**Regulating big tech summary** =============================== **Week 1** ========== ### **Summary combined with the learning tasks of this week:** ### **How the Internet Works: Core Technologies and the Role of Legal Expertise** The internet operates on a [complex network of technologies], including the Transmission Control Protocol/Internet Protocol (TCP/IP), Domain Name System (DNS), and various data packet routing mechanisms. These technologies enable data to be broken into packets, routed through different nodes, and reassembled at the destination, creating a global, **decentralized information network**: *Data and information are not managed by one single authority put rather widespread. Information travels through multiple [nodes] (knooppunten) like servers and routers distributed all across the globe allowing open and flexible flow of data. Because of the decentralization of the internet, it is resilient and cannot be brought down by one point of control*. [Legal experts do not need to be engineers], but Guadamuz, Ziewitz, and Brown agree that a [basic understanding of these core technologies is essential]. Knowledge of TCP/IP, DNS, and other protocols helps legal experts grasp how information travels, where jurisdictional boundaries might apply, and what technical challenges arise in enforcing laws online. Understanding these technologies also provides insight into the infrastructure that supports both the visible "surface web" and the hidden "deep" and "dark" layers of the internet, where jurisdiction and enforcement issues are especially complex. Lessig's work on code as a form of regulation highlights how the architecture of these protocols directly influences what is possible or permissible online, reinforcing the need for legal experts to understand the technological context when formulating or interpreting internet regulations. ### **Cyberlibertarianism and Its Influence on Internet Regulation** **Cyberlibertarianism** *is the belief that the internet should remain an open, unregulated space free from government interference, rooted in the idea that the internet can self-govern through community norms and decentralized control*. This ideology emerged in the early days of the internet, with advocates arguing that government regulation would stifle innovation, freedom of expression, and the potential for digital communities to manage themselves. As discussed by Guadamuz and Ziewitz & Brown, cyberlibertarianism significantly shaped the initial resistance to regulation and influenced early internet policies that prioritized freedom and openness over control. Cyberlibertarianism interfered with the development of internet regulation because it believed that regulation interfered with innovation and freedom and that regulation was not needed because the internet could regulate itself. However, as internet use expanded and issues like cybercrime, privacy violations, and content moderation arose, it became evident that some regulation was necessary to protect users and uphold laws. The cyberlibertarian influence still permeates debates around net neutrality, encryption, and data privacy, as proponents continue to argue for minimal intervention, while critics point out the need for rules to curb abuses and protect individual rights. ### **Deep and Dark Web: Distinctions and Legal Challenges** The "[deep web" and "dark web]" refer to portions of the internet that are not easily accessible through standard search engines. The deep web includes content like academic databases, subscription-only websites, and corporate intranets that are hidden behind login requirements or otherwise not indexed. The dark web, however, is a small part of the deep web that requires special software, such as Tor, to access. It provides users with greater anonymity, which is appealing for legitimate privacy concerns but also attracts illegal activities, such as drug trafficking, illegal trade, and cybercrime. The internet is divided into 3 layers: - Surface web: *includes websites indexed by standard search engines* such as google. - Deep web: *all online content that is not publicly accessible or searchable on the service web.* refers to parts of the internet that are not indexed by standard search engines like Google. This includes content hidden behind passwords, paywalls such as private databases or subscription services. - Dark web: *is a small hidden part of the deep web which is only accessible with specialized software like [Tor] which provides anonymity*. Often used for private communication and illegal activities where identity and location are concealed. Guadamuz and other scholars discuss the legal challenges associated with the deep and dark web, [particularly regarding jurisdiction, anonymity, and enforcement]. Tracking criminal activities or implementing regulations in these hidden areas of the internet proves difficult due to the high level of anonymity and technical barriers. Legal experts face challenges in applying existing laws to the dark web, as it requires a deeper understanding of the technology and often involves international cooperation to address cross-border criminal activities. ### **Regulation Theory and Its Application to the Internet** [Regulation theory] is a framework that explores *how laws, norms, and standards are created and enforced to guide behavior within societies*. In traditional setting governments create and policies to regulate society, but it becomes *more difficult when it comes to regulating the internet because of its decentralized and more complex global nature*. **Regulation theory applied to the internet**: In the context of the internet, regulation theory *addresses the ways in which governments, private entities, and even technologies themselves attempt to control online activities*. - Government regulation: governments create their own set of rules to regulate the internet and attack cybercrime, but this is quite complex because the internet operates globally meaning that countries nead to work together. - Private regulation: also private entities on the internet such as Facebook or Instagram set their own rules of policy on their platforms. - **Lessig's concept of "code as law**" plays a central role in this discussion, suggesting that the *architecture of the internet itself can act as a regulatory force*. For instance, algorithms can restrict access to certain content, and platforms can establish terms of service that function as de facto laws for their users. Guadamuz examines *several theories about how to best regulate the internet*, focusing on **Lawrence Lessig's** concept of "regulation by code" and the idea of multi-stakeholder governance. - **Regulation by code**: Lessig's theory suggests that rather than using laws, we can control online behavior through technology itself. This "code" can regulate users by *embedding rules directly into software*, like filters that prevent certain types of content from being uploaded. However, critics argue that this approach can be too [rigid and lacks transparency]. It can be rigid because rules embedded in codes are not flexible and the users do not exactly know which rules within the software are applied or by whom, making it difficult to understand for users. - Guadamuz also looks at **multi-stakeholder governance**, a model that includes governments, private companies, and the public in the regulation process. This collaborative approach seeks to balance the needs and rights of different groups, creating policies that respect both user freedoms and public safety. This approach aims to balance the interest of all different groups and seeks to prevent excessive control of one entity. Guadamuz applies regulation theory to the internet by exploring different regulatory models, *including state-based legal approaches, self-regulation by private entities, and hybrid models where governments and companies collaborate.* The theory is also reflected in **multi-stakeholder governance**, which involves various groups---such as governments, companies, and civil society---in the decision-making process. This approach aims to balance competing interests and create regulations that are both effective and fair, addressing the internet's global and decentralized nature. ### **Characteristics of Private Governance in Platform Terms of Service** [Private governance] on the internet *refers to rules set by private entities (such as Facebook) that regulate the internet*. Private governance on the internet is primarily enacted through platform terms of service (ToS). [Platform terms of Service (ToS):] *are legal agreements that users must accept to access a website or app*. Platforms like Facebook, Google, and Twitter use these ToS to set rules about acceptable behavior, content restrictions, and consequences for violations. Guadamuz and Ranchordas & van \'t Schip describe these agreements as a form of "private governance," *where companies establish their own regulatory frameworks independently of governme*nts. This private governance model gives platforms significant power to control user behavior, but it [also raises questions about accountability and **transparency**]. Since ToS are often complex and opaque (ondoorzichtig), users may not fully understand the rules they are agreeing to or the consequences of violations. Furthermore, because companies are profit-driven, their governance priorities may not always align with the public interest, leading to criticisms that platforms enforce their rules inconsistently or censor legitimate speech. ### **Algorithmic Governance and Modes of Internet Governance** [Algorithmic governance] refers to the *use of automated systems and algorithms to enforce rules and manage user behavior online*. Platforms use algorithms for content moderation, recommendations, and user engagement, shaping what users see and interact with. Lessig and other scholars acknowledge that [these systems act as regulatory tools], as they determine what is visible and accessible, often without direct human oversight. Different modes of governance are recognized in literature, including: 1. **Self-Governance**: [Platforms create and enforce their own rules], such as Facebook's community guidelines or Twitter's terms of service. 2. **Co-Governance**: [*Collaboration between private companies and gover*nments] to establish guidelines, as seen with European Union regulations on data privacy. 3. **Government-Imposed Governance**: Governments set [mandatory regulations that platforms must follow], such as content takedown laws. Algorithmic governance can be efficient but is criticized for its **lack of transparency** and **accountability**. Algorithms can make mistakes, be biased, or lack the nuance needed to understand context, leading to wrongful content removal or unintended discrimination. As Guadamuz, Ziewitz & Brown, and Ranchordas & van \'t Schip highlight, *regulating these systems is challenging, as it requires balancing efficiency with fairness, transparency, and accountability.* ### **Conclusion** The combined insights of Guadamuz, Ziewitz & Brown, Lessig, and Ranchordas & van \'t Schip reveal the complexity of regulating the internet. *Legal experts must understand both the technical and social aspects of the internet to create effective regulations that protect users while respecting the unique qualities of online spaces*. Concepts like cyberlibertarianism, private governance through platform terms of service, and algorithmic governance demonstrate the shifting balance of power between users, platforms, and governments. The deep and dark web complicates enforcement efforts, while regulation theory provides frameworks for addressing these challenges. As internet technologies and user behavior continue to evolve, so too must the regulatory approaches, adapting to ensure a safe, fair, and open internet for all. **Week 2** ========== **Summary combined with the learning tasks.** --------------------------------------------- **What is the concept of *intermediary liability* and what are its legal implications?** Intermediary liability in Dutch is: [ ] aansprakelijkheid van tussenpersonen. [Intermediary liability] refers to the *responsibility that online service providers such as media platforms or e-commerce websites may have over the unlawful activities or content uploaded on their platforms by their users*. The concept is important to determine how much responsibility intermediary bear for third-party content and action. The E-commerce directive (2000/31/EC) gives the [legal framework] for intermediary liability and distinguishes 3 types of intermediaries, *each with different responsibilities and liability exemptions*: 1. **Mere conduits (art.12 Directive*):*** *these are intermediaries that simply transmit information of users without altering or interfering with its content*. For example, Internet Service Providers (ISP's) that provide internet connectivity. **Liability Exemption**: Mere conduits are not liable for the content they transmit if: - They [do not initiate t]he transmission. - They [do not select the recipient of the data]. - They [do not modify] the transmitted content. 2. **Caching providers (art.13 Directive):** *an intermediary that temporarily stores data or content to make the delivery to the end-user faster* faster and more efficient data transmission. Caching = temporary storage. **Liability Exemption**: Caching providers are not liable for the stored content if: - They [do not modify] the content. - They [comply with conditions about updating and removing] content set by the original source. - They [act expeditiously to remove or disable access to the content upon obtaining knowledge of its illegal natur]e. **Purpose**: This exemption ensures that data can be delivered efficiently across the internet without holding caching providers responsible for every piece of content temporarily stored. 3. **Hosting providers (art.14 directive*):*** *intermediary that offers services to stores and manage digital content of their users on their platforms. Makes it possible for people and companies to create their own websites or applications on the internet*. Examples of hosting providers are YouTube or Ebay. Under Article 14 of the E-Commerce Directive, hosting providers are generally not liable for the content they store on behalf of their users, provided that: - They [do not have **actual knowledge** of illegal content]. In other words, they are unaware that the content stored on their servers is unlawful. - Upon [obtaining knowledge of illegal content] (e.g., through a notice from a rights holder or an affected party), [they act **promptly to remove** or **disable access** to the content]. **Purpose**: This provision encourages innovation by protecting hosting providers from liability as long as they take action when illegal content is reported or identified, balancing the need for user-generated content platforms with the requirement to prevent illegal activities. These distinctions and exemptions with intermediaries ensure that intermediaries can operate freely without the constant fear of possible legal repercussions, while also setting a clear responsibility for handling illegal content. **Legal implications:** Legal implications of intermediary liability involve balancing the protection of fundamental rights as freedom of expression and information with enforcing law against illegal conduct. An important case when talking about the legal implications is **L'Oréal v eBay (C-324/09):** in the case the CJEU addressed the liability of online marketplaces (intermediaries) about the infringement of intellectual property. The court gives key insights on the responsibility of intermediaries based on the E-commerce directive and how exemptions on the liability work. The users on eBay sold products of L'Oréal and eBay actively promoted this. L'Oréal stated this was an infringement of their intellectual property. The CJEU ruled that eBay could not directly fall under the liability exemptions because *eBay had actual knowledge of the infringement and dit not act promptly to remove or disable this illegal activity*. The case stated the responsibility of intermediaries to actively address illegal conduct if they have knowledge of it and cannot rely on an liability exemption if they are actively engaged in promoting the conduct. Case also makes an distinction between [passive and active hosts]. Passive hosts that don't promote the conduct can possibly come in account for a liability exemption. However, eBay was promoting and played an active role with directly meant that they cannot have a liability exemption. **Aleksandra Kuczeraway: "from notice and take down to notice and stay down"** Aleksandra Kuczerawy, in her analysis "From 'Notice and Take Down' to 'Notice and Stay Down,'" explores the significant concerns surrounding intermediary liability, particularly how it can impact freedom of expression on the internet. The debate centers around the methods intermediaries use to deal with illegal or infringing content, and the potential risks associated with overly strict liability rules. - Notice and take down: when an intermediary notices illegal activity it must promptly remove this conduct. The responsibility ends there. It is not required to monitor any further once it is removed. - Notice and stay down: more stringent system. After removing the illegal conduct when noticed, the intermediary is required to prevent similar illegal conduct from happening again. This would require more stringent monitoring and proactive measures to prevent reappearing. **What are the policy objectives of the E-commerce directive?** The E-Commerce Directive aims to foster the growth of electronic commerce across the EU by ensuring a harmonized and predictable legal framework. It focuses on: 1. **Facilitating the Internal Market:** By removing barriers to cross-border online services, the directive aims to promote free movement of information society services. 2. **Legal Certainty and Trust:** The directive provides rules that enhance trust in online environments, establishing clear limitations of liability for intermediaries. 3. **Promoting Innovation and Competition:** It seeks to provide a balanced regulatory environment where intermediaries can flourish without being overburdened by liability risks. Because the intermediaries are not responsible for alle actions of their users, which saves a lot of time, they can focus more on innovating. 4. **Consumer Protection and Rights:** The directive protects users by ensuring online platforms handle illegal content appropriately while safeguarding fundamental rights. According to intermediary liability, platforms have an active obligation to handle illegal conduct when notices by the platform. **[Arno Lodder's]** focuses on the importance *of a good [balance] between regulatory requirements and on the other hand promoting innovation and expression*. The E-commerce directive protects intermediaries such as e-commerce websites or social media platforms against [excessive liability] for user activity. This protection is crucial because if intermediaries faced heavy legal burdens for every piece of content shared by their users, they would be forced to constantly monitor and possibly over-censor content to avoid legal consequences. *Such over-regulation could stifle innovation and restrict the open exchange of ideas, turning the internet into a heavily monitored and less creative environment*. By limiting liability of intermediaries the E-commerce directive ensures that the internet stays a dynamic space for communication and commerce, where new platforms can enter the market without having the constant fear of legal repercussions. **How has the CJ interpreted some of these policy objectives?** The Court of Justice of the European Union (CJEU) has played a key role in interpreting the E-Commerce Directive's policy objectives, balancing the interests of rightsholders (rechthebbenden), platforms, and users. 1. **C-324/09, L'Oréal v. eBay:** This case clarified the directive's limitations on liability, stating that platforms may not be held liable for third-party content if they are merely passive hosts without knowledge of illegal activity. However, if they are found to have played an active role (e.g., optimizing sales), they might lose their exemption. This ruling reinforced the objective of maintaining a predictable legal environment while protecting intellectual property rights. 2. **C-298/07, Bundesverband der verbraucherzentralen v Deutsche Internet Versicherung:** ruling focused on the obligation of intermediaries (online service providers) to comply with [consumer protection law]. The Bundesverband argued that the Deutsche internet was not *transparent* as its terms and condition were not clear and hard to access. The CJ argued that online service providers must comply with consumer protection law which requires transparency. Relevance of the case is that intermediaries must not only promptly respond to illegal activities noticed, but also proactively ensure that their platform meet consumer protection standards. 3. **C-18/18, Eva Glawischnig-Piesczek v Facebook:** significant ruling of the CJEU that addressed the [obligation of social media platforms to remove defamatory content (]belastend). Eva, an Australian politician, demanded that defamatory content of her and equivalent content removed from Facebook. CJEU ruled that platform such as Facebook can be obliged not only to remove defamatory content, but also [identically/equivalently content] globally. Social media platforms could be obliged to proactively remove this content. Decision of the court in this case raised questions about the risk of over-censorship and freedom of expression because the court rules that platforms have a proactive obligation to remove harmful content **(challenging the objective of the directive to balance rights).** **Do platforms have a general monitoring duty under the E-commerce directive?** No platforms DO NOT have a general monitoring duty under the directive. **[Art.15 Directive]** [explicitly prohibits] member states from setting obligations to platforms to actively monitor content or seek out illegal activities. Idea behind this ruling is to **protect consumer privacy** and to *prevent platforms being burdened with excessive monitoring responsibilities* **protecting the free exchange of ideas**. However, the case above of **Eva Piesczek v Facebook suggested a shift towards a more proactive obligation for platforms**: the CJ stated that equivalent or similar content must be removed by platforms. This is not the same as a monitoring duty, but still it requires a more proactive responsibility from the platform. **What [transparency obligations] do intermediaries have in relation to E-commerce according to the E-commerce directive?** The E-commerce directive sets out specific [transparency conditions] to promote **consumer protection and trust** in the online marketplace. Conditions are designed to ensure the consumer can make well-informed decisions and the intermediaries act in a reasonable and clear manner. There are a couple of key transparency requirements: 1. **General information obligation (art.5 directive):** intermediaries must make their identity, registration and contact information easily accessible for users and authorities. Users must be able to easily find and contact the provider to ensure accountability. 2. **Commercial communication (art 6 directive):** intermediaries must clearly identify any commercial communications, such as advertisement to the user. *Should make it obvious when contact is sponsored or paid by a third party, so consumers know when its advertisement*. 3. ### **Contractual Information (Article 10**): Platforms must provide users with clear and easily accessible terms and conditions before entering into a contract. This includes details about how contracts are formed, any applicable terms, and how users can correct errors before making a binding commitment. This obligation ensures that consumers are well-informed about their rights and obligations when engaging in online transactions. 4. **Terms and conditions:** in the **case bunderverband v. Deutsche internet Versicherung** the CJ emphasized the [importance of transparency in conditions and terms] that should be in line with [consumer protection law]. Terms and conditions must be presented to the user in a transparent, clear and easy accessible way. **[Daphne Keller]** emphasizes that the **E-Commerce Directive**, which sets rules for how online intermediaries handle content and ensure transparency, must also be consistent with other important EU laws, especially **data protection regulations** like the **General Data Protection Regulation (GDPR)**. This means that while platforms are obligated to be transparent and remove harmful content under the E-Commerce Directive, they must also be careful about **how they process and manage user data**, ensuring compliance with GDPR standards. The integration of the framework of Daphne Keller is important to ensure that the enforcement of one set of rules (like removing illegal content) does not breach set of rules (like protecting personal data). The balance is important to make sure consumer rights and privacy are protected as well as that online intermediaries act responsibly and lawful. **SO**: according to Daphne Keller it is important to look at the E-commerce directive, but we should also look at other EU legal principles to make sure that living up to the rules of the directive does not breach another set of important digital EU rules. **Conclusion:** Overall, the E-Commerce Directive plays a crucial role in shaping intermediary liability in the EU. Its provisions are carefully balanced to promote innovation, protect consumers, and uphold fundamental rights. The CJEU's interpretations have reinforced these objectives but have also introduced new complexities, especially in cases involving content moderation and global takedown orders. **Week 3: Digital constitutionalism and Consumer protection (**Jakob van Kerkhof & Catalina Goantaa) ==================================================================================================== **Abstract** Traditionally, [Digital constitutionalism] focused on regulating the power of private actors in the digital realm on a constitutional manner. However, the New Social Media is characterized by a more commercial structure with [content monetization structures] (structuur voor genereren inkomst). Therefore, it makes sense to turn to **consumer protection for digital constitutionalism**. Consumer protection seeks to protect consumers rights and sets *mandatory limitations* to the freedom of private actors. Consumer protection law is used in *levelling the imbalance in power between consumer and platform and providing a framework for safeguarding rights in the digital economy*. Consumer protection became an official right in **art.38 CFREU** recognizing its importance. This article investigates the role consumer protection law can have on digital constitutionalism, creating a new facet **commercial constitutionalism**. **Introduction** Nowadays are society is mostly online. This fast growing, brought with it concerns about the decay of classical government system such as the bilateral relationship between [citizen and state]. This relationship was governed by constitutional laws that aim to limit the power of the state and protect the rights of the citizens. However, the internet is not governed primarily by states but by private actors (*new governors*), who intermediate and shape the internet this requires a rethinking of constitutionalism turning into **[digital constitutionalism]**: *addresses the constitutional gap in governance performed by digital private actors*: in the digital realm private actors such as social media actors govern online spaces quite similar as how state governance works, but without the system of checks and balances that usually comes with it. Digital constitutionalism is a field that is usually dominated by themes such as [the protection of free speech on online platforms]. However, internet is rapidly evolving with new ways to share activity and spread information, which leads to new constitutional problems. A general trend that can be observed is **[increased commercialization]**. *More often social media platforms function as online marketplaces for consumers*. Social media became the most important consumer spaces. The social media platforms that developed into online marketplaces, are filled with advertisements, placement of products and subscription **this commercialization is shaping the New Social Media**. In exploring how New Social Media fits in Digital constitutionalism, the paradigm of **commercial constitutionalism** is used. Consumer protection forms the most important pillar of this constitutionalism. This article focusses on digital constitutionalism in the light of [EU-context] where consumer protection is an important right citizen have according to **art.38 CFREU**. Digital constitutionalism is aimed at limiting private power in the digital realm therefore consumer protection can form an important consolidation in protecting weaker parties against powerful technological companies. *The article explores how consumer protection can shape commercial constitutionalism as a form of digital constitutionalism that is better in dealing with the increase of monetization on social media*. **Defining Digital constitutionalism in the Social Media consumer context** **Constitutional gap:** Consumer protection is a right that is not defined well. Traditionally, the governance has the task to enforcing laws and protecting individuals. However, we see that society is digitalizing fastly and becomes more and more online. In the digital realm we see that the traditional legislator has the responsibility, but they have certain limitations. Big tech companies and social media platforms have assumed roles that originally were held by governments. Private social media platforms must fill in those gaps. However, these private actors are not held by the same constitutional principles states are bound by. This shift leaves a **constitutional gap**, where private actors exert immense power but are not bound by constitutional principles that ensure fairness, transparency, and accountability. Digital constitutionalism can be seen as *efforts to fill in this* gap by adapting constitutional principles into the digital realm. Its goal is to create frameworks that protect users rights against private powers, just like constitutional frameworks that protects citizens against the state. There exist [three perspectives on digital constitutionalism.] This article proposes *commercial constitutionalism, which can be discussed from all of the three perspectives beneath.* 1. **Liberal (traditional) constitutionalism:** offline as well as online, consumers must be ensured of an sufficient level of protection which the legislator can strive towards with defining norms of consumer protection. Extending existing constitutional principles into the digital realm. [Translating protections] such as freedom of speech in online world. 2. **Societal constitutionalism:** *is a lens to look at constitutional processes arising outside of the traditional constitutional sphere*. How [quasi-public entities] such as Facebook or Google control infrastructures and set out rules for interaction and protection Examines how these entities set up rules aligning with public values *that's why societal constitutionalism is a good way to examine the role social media platforms take in guaranteeing online consumer protection*. 3. **Global constitutionalism:** *local constitutions of local rooted traditions have to compete with transnational organizations that create their own constitutional processes*. The internet is a global thing, yet most constitutional framework is bound by national borders. *Global constitutionalism seeks for solutions to apply territorial constitutional principles in a more global context.* This lens is useful because the article only looks at a limited area: the EU-territory. However, this consumer protection norm can easily be exported outside of the EU area due to the global nature of internet, as well as the [Brussels-effect]. For example, how European consumer protection norms can function outside the jurisdiction of the Union. This contribution takes [primarily a liberal constitutionalist and a societal constitutionalist] lens in formulating **commercial constitutionalism**. It seeks to build on existing norms of consumer protection as a fundamental right and translate those to the digital realm (liberal), as well as a societal constitutionalist lens on how actors in the digital realm have developed their own processes in creating consumer protection that fits the reality of digital platforms (societal). **Consumer protection and Human rights in the European Union** Consumer protection has been a policy and regulation objective of the European Union and its predecessors for the past 50 years. Designed inter alia as a strategy to unify the markets of Member States in the [pursuit of a greater internal market], *consumer policy has generated regulatory instruments promoting consumer rights and reducing private power*. In this section, we discuss the conceptual development of consumer protection in the European Union from a private right to a more public, fundamental right. **Consumer protection as a Human right** The adoption of consumer protection as a human right went in [three] waves. Consumer protection evolved from a *policy tool to a fundamental right* officially recognized in [art.38 CFREU]. In first instance consumer protection was focused on ensuring fair competition, now it mainly focusses on individuals' dignity and autonomy. Including consumer protection into human rights has led to [academic debate:] some feared that the inclusion of consumer protection *diluted the human rights (verdant)* by including new human rights that were only relevant for wealthy states. Human rights are often targeted against state power and not against power of big companies. on the other hand, all citizens are consumers when they participate in the free movement of goods and deserve some kind of human dignity as consumer. To investigate if consumer protection is a fundamental right, Deutch introduced two kinds of tests: [procedural- and substantive test]: *the procedural test requires the fundamental right to be adopted in an international fundamental rights instrument, whereas the substantive test addresses the contents of the right as being generally 'important'*. Deutch concludes that consumer protection is a *[soft-human right]* for now. A human right has three characteristics: (1) should pertain entire human community and not one group, such as consumers, (2) human rights are the characterization of the individual as a primary concern and (3) [human rights are rights of the individual against power governments]. Consumer protection protects the citizen/consumer against big companies and not governments. Although cannot be characterized as a fundamental human right, consumer protections provide precisely the protection needed in search for digital constitutionalism (protection against big private companies). **Consumer protection and the EU-charter** Whether consumer protection deserves recognition as a human right at international level is outside the scope of this article. However, consumer protection already got recognition as fundamental human right in the EU-charter (at European level in [art.38 CFREU]). States are according to this article, obliged to ensure a high level of consumer protection. However, this is regarded more as a guideline than a subjective right. Consumers cannot directly rely on this right. It does reflect the commitment of the Union to ensure consumer welfare. [Harmonization efforts], outlined in [Article 114 TFEU], aim to unify consumer protection laws across Member States for a cohesive internal market. EU consumer law draws from both economic integration and fundamental rights objectives, blending market efficiency with protections for individuals. **The charter and the consumer acquis** The early idea of consumer protection was shaped across five basic rights. Over time this has shaped in a *number of consumer protection instruments known as **[consumer acquis]***. Encompasses Key directives that protect consumers for example against unfair conditions. Consists of norms that may not be deviated from. This includes documents such as the Consumer rights Directive (CRD), the Price indication Directive or the **[Unfair Commercial Practices Directive (UCPD]**): in this part were focusing on the UCPD as example for consumer protection at European level in the digital realm. UCPD is a [maximum harmonization] instrument with a business- to- consumer setting. The UCPD, a cornerstone of the acquis, prohibits commercial practices that distort consumer behavior through deceit or coercion. It applies broadly to digital issues like data misuse, unmarked sponsored content, and native advertising. The directive adapts to emerging technologies and commercial practices, [ensuring transparency and fairness]. *The notion of 'fairness' in the UCPD is NOT a moral stance but mainly to ensure proper competition*. **Unfairness is defined in Article 5(2) in [the form of a test]:** *a practice is unfair [if it is contrary to the requirements of professional diligence], and if it materially distorts or is likely to materially distort the economic behaviour with regard to the product of the [average consumer]* whom it reaches or to whom it is addressed. The UCPD is targeted for the *'vulnerable consumer'*. Both requirements need to be satisfied to make a practice unfair. '*professional diligence*' is a standard of special skill which can be expected of a trader with honest market practice and good faith. **Challenges in the Digital Sphere:** Platforms often act as intermediaries enabling unfair practices, raising questions about their accountability under consumer law. Issues like hidden costs (\"free\" services funded by personal data) and deceptive marketing practices *highlight the growing need for robust enforcement in digital marketplaces*. UCPD is a crucial instrument for protecting consumers on online platforms: the directive covers *all stages of contact* between the user/consumer and the platforms such as agreement on processing personal data, as the termination of a contractual relationship. **(UCPD) protects consumers on two fronts**: - First, it safeguards their rights in contractual relationships with platforms, including pre- and post-contractual stages, addressing deceptive practices like misleading terms of service. - Second, it regulates harmful content, such as unmarked native advertising, emphasizing fairness and transparency, and requiring platforms to align their practices with consumer protection standards. Like we see in the example above with the UCPD in combination with all the other mandatory norms of the charter, such as art.38 CFREU, we see that consumers protection makes a good instrument in the rebalancing of private power on the internet, aligning with digital constitutionalism. **The rise of Commercial Constitutionalism** As said before, in the pursuit of limiting the power of private platforms, consumer protection can fulfill a role in the search for digital constitutionalism when explored from a European perspective. Scholars of competition law stated that 'the big business organizations should be treated less as consumer individuals, and more like governments who control the consumers. Yet so far, the commercial dimension of digital constitutionalism has only remained implicit in market power exercised by social media platforms. This section recognizes the commercial dimension by introducing the concept of ***commercial constitutionalism***, which relies on the recognition of consumer protection as a fundamental right in the digital realm. Commercial constitutionalism relies/is build on [two elements of digital constitutionalism:] 1. **Articulation of fundamental rights** (in digital realm): *translating analog/existing consumer protection in digital contexts as a fundamental right ensures legal recognition and enforcement mechanisms* fits well with [liberal constitutionalism]. 2. **Limitation of (private) power**: Consumer protection traditionally aims to limit power in favor of the consumer. This element can be found in all three perspectives above on digital constitutialism. The big private on social media platforms actors govern and enforce on the internet in the same way as governments would do in an offline society. Platforms wield significant control over users, from data-driven decision-making to monopolistic practices. Consumer protection laws can counterbalance this power, ensuring transparency and fairness. The primary challenge of digital constitutionalism is limiting the power of (private) actors. *Commercial constitutionalism focuses on limiting private power, particularly that of powerful digital actors like social media platforms, which govern online spaces in ways similar to states in offline society*. This approach addresses the fundamental imbalance between platforms and users, where platforms derive significant power from practices like \"pay-with-your-data\" transactions and consumer contracts. These contracts often challenge fundamental rights, positioning consumer protection law as essential for safeguarding users\' interests. Consumer protection law mitigates the vulnerabilities of consumers by addressing the opaque power structures and complex business models of social media platforms, especially in the context of [increasing commercialization]. Examples include influencer marketing, subscriptions, and social commerce features like Instagram's Checkout. These developments blur the lines between citizens and consumers, creating tensions around legal classifications and emphasizing the need for norms that protect users\' rights in these monetized digital spaces. Commercial constitutionalism that recognizes consumer protection as a fundamental right, addresses all these tensions. *Consumers require protection against the trader because they're the weaker party* this is even more necessary in the the [New Social Media] where the power imbalance is being exacerbated (verergerd) by the monopolistic power social media platforms hold over their users. This power is twofold: - Because of the dominance of these platforms and the connectivity, [users can hardly switch to other platforms]. - Secondly, platforms hold a lot of power over their users -- consumers- by virtue of their lack of privacy, dependence and architecture. This together creates **'*Vulnerability by design for the user'***: the nature of the internet and social media platforms causes consumers to be '[vulnerable by default'] when they act on the internet therefore consumer protection as a fundamental right Is highly needed. Therefore [commercial constitutionalism] shapes itself as a new facet of digital constitutionalism with a [dual nature] that covers the debate of this chapter. It has the following dual nature: - **Consumer protection as [mandatory limitation]:** *clear rules and remedies that limit private power and* establish legal certainty and harmonize business conduct across member states. - **[Constitutional recognition] of fundamental rights:** EU charter and national constitutions do not only see consumer protection as a tool to protect consumers but more as a *holistic right that consumers have*. *Treating consumer protection as a fundamental right to ensure user welfare in increasingly transactional online space* This approach redefines digital constitutionalism to address the dual challenges of commercial practices and fundamental rights violations, fostering a digital space that protects both consumers and citizens. **Conclusion** This chapter has looked at a new facet of digital constitutionalism, which we penned as commercial constitutionalism, to better fit the New Social Media, an increasingly monetized space where commercialism becomes more visible. We've explored the [commercial facets of digital constitutionalism], as well as the [need for consumer protection as fundamental righ]t due to great power imbalance on digital platforms. **Week 4: Dark patterns** ========================= **What are dark patterns?** The [definition of dark patterns] is that dark patterns are '*are **deceptive or manipulative design elements** in [user interfaces] (UI) that influence users to make choices they might not otherwise make if fully informed or free from manipulation*. Dark patterns use a range of manipulative tactics that influence user behavior by [exploiting cognitive biases], often at the expense of the customers well-being. A *cognitive bias* is a mental shortcut or error in thinking that influences how people perceive or process information, often leading to irrational decisions. **Key characteristics of dark patterns:** - Manipulation over choice: Dark patterns steer consumers in a way that mostly benefits the business such as making unnecessary purchases or sharing personal data. They often remove customer friendly options. - Deceptive presentation: information of choices is presented in ways that mislead the user. For example, they may hide true costs of an action until the last step of checkout. - Exploitation of *cognitive biases*: dark patterns make use of cognitive biases every person has to steer them a certain way without the consumer being aware. - Lack of transparency: they obscure important information or delve them in difficult legal language, so it is hard for the customer to understand. Examples of dark patterns are [hidden charges] (in the last checkout step), [urgency messages] (creating a false sense of scarity) or [privacy intrusions] (using customer private data unless they opt out in a difficult way). Dark patterns are often used on [E-commerce platforms, social media and subscription services], where they influence consumers for business gain. **An arising problem** that like technology is evolving, also they ways of dark patterns are getting more sophisticated the manipulative designs get more subtle which makes it harder for the consumer to recognize the dark patterns. For example, with **advances in data collection** like businesses can make more personalized interfaces tailoring the dark patterns to the specific consumer. Another example is that with the use of AI and deep learning **behavior prediction** becomes more easy making manipulation even easier. This evolution *makes it increasingly difficult for users to recognize when they are being manipulated, as dark patterns are crafted to feel intuitive and natural within the digital experience.* The more sophisticated these tactics become, the more challenging it is to draw the line between persuasive design and unethical manipulation. **What is Web measurement and the Princeton study insights** **What is web measurement?** Web measurement involves the *[systematic collection and analysis of data] from websites to understand how they work, interact with users and how they influence user behaviour*. It is particularly useful to identify certain online patterns such as the use of manipulative design which is the case with dark patterns. In the **[Princeton study]** web measurement was achieved through **automated web crawlers**: *these are online bots that systematically search websites and index their content*. The bots gather information by simulating user interaction on the interfaces and seeing how the platforms presented information and influenced the user's behaviour. **Insights in the Princeton study: *"Dark patterns at scale: Findings of a crawl of 11k shopping websites"*** The study represents one of the largest empirical investigations into dark patterns on E-commerce websites. The study [used automated web crawlers] to analyze 11,000 shopping websites, identifying over 1,800 instances of dark patterns across 15 types. This large-scale research provided unprecedented insights into the [prevalence (mate waarin iets voorkomt), design, and mechanisms of these manipulative tactics]. The study gave the following insights on dark patterns: a. **Prevalence (mate waarin voorkomt) of dark patterns:** research lead to the conclusion that 11% of the websites used at least one dark pattern. However, it turned out that bigger, more visited websites, used more dark patterns visibility of a website correlated/linked with the use of manipulative designs (dark patterns) b. **Categorization and taxonomy:** the study developed an taxonomy (indeling) of dark patterns in 15 types under 7 categories including, *sneaking, urgency, misdirection, obstruction and forced action*. Each different category presented a different influence mechanism. c. **Exploitation of cognitive biases:** Dark patterns were found to rely [on *cognitive shortcuts*] that consumers rely on when making decisions. These patterns often target cognitive shortcuts like the scarcity bias (fear of missing out) or default effect (sticking with pre-selected options). d. **Role of third party-enablers:** the study of Princeton uncovered 22 extern parties that offered dark pattern designs to businesses. These third party services offered templates on how to implement manipulative designs making it easier for small companies also to use these tactics. The study underscored (onderstreept) the importance of systematic web measurement to uncover these practices at scale, providing data-driven evidence for regulators and researchers to address the issue. The Princeton study illustrates how automated, data-driven techniques (such as Web measurement) can empower efforts to combat dark patterns, protect user autonomy, and foster ethical digital environments. **The findings above had the following significance:** The study highlighted [the extensive use of dark patterns on E-commerce] websites and provided regulators with empirical information to enforce policy changes. It underscored the prevalence of the use of dark patterns online and showed how vulnerable the consumer is in making informed decisions. The idea of the Princeton research was to stimulate action in mitigating online dark patterns. **Understand the prevalence of dark patterns on shopping websites.** The prevalence of dark patterns on shopping websites is significant and deployed by businesses to maximize their revenue. These manipulative designs manipulate the psychology of the consumer and the digital environment to influence the decisions of the user. Like stated before in the Princeton study, 11% of the E-commerce websites uses dark patterns which means it is not an incidental case, but widespread all over the internet. Dark patterns are mostly used on [popular/high visited websites], where their impact can influence a larger group of users **correlation between visibility and dark patterns**. **Common Examples on Shopping Websites** 1. **Urgency Tactics**: - Many shopping websites use false **countdown timers** or messages such as \"Offer expires in 10 minutes\" to create artificial pressure on users to make immediate purchases. These timers often reset upon refreshing the page, exposing their deceptive nature. - Statements like \"Only 2 items left in stock\" leverage **scarcity bias** to make products seem more desirable, even if the stock information is fabricated. 2. **Hidden Costs**: - Websites often reveal additional fees, taxes, or shipping costs only at the final checkout stage, exploiting users\' **commitment bias**---the reluctance to abandon a purchase after investing time in the process. 3. **Subscription Traps**: - Offers such as \"free trials\" that automatically convert into paid subscriptions are prevalent. These traps make cancellations intentionally difficult, with users often unaware of recurring charges until they occur. 4. **Social Proof**: - Fake or unverifiable activity messages like \"5 people bought this item today\" or \"This is trending now\" are used to create an illusion of popularity and urgency, influencing users to follow perceived social trends. **Why shopping websites are a key target** There are a couple of reasons why shopping websites are a key target to make use of Dark patterns. Firstly, the [volume of transactions is very high] on these platforms. Everyday millions of users make purchases, making the use of manipulative designs highly lucrative. Secondly there's [minimal oversight] on these platforms: regulatory frameworks struggle to monitor these platforms which makes the use of not ethical practices like dark patterns easier. The widespread use of dark patterns as manipulative designs has **consequences for the autonomy of the consumer:** The consumer first of all is [ financially harmed] by these dark patterns: they make more unnecessary purchases. Dark patterns also lead to more [mistrust in E-commerce platforms:] encountering unfair practices like hidden costs can lead to lowering trust in digital platforms. In conclusion, the high prevalence of dark patterns on shopping websites reflects a deliberate strategy to influence user behavior in favor of businesses. These manipulations exploit vulnerabilities in user psychology, **emphasizing the need for robust consumer protection** mechanisms and greater transparency in digital commerce. **Different taxonomies (indeling) in dark patterns and the consumer protection applicable to dark patterns (UCPD).** [Taxonomies] are *frameworks to categorize and understand dark patterns, based on their characteristics, mechanisms and effects*. This taxonomy helps regulators and researchers to systemically identify these practices. The recent [Princeton study] made a more broad and structured taxonomy to reflect the diversity of dark patterns: - **Sneaking**: Hiding or disguising actions that users might not consent to, such as hidden subscriptions or additional charges. - **Urgency**: Creating artificial time pressure, like fake countdown timers or "limited stock" notices. - **Misdirection**: Highlighting one option to draw attention away from alternatives, steering users toward the desired choice. - **Obstruction**: Making desired actions inconvenient, such as hiding cancellation options behind multiple steps. - **Forced Action**: Requiring users to complete unnecessary tasks to achieve their goals, like mandatory account creation. These taxonomies align with behavioral science insights, linking **[patterns to cognitive biases]** (e.g., scarcity bias, default effect) to explain their impact on user decisions. **Consumer protection regime applicable to dark patterns** The European Union has several legal frameworks designed to address manipulative commercial practices, such as dark patterns. The frameworks aim to protect [user autonomy, transparency and fairness] in the digital environment: **Unfair Commercial Practices Directive (UCPD):** The UPCD is the [primary legal instrument] to regulate business to consumer practices in the EU. It addresses deceptive, aggressive, and unfair practices that distort decision-making of the consumer. The UPCD applies [before, during and after the conclusion of a contract]. The UPCD tries to ensure fairness in commercial practices by addressing, [professional Diligence, misleading actions and omissions and aggressive practices]**:** ####  How the UCPD Applies to Dark Patterns 1. **Article 5: Professional Diligence** - This article prohibits unfair commercial practices that breach the standard of professional diligence and materially distort consumer decision-making. - **Relevance to Dark Patterns**: User interfaces that mislead or restrict consumer choice (e.g., through hard-to-cancel subscriptions) may violate this standard, even if no intent to deceive is proven. 2. **Articles 6 and 7: Misleading Actions and Omissions** - **Article 6** covers practices that provide false or ambiguous information, even if technically correct. For instance, countdown timers suggesting false urgency. - **Article 7** addresses omissions, such as hiding critical costs until checkout, which significantly impair the consumer\'s ability to make informed decisions. 3. **Articles 8 and 9: Aggressive Practices** - **Article 8** defines aggressive practices as those that significantly impair consumer choice through undue influence or psychological pressure. Examples include guilt-tripping language (\"confirmshaming\") or \"click fatigue\" designs that deter privacy-friendly options. - **Article 9** provides criteria for assessing such practices, including exploitation of vulnerabilities or applying excessive obstacles. 4. **Annex I: Blacklisted Practices** - Specific unfair practices are outright banned, including \"bait and switch\" tactics. Dark patterns that promise one outcome but deliver another (e.g., hidden subscriptions) can fall under this annex. #### **2. Consumer Rights Directive (CRD**) The **CRD** complements the UCPD by promoting transparency and fairness in contractual relationships. Relevant provisions include: - Requirements for clear and comprehensive information before a contract is agreed upon. - Ensuring the **right to withdraw** from contracts within specified periods, supported by user-friendly cancellation mechanisms. #### **3. Digital Markets Act (DMA) and Digital Services Act (DSA)** These newer frameworks target dominant platforms and their control over digital markets. They address dark patterns by: - Mandating transparency in choice architectures. - Prohibiting manipulative designs that exploit user vulnerabilities. - Introducing accountability measures for platform operators. As we see above, the [UPCD offers tools to combat manipulative practices] such as dark patterns. However, there are some challenges and limitations with implementing the UPCD on dark patterns. Firstly [dark patterns is not specifically mentioned] in the UPCD, complicating direct implementation. Secondly, there exist a [asymmetric information between consumer and business:] businesses\' access to data and behavioral insights creates an imbalance, making it harder for consumers to resist manipulative tactics. Thirdly, there must be [case-by-case assessment] when it comes to determining whether an specific dark pattern breaches the directive, this can take a lot of time which will lead to slow enforcement. **Remaining legal uncertainty in applying the current legal framework.** Despite progress in addressing dark patterns, significant uncertainties remain in applying current legal framework such as the UPCD. There a couple of reasons why uncertainties remain in applying legal framework of the EU in combating the use of dark patterns by E-commerce platforms. These uncertainties stem from the following 4 reasons: - **Ambiguity (meerduidigheid) in definitions:** a core issue is the [lack of an universal definition of dark patterns]. While the UPCD prohibits misleading, unfair and aggressive practices, it does not explicitly mentions all the manipulative interface designs. Due to the lack of a specific definition in regulatory framework like the UPCD, there's a reliance on interpretation which can vary across jurisdictions will lead to **inconsistent implication of the law:** some dark patterns will be penalized and others not. - **Proof of harm:** another challenge is [proving that the use of dark patters materially distorts consumer behaviour]. The manipulative behaviour is often very subtle which makes it difficult to prove a direct link with the behaviour of the consumer. Also harm is very subjective which complicates it even more. - **Adaptation to emerging tactics:** rapid evolution of technology continuously [outpaces the legal framework.] With the use of AI and other deep learning and data-collections, businesses constantly refine their dark patterns. Regulatory framework cannot keep up with this rapid pace which will lead to regulations that are not fit to address the new dark patterns **law struggles to remain relevant in innovating field**. This legal uncertainty hampers the ability of regulators to uniformly address dark patterns, leaving consumers vulnerable to emerging forms of exploitation. **Desing regulatory solutions:** LOOK AT *"DARK PATTERNS AND THE EU CONSUMER" FROM PAGE 12,* LARGE LIST WITH RECOMMENDATIONS. **Week 5: gig work and the sharing economy** ============================================ **What is the sharing economy?** The **sharing economy**, also known as *socio-economy*, refers to an '*economic system that facilitates [sharing] underused resources throughout digital platforms*.' These platforms connect individuals in the short-term exchange of goods, services or transportation. Examples of sharing economies are [Airbnb or Uber] that use technology to lower their transaction costs and increase the shared use of resources and services. It typically involves Peer-to-Peer (P2P) interaction facilitated by a technology driven intermediary. In a 'sharing economy' people often work in the form of ***'gig work':** refers to short-term, flexible jobs where individuals are typically hired on a task-to-task or freelance basis, contrary to traditional employees*. Gig workers are often classified as [independent workers] which means they do not benefit from the traditional employee benefits like health care and insurance. A sharing economy has a couple of **key characteristics**: - Digital platforms: this is a must when it comes to sharing economy. Essential that there is a digital platform where demand meets supply such as Airbnb for accommodations. - Underused resources: it often concerns underused resources like houses/rooms that are empty or idle cars. - Access over ownership: the participants or users often prefer temporary use of services than owning it. Users of Uber rather prefer the temporary transportation than straight owning a car. - Trust systems: trust is reinsured in these platforms by user ratings, reviews and verification of profiles. **In which ways is the sharing economy innovative and what regulatory challenges does it pose?** The sharing economy represents a significant shift in the way services and capital are being exchanged, driven by innovations in technology, accessibility, and social trust. It enables providers to connect directly with customers through digital platforms, without intervention of traditional intermediaries the process *reduces costs, enhances efficiency and opens up new markets*. First, we'll discuss the **innovative aspect of the sharing economy:** - **Democratization of access:** *platforms like Uber and Airbnb [lower the barriers to entry], making it possible for everyone that possesses an underutilized asset, like a car, a room, or a skill, to participate in the sharing economy*. Therefore, the possibility for providers is created to generate income flexibility. - **Market efficiency**: *the sharing economy uses data-analytics and algorithms [to match the real time demand with the suppl]y*. For example, ride-sharing platforms such as uber use surge pricing (piekprijzen) to balance driver availability with the need of passengers, [optimizing resource allocation]. When the demand for rides is high and there are few uber drivers available, they up the price to align and balance the demand for rides to the rides available. - **Trust mechanisms**: unlike traditional markets, the [sharing economy heavily relies on trust mechanisms] such as user reviews and ratings. These trust mechanisms foster transparency and demand accountability, which ensures good behaviour on the platforms. If an Uber rider misbehaves he gets bad ratings and reviews which will lead to less work opportunity therefore he will probably act in accordance with the expectations of the consumer and platform. - **Cultural shift:** by emphasizing [access over ownership], the sharing economy aligns with modern values of sustainability and resource conservation. This will challenge the consumer who prioritizes acquisition, ownership. Overall, the innovations of the sharing economy make services more accessible, affordable and environmentally conscious while reshaping traditional economies. Now, we'll discuss the **regulatory challenges of the sharing economy:** the innovations of the sharing economy grow with a rapid speed which [outpaces] the existing legal regulatory framework. This will create regulatory challenges: a. **Legal classification:** one of the biggest challenges is how to classify these digital platforms, should they be [classified as intermediaries or as service providers] (for example of transport)? *The distinction of classification affects the regulatory obligations they have*. For example, the CJEU ruled that Uber was classified as a service provider of transport because they had lot of influence in determining the price and operational conditions. However, Airbnb was ruled as an intermediary because of its little influence on the transactions. b. **Consumer protection and liability:** traditional industries like taxis or hotels operate under clear regulatory framework that ensures public safety, quality and liability coverage. These sharing economy-platforms often bypass these legal frameworks and fall in *[legal gray areas]*. This can expose consumers to risks such as unsafe conditions or inadequate accountability, because it is not certain that hosts or drivers meet the same industry standards. c. **Unfair competition:** traditional industries such as taxis or hotels are direct service providers which must comply with strict rules including licensing and insurance which will up the costs. *In contrast, sharing economy-platforms will often argue that they're [intermediaries] instead of direct service providers [so they can operate under less stringent rules] in contrast to the traditional industries* this will lead to an unfair advantage for the sharing economy-platforms because they can offer lower prices and more flexible services without bearing the same strict regulatory framework creating [unfair competition] regarding the traditional industries. d. **Adapting to rapid change:** law and regulation often evolve slowly. However, technology and economic models develop very rapidly. T*his creates problems because [traditional laws are not designed to address the new problems] that come with the new economies*. This lag can leave policymakers struggling to address new challenges, such as the rise of \"gig work\" and its implications for labor rights and benefits. **Regulators face a difficult balancing:** encouraging the innovation and advantages of a sharing economy and at the same time ensuring public safety, fair competition and consumer protection. Over-regulation could stifle and slow the growth of these platforms and Under-regulation could lead to loopholes and inequities (ongelijkheden). This ask for a flexible regulatory framework that is capable of accommodating the dynamic nature of a sharing economy. **What is an information society service?** The **definition *of information society service*** *(**ISS**)* is given in [directive 2000/31/EC (]**E-commerce directive)**: an information society service refers to *services provided in return for [remuneration], at a [distance], by [electronic means] and [at the individual request of a recipient]*. So an information society service has [4 elements]: - Provided in return for [remuneration] has to generate direct or indirectly economic value. - Provided at a [distance] has to be provided *remotely*, provider and consumer cannot be in the same place simultaneously. - By [electronic means]: service is delivered over an electronic network, like the internet - At the [individual request of a recipient]: most important is that the service is *initiated by the user* and specified to its needs, like a booking for a ride, renting a property or buying something online. An information society service is a legal category under EU law that helps determine how digital platforms and online services are regulated. Plays an important role in the question if platforms like Uber or Airbnb can benefit of specific protection given in the E-commerce directive. This protection includes for example *reduced liability* for content uploaded on its platforms or freedom from certain national regulations. So, if a platform classifies under the definition given above of an ISS, it can benefit from the harmonized rules at EU level and get certain benefits. It covers services like e-commerce platforms, digital intermediaries, and online marketplaces where the primary interaction between parties occurs digitally. Uber is classified as a service provider because it has a lot of control over the drivers. Airbnb is classified as an ISS/intermediary and benefits from the conditions laid down in the E-commerce directive. **Are Uber and Airbnb Information Society Services according to the CJEU?** The CJEU has ruled in two cases about the status of Airbnb as well as Uber if they can be classified as an ISS under EU law. In each case the CJEU addressed their distinct [business models] and [the level of control they have]. The following two cases are relevant: - **Case Elite taxi v. Uber Spain (UBER):** in the case the CJEU ruled that the company Uber does NOT solely qualify as an ISS, but rather qualifies as an **[transport service provider]**. The main reason for this ruling was Uber's *significant involvement in the provision of transport services.* Their 2 following arguments: a. **Decisive influence**: *uber [exercises substantial control] over the drivers and the transport service they offer*. Uber was for example setting fairs (decides the price) and payments through the platform, they set requirements drivers had to meet to become eligible and they had rating systems which could exclude certain drivers from providing their service on the platform. b. **Market creation**: secondly the court noted that *Uber's [platform was indispensable] for drivers to find and serve customers*. Without the platform of Uber drivers could not find and get customers and would not be able to provide their services *So Uber goes beyond mediation and directly shapes and manages the service*. The CJEU ruled that the [main component] of Uber's service [was providing the transport and not the digital mediation between customers and rider]s. The physical transport services of Uber cannot be seen separately from the platform so the physical transport is the main service they provide. According to the court Uber does much more than only matching riders and customers, it truly facilitates the transport. Uber is classified by the CJEU as a **Composite service:** *since offline transport is the main component of Uber's service [it cannot be classified as an ISS and enjoy the benefits of the E-commerce directive], but it must rather comply with the regulations concerning transport*. National member states can now set requirements on Uber such as licensing or other regulations must comply with stricter rules just like the traditional taxi industry. - **Case *Airbnb Ireland* (AIRBNB):** in contrast to the Uber case, in this case the CJEU decided that Airbnb keeps its status as **[information society service (ISS)]**. *This was because Airbnb [mainly was facilitating the short-term rentals and was primarily acting as intermediar]y without exercising a high level of control/influence on the services provided by its users*. The court considered the following: A. **limited influence:** while Airbnb does recommend prices and standardized policies, it does NOT directly set prices and the users can determine their own price and availability, Airbnb does not control this. Host can independently manage their properties and decide their availability. B. **Intermediation role:** Airbnb primarily provides a platform that connects hosts with guests and facilitates transactions [without managing the actual provision of accommodation service]s. This keeps it within the scope of the ISS under the service directive (the mainly intermediary role it has). C. **Transparency Obligations:** Airbnb's classification as an ISS means it is subject to EU rules on transparency and consumer protection but is not required to meet the stricter regulations imposed on real estate brokers or hotels The implications of the case were that Airbnb is classified as an ISS under the E-commerce directive. This means that Airbnb as an intermediary, can benefit from certain provisions in the E-commerce directive such as the harmonized regulation and the reduced liability. It does not have to comply with the stricter rules set for the traditional industry of hotels. **Should there be major differences in the qualification of sharing economy services under EU law?** The question if there should be major differences in qualification of sharing economy services under EU law, is [important for the debate how platforms like Uber and Airbnb should be regulated.] *The literature together with the given cases of the CJEU highlight those differences in the [level of control], [nature of the service] and [impact on traditional markets] can justify different regulatory approaches*: There are a couple of considerations to justify different regulatory approaches for different sharing economy services: a. **Nature of the service:** - *Platforms that [facilitate purely digital transactions]*, such as matching supply and demand (e.g., Airbnb), *can generally be classified as **information society services*** **(ISS)**. These platforms serve as intermediaries, enabling peer-to-peer interactions without fundamentally altering or controlling the offline service. - *Conversely, platforms like Uber, which deeply [integrate into and orchestrate the offline component of the servic]e (transport), go beyond intermediation*. The CJEU classified Uber as a **composite service**, where the primary service is transport, subjecting it to sector-specific regulations. b. **Level of platform control:** a. Uber sets fares, determines driver eligibility, and influences work conditions, making it functionally similar to a transport operator. b. [Airbnb,] while offering tools like pricing algorithms and standardized policies, l*[eaves key decisions (pricing, availability, guest management) to hosts,]* retaining its intermediary status. *Platforms with significant control should face stricter regulations to ensure consumer protection and fair competition with traditional service providers*. c. **Economic and social impact:** Platforms like Uber and Airbnb disrupt traditional industries like the taxi and hotel industry. They can often bypass certain strict regulations and operational rules. *The extent of integration in these sectors can influence their qualification:* - Uber's model creates direct competition with taxis and requires regulatory parity to avoid unfair advantages. - Airbnb's impact on housing markets, while significant, is less direct, allowing it to remain classified as an ISS. SO: there should be differences in the qualification of different sharing economy services under EU law when justified by the nature of their business models, their level of control and their impact on the sector this nuanced approach is **in line with the goal of the E-commerce directive**: *foster innovation while ensuring fair competition and consumer safety*. Platforms like Airbnb that mainly act as intermediaries retain their status as ISS and get more room and platforms that exercise more control on the services as Uber should comply with sector specific regulations. Applying the same regulations to all sharing economy services will sometimes stifle innovations, even when this is not necessary. Platforms that operate on different models should not be treated the same with the same strict regulations. This will ask for more tailored approaches. **To what extent should differences in business models count?** The extent to which differences in business models should count under EU law is rooted in the core principles [of fairness, proportionality, and innovation]. As the literature and rulings demonstrate, business models influence the degree of control a platform has over the service provided, the impact on markets, and the regulatory frameworks that apply. 1. **Decisive Influence and Control:** *Platforms that exert significant control over key aspects of the service, such as pricing, quality, and operations, effectively function as service providers rather than intermediaries.* For example: The more a platform integrates into the service, the stronger the case for subjecting it to sector-specific regulations. 2. **Economic Independence:** Platforms like Airbnb enable hosts to operate independently, maintaining economic autonomy. Hosts can choose whether, when, and how to offer their services. *Conversely, Uber drivers are heavily reliant on the platform to generate business, and their operations are tightly regulated by Uber*. This dependency suggests a higher degree of integration and control. 3. **Market Creation vs. Facilitation:** Platforms like Uber create and sustain the supply chain for their services by recruiting and coordinating drivers, effectively shaping the market. Platforms like Airbnb primarily facilitate transactions within an already existing market, such as short-term rentals, without directly creating or managing the supply chain. The level of market creation influences the platform's responsibility and regulatory obligations. Differences in business models should significantly influence the regulatory treatment of sharing economy platforms. Platforms that control essential service components and deeply integrate into operations, like Uber, should face stricter sector-specific regulations. Conversely, platforms that act primarily as facilitators, like Airbnb, can benefit from the flexibility of being classified as intermediaries under the E-Commerce Directive. This approach ensures fairness, promotes innovation, and protects consumers and markets. **Week 6: Cryptography and the decentralized internet** ======================================================= This week explores the intersection of cryptography, blockchain and the decentralized internet within the legal and regulatory framework. *This week will give a better understanding of how cryptography and blockchain work and highlights the social and legal implications of a decentralized internet*. **Cryptography and Crypto communities** **What is Cryptography?** Cryptography is the science of *securing communication and data trough [encryption], ensuring confidentiality*. It uses algorithms to encrypt messages with a key only known to the sender and the recipient. Its main goal is to guarantee a safe passage of information from unauthorized access. So, cryptography ensures confidentiality and also ensures integrity because the data remains unaltered in storage. Initially cryptography was invented for governmental and military purposes. Later on through the advent of [public-key encryptions] like RSA, the use of cryptography became widely accessible to the public and is commonly used in technologies such as online banking or messaging. **What are crypto communities?** Crypto communities are *virtual collectives created around the use of cryptography for protecting identity, safe communication, currency transactions or value transfer*. These crypto communities have evolved in [three distinct stages]: - First generation: **cypherpunks**: early proponents (voorstanders) for cryptography who stood for [*individual* freedom] emphasizing privacy and resistance to state surveillance. Was mainly used for communicational services and did not yet involved the use of cryptocurrencies. - Second generation: **dark web marketplaces:** cryptographic technology expanded creating *[decentralized marketplaces]* such as [Silk road]. Silk road was one of the first darknet market allowing users to sell and buy products anonymously. Platforms like Silk road provided for their users. c. Anonymity: users could provide serviced and trade on these platforms without revealing their identity. d. Cryptocurrencies: on these platforms users could pay for transaction with currencies like Bitcoin or other coins, which meant their transaction were untraceable. - Third generation: **Decentralized applications (DApps**): build on [blockchain-technology] these communities expanded the cryptographic applications to *decentralized finance, gaming, virtual assets and more*. Main benefit with the uprising of 'decentralized applications' is the escape from monopolies/central authorities who determine which information you get to see on the web. Blockchain is a method to secure and record information without the interference of a centralized entity like banks or governments. This generation separates itself from earlier ones by using blockchain and eliminating the need for intermediaries. It creates, using bitcoin-technology direct digital interaction between users where they can make transaction for all kinds of values such as 'land' in digital worlds (Decentraland) or Art in the form of Non-Fungible Tokens (NFT's). SO each generation of Crypto communities tries to create an [alternate system] that challenges conventional hierarchies and promotes [individual agency (keuzevrijheid]). **Cryptography on the dark web:** *The question is how cryptography has been used on dark web marketplaces*? Cryptography is foundational for dark web marketplaces because it ensures the anonymity of users, secure communication and untraceable transactions in environments designed to evade traditional surveillance and regulation. Techniques of cryptography shield there users from surveillance and law enforcement. Here's how it works" - **Anonymity via Tor network:** dark web marketplaces operate via networks like [Tor (The onion router)] which uses cryptographic techniques to ensure anonymity. Information on the network is stored in different nodes encrypted with several layers. By working through these different nodes, the next destination is clear but not the content or the source allows users to hide their identity and locations without being tracked. - **Secure transactions via cryptocurrencies:** cryptocurrencies are like Bitcoin or Monero are essential for dark web marketplaces. Earlier currencies like Bitcoin were popular because of their [pseudonymity]. Altough they worked with 'ledgers' (online wallets), users could use certain techniques to combine transaction making it harder to trace their identities. By using cryptography and securing information and confidentiality, users on the dark web marketplaces can safely make transactions with each other without surveillance or unauthorized access of other parties. Cryptography creates anonymity and untraceable transactions making illegal transactions easier. **Blockchain and its academic pedigree (stamboom)** **Blockchain** is *a [transparent and decentralized] that acts as a [distributed ledger] to securely record transactions*. Blockchain works as a online ledger that keeps records of every transaction in a way that everyone in the network can trust. It eliminates the need for intermediaries (central authority) by relying on cryptographic algorithms and storing data across a widespread network of participants. Each transaction is grouped in a block, which is then cryptographically linked to the previous block creating a (block) chain. Once a block is added to the chain, it cannot be changed anymore, ensuring the information/transaction stays untampered. Here's how it works in simple terms: 1. When someone wants to make a transaction, the details (like who is sending money to whom and how much) are bundled into a block. 2. This block is sent to a network of computers, called nodes, which work together to check if the transaction is valid (e.g., making sure the sender has enough money). 3. To approve the block, the nodes solve a complex puzzle or follow a specific rule (this process is called consensus). Once agreed upon, the block is added to the chain. 4. Every computer in the network keeps a copy of the entire blockchain, making it very hard to hack or alter. Blockchain doesn't rely on a central authority, like a bank, but instead uses cryptography and these networked computers to ensure trust and security. This system is what makes blockchain transparent, decentralized, and nearly impossible to tamper with. The **Bitcoin blockchain** was the first and most known application of this technology. The academic pedigree of bitcoin draws on multiple disciplinary influences such as cryptography, distributed computing, economy and decentralized consensus. It was invented by Satoshi Nakamoto in his whitepaper in 2008 Nakamoto's Bitcoin blockchain introduced a *decentralized mechanism for secure digital-currency transactions*. It promoted a [peer-to-peer] electronic cash system. The idea was to use the innovations in cryptography to create a decentralized currency free from central authority. The academic pedigree (stamboom) of the Bitcoin blockchain reflects decades of research and innovation in cryptography and distributed systems, making it a pioneering breakthrough that *laid the foundation for numerous applications beyond digital currencies*, including smart contracts, decentralized finance, and supply chain management. **Selling LAND in Decentraland (Catalina Gooanta)** Gooanta introduces the concept of '***the Internet of Value'** (IoV*): the concept of digital currencies alternative to the national currencies gradually turned into a more general expression of value Internet of value. It entails the digitalization of assets such as 'digital intellectual and property, equities and wealth, and its secure manure of transferring. Not only cryptocurrencies are traded online but also other kinds of informational values. This also includes other interpretations of value beyond the exchange of money and currency such as digital ecosystems that are blockchain-based such as the one of Ethereum (Decentraland). **Decentraland and its compliance with the Digital Content Directive:** Decentraland *is a blockchain-based 'virtual world' where its users can buy, sell and develop virtual real estate called 'LAND', represented by Non-Fungible Tokens (NFT's*). Virtual world by Ethereum based on blockchain. In Decentraland 'LAND' is a NFT that can be traded between users to obtain digital space (land). Users on the platform can buy LAND using MANA, which is the cryptocurrency allowed in this virtual world. Decentraland functions as a [decentralized ecosystem], where users can obtain virtual property without interference of on central authority. SO, Decentraland sees itself as a virtual world claiming to engage in and facilitate the sale of virtual property. Despite its decentralized claims, the platform\'s governance involves entities like Metaverse Holdings Ltd., which provide tools such as the Decentraland Client, SDK, and Marketplace to facilitate interactions. Metaverse holding for example holds the intellectual property on architecture of the platform. **Decentraland's compliance with the Digital Content Directive (DCD):** The DCD regulates *contracts for the supply of digital content and services within the EU*, [aiming to protect consumer rights]. 'LAND' in Decentraland falls within the scope of 'digital content' of the directive and thus we can check if this platform complies with the requirements laid down in the CDC. The LAND are digital assets that only exist in the virtual world by ownership maintained in the Ethereum network. READ FOR THIS PART 'SELLING LAND' BY CATALINA GOOANTA W6 T1 RBT ### **Regulation of Decentralized Digital Services** **How should decentralized digital services be regulated?** Regulation of decentralized digital services requires a balance between innovation and consumer protection: 1. **Transparency and Accountability:** Clear terms outlining rights and responsibilities for both providers and users are essential. 2. **Data and Asset Ownership:** Clarifications on the distinction between owning digital assets (e.g., LAND in Decentraland) and associated intellectual property rights should be standardized. 3. **Dispute Resolution Mechanisms:** Accessible, fair, and enforceable mechanisms are needed to address consumer grievances. 4. **Compliance with Local Laws:** Platforms must adapt to regional legal frameworks like the DCD to ensure alignment with consumer rights and contractual fairness LAATSTE TWEE VRAGEN NIET HELEMAAL AF, KIJK ZELF NOG EEN KEER. **Week 7: internet and platforms liability: Digital Services Act** ================================================================== The EU Digital Services Act: Towards a more responsible internet (Andrej Savin) =============================================================================== **Introduction: Why reform is necessary?** The rapid growth of the internet from a niche technology to a commonly used tool, has *exposed flaws in the early regulations*. In the early regulation of the internet, such as the [**E-commerce directive (ECD**)] adopted in 2000, minimal regulations and liability exemptions for intermediaries [promoted *fast growth of the internet*] digital economy grew exponentially. However, todays more developed and dominant digital economy, driven by data and algorithms, present new challenges for regulators to face. These platforms often prefer user engagement over content integrity, amplifying illegal and harmful content on their platforms. Therefore, the **Digital Services Act** was created, *which builds on the foundation of the ECD but tries to address its shortcomings in the light of more modern digital challenges*. It redefines responsibilities in a **data-driven economy**. **Structure and objectives of the DSA** DSA is part of a broader European market strategy to update digital regulations, complementing the [Digital Markets Act (DMA)]. DSA is the result of attempting to redefine platform regulation *defines responsibilities and obligations of online service providers*. [Two factors] determined the need for a revised framework: 1. First idea is that **economically more powerful actors (gatekeepers) should bear higher burden of responsibilities**. DSA would be an opportunity to induce platforms to engage in more moderation in order to \"earn\" their protected status. 2. The second idea is *the need to move away from the non-interventionist regulatory model, characterized by low levels of regulation and high tolerance, **to a more [proactive model,] possibly distinguishing smaller from more prominent platforms*** and requiring more from dominant market players likely to engage in harmful practices. The DSA preserves the core principles of the E-Commerce Directive (ECD) but introduces significant changes to address modern digital challenges. These include enhanced due diligence (zorgvuldigheid), increased transparency, and stricter obligations for very large platforms. **Structure of the DSA:** - [Chapter I] contains rules on the subject matter, scope and definitions. - [Chapter II] contains rules on the exemption from liability for intermediaries. - [Chapter III] sets out due diligence obligations for a transparent and safe online environment. **The new regime of intermediary liability** The liability exemptions as laid down in [art.12-15 ECD] are still kept and platforms can still get liability exemptions under certain circumstances, *but these platforms should now under the DSA also comply with stricter rules regarding due diligence, transparency and responsibility*. So **double liability**: Traditional set of liability exemptions together with extra responsibilities for platforms. ##### ##### **1. Intermediary Liability** The DSA builds on the ECD's framework, maintaining protections for intermediaries while adding new requirements: - **Conditional Liability Exemptions**: [Articles 3-5] replicate existing protections for \"mere conduit,\" caching, and hosting services like settled in art.12-15 of the ECD. Platforms remain exempt unless they actively manage or fail to act on illegal content when notified. - **Good Samaritan Protections**: [Article 6] protects platforms that voluntarily address illegal content, incentivizing proactive moderation without fear of liability. - **Procedural Requirements**: Articles 8-9 set clear rules for platforms to handle illegal content orders and information requests from authorities, ensuring consistency across member states. ##### **2. General Due Diligence Obligations** [Chapter III introduces] baseline requirements for all digital service providers. Based on the idea that platforms should act more responsibly meaning they should act more diligence, transparent and in the case of very large platforms, manage systemic risks **proactive measurements.** - **Transparency**: Platforms must publish annual reports detailing their content moderation efforts and provide clear terms and conditions outlining moderation policies. - **Contact Points and Representatives**: Platforms must designate contact points for authorities and legal representatives within the EU if based outside it. - **Notice-and-Action Mechanisms**: Platforms must create accessible processes for users to report illegal content, ensuring requests are clear, justified, and actionable. **3. Rules for online platforms:** Detailed obligations for online platforms are dealt with in [section 3] from art.19 DSA. Starts with a *minimis rule*: does not apply for small enterprises. Additional obligations apply to online platforms that connect users with goods, services, or content: - **Consumer Protection**: Platforms must verify the identity and legality of traders using their services, ensuring transparency and safety for users. - **Trusted Flaggers**: Entities with expertise in identifying illegal content receive priority in flagging processes, streamlining content moderation. - **Complaint Mechanisms**: Platforms must establish internal and external complaint systems, including human oversight for content removal decisions. Disputes can be escalated to certified out-of-court resolution bodies. **4. Obligations for Very large platforms:** In the EU, there's the idea *that very large platforms are ought to be regulated differently*. The DSA addresses its section on very large platforms primarily on **[risk mitigation in content moderation situations.]** While this may be competition related it is not primarily a competition related situation (and to the extent that it is, it can also be dealt with through DMA) **Very large platforms** are defined as platforms that provide their services to 'average montly recipients' of [45 million or higher]. A platform cannot be faced with these extra responsibility requirements until a Digital service coordinator has designated the platform as so.  **Risk Assessment**: Platforms must identify and address systemic risks, such as the spread of illegal content, AI manipulation, and harm to fundamental rights like privacy and freedom of expression.  **Mitigation Measures**: Examples include enhanced moderation, transparent advertising practices, and cooperation with trusted flaggers.  **Audits and Transparency**: [article 28.] Platforms must undergo annual independent audits and disclose information about algorithms, targeting parameters, and systemic risks.  **Crisis Protocols and Codes of Conduct**: [art.35-37.] Platforms are encouraged to develop self-regulatory practices for regular operations and emergencies, such as public health crises or threats to public safety. **Implementation, coordination, sanctions and enforcement** In [chapter IV] a couple of enforcement measures for the new protocol are introduced. These measures are divided into sections concerning measures by national authorities, European board, super visional measures etc. The DSA introduces a robust **enforcement framework:** - **Digital Service Coordinator**: *Member states must designate national authorities to oversee compliance and take measures*. Are responsible for enforcement. DSCs have extensive powers, including on-site inspections and issuing fines (laid down in art.41 DSA). Member state in which the main establishment of the provider of information society services is located, shall have the jurisdiction. - **European Board for Digital Services (EBDS)**: This advisory body ensures consistent enforcement across the EU. - **Sanctions**: Non-compliance can lead to fines of up to 6% of annual turnover, and persistent violations may result in service suspensions. - **Enhanced Supervision for Very Large Platforms**: These platforms face additional scrutiny, with the [European Commission] empowered

Use Quizgecko on...
Browser
Browser