Abstract
Societal speech activities are increasingly conducted online, and therefore free speech concerns focus on the digital sphere. Major online service platforms operate speech-moderation practices that constrain digital speech. These platforms are run by a few multinational corporations—the so-called online giants. Online giants, in fact, control the backbone of democracies on a global scale. This article stresses a potential legal path to appropriately regulating digital speech, while preserving the free and thriving global digital culture. The argument introduced here is that the online giants should, to a certain extent, emulate global organizations, and since they control an essential public utility, they should operate under basic administrative legal norms that include accountability, transparency, giving reason, and objective review. To this end, the article inspects the global administrative law movement and proposes to extend its overarching conceptualization of global public procedural principles facilitating “good governance” to the online giants’ procedures, despite their private ownership.
1. Introduction
Today, almost any aspect of societal life is conducted online. This environment includes one of the most fundamental elements of democratic life: this is the sphere in which everyone may actively or passively create and consume information. Therefore, free speech concerns are increasingly focused on the digital online sphere. These concerns involve a variety of phenomena, reflecting different social and legal issues, which all raise the issue of speech and content moderation. Speech moderation refers to various practices involving monitoring, removal, or blocking of speech or content.1 There is a wide range of situations in which speech moderation may be exercised: from concerns about harm to public values and human rights to potential harm to economic and personal rights. Although speech moderation is aimed at protecting other legitimate rights, stemming from public or private laws, it nevertheless challenges the very basic notion of freedom of speech since it involves silencing speech practices.
While the inception of the Internet was celebrated as the accomplishment of a utopian and ideal free speech environment,2 after almost three decades it seems clear that some regulation is inevitable. This regulation should be crafted carefully to maintain the online thriving free speech culture, yet at the same time design the governance of digital speech moderation. The justification and legitimacy of speech moderation practices cannot only be derived from the appropriateness of their final goal; these practices should also be carried out by appropriate processes and procedures. The public trust in the online digital speech sphere may be promoted by procedural guarantees, regulating the vast range of speech and content moderation practices.
One of the most significant obstacles to speech moderation is the fact that the online digital sphere is run entirely by private commercial corporations. These corporations, headed by US technology giants such as Meta (Facebook) and Alphabet (Google), have grown to massive proportions over the last couple decades both in financial terms and in terms of control over the online digital sphere. These online giants de facto govern the backbone of democracies on a global scale. Against this backdrop, this article aims to stress a potential legal path to appropriately and adequately regulating speech moderation exercised by the online giants, while preserving the free and thriving culture of the digital sphere. The argument advanced is that the online giants, which operate on a global scale, should to a certain extent emulate other global or supranational organizations. In addition, since they control an essential public utility, service, or resource, they should operate under basic administrative law norms, including accountability, transparency, giving reason for decisions, and objective review or oversight. To this end, the article examines the global administrative law (GAL) movement that emerged two decades ago and that conceptualized, on both descriptive and normative levels, the administrative law principles facilitating “good governance,” applied by various supranational organizations.3
The GAL movement originated in the growing global governance of supranational bodies that have adopted semi-procedural public law standards, emulating administrative law principles. By focusing on procedural justice and turning the spotlight onto the process of decision-making, these bodies wanted to strengthen their legitimacy and enhance the public trust in their governance. GAL, in that respect, serves as both a descriptive project and a normative stance, encouraging further and deeper compliance with public administration standards. This article proposes to extend GAL to speech moderation practices conducted by the online giants.
GAL’s extension to online giants, which are for-profit corporations, reflects a major conceptual leap. The dividing line between public law and private law is increasingly blurred, and there are more and more worldwide legal measures for imposing various public law norms in the private law sphere. GAL may serve as a potential gateway for such a move, since online giants govern a globalized digital sphere and should therefore be regulated by globalized legal means. Moreover, extending GAL over online giants may yield some important advantages to the principled debate concerning the justification for regulating the digital sphere. Online speech moderation may be accused of silencing opinions, thus constituting undemocratic censorship; at the same time, avoiding speech moderation may be seen as facilitating hate speech and other activities harmful to democratic values. Therefore, speech moderation as such does not necessarily reflect a pro or con stance toward democratic measures. Speech moderation is a procedure that should be carried out in a manner that promotes the protection of human rights and democratic values. Therefore, the focus should shift to procedural justice measures, namely designing a deliberative global speech moderation method as a process subject to basic public law norms of accountability. Procedural justice may generate the adequate guarantees that the digital speech sphere, though governed by private corporations, would remain free and supportive to democratic values. The position articulated here is that the time is ripe to take GAL one step further and extend it to the procedures adopted by online giants.
Recent legislative initiatives, aimed at imposing obligations on online giants with regard to content moderation practices, reflect an attempt to craft a digital governance regime. The GAL legacy may significantly contribute to these various initiatives by providing a legal framework to facilitate the imposition of public law norms on the online giants. GAL can equip policymakers with a supportive legal concept that may strengthen the justification for implementing full-fledged administrative law procedures, guaranteeing the protection of fundamental rights in the online sphere. Legislatures around the globe realize that there is a pressing need to adopt a digital governance agenda, and the choice of its legal framing has a significant effect on its application. GAL’s underlying rationale may facilitate the development of a structured and coherent digital governance regime, embedded in the broader context of public law, as opposed to an anecdotal and ad-hoc response to specific hurdles.
This article will proceed as follows. Section 2 will briefly describe the centrality of online giants in the global democratic sphere, including their practices of content moderation and their potential conflict with fundamental rights and free speech. Section 3 will delve into the origins and major conceptualizations of the GAL movement and will explore the various supranational organizations that adopt GAL principles. These descriptive parts will be followed by Section 4, which will focus on the proposal to impose procedural public law standards on online giants. First, we discuss the potential path of applying such standards through semi-voluntary measures. Next, we present the proposal to extend GAL principles to online giants on a mandatory basis. Finally, we briefly address a third path of using competition laws. The article closes with a review of three legislative initiatives from the United States, the United Kingdom, and the European Union, which share the rationale of promoting accountable decision-making processes by online giants; however, as argued, these initiatives lack the upfront normative acknowledgment that digital governance should enter the realm of global public law.
2. Online giants and the global digital democratic sphere
2.1. Digital free speech
A growing portion of our societal life is conducted in the online digital sphere. While online, we converse, exchange information, express our thoughts and ideas, meet, shop, trade—and the list goes on. Therefore, the digital sphere has been compared to the traditional “market square,”4 as the current public forum in which major human interactions and communications take place. In fact, the digital sphere has grown into a global democratic culture.5 Various social media platforms, such as Facebook and YouTube, are major driving forces behind this culture. Online platforms provide essential public needs, such as offering a space for individuals to access information, express themselves, and thereby enjoy access to, and engage with, social and cultural life, all that on a global scale.6 In line with this emerging reality, in 2017, the US Supreme Court held in the case of Packingham v. North Carolina that the right to freedom of speech comprises access to Facebook services and other online social media platforms.7 Yet, the current phase of digital free speech comes with the traditional legal problems concerning both the protection of free speech as a fundamental human right and its counterbalance with conflicting interests and rights.8 A growing scholarship describes the phenomenon of global digital democratic culture and addresses various aspects of digital free speech, its limits, and the adequate constraints.9
This article tackles a specific issue, which presents one of the most troubling obstacles to democracy: although the online digital sphere has become the backbone of public civil democratic society,10 it is nevertheless entirely operated by privately owned commercial corporations.11 Moreover, the market of online platforms where social commerce and speech services are conducted is currently centralized, held by a small handful of gigantic corporations operating on a global scale. Thereby, the global democratic market square is run by a few private online giants.12 The global reach and dominance of online giants presents a new global challenge.13 The most urgent question is therefore whether the conduct of these online giants should be regulated, and if so, how? The various legal frameworks for tackling this new reality may be applied concurrently to promote a comprehensive solution, and indeed, preliminary moves in this direction are being initiated as described in Section 5. Yet, since legal regulation implicates freedom of speech, the threshold question is whether online giants’ activities should be subject to some basic procedural and substantive public law principles.14
Thus far, the liability of the various online platforms for content disseminated on their “premises” is regulated at the national level. In the United States, for example, the most significant legal framework on the matter is section 230 of the Communications Decency Act (CDA), which provides online platforms with broad immunity from liability for user-generated content.15 In the European Union, the e-Commerce Directive adopted in 2000 establishes somewhat similar immunity for intermediaries on various legal grounds.16 The underlying rationale for these immunities is to encourage online platforms voluntarily to take an active role in removing offensive content,17 and also to avoid free speech problems of private censorship.18 Yet, this legal framework gives online platforms vast discretion and provides no concrete guidelines. Therefore, voluntary speech moderation remains, to a large extent, a “black box” of private governance.19 The challenges stemming from speech moderation practices in the digital democratic sphere are addressed further below.
2.2. Content moderation and human rights
Digital speech—usually referred to as “content”—moderation practices may vary in many aspects: the type of content moderated; the reason and motivation for exerting the practice; the way the moderation is operated; and its results. The content moderated varies, and may include text-based content, images, and videos that are communicated online via various services. Content moderation practices may be carried out on either a voluntary or nonvoluntary basis. Private, voluntary regulation concerning content moderation has proliferated in recent years. Various social and legal developments, triggered by both private and public stakeholders, have led to the adoption of speech policies and enforcement measures by online giants.20 These speech moderation practices were crafted gradually, occasionally on a casuistic basis, responding to a business need, social outcry, or the nudging by public authorities. Various interests, such as business credibility and social legitimacy in consumers’ communities, incentivize the development of content moderation practices.21 Accordingly, the content moderated may comprise a wide array of undesirable speech, such as hate speech and speech encouraging violence;22 misleading or false information (known as “fake news”);23 or content that allegedly infringes copyright or amounts to other tortious conduct, such as defamation.24 Content moderation may also pertain to criminal activities, such as child abuse.25 Each type of content may merit different measures.
Non-voluntary content moderation, in contrast, may be generated by coercive legislation or by a court order. A prominent example of mandatory content moderation is the 2017 German Hate Speech Act (NetzDG), which imposes certain obligations on major social networks, such as the duty to remove hate speech and to provide a transparent decision-making process for the removal of such content.26 Another example is the 2019 EU Digital Single Market Directive imposing content moderation obligations concerning copyright infringements.27 In addition, courts may order content to be removed or blocked on grounds of illegality.28
Speech moderation practices comprise a range of methods that include, for instance, monitoring and detecting, speech tagging, and speech removal or blocking.29 Tagging is aimed at elevating users’ awareness of the problematic aspects of the content, while removal is aimed at protecting users or other stakeholders from the harmful consequences of the content’s dissemination. Removal may be accompanied by stay-down measures aimed at preventing the re-upload of the content that was taken down.30 All of these practices may be executed either by individuals who check the content manually or through the use of computational technologies (including artificial intelligence). The latter is usually referred to as “algorithmic moderation,” which implicates challenges particular to non-human governance regimes.31
Whichever method is employed, it creates a privately made global speech governance regime, which raises upfront conflicts with freedom of speech and other fundamental human rights.32 This is the case even if the content moderation is required by coercive regulation, since enforcement is conducted by private corporations pursuant to their own practices.33 The main concern is that private entities may moderate content too heavily, thereby silencing legitimate speech and amplifying the chilling effect on free digital speech.34 The online giants tend to over-moderate speech based on legal risk assessment, while counterincentives and an obligation to maintain the disputed speech on their platforms are lacking. In other words, the online giants have no incentive to invest time and effort in a profound legal assessment of content removal. The asymmetric incentives regarding speech moderation cause a well-documented tendency to opt for a speech-silencing default, and the outcome is a massive and uncontrolled removal of content.35 Moreover, the fear is that digital speech may be moderated on a capricious or discriminatory basis, raising not only free speech concerns but also concerns for other values underlying fundamental rights.36 These outcomes are amplified by automated moderation systems, since the asymmetric position of the online service providers is incorporated into the design of the algorithm through the setting of defaults. Thus, in the current phase of online governance, the silencing mechanism is an algorithmic one.37
The UN Human Rights Commission established a mandate of the Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression, which has been extended periodically.38 The Special Rapporteur has published a series of reports addressing freedom of speech in the digital environment. An early report published in 2011 gave special attention to safeguards of freedom of expression in social media platforms. This report reflected a comprehensive acknowledgment both of the important role online platforms serve in contemporary social structure and of the need to set rules and legal boundaries pertaining to content moderation conducted by these bodies.39 Online platforms, it was further stressed, should serve as gatekeepers in protecting human rights.40 The report articulated concrete obligations stemming from the duty on the part of online platforms to safeguard human rights in the context of content monitoring: content monitoring should be transparent to both the relevant individuals and to the public; it should apply proportionate measures such as providing a forewarning whenever possible; and restrictions should be narrowly tailored to the content involved.41 These statements marked the first step in an institutional–public discourse regarding the need to regulate content moderation practices. Another report issued in 2018 was based on a global survey and sought to sketch an empirical picture of voluntary and nonvoluntary content moderation practices. The overall finding was that on a global scale, the private sector does not adequately protect freedom of speech.42 This 2018 report emphasized in particular the fact that content moderation is often operated by algorithmic decision-making processes, which are unaccountable for any results affecting individuals’ human rights.43 The report thus concluded that there is a pressing need to make private companies subject to human rights obligations and to make civil compliance with basic human rights the default standard.44 Most recently, in 2019, the Special Rapporteur issued another report focusing on “hate speech.” This report stressed that online hate speech content moderation should be based on public-law principles of proportionality and necessity45 and that states should promote a combination of features including transparency and enforcement by independent judicial authorities.46 The report concluded that, while states should “actively consider and deploy good governance measures” regarding content moderation practices,47 companies should “adopt content policies that tie their hate speech rules directly to international human rights law.”48
A prominent example illustrating the conflicts in the realm of global digital speech is the case of Facebook, now regarded as the largest online social media service in the world.49 The traffic of digital speech over the Facebook network is immense.50 Against this background, Facebook has not only voluntarily adopted its own policy concerning speech moderation,51 and more recently a general human rights policy,52 it has also established an oversight board, seemingly objective and independent, to review the decision-making process concerning speech moderation.53 The underlying motivation in establishing the oversight board was to promote public trust in Facebook’s conduct and foster its legitimacy.54 Yet, as this regime remains voluntary and private, it has been criticized for lacking the safeguards of an objective and independent oversight body,55 and it also lacks the ability to provide redress for all individuals challenging the company’s decisions. Further, the company has been subject to continual criticism regarding the inadequate notice and justification offered to users found to violate Facebook’s rules.56 At best, Facebook’s oversight board constitutes an improvement in the company’s content moderation policy.57 It is nevertheless a major leap for a private corporation to realize that its economic future and social legitimacy intertwine with the adoption of measures emulating procedural administrative law principles, and it brings into the discussion the issue of supranational governance and the emerging global administrative law, to be further described below. As we argue, voluntary and eclectic measures do not provide a solid ground for establishing a comprehensive and systematic digital governance agenda. Moreover, content moderation raises the complicated question of the private sector’s role in guaranteeing human rights. Therefore, to accomplish the Special Rapporteur’s stated goal of directly tying content moderation practices to international human rights law, more profound legal developments are needed.
3. Global administrative law
3.1. The evolution of global administrative law
The new legal field of global administrative law emerged two decades ago, and it has been constantly and extensively developing ever since.58 GAL addresses the legal governing rules of various supranational bodies that were established to monitor or coordinate global behavior. The development of GAL has attracted much legal attention.59 The inception of this legal movement was generated by the observation that, paradoxically, though these supranational organizations were established to enhance accomplishment of public goals including human rights, they nevertheless were not operating in an accessible and transparent manner.60 Basic administrative law principles are headed by the general notion of “accountability,” which is translated into the sub-requirements of transparency, giving reason to decisions, and objective oversight, whether by judicial review or other mechanisms.61 Some supranational organizations even oblige domestic bodies to apply these basic administrative requirements, while not necessarily adhering to the same standards themselves.62 Therefore, the GAL discourse has stressed the idea that the notion of accountability, along with its sub-requirements, should be applied on a global scale and outside the traditional and narrow scope of state-individuals.63 The adoption of basic administrative law principles on a global scale, therefore, aimed to strengthen the legitimacy of these supranational regulatory organizations.64 The notion of legitimacy in itself may be further broken down into various values GAL generates to the organizations, such as adherence to democracy and rule of law, which are fundamental principles in the supranational domain, as well; promotion of social welfare; and promotion of a structured, systematic, and deliberative decision-making process.65 Another important value derived from the overarching goal of legitimization is building trust and fidelity to the purposes for which the decision-making power was allocated.66 Rigorous procedural justice measures, therefore, enhance the legitimization of governance and the exercise of power, whether conducted by the state or by a supranational organization.67
The administrative requirements of transparency, giving reason, and review reflect the most basic and essential elements of procedural justice guarantees.68 These requirements should be differentiated from public law substantive principles, such as proportionality and fair and equitable treatment, which are nonetheless gradually introduced into the discourse of supranational organizations.69 However, when national administrative law principles are transplanted into the supranational sphere, a different outcome is yielded, accommodated to the global context. In other words, the various traditional principles of administrative law are transformed within the GAL melting pot, and the incremental decentralized development of the field ends up with diverse administrative systems. Global administrative law is thus evolving into a differentiated branch of public law, and since it is still in its first stages of implementation in the supranational arena, it is not yet unitized.70 For instance, the principle of oversight or review of decisions is considered as the bedrock of traditional administrative law.71 However, there are many variations to the manner this principle is implemented by supranational organizations, which vary with regard to elements such as the identity of the members of the reviewing tribunal or quasi-judicial body, the independence of these tribunals, the procedures of the review process (whether adversarial or not), the nature of the decision, as well as its instructive or even educational character.72 The layered and disorderly development of GAL has generated diversity and fragmentation, which consequently created complexity in understanding and studying this new emerging legal field.73 Moreover, the activities of supranational organizations have incrementally grown into a web of interactions between diverse organizations employing a wide range of regimes, which turns the entire field into a highly complex social and legal phenomenon.74
The scholarship pertaining to GAL is growing extensively, and new ramifications, new approaches, and finetuned observations are naturally emerging.75 This “aftermath” scholarship has included controversies over a wide range of issues,76 such as GAL’s legal grounding,77 its relation to substantive constitutional standards,78 and, most relevant to this article, the question of GAL’s coverage of non-governmental and private actors.79 This article aims to tackle an issue that reflects a major leap in GAL scholarship, namely “boundary crossing,” criticized by GAL opponents.80 Nevertheless, it was suggested that one of GAL’s important virtues is openness in seeking new legal frameworks, outside the classical dichotomies of national/international and public/private.81 Moreover, GAL encompasses multifaceted conceptions of various legal notions and terms, including with regard to the basic term “law,” which may include soft measures, self-regulation, and institutional conventions.82 Kingsbury, Krisch, and Stewart, in their seminal 2005 article encapsulating the GAL project, proposed to define GAL as comprising “the mechanisms, principles, practices, and supporting social understandings that promote or otherwise affect the accountability of global administrative bodies, in particular by ensuring they meet adequate standards of transparency, participation, reasoned decision, and legality, and by providing effective review of the rules and decisions they make.”83 The present article seeks to extend this trajectory by searching for a conduit through which commercial companies that govern the digital sphere might comply with the basic elements of GAL, thus introducing solid underpinnings drawn from public law into the democratic digital environment.
3.2. Taxonomy of supra-national bodies adopting administrative law principles
Over the last two decades, GAL’s reach has expanded, and various kinds of supranational bodies have applied administrative law principles. Examining the bodies currently complying with GAL standards may serve the search for future potential evolvement of GAL’s reach. Classification of current bodies complying with GAL standards into categories may reveal the scope of GAL’s current spread and assist in evaluating the possibility of its further potential extension. Within this framework it should be particularly questioned whether there are private for-profit entities on this shortlist; if the answer is positive, then the manner in which these private for-profit bodies apply GAL principles should be further analyzed. After taking stock, the question whether moving ahead and extending GAL to online giants would be better addressed.
There are many international, supranational, or rather global,84 organizations, whose composite is still in its early stages of evolution.85 The various supranational bodies may initially be classified by their governmental or non-governmental nature.86 The governmental bodies may be further classified according to different factors, such as the source of their foundation: international organization established by a treaty (e.g. the United Nations) or by other inter-governmental agreements (e.g. the World Trade Organization), or a sub-organization of an already established international or intergovernmental organization (e.g. the World Health Organization, the World Intellectual Property Organization).87 These international organizations, linked in one way or another to states or governments, may be perceived as part of the state’s overarching function when they exercise any kind of “governance activity,” and therefore they should be subject to public law norms, at least to a certain extent.88 In this sense, domestic public law may be extended to cover state activities, even when they are conducted by a state’s delegation to an extraterritorial arm.89 The legitimacy of “international public authority” activities, the argument continues, lies in the applicability to substantive and procedural public law standards.90 The enforcement of the norms produced by these bodies, likewise, usually depends upon the states.91 Yet, these organizations, even if established by states, have become independent supranational bodies, therefore they have complex and multi-faceted relationships with their “founding” states or governments.92
Non-governmental international organizations, which are therefore not exercising official governance activities or function as a state’s arm, deserve further classification. Turning the spotlight onto this group, a major classification would stress the not-for-profit and for-profit divide. The vast category of international non-governmental not-for-profit organizations (INGOs) includes a wide range of bodies, such as trade and professional associations, private regulatory bodies, and global funds, aimed at promoting public goals on the international level.93 Common examples are the International Olympic Committee (IOC)94 and the International Organization for Standardization (ISO).95 These bodies may apply common standards that are imposed on not-for-profit entities by domestic laws, and occasionally even higher standards are adopted voluntarily.96 The domestic legal governance of not-for-profit entities, to a certain extent, is underpinned by public law values,97 therefore INGOs follow a similar path. Moreover, INGOs may adhere to public law standards, as stressed above, in order to foster their legitimacy. This not-for-profit “third” sector may be perceived as an intermediary social layer between the public and the private sectors, and it further reflects the transition of public services from the national state to the transnational arena.98 Yet, since there is no external binding source for imposing GAL standards on INGOs, and since INGOs are, at the end of the day, private initiatives, these bodies also conduct complex relations with the various players, either in the global sphere or in the national field.99 Occasionally, however, states or governments are allied participants in INGOs, thus turning these bodies into hybrid public/private ones, which are therefore driven to comply with public law standards due to the robust public governance character.100
A particularly interesting example of an INGO that has come to resemble a full-fledged public agency, and therefore adhered to public law principles, is the Internet Corporation for Assigned Names and Numbers (ICANN). ICANN is involved in the allocation and approval of domain names and domain addresses worldwide. ICANN describes itself as a non-profit corporation based in the United States with global participation.101 Due to a minimal number of government representatives on an advisory committee, it may be classified as a hybrid INGO.102 Yet, in fact, it is a private entity, which controls the distribution of a major asset in the current digital sphere. In this sense, it functions as a supranational regulator controlling a global resource, which is an essential precondition for the function of the Internet.103 Its function, similar to that of a government agency controlling essential infrastructure, has driven ICANN to adopt GAL principles as an important pillar of its legitimacy.104
Another classification, which is of utmost interest for this research, is the category of non-governmental for-profit international organizations. This category mainly includes multinational corporations (MNCs). MNCs do not have a single definition; however, they are characterized as large for-profit corporations that produce or sell goods or services on a worldwide scale.105 In the last decade, the amplification of globalization has been accompanied by the proliferation of MNCs, operating all over the globe.106 Therefore, MNCs are also occasionally referred to as “stateless corporations,”107 which entails concerns stemming from the lack of legal constraints upon the conduct of MNCs as corporations without a certain state basis.108 The adherence of MNCs to human right standards has long been a significant concern, and extensive academic discourse is devoted to this specific angle, whether under the auspices of “corporate responsibility”109 or under other schools of thought.110 As will be further discussed below, the most popular mechanism for generating compliance by MNCs with human rights as well as other public law standards, such as procedural ones, is through voluntary and semi-voluntary codes of conduct.
4. Extending public law principles to online giants
Considering the unique and central role of some of the online giants in democratic societal life, and the fact that today the major sphere for accomplishing the underlying values of freedom of speech is online, the question is whether some of the basic procedural public law measures should be applied by these online giants. The fact that the operators of the online sphere are private commercial entities creates the main obstacle for the imposition of public law principles, whether substantive or procedural. In this part, we address various potential paths for introducing public law principles into the content moderation practices operated by the online giants.
4.1. Semi-voluntary mechanisms for compliance with public law standards
The discourse concerning the role of MNCs in globalized societies is often driven to discuss “corporate social responsibilities,” regarding various soft, non-coercive mechanisms that push for the adoption of higher standards of conduct by for-profit corporates. The underlying motivation for these measures is to promote corporate accountability towards both the shareholders111 and the public at large.112 Corporate social responsibility may relate to a wide array of themes, including environmental, labor, and financial ones.113 Following this trend, many MNCs have signed voluntary principles and codes of conduct embracing best practices in the field of human rights,114 such as those concerning child labor and fair labor conditions.115 Yet, it is important to emphasize that within social responsibility measures, MNCs do not undertake to meet GAL standards.
These legal mechanisms of voluntarily compliance with higher standards are debated: while some praise their virtues as legal tools achieving best results under the given circumstances of no regulation,116 others criticize them as ineffective measures that are a fallback result of a failure to regulate the relevant behavior.117 The realm of voluntary mechanisms inducing compliance with higher standards has extensively evolved in the last two decades in a way that reflects the decline of the mandatory/voluntary dichotomy with respect to regulative measures.118 An exemplary code of best practices, which reflects a mixture of such a mandatory/voluntary instrument, refers to corporate governance standards. Adequate corporate governance of publicly listed companies entails a set of core principles aimed at promoting accountability of the company to its investors. These core principles include transparency and disclosure of information, as well as many other managerial requirements.119
The recommended mechanism by the Organisation for Economic Co-operation and Development (OECD) for the corporate governance code of best practices, which is employed by many countries, is that companies must disclose the fact of their (non)compliance with a recommended code, and in case of noncompliance explain the reasons for their choice.120 This mechanism is known as “comply or explain.”121 Consequently, massive market pressure drives most companies to compliance.122 This mechanism could be conceptualized as semi-voluntary, since it incorporates a mandatory element of disclosure. This example of semi-voluntary adherence to higher standards of conduct demonstrates both its powerful impact on for-profit companies’ conduct and its ability to challenge core business issues, beyond matters of traditional corporate social responsibility.123 Moreover, a major criticism raised against the various general codes concerning traditional social responsibility issues is that there are a wide range of codes, among which companies may choose one that best fits their needs; therefore the codes are nothing but self-serving tools.124 In contrast, the corporate governance “comply-or-explain” code presents a much more firm framework: it offers a clear and certain standard that reflects the adequate and effective threshold by the regulator, and it leaves the companies freedom to choose to either comply or explain noncompliance. There is no “code shopping,” and therefore it aligns with higher standards of conduct.125 Therefore, the virtues of the “comply-or-explain” mechanism are varied, from its relatively effective influence on corporate conduct to its potential flexibility, in the sense that the specificity of the code could be tailored by each country to its needs and legal tradition.126
This kind of instrument may be extended to introduce public law standards accommodated to the speech arena as well.127 Big technology corporations, being classified as MNCs, may be induced to comply with adequate standards through the semi-voluntary mechanisms. Along these lines, in 2019, the OECD stressed the important role of online platforms to the global digital sphere,128 and further published corporate social responsibility guidelines for online platforms that would challenge various potential conflicts with human rights, including the “right to free expression, non-discrimination, the right to information, and the safety and security of persons.”129 However, these guidelines do not include any concrete recommended measures, or any mandatory elements. Taking the conduct of online giants in the context of digital speech governance one step further requires domestic and international regulators to design precise, concrete, and effective codes of best practices concerning digital free speech “good governance,” pushing for a higher standard akin to an administrative law one.130 Online platforms should not be expected to develop meaningful and burdensome guiding principles by themselves.
4.2. Extending GAL to online giants on a mandatory basis
Another alternative path is to consider ways to extend GAL to the digital speech governance realm on a mandatory basis. This path gains significance in light of the various drawbacks of the semi-voluntarily mechanisms, such as the lack of structured public law procedural standards, code shopping, and loose regulatory oversight concerning de facto compliance with the code’s provisions.131 If policy considerations aim to guarantee a very specific standard of conduct, emulating administrative law procedures, then why not make it a clear mandatory requirement?132
The argument introduced here is that the online giants effectively function as a public utility,133 and as such, they are appropriately subject to governance on a supranational level pursuant to GAL principles.134 The underlying rationale of the GAL project, including the overarching goal of facilitating legitimacy through procedural measures, fits well within the current reality of online giants’ control over digital speech. Incorporating basic elements of procedural justice to enhance trust, order, public welfare, and democracy may satisfy the requirements of “good governance” in the global digital sphere. GAL, therefore, may serve as a model for the potential legal order to govern some of the online giants’ activities implicating freedom of speech. This move does not envision the nationalization of private corporations; its impact is much narrower, as it aims to identify the specific activities that have a significant effect on free speech interests and should therefore be subject to basic procedural public law standards. In this sense, we do propose to take GAL one step further and extend its application to new global arenas, accommodating GAL’s application in view of contemporary societal developments and legal needs.135 As has been stressed, GAL is not only a descriptive project but also a prescriptive one.136 By the same token, GAL scholarship reflects an attempt not only to systematize existing global administration practices and name them as such, but also to further disseminate these principles in the global legal arena and encourage their adoption.137 In the following section, we explain the virtues of extending GAL to digital speech governance and address the justification for initiating such a move through mandatory measures.
a) The virtues of extending GAL to digital speech governance
The profound control of a few MNCs over the global digital speech sphere supports strong policy considerations favoring the mandatory imposition of basic public law standards. The governance of these private for-profit corporations over the backbone of democracies begs for a global, determined, and structured solution. GAL principles are an appropriate and adequate framework, both theoretically and practically, since they can provide procedural guarantees that speech moderation would be handled through basic procedural public law norms. The virtues of the GAL principles are varied. GAL provides the theoretical infrastructure, i.e. a conceptual framework, that may facilitate the imposition of genuine public law standards rather than ad hoc obligations on private corporations. As a result, extension of GAL principles over the digital speech environment may generate the needed societal legitimacy and trust in the global digital fora. Thus far, there have been only preliminary legislative initiatives aimed at establishing a digital speech governance regime, which we address below. These initiatives, which reflect different approaches, may yield significant advantages by explicitly leaning on GAL perceptions. A GAL framework may furnish an overarching legal conceptualization that would enhance the introduction of public law standards into the private corporate sphere, beyond mere sporadic duties lacking a legal context. GAL may provide the broader perspective concerning the reasons, justifications, and goals for regulating the online giants while encompassing the complexity of introducing public law measures into the private sphere. It is likely that digital speech governance will be an evolving project requiring legal interpretation and further refinement in view of various ramifications.138 A long-lasting digital speech governance regime would be better achieved by anchoring new obligations within a legal infrastructure that can support future legal developments. By identifying online giants’ regulation as a branch stemming from GAL, such regulation would benefit from a comprehensive “legal package” to support the introduction of true administrative procedural standards applicable to the relevant MNCs. Consequently, a GAL framework may promote effective development of this emerging legal field, guaranteeing its further growth along a systematic delineated trajectory.139 This proposed path for generating “good governance” for global digital speech might follow GAL’s past patterns, characterized as incremental and adaptive processes of legal development. A further advantage of embedding GAL legacy within digital governance regulations is that such a regime would acknowledge the growing power of online giants as global overarching rulers140 whose governance should be based on solid public law groundings.
The practical benefits of extending GAL to online giants lie in the provision of concrete guidelines regarding the scope and extent of the various essential procedures that should be adopted by any adequate digital speech governance regime. Administrative procedures comprise a complex web of rules and guidelines, and the development of digital speech governance can lean on these already existing practices. In other words, a GAL framework may assist in better incorporating public law procedures into the online giants’ day-to-day practices. This potential advantage can be exemplified by various procedural standards. To start with, the requirement of transparency, regarded as the bedrock of administrative procedural standards,141 can be implemented through a range of measures reflecting various disclosure obligations. Annual transparency reports containing aggregative data on the organizations’ operations represent the minimal threshold of transparency that is usually associated with minimal standards of corporate social responsibility. While such reports are important for shedding light on an organization’s policy and activities,142 they do not disclose sufficient information to serve as a basis for individual claims based on personal rights. Full-fledged administrative transparency relates to a very different level of disclosure, which requires public agencies to fully disclose to any affected individual the basis for their decisions. Such requirements include, for example, an affirmative obligation to deliver information regarding a decision that has been taken that may affect an individual; disclosure of how the decision was processed and by whom; and information on who the individual may contact to inquire about the decision.143 Moreover, because many decisions taken by public agencies today are generated by computational systems, disclosure of the underlying computational system’s algorithm is required to enable affected individuals to understand and potentially dispute the decision-making process.144 There have been significant developments with regards to algorithmic transparency as part of administrative procedural standards.145 What is clear is that meaningful protection of individuals’ fundamental rights in the context of content moderation practices requires higher levels of transparency and disclosure.146 While greater transparency represents a major conceptual leap regarding procedural obligations of MNCs, adherence to a GAL framework may facilitate this advance because it would anchor the obligations within administrative law justifications on a global level, as opposed to mere ad-hoc regulation which is detached from a normative source.
Another basic element of administrative procedures relates to external, objective oversight over administrative decisions.147 Extending GAL to online giants necessitates the establishment of oversight tribunals, which should meet the objectivity and independence requirements. An oversight body that fails to provide for the necessary structural elements to guarantee public law procedural standards would not achieve an adequate digital-speech governance regime. Facebook’s voluntary oversight board has been subject to significant criticism in this context, as discussed above. Adhering to a GAL framework may emphasize that, once it has been accepted, online giants should be subject to administrative-law-like standards because of their essential role in controlling the digital speech sphere, and the procedures implemented should emulate at least the basic thresholds of objectivity.
Extending a GAL framework, therefore, may facilitate a successful legal exportation of public law standards to the realm of private corporate governance in the emerging digital societal sphere. Two decades ago, scholars perceived that the concept of the “rule of law” was spreading from national to multinational agencies, and further to corporations, yet without a structured theoretical and practical framework.148 GAL may fill in this gap,149 at least by contributing a guiding legal framework for the design of a digital speech governance regime. After examining the potential virtues of extending GAL to online giants, the next question is how this legal move could be achieved.
b) Extending GAL to digital speech governance through coercive regulation
The proposed approach of extending GAL to online giants will certainly confront claims that public law principles cannot be imposed on purely private commercial entities,150 and in particular, that GAL principles have no intrinsic legal enforcement mechanisms.151 Compliance with GAL is usually generated either by a national anchor or else on a voluntary basis. Nevertheless, the following section will present some major arguments for allowing the extension of GAL to online giants. These points should be viewed as catalysts (rather than counterarguments) to encourage future development in the global digital sphere.
There are theoretical controversies in the GAL scholarship, as stressed above, with respect to the applicability of GAL to private bodies. However, it seems that there is some agreement with respect to the extension of GAL principles over private bodies if they exercise a public power.152 The taxonomy of the supranational bodies applying GAL as described above reveals that there is a spectrum of contingencies on the matter—from various kinds of governmental organizations compelled to meet public law standards to pure for-profit corporations that apply some public law standards on a voluntary/semi-voluntary basis. In between, we find international not-for-profit organizations, such as ICANN, which, despite being private entities, are bound by some public law standards. Policy considerations may justify the acknowledgement of another sub-classification of supranational bodies, of for-profit corporations, that operate on a global scale and control an essential public utility, service, or resource. This new category of MNCs should be reconceptualized as obligated to GAL principles on a firm regulatory basis. The proposed new sub-classification considers the relevant MNCs—although they are for-profit, akin to non-profit hybrid IMNCs153—as obliged to follow basic GAL principles when engaging in activities that serve a major public function. The public capacity could not be ignored and left to private market forces.
The question of global regulation over MNCs is not new.154 There are a few subject matters, such as the environment, in which states have set mandatory standards through multilateral treaty regimes.155 Though one may hope that a similar multilateral instrument will be developed for imposing adequate digital speech norms on online giants, it is not foreseeable in the near future.156 As stressed above, current “global regulation” perceived in its broadest meaning may be promoted by a variety of stakeholders—such as shareholders, inter-governmental bodies, civil society organizations, and other public organizations such as consumer groups—that collectively generate the necessary pressure.157 Nevertheless, states are still in the most effective position to impose norms on MNCs, as states have at their disposal a variety of measures encouraging territorial compliance, even of foreign corporations.158 States are also the effective and legitimate entities that can take steps to guarantee that public law standards are met.159 Moreover, domestic norms and standards imposed on corporations may have an extraterritorial impact and eventually generate an international benchmark.160 Therefore, it may be the case that the most effective and realistic vehicle for initiating the extension of GAL standards to online giants will be through coercive domestic measures, namely state legislation.
4.3. The competition law path
A third potential path for imposing public-law obligations on online giants is through competition laws that regulate monopolies. Yet, this path may serve mainly as a complementary measure alongside regulations governing content moderation and free speech.161 The origins of US competition law are rooted in the objective of limiting the power of dominant private corporations, which negatively affect the public good.162 Under section 2 of the Sherman Act,163 a monopoly formed through prohibited conduct has committed an offense subject to judicial remedies.164 These remedies may include forcing monopolies to be broken up or to be run subject to certain obligations, and massive penalties may be imposed.165 In view of the growing power of the online giants and their control over the digital environment, there is a significant societal movement that includes scholarly voices,166 civil organizations,167 and public representatives,168 calling to restrain the online giants through various competition laws. Such measures may include, for example, imposition of unbundling obligations that would limit online giants’ scope of activities,169 or interoperability mandates that require big tech companies to provide access to their systems to potential competitors.170 In line with such proposals, in December 2020, the European Union proposed a Digital Markets Act (DMA) aimed at tackling the problems stemming from the highly centralized digital services market, which includes, as part of the “package” of regulations discussed below, a new digital governance regime.171 The DMA has been approved by the Council of the European Union in July 2022.172
This potential path, while important and effective, should not be viewed as a substitute for the proposed imposition of GAL standards on online giants, for several reasons. First, under current US antitrust law, the prevailing test for anti-competitive conduct is the “consumer welfare standard,” which emphasizes price over any other criteria.173 Accordingly, antitrust enforcement focuses on conduct that causes market harms, such as consolidations.174 Yet, the flaw addressed here concerns restrictions on free speech, or silencing practices, which may be caused by services given for free. Therefore, the problem is not monopolistic pricing or direct market restrictions, but rather injury to the democratic digital free-speech sphere. As long as antitrust laws are construed narrowly as concerning consumers’ economic welfare, then this path is ill-equipped to tackle non-economic injuries inflicted by online giants’ control of digital speech.175 The same limitations apply to the EU DMA, which focuses on promoting competition among businesses, and not on promoting greater safeguards for end users’ digital human rights.176 Second, the online silencing practices are not necessarily operated only by “giant” corporations in terms of market dominancy, but may be also operated by smaller platforms, occasionally local, that still may affect digital freedom of speech. The fact that moderation practices are typically operated by online monopolies does make the practices monopolistic per se.177 Third, on a pragmatic level, antitrust procedures may be long and complicated,178 and their final outcomes, such as criminal penalties, may be disproportionately severe. Finally, the tension between online giants’ practices and individual Digital Human Rights may be better addressed through an explicit legal framework based on an appropriately crafted standard that targets their problematic conduct, rather than a path that simply treats the adverse consequences of the absence of clear regulation.
5. Current initiatives on both sides of the Atlantic
The problem with online speech moderation is one of the most troubling challenges democratic governments face. Civil society organizations were the first to identify the pressing need for measures to ensure accountability and transparency aimed at promoting digital free speech. A prominent example is the 2018 Santa Clara Principles, which articulated three core principles that should be implemented in content moderation practices: disclosure of information regarding content removal; providing advance notice to stakeholders; and providing a meaningful opportunity for appeal.179 More profound legislative initiatives from the United States, the United Kingdom, and the European Union have been recently presented, each one of them demonstrating a different approach. These initiatives, discussed in short below, reflect a range of measures in an attempt to impose some aspects of accountability and transparency, either through a semi-voluntary measure (UK), a mandatory measure relying on deprival of immunity (US), or a full regulatory measure that is mandatory but limited in the obligations it imposes (EU). Even the farthest-reaching EU initiative stops short of taking the full-fledged path of imposing GAL principles on online giants, but each adopts some elements of procedural standards aimed at furthering accountability and transparency. As explained above, the absence of a true public-law framework may lead to deficiencies in the implementation of higher public-law standards, for example in cases requiring interpretation of the scope of the obligations imposed.
5.1. The US Platform Accountability and Consumer Transparency Act bill
The Platform Accountability and Consumer Transparency (PACT) Act bill,180 first introduced in June 2020 and reintroduced in March 2021 by Senators Brian Schatz and John Thune, reflects a partial move towards the imposition of certain limited duties resembling administrative law principles. The PACT Act bill opens by stressing the centrality of current online culture to societal life181 and by emphasizing the need to preserve this culture while promoting accountable and transparent measures for consumers’ engagement in online services.182 The actionable sections of the PACT Act bill include obligations to adopt a policy concerning content and speech over online platforms and to make this policy accessible,183 and to establish a complaint system of “notice and takedown” with regard to both illegal content and content that does not meet the policy standard, with various measures concerning accessibility, speed, giving a reason, and a right to appeal to an internal appellate body.184 Moreover, the PACT Act Bill proposes to oblige online platforms to publish quarterly transparency reports concerning these content moderation activities.185 Some exemptions are carved out of the proposed regulation, such as in the case of a small business.186 Finally, the PACT Act Bill proposes to amend section 230 of the CDA in a manner that deprives immunity in cases of knowledge of illegal content that was not removed, under specified conditions.187
The PACT Act Bill therefore reflects a move toward application of some minimal threshold of accountability and transparency in the process of speech moderation; however, by no means does it propose a straightforward application of administrative law standards. The PACT Act Bill in fact proposes to establish a very specific technical operation to adopt a policy and execute it within an accessible complaint system, motivated by a “stick” of liability in case the online operator fails to adhere to the mandatory procedures. The Bill does not get into the substantive question of the policy itself and does not state that online operators must adhere to human rights per se. Moreover, the Bill does not explicitly subordinate the entire decision-making process to an external objective oversight. In that respect, it does not represent an attempt to push digital governance into the public law sphere, yet it may be justified by a pragmatic approach, considering the generally rigid perception in US law regarding a strict divide between the private and public law realms.188 It should be further noted that the mechanism of a complaint system and the proposed exemption from Section 230 to the CDA immunity, which in fact reflect a grand “notice and takedown” regime, were criticized for being a double-edged sword that may eventually lead to more digital censorship and harm to free speech, because they may encourage online operators to carry out mass takedowns of content in order to avoid liability.189 A similar phenomenon was extensively described, discussed, and documented with regard to the notice-and-takedown regime set by §512 to the Copyright Act, allowing mass and easy removal of allegedly infringing copyrighted content, ending up with a significant chilling effect on freedom of speech.190
5.2. The UK “Online Harms White Paper” initiative
In February 2020, the UK government published an Online Harms White Paper (OHWP),191 and in December 2020, the government published its full Response to Online Harms White Paper consultation (R-OHWP).192 The OHWP opens by setting a vision for a “free, open and secure internet, ” proposing to tackle various online harms in a “coherent, single regulatory framework.”193 The proposed solution is to impose a new “duty of care” on relevant online platforms,194 which aims to promote effective and proportionate measures, including technological ones, to accomplish the vision of the new law.195 It was clarified that only companies with direct control over the content and activity on a service, including search engines,196 will be subject to the proposed duty of care.197 The R-OHWP further identified various exemptions to the new proposed regime aimed at safeguarding freedom of speech. For example, it was clarified that content produced and published by news services on their own sites, including below-the-line comments of readers, are outside the scope of the duty.198 Moreover, in order to promote certainty, the R-OHWP refined the definition of the type of content that will be subject to the duty of care as limited to content that “gives rise to a reasonably foreseeable risk of a significant adverse physical or psychological impact on individuals.”199 The R-OHWP explicitly excluded additional subject matter content, such as harms resulting from breach of intellectual property rights, data protection regulation, consumer protection law, fraud, and hacking.200 Accordingly, the proposed regime will apply only to significantly high-risk content. This narrower approach to harmful content is reflected, for example, in provisions stating that the duty of care would cover only “disinformation and misinformation that poses a reasonably foreseeable risk of significant harm to individuals (e.g., relating to public health).”201 The limitations in the scope of harmful content covered by the proposed duty of care are aimed at promoting greater certainty associated with business risk management in the private sector.202 Compliance with this new duty of care will be enforced by an independent regulator, the Office of Communications (Ofcom).203 The regulator will design a code of best practices that will translate the duty of care into clear measures to ensure that moderating content is transparent and effective.204
The OHWP further proposed that the companies will be subject to a nuanced “comply-or-explain” regime, in the sense that if they choose not to comply with the recommended code, they can explain through what alternative measures they will ensure fulfillment of the set goals.205 The R-OHWP refined this aspect concerning the codes of best practices, explaining that it would be a voluntary measure;206 and to avoid confusion and over-removal of content by risk-averse companies, the regulator will adopt an integrated, overarching code after consultation with relevant stakeholders and subject to parliamentary approval.207 Moreover, annual transparency reports concerning compliance and other information regarding the practices of content moderation could be required by the regulator.208 Regarding oversight, and as part of the duty of care, the companies are expected to operate user complaint functions.209 Finally, regarding user redress, whether to establish a “super complaints” body to defend the interests of users was left open to further consultations.210
The OHWP initiative thus reflects a combination of two mechanisms: regulatory imposition of a new duty of care and elaboration of the new duty into concrete measures through codes of best practices, adopted voluntarily and overseen by a designated regulator. However, the OHWP does not propose to extend GAL measures or public law principles as such over the online platforms. Concerning transparency and oversight obligations, the OHWP proposes a very low threshold: annual transparency reports and effective user complaints functions. The R-OHWP reinforced this tendency by maintaining a minimal standard of transparency and accountability: the required information disclosure is needed only to a degree that will allow the regulator to assess compliance and the effectiveness of the policy implemented by the relevant company; however, this standard does not extend to a duty to disclose information relevant to an individual affected by the company’s decisions.211 Further, any “super complaints” function operated by the regulator will not serve as an external review of individual disputes, and companies are expected to develop their own internal bodies for such purposes.212
In contrast to the limited standard proposed by the UK initiative, this article proposes a broader governance regime emulating genuine administrative law standards to better guarantee individual digital human rights. Accordingly, transparency obligations should not be reduced to mere annual reports, and the proposed internal users’ complaint function should provide for an external and objective review procedure that allows individuals to challenge each individual case. As long as the oversight is conducted by the same operating body, a for-profit corporation, the fear is that it would not be an independent one, and therefore it would not gain legitimacy and trust by the public.213
The OHWP’s underlying rationale is to impose a new duty which derives from tort law. “Duty of care” is a threshold of conduct that refers to the “reasonable” standard of conduct, which avoids foreseen harm to others. A negligence cause of action is established if a person that is under a legal duty of care has deviated from the reasonable conduct standard.214 Therefore, the OHWP policy is to rely on tort-private-law mechanisms to promote the societal policy goals relating to the digital sphere. There is an increase in the use of tort law as a vehicle to promote societal and public values rather than interpersonal justice.215 More specifically, tort is becoming an important path for establishing corporate accountability for human rights violations.216 Yet, the question is whether tort law is the appropriate and most effective legal vehicle for designing the substantive norms of conduct in the digital speech environment. If public policy considerations support corporate accountability for human rights including free speech, the application of tort law norms might best be viewed as a complementary means of enforcement rather than as a sole path. The ability of tort law to promote adherence to human rights is not without limits, especially since its ex ante effect of guiding conduct is only a byproduct of imposing ex post liability on a tortfeasor to compensate for injuries caused by him or her.217 In contrast, the lesson drawn from the normative and doctrinal conceptualization of the GAL movement is that the imposition of new duties that derive directly from the realm of public law on the private sector might be an effective path toward guaranteeing adherence to substantive and procedural human rights. Such a path may build the needed trust in the digital societal sphere.218
5.3. The EU proposal for a Digital Services Act
In December 2020, the EU published a proposal for a Digital Services Act (DSA), aimed at ensuring a safe and accountable online environment.219 The Council and the European Parliament reached an agreement on approving the DSA in April 2022, and therefore the DSA is expected to be approved in the near future.220 The DSA reflects an attempt to establish comprehensive and systematic regulation for the various online services and to introduce some safeguards for fundamental rights in the online environment. This initiative is the first major piece of EU legislation for the digital sector since the e-Commerce Directive of 2000. This new established regime will serve as a building block for the European community’s overall digital strategy, shaping digital governance principles in a harmonized way.221
The DSA is based on the principle of tailoring the obligations imposed to the nature of the service and the size of the service provider.222 The DSA differentiates among various digital service providers, and while some general obligations are imposed on most services, stricter obligations are imposed on online platforms, which are defined narrowly so as to encompass services such as those provided by social networks (e.g. Facebook) and content storage and dissemination platforms (e.g. YouTube).223 As was clarified at the outset of the DSA proposal, the proposed regulations “do not provide fully-fledged rules on the procedural obligations related to illegal content and they only include basic rules on transparency and accountability of service providers and limited oversight mechanisms.”224 It was further stipulated that there is no general obligation to monitor information or duty to actively search for illegal activity.225 In this respect, the DSA does not follow the German Hate Speech Act, and online providers may not be held liable for failure to remove illegal content, thus reducing fears of a massive chilling effect on freedom of speech.226 At the same time, however, the DSA reflects a significant move toward the expansion of various procedural obligations imposed on online service providers. When a service provider serves 10% or more of the EU population, it is deemed justified to impose stricter procedural rules on its activity.227 With respect to very large online platforms serving 10% or more of the EU population, while there are no monitoring obligations, some additional obligations are imposed such as risk assessment concerning the traffic of illegal content and the negative effects of content moderation to freedom of speech.228 In line with the general policy of setting only general accountability duties, these very large online platforms are obliged to “put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified.”229 Yet, the DSA does not specify exactly what these measures are.
The DSA covers the three major core principles of transparency, giving reason, and external objective review: Chapter III of the DSA is focused on “due diligence obligations for a transparent and safe online environment,” and it includes obligations concerning the procedures for the operation of the various services. For example, service providers are obliged to provide accessible information regarding any policy and contractual terms relating to content moderation, including any measures and tools used for such purpose, whether based on algorithmic or human decision-making. While the DSA offers measures to open the “black box” of algorithmic content moderation with certain transparency obligations,230 it does not require human determination in the process, despite many concerns expressed on the matter. It is further specified that service providers are required to “act in a diligent, objective and proportionate manner,” and “with due regard to the rights and legitimate interests of all parties involved, including the applicable fundamental rights of the recipients of the service as enshrined in the Charter.”231 The term “due regard” reflects the overall policy of the DSA to impose only partial procedural obligations which do not rise to the level of imposing genuine public-law standards on service providers.
In line with this rather limited standard, the DSA requires the various service providers periodically to publish transparency reports, which, while very detailed, do not include explicit transparency obligations concerning individual entitlements.232 Regarding the reasoning of decisions to remove content, service providers are required to inform the recipient, “at the latest at the time of the removal or disabling of access, of the decision and provide a clear and specific statement of reasons for that decision.”233 This obligation specifies the content to be included in such a notification, ensuring its substantial basis. However, regarding the possibility of challenging the decision, it is stipulated that the service provider must provide information concerning the available options of “internal complaint-handling mechanisms, out-of-court dispute settlement and judicial redress.” In other words, external objective judicial review may be available in accordance with applicable laws but is not mandatory.234 With respect to online platforms in particular, the establishment of an easily accessible internal complaint-handling mechanism, as well as in some cases an out-of-court dispute settlement mechanism, is mandatory.235 The DSA further proposes to establish a “certified” out-of-court dispute settlement body that would satisfy basic standards of independence and apply “clear and fair rules of procedure.”236
To summarize, the DSA represents a landmark legal move by promoting a digital governance regime that imposes extensive obligations on various online service providers aimed at enhancing their accountability as the digital sphere gatekeepers. However, this initiative stops short of imposing full-fledged public law principles or genuine administrative-law-like procedures. The most far-reaching duties are to act with “due regard” for human rights and to provide services that would meet the threshold of procedural “objectivity” and “proportionality.” These standards are an important step in the process of introducing public law principles into the digital governance environment. Nevertheless, the DSA does not aspire to emulate the GAL framework, which implements a “fuller package” of administrative law norms and human rights guarantees in the MNCs arena. The DSA’s explicit objective is to present, at least as a first step, a much less ambitious regulation in its scope. Though various reasons including real politick considerations may justify this policy, it is important to understand what the DSA does not pretend to be, and to recognize its potential limitations.
6. Concluding remarks
How will the governance of the digital sphere look in a decade or two? Prophecy, it is said, was given to the fools, but it seems that the rapid evolvement of technology will inevitably be accompanied by a major legal leap, introducing procedural and substantive public-law standards into the digital sphere. Human interactions of all kinds are proliferating in the digital sphere. Therefore, the underlying rationale of protecting human rights, along with procedural justice guarantees, would be necessarily replicated to the digital locus. The focal point of this article is on a major societal development concerning the evolvement of a global digital speech sphere that functions as the backbone of contemporary democracies. Free speech has always triggered questions concerning its limits and its balance with conflicting interests. Yet, the global digital speech sphere has transferred the control over the free speech environment into the hands of private corporations. Consequently, a handful of online giants govern global public societal life. This article belongs to a growing body of scholarship on the social and legal aspects of the evolving digital speech sphere, and it proposes to challenge the major obstacle to the imposition of public-law standards on online giants by locating the relevant legal field as belonging to global administrative law. GAL encompasses a dynamic legal field, adhering the introduction of basic public administration principles of “good governance” to a wide range of supranational entities, including multinational commercial corporations. Therefore, GAL may serve as a legal anchor and as a discourse initiator for the normative and doctrinaire-adequate framework that would take the governance of the digital speech sphere into its appropriate location—the realm of public law.
This research was supported by the Research Authority, College of Management, Israel.
1 Jack Balkin, Old School/New School Speech Regulation, 127 Harv. L. Rev. 2296, 2306 (2016).
2 Hannah Bloch-Wehba, Global Platform Governance: Private Power in the Shadow of the State, 72 S.M.U. L. Rev. 27, 33–7 (2019).
3 Benedict Kingsbury, Nico Krisch, & Richard B. Stewart, The Emergence of Global Administrative Law, 68 Law & Contemp. Probs. 15 (2005).
4 Packingham v. North Carolina, 137 S. Ct. 1730, 1737 (2017).
5 Jack M. Balkin, Digital Speech and Democratic Culture: A Theory of Freedom of Expression for the Information Society, 79 N.Y.U. L. Rev. 1, 35 (2004); Kate Klonick, The New Governors: The People, Rules, and Processes Governing Online Speech, 131 Harv. L. Rev. 1598, 1664 (2017).
6 Evelyn Mary Aswad, The Future of Freedom of Expression Online, 17 Duke L. & Tech. Rev. 26, 31 (2018).
7 Packingham, supra note 4.
8 Kitsuron Sangsuvan, Balancing Freedom of Speech on the Internet Under International Law, 39 N.C.J. Int’l L. & Com. Reg. 701 (2014).
9 See, e.g., Danielle Keats Citron, Cyber Civil Rights, 89 B.U. L. Rev. 61 (2009); Rory Van Loo, The New Gatekeepers: Private Firms as Public Enforcers, 106 Va. L. Rev. 467 (2020); Evelyn Douek, Governing Online Speech: From “Posts-as-Trumps” to Proportionality & Probability, 121 Colum. L. Rev. 759 (2021); Thomas E. Kadri, Digital Gatekeepers, 99 Texas L. Rev. 951 (2021); Orit Fischman Afori, Online Rulers as Hybrid Bodies: The Case of Infringing Content Monitoring, 23 U. Pa. J. Const. L. 121 (2021).
10 Eyal Benvenisti, Upholding Democracy Amid the Challenges of New Technology: What Role for Global Governance?, 29 Eur. J. Int’l L. 9, 55–6, 70 (2018).
11 Stefan Kulk, Internet Intermediaries and Copyright Law: EU and US Perspectives, 10–11 (2019).
12 Aswad, supra note 6, at 30.
13 In the pre-digital era, the traditional speech vehicles were mainly local, with less exposure. See Daniel L. Brenner, Ownership and Content Regulation in Merging and Emerging Media, 45 DePaul L. Rev. 1009 (1996); C. Edwin Baker, Media Concentration: Giving Up on Democracy, 54 Fla. L. Rev. 839, 902–19 (2002).
14 Similar to global commerce, which has always been attached to the public law realm. See Fabrizio Cafaggi, The Many Features of Transnational Private Rule-Making: Unexplored Relationships between Custom, Jura Mercatorum and Global Private Regulation, 36 U. Pa. J. Int’l L. 875, 878, 888 (2015).
15 Communications Decency Act, 47 U.S.C. § 230 (2012) [hereinafter CDA]. This immunity is without prejudice to any other law.
16 Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market, 2000 O.J. (L 178) 1.
17 47 U.S. Code § 230.
18 Balkin, supra note 1, at 2309; Klonick, supra note 5, at 1602; Felix Wu, Collateral Censorship and the Limits of Intermediary Immunity, 87 Notre Dame L. Rev. 293, 347–9 (2011).
19 Klonick, supra note 7, at 1630–47, 1663.
20 Daphne Keller, Who Do You Sue? State and Platform Hybrid Power over Online Speech (Hoover Inst., Stanford Univ., Aegis Series Paper No. 1902, 2019).
21 Danielle Keats Citron, Extremist Speech, Compelled Conformity, and Censorship Creep, 93 Notre Dame L. Rev. 1035, 1047 (2018); Aswad, supra note 6, at 42. See also Věra Jourová, Code of Conduct on Countering Illegal Hate Speech Online: First Results on Implementation, Eur. Comm’n (Dec. 2016), http://ec.europa.eu/information_society/newsroom/image/document/2016-50/factsheet-code-conduct-8_40573.pdf.
22 Richard Ashby Wilson & Molly K. Land, Hate Speech on Social Media: Towards a Context-Specific Content Moderation Policy, 52 Conn. L.R. 1029 (2021).
23 Yochai Benkler, Robert Faris, & Hal Roberts, Network Propaganda (2018); Nathalie Maréchal et al., Tackling the “Fake” Without Harming the “News”: A Paper Series on Regulatory Responses to Misinformation, Wikimedia/Yale L. Sch. Initiative on Intermediaries & Info. (Mar. 8, 2021), https://ssrn.com/abstract=3804878.
24 See, e.g., Rules and Policies: Copyright, YouTube, www.youtube.com/howyoutubeworks/policies/copyright/ (last visited Sept. 28, 2022).
25 See, e.g., Parental Controls: Family-Friendly Experiences, Google, https://safety.google/families/ (last visited Sept. 28, 2022).
26 Netzwerkdurchsetzungsgesetz [BGBl. I p. 3352] [NetzDG] [Act to Improve Enforcement of the Law in Social Networks], art. 1(1), translation at www.bmj.de/SharedDocs/Gesetzgebungsverfahren/Dokumente/NetzDG_engl.pdf [hereinafter Hate Speech Act].
27 Directive (EU) 2019/790 of the European Parliament and of the Council of 17 April 2019 on copyright and related rights in the Digital Single Market and amending Directives 96/9/EC and 2001/29/EC, art. 17, 2019 O.J. (L 130) 92 [hereinafter Copyright Digital Single Market Directive].
28 See, e.g., Case C-314/12, UPC Telekabel Wien GmbH v. Constantin Film Verleih GmbH, ECLI:EU:C:2014:192 (discussing site-blocking order due to copyright infringement); Case C-18/18, Eva Glawischnig-Piesczek v. Facebook Ireland Ltd., ECLI:EU:C:2019:821. See also Martin Husovec, Injunctions against Intermediaries in the European Union: Accountable But Not Liable? (2017).
29 Daphne Keller, Internet Platforms: Observations on Speech, Danger, and Money 18 (Hoover Institution, Aegis Series Paper No. 1807, 2018), www.hoover.org/sites/default/files/research/docs/keller_webreadypdf_final.pdf.
30 Copyright Digital Single Market Directive, supra note 27, art. 17(4)(c).
31 Jennifer M. Urban, Brianna L. Schofield, & Joe Karaganis, Takedown in Two Worlds: An Empirical Analysis, 64 J. Copyright Soc’y U.S.A. 483 (2017).
32 Jack M. Balkin, Free Speech Is a Triangle, 118 Colum. L. Rev. 2011 (2018); Kyle Langvardt, A New Deal for the Online Public Sphere, 26 Geo. Mason L. Rev. 341, 349 (2018); Paul Schiff Berman, Cyberspace and the State Action Debate: The Cultural Value of Applying Constitutional Norms to “Private” Regulation, 71 U. Colo. L. Rev. 1263 (2000).
33 See, e.g., Rebecca Zipursky, Nuts About NETZ: The Network Enforcement Act and Freedom of Expression, 42 Fordham Int’l L.J. 1325, 1328 (2019); Mathias Hong, The German Network Enforcement Act and the Presumption in Favour of Freedom of Speech, Verfassungsblog (Jan. 22, 2018), https://verfassungsblog.de/the-german-network-enforcement-act-and-the-presumption-in-favour-of-freedom-of-speech/; Shoshana Zuboff, Big Other: Surveillance Capitalism and the Prospects of an Information Civilization, 30 J. Info. Tech. 75 (2015).
34 See, e.g., Barrie Sander, Freedom of Expression in the Age of Online Platforms: The Promise and Pitfalls of a Human Rights-Based Approach to Content Moderation, 43 Fordham Int’l L.J. 939, 952–3 (2020).
35 Jeffrey Cobia, The Digital Millennium Copyright Act Takedown Notice Procedure: Misuses, Abuses, and Shortcomings of the Process, 10 Minn. J. Sci. & Tech. 387, 390–3 (2009); Urban et al., supra note 31.
36 Langvardt, supra note 32.
37 Niva Elkin-Koren & Maayan Perel, Separation of Functions for AI: Restraining Speech Regulation by Online Platforms (Feb. 14, 2020), https://ssrn.com/abstract=3439261.
38 Hum. Rts. Council, Special Rapporteur Report on the Promotion and Protection of Freedom of Opinion and Expression, A/HRC/17/27 (May 16, 2011), https://www2.ohchr.org/english/bodies/hrcouncil/docs/17session/a.hrc.17.27_en.pdf.
39 Id. at 75.
40 Id. at 76.
41 Id. at 76–7.
42 Special Rapporteur, Overview of Submission Received in Preparation of the Report on the Protection and Promotion of the Right to Freedom of Opinion and Expression, A/HRC/38/35/Add.1, at 2–3 (June 6, 2018), https://digitallibrary.un.org/record/1638481?ln=zh_CN.
43 Id. at 40–9.
44 Id. at 44–8, 70–2.
45 Special Rapporteur on the Protection and Promotion of the Right to Freedom of Opinion and Expression, A/74/486, at 14 (Oct. 9, 2019), https://digitallibrary.un.org/record/3833657.
46 Id. at 15.
47 Id. at 22.
48 Id. at 23. See also David Kaye, Speech Police the Global Struggle to Govern the Internet 112 (2019).
49 Esteban Ortiz-Ospina, The Rise of Social Media, Our World in Data (Sept. 18, 2019), https://ourworldindata.org/rise-of-social-media.
50 Facebook Reports Fourth Quarter and Full Year 2020 Results, PR Newswire (Jan. 27, 2021), https://www.prnewswire.com/news-releases/facebook-reports-fourth-quarter-and-full-year-2020-results-301216628.html.
51 We Are Committed to Protecting Your Voice and Helping You Connect and Share Safely, Meta, https://about.fb.com/actions/promoting-safety-and-expression/ (last visited Sept. 29, 2022); Protecting Privacy and Security, Meta, https://about.fb.com/actions/protecting-privacy-and-security/ (last visited Sept. 29, 2022); We Are Committed to Securing Our Platforms, Providing Transparency and Empowering People to Vote, Meta, https://about.fb.com/actions/preventing-election-interference/ (last visited Sept. 29, 2022).
52 Corporate Human Rights Policy, Meta, https://about.fb.com/wp-content/uploads/2021/03/Facebooks-Corporate-Human-Rights-Policy.pdf (last visited Sept. 29, 2022).
53 See Facebook Oversight Board Charter, Meta (Sept. 19, 2019), https://about.fb.com/wp-content/uploads/2019/09/oversight_board_charter.pdf; Kate Klonick, The Facebook Oversight Board: Creating an Independent Institution to Adjudicate Online Free Expression, 129 yale l. j. 2419 (2020).
54 Klonick, supra note 53, at 2427.
55 Dipayan Ghosh, Facebook’s Oversight Board Is Not Enough, Harv. Bus. Rev. (Oct. 16, 2019), https://hbr.org/2019/10/facebooks-oversight-board-is-not-enough. See also Allana Akhtar, Donald Trump Reportedly Called Mark Zuckerberg and Asked Him to Make Changes to the Panel That’s Now Responsible for Reviewing Trump’s Ban from the Platform, Yahoo! News (Feb. 12, 2021), https://news.yahoo.com/donald-trump-reportedly-called-mark-174508229.html.
56 Evelyn Douek, Facebook’s Oversight Board: Move Fast with Stable Infrastructure and Humility, 21 N.C. J.L. & Tech. 1, 5 (2019).
57 Id. at 7. See also Evelyn Douek, The Facebook Oversight Board’s First Decisions: Ambitious, and Perhaps Impractical, Lawfare (Jan. 28, 2021, 11:23 AM), https://www.lawfareblog.com/facebook-oversight-boards-first-decisions-ambitious-and-perhaps-impractical.
58 Kingsbury, Krisch, & Stewart, supra note 3, at 20–3.
59 For various materials on GAL, see Inst. Int’l L. & Justice, www.iilj.org (last visited Sept. 29, 2022). See also Kingsbury, Krisch, & Stewart, supra note 3; Sebastian Lopez Escarcena, Contextualizing Global Administrative Law, 21 Gonzaga J. Int’l L. 57 (2018); Christoph Mollers, Ten Years of Global Administrative Law, 13 Int’l J. Const. L. 469 (2015); Susan Marks, Naming Global Administrative Law, 37 N.Y.U. J. Int’l. L. & Pol. 995 (2005); Richard B. Stewart, U.S. Administrative Law: A Model for Global Administrative Law, 68 Law & Contemp. Probs. 63 (2005).
60 Sabino Cassese & Elisa D’Alterio, Introduction: The Development of Global Administrative Law, in Research Handbook on Global Administrative Law 1, 8 (Sabino Cassese ed., 2016).
61 Kingsbury, Krisch, & Stewart, supra note 3, at 37–40.
62 Benedict Kingsbury & Richard B. Stewart, Legitimacy and Accountability in Global Regulatory Governance: The Emerging Global Administrative Law and the Design and Operation of Administrative Tribunals of International Organizations, in International Administrative Tribunals In A Changing 1, 9 (Papanikolaou ed., 2008).
63 Kingsbury, Krisch, & Stewart, supra note 3, at 17. See also Danielle Hanna Rached, Doomed Aspiration of Pure Instrumentality: Global Administrative Law and Accountability, 3 Global Const. 338 (2014); David Dyzenhaus, Accountability and the Concept of (Global) Administrative Law, 2009 Acta Juridica 3; Simon Chesterman, Globalization Rules: Accountability, Power, and the Prospects for Global Administrative Law, 14 Global Governance 39 (2008).
64 Cassese & D’Alterio, supra note 60, at 8; Kingsbury, Krisch, & Stewart, supra note 3, at 16–17; Kingsbury & Stewart, supra note 62, at 15–19.
65 Daniel C. Esty, Good Governance at the Supernational Scale: Globalizing Administrative Law, 115 Yale L.J. 1490, 1515–21 (2006); Kingsbury, Krisch, & Stewart, supra note 3, at 44–51.
66 Benedict Kingsbury, Megan Donaldson, & Rodrigo Vallejo, Global Administrative Law and Deliberative Democracy, in Oxford Handbook of International Legal Theory 1, 5 (A. Orford & F. Hoffmann eds., 2016).
67 Esty, supra note 65, at 1522.
68 StephenBreyer, Administrative Law and Regulatory Policy (3d ed. 1992); Administrative Procedure Act, Pub. L. No. 79-404 (1946).
69 Kingsbury & Stewart, supra note 62, at 8. See contra Escarcena, supra note 59, at 68.
70 Cassese & D’Alterio, supra note 60, at 8.
71 Kingsbury & Stewart, supra note 62, at 8.
72 Cassese & D’Alterio, supra note 60, at 9; Kingsbury & Stewart, supra note 62, at 8–9. For an in-depth taxonomy of the various international oversight bodies, see Cesare P.R. Romano, A Taxonomy of International Rule of Law Institutions, 2 J. Int’l. Dispute Settlement 252, 247, 251, 253 (2011).
73 Cassese & D’Alterio, supra note 60, at 1; Kingsbury & Stewart, supra note 62, at 5.
74 Kingsbury & Stewart, supra note 62, at 5.
75 See Research Handbook on Global Administrative Law, supra note 60; A. von Bogdandy, Philipp Dann, & Matthias Goldmann, Developing the Publicness of Public International Law: Towards a Legal Framework for Global Governance Activities, 9 Ger. L.J. 1375, 1377 (2008); Escarcena, supra note 59, at 70–5.
76 For a review of the various controversies, see Escarcena, supra note 59, at 70–5.
77 Edoardo Chit, Where Does GAL Find Its Legal Grounding?, 13 Int’l J. Const. L. 486 (2015).
78 Christoph Mollers, Ten Years of Global Administrative Law, 13 Int’l J. Const. L. 469, 471 (2015); Escarcena, supra note 59, at 70.
79 Jose Alvarez, “Beware: Boundary Crossings”: A Critical Appraisal of Public Law Approaches to International Investment Law, 17 J. World Invest. Trade 171, 184–5, 186–90 (2016).
80 Alvarez, supra note 79, at 227.
81 Kingsbury, Donaldson, & Vallejo, supra note 66, at 2. See also Escarcena, supra note 59, at 80; Alvarez, supra note 79, at 227.
82 Kingsbury, Donaldson, & Vallejo, supra note 13, at 3–4.
83 Kingsbury, Krisch, & Stewart, supra note 3, at 17.
84 For the difference between these three notions, see Cassese & D’Alterio, supra note 60, at 2.
85 Cassese & D’Alterio, supra note 60, at 6.
86 For a different taxonomy, see Kingsbury, Krisch, & Stewart supra note 3, at 20.
87 Cassese & D’Alterio, supra note 60, at 5; Kingsbury and Stewart, supra note 62, at 5.
88 Von Bogdandy, Dann, & Goldmann, supra note 75, at 1376; Benedict Kingsbury & Lorenzo Casini, Global Administrative Law Dimensions of International Organizations Law, 6 Int’l Org. L. Rev. 319, 324 (2009).
89 Von Bogdandy, Dann, & Goldmann, supra note 75, at 1377.
90 Bogdandy, Dann, & Goldmann, supra note 75, at 1380.
91 Cassese & D’Alterio, supra note 60, at 7.
92 Cassese & D’Alterio, supra note 7, at 7; Krisch & Stewart supra note 3, at 29–31.
93 Cassese & D’Alterio, supra note 60, at 5. For a review of supranational private regulatory bodies, see Cafaggi, supra note 14, at 899–903.
94 IOC Principles, Int’l Olympic Comm., https://olympics.com/ioc/principles (last visited Sept. 29, 2022). For the special acknowledgement of an international legal entity under Swiss law, see David J. Ettinger, The Legal Status of the International Olympic Committee, 4 Pace Y.B. Int’l L. 97 (1992).
95 About Us, Int’l Org. for Standardization, www.iso.org/about-us.html (last visited Sept. 29, 2022).
96 Kingsbury & Stewart, supra note 62, at 11.
97 Klaus J. Hopt & Thomas Von Hippel, Preface to Comparative Corporate Governance of Non-Profit Organizations at xxxv (Klaus J. Hopt & Thomas Von Hippel eds., 2010); Helmut K. Anheier, What Kind of nonprofit Sector? What Kind of Society? Comparative Policy Reflections, in Comparative Corporate Governance of Non-Profit Organizations, supra, at 3, 4.
98 Hopt & Von Hippel, supra note 98, at xl.
99 Mark James & Guy Osborn, The Olympics, Transnational Law and Legal Transplants: The International Olympic Committee, Ambush Marketing and Ticket Touting, 36 Legal Stud. 93 (2016).
100 Kingsbury, Krisch, & Stewart supra note 3, at 22. See, e.g., World Anti-Doping Agency, www.wada-ama.org/en/who-we-are (last visited Sept. 29, 2022).
101 The History of ICANN, Internet Corp. for Assigned Names & Numbers, https://www.icann.org/history (last visited Sept. 29, 2022).
102 Kingsbury, Krisch, & Stewart supra note 3, at 22.
103 Matthias Hartwig, ICANN: Governance by Technical Necessity, in The Exercise of Public Authority by International Organizations 575 (Armin von Bogdandy, Rudiger Wolfrum, Jochen von Bernstorff, Philipp Dann, & Matthias Goldman eds., 2010).
104 For ICANN’s explanation as to why they have adopted voluntarily public law standards as binding principles, see Accountability Mechanisms, Internet Corp. for Assigned Names & Numbers, www.icann.org/resources/accountability (last visited Sept. 29, 2022).
105 John Gerard Ruggie, The Paradox of Corporate Globalization: Disembedding and Reembedding Governing Norms 5 (M-RCBG Faculty Working Paper Series No. 2020-01, Mar. 2020), www.hks.harvard.edu/sites/default/files/centers/mrcbg/FWP_2020-01v2.pdf.
106 Sean D. Murphy, Taking Multinational Corporate Codes of Conduct to the Next Level, 43 Colum. J. Transnat’l L. 389, 397 (2005); Allison M. Snyder, Holding Multinational Corporations Accountable: Is Non-Financial Disclosure the Answer, Colum. Bus. L. Rev. 565, 566 (2007).
107 Robert Reich, Who Is Them?, Harv. Bus. Rev. 77 (1991).
108 Beth Stephens, The Amorality of Profit: Transnational Corporations and Human Rights, 20 Berkeley J. Int’l L. 45, 54–9 (2002).
109 See Research & Publications, Harv. Kennedy School, Mossavar-Rahmani Ctr. for Bus. & Gov’t, www.hks.harvard.edu/centers/mrcbg/programs/cri/research (last visited Sept. 29, 2022); Tawny Aine Bridgeford, Imputing Human Rights Obligations on Multinational Corporations: The Ninth Circuit Strikes again in Judicial Activism, 18 Am. U. Int’l L. Rev. 1009 (2003); Sukanya Pillay, And Justice for All: Globalization, Multinational Corporations, and the Need for Legally Enforceable Human Rights Protections, 81 U. Det. Mercy L. Rev. 489 (2004).
110 Fabrizio Cafaggi, The Regulatory Functions of Transnational Commercial Contracts: New Architectures, 36 Fordham Int’l L.J. 1557 (2013).
111 Simon Chesterman, The Turn to Ethics: Disinvestment from Multinational Corporations for Human Rights Violations: The Case of Norway’s Sovereign Wealth Fund, 23 Am. U. Int’l L. Rev. 577 (2007); Snyder, supra note 107, at 573.
112 Cynthia A. Williams & John M. Conley, Is There an Emerging Fiduciary Duty to Consider Human Rights?, 74 U. Cin. L. Rev. 75, 77 (2005).
113 Ruggie, supra note 106; Org. for Econ. Co-operation & Dev., Guidelines for Multinational Enterprises: Responsible Business Conduct, http://mneguidelines.oecd.org/; ISO 26000: Guidance on Social Responsibility (2010), www.iso.org/standard/42546.html.
114 Stephens, supra note 109, at 47; Reuven S. Avi-Yonah, The Cyclical Transformations of the Corporate Form: A Historical Perspective on Corporate Social Responsibility, 30 Del. J. Corp. L. 767 (2005).
115 Murphy, supra note 107, at 397–9.
116 Rachel Kyte, Balancing Rights with Responsibilities: Looking for the Global Drivers of Materiality in Corporate Social Responsibility and the Voluntary Initiatives that Develop and Support Them, 23 Am. U. Int’l L. Rev. 559 (2008); Williams & Conley, supra note 113, at 102.
117 Joseph E. Stiglitz, Regulating Multinational Corporations: Towards Principles of Cross-border Legal Frameworks in a Globalized World: Balancing Rights with Responsibilities, 23 Am. U. Int’l L. Rev. 451 (2008); Usha Rodrigues & Mike Stegemoller, Placebo Ethics: A Study in Securities Disclosure Arbitrage, 96 Va. L. Rev. 1 (2010).
118 Stephens, supra note 109, at 78–81.
119 See, e.g., G20/OECD Principles of Corporate Governance (2015), http://dx.doi.org/10.1787/9789264236882-en.
120 Id. In the United States, most disclosure rules are mandatory, however some limited “comply or explain” provisions have been introduced, such as the Sarbanes–Oxley Act 2002, 17 C.F.R. § 229.406(a)–(b), 15 U.S.C. § 7264 (2012).
121 The term was first coined by the Cadbury Report, and has been incorporated in the 2003 and more recent 2006 amendments to the UK Companies Act. See Report of the Committee on the Financial Aspects of Corporate Governance (1992), www.ecgi.org/codes/documents/cadbury.pdf; Companies Act 2006 c. 46, art. 13 (U.K.).
122 See Org. for Econ. Co-operation & Dev., Corporate Governance Factbook—2021 (2021), www.oecd.org/corporate/corporate-governance-factbook.htm; Virginia Harper Ho, Comply or Explain and the Future of Nonfinancial Reporting, 21 Lewis & Clark L. Rev. 317, 334 (2017); Fin. Reporting Council, What Constitutes an Explanation Under “Comply Or Explain”? Report of Discussions between Companies and Investors 5 (Feb. 2012), www.frc.org.uk/getattachment/a39aa822-ae3c-4ddf-b869-db8f2ffe1b61/what-constitutes-an-explanation-under-comply-or-exlpain.pdf.
123 Andrew Keay, Comply or Explain in Corporate Governance Codes: In Need of Greater Regulatory Oversight?, 34 Legal Stud. 279, 302 (2014).
124 Julia Black, Decentering Regulation: Understanding the Role of Regulation and Self-Regulation in a “Post-Regulatory” World, 54 Current Legal Probs. 103, 115 (2001).
125 Harper Ho, supra note 123, at 334. However, it was argued that the quest for a single, global corporate governance metric is misguided. See Lucian A. Bebchuk & Assaf Hamdani, The Elusive Quest for Global Governance Standards, 157 U. Pa. L. Rev. 1263, 1269 (2009).
126 Harper Ho, supra note 123, at 332–44; Snyder, supra note 107, at 576–86.
127 Harper Ho, supra note 123, at 329.
128 Org. for Econ. Co-operation & Dev., An Introduction to Online Platforms and Their Role in the Digital Transformation (2019), https://doi.org/10.1787/53e5f593-en.
129 Org. for Econ. Co-operation & Dev., Companies & Responsible Business Conduct 5, http://mneguidelines.oecd.org/RBC-and-platform-companies.pdf (last visited Sept. 29, 2022).
130 Murphy, supra note 107, at 431. For a call to establish global digital governance through corporate regulations, applying to the major giant technology companies based in the US, see Orit Fischman Afori, Global Digital Governance Through the Back Door of Corporate Regulation”, Ford. I.P., Media. & Ent. L. J. (forthcoming 2022) (manuscript), https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4215774.
131 Keay, supra note 124, at 280, 284–5.
132 For a similar criticism, see Rodrigues & Stegemoller, supra note 118, at 64.
133 K. Sabeel Rahman, The New Utilities: Private Power, Social Infrastructure, and the Revival of the Public Utility Concept, 39 Cardozo L. Rev. 1621, 1668–70 (2018).
134 See similarly Benvenisti, supra note 10, at 71, 79–81.
135 This proposed stance takes into consideration the criticism that GAL has become a popular term, occasionally used in non-relevant contexts, see Lorenzo Casini, Global Administrative Law Scholarship, in Research Handbook on Global Administrative Law 548, 563 (Sabino Cassese ed., 2016).
136 Kingsbury, Donaldson, & Vallejo, supra note 13, at 2.
137 Kingsbury, Donaldson, & Vallejo, supra note 66, at 2.
138 For the various schools of legal interpretation, see Richard H. Fallon Jr., The Meaning of Legal Meaning and Its Implications for Theories of Legal Interpretation, 82 U. Chi. L. Rev. 1235 (2015).
139 Much was written on the importance of theory to practice. Theory has maintained its significance in the age of globalization, in which much more complex forms of new legal ordering are developed, see William Twining, Globalisation and Legal Theory 51 (2000).
140 Klonick, supra note 5.
141 Stephen Breyer, Administrative Law and Regulatory Policy (3d ed. 1992); Administrative Procedure Act, 5 U.S.C. §§ 551–9 (1946); Freedom of Information Act, 5 U.S.C. § 552 (2012).
142 Transparency Reporting Index, Access Now, https://www.accessnow.org/transparency-reporting-index/ (last visited Sep.. 29, 2022).
143 Administrative Procedure Act, supra note 142.
144 Paul Daly, Artificial Administration: Administrative Law in the Age of Machines (Nov. 25, 2019), https://ssrn.com/abstract=3493381; Monika Zalnieriute, Lyria Bennett Moses, & George Williams, The Rule of Law and Automated Government Decision-Making, 82 Mod. L. Rev. 425, 444 (2019); Cary Coglianese & David Lehr, Regulating by Robot: Administrative Decision Making in the Machine-Learning Era, 105 Geo. L.J. 1147, 1205–13 (2017).
145 Joshua A. Kroll et al., Accountable Algorithms, 165 U. Pa. L. Rev. 633, 657–60 (2017). See for example Canadian initiatives for transparent algorithms used by the government: Responsible use of artificial intelligence (AI), Gov’t Can., www.canada.ca/en/government/system/digital-government/digital-government-innovations/responsible-use-ai.html (last visited Sept. 29, 2022); Digital Nations Charter, Gov’t Can., www.canada.ca/en/government/system/digital-government/improving-digital-services/digital9charter.html (last visited Sept. 29, 2022).
146 For a similar view based on fiduciary duties, see Jack M. Balkin, Free Speech in the Algorithmic Society: Big Data, Private Governance, and New School Speech Regulation, 51 U.C.D. L. Rev. 1149, 1162 (2018).
147 Breyer, supra note 142; Administrative Procedure Act, supra note 142.
148 Bryant G. Garth & Yves Dezalay, Introduction to Global Prescriptions: The Production, Exportation, and Importation of a New Legal Orthodoxy 1, 2–3 (Bryant G. Garth & Yves Dezalay eds., 2002).
149 It was observed that one of the challenges of legal ordering in the age of globalization is the need for “construction of conceptual framework and a meta-language of legal theory that can transcend legal cultures.” Twining, supra note 140, at 53.
150 Kingsbury, Krisch, & Stewart, supra note 3, at 23.
151 Cassese & D’Alterio, supra note 60, at 7.
152 Escarcena, supra note 59, at 74.
153 For the classification of online giants as hybrid bodies, see Fischman-Afori, supra note 9.
154 Enneking Liesbeth, Foreign Direct Liability and Beyond: Exploring the Role of Tort Law in Promoting International Corporate Social Responsibility and Accountability 448 (May 11, 2012), https://ssrn.com/abstract=2206836.
155 Jennifer A. Zerk, Extraterritorial Jurisdiction: Lessons for The Business and Human Rights Sphere from Six Regulatory Areas 30–6, 176–80 (Corporate Social Responsibility Initiative Working Paper No. 59, 2010), www.hks.harvard.edu/m-rcbg/CSRI/publications/workingpaper_59_zerk.pdf.
156 Liesbeth, supra note 155, at 460–3.
157 Liesbeth, supra note 155, at 449.
158 Zerk, supra note 156, at 60–3, 82–6; Stephens, supra note 109, at 82–3. See also Fischman Afori, supra note 131.
159 Liesbeth, supra note 155, at 463–8.
160 Zerk, supra note 156, at 60–6.
161 For the holistic approach, see EU: More Ambitious DMA Needs to Shape Digital Markets of Our Future, Article19 (Mar. 11, 2021), https://www.article19.org/resources/eu-dma-needs-to-shape-digital-markets-future/.
162 George J. Stigler, The Origin of the Sherman Act, 14 J. Legal Stud. 1 (1985).
163 Sherman Act, 15 U.S.C. § 2 (2018).
164 State Oil Co. v. Khan, 522 U.S. 3, 10 (1997). See also Gregory J. Werden, Antitrust’s Rule of Reason: Only Competition Matters, 79 Antitrust L.J. 713, 726–37 (2014).
165 Federal Trade Commission Act, 15 U.S.C. § 45 (2018); Jonathan B. Baker et al., Unlocking Antitrust Enforcement, 127 Yale L.J. 1916 (2018).
166 Lina M. Khan, The Ideological Roots of America’s Market Power Problem, 127 Yale L.J. F. 960 (2017); Tim Wu, Antitrust & Corruption: Overruling Noerr (Colum. Pub. L. Res. Paper No. 14-663, June 18, 2020), https://ssrn.com/abstract=3630610.
167 See, e.g., Elec. Frontier Found., www.eff.org/ (last visited Sep. 29, 2022); Cory-Doctorow, Competitive Compatibility: Year in Review 2020 (Dec. 30, 2020), https://www.eff.org/deeplinks/2020/12/competitive-compatibility-year-review.
168 Subcomm. on Antitrust Comm. & Admin. L. Comm. Judiciary, Investigation of Competition in Digital Markets: Majority and Staff Report and Recommendations, (2020), https://judiciary.house.gov/uploadedfiles/competition_in_digital_markets.pdf [hereinafter Antitrust Committee Report].
169 Id. at 19–21; Lina M. Khan, The Separation of Platforms and Commerce, 119 Colum. L. Rev. 973 (2019).
170 Antitrust Committee Report, supra note 169, at 19–21; Doctorow, supra note 168.
171 Proposal for a Regulation of the European Parliament and of the Council on Contestable and Fair Markets in the Digital Sector (Digital Markets Act), COM(2020) 842 final (Dec. 12, 2020), https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:52020PC0842&from=en [DMA Proposal].
172 Press Release, Council Eur. Union, DMA: Council Gives Final Approval to New Rules for Fair Competition Online, (July 18, 2022), https://www.consilium.europa.eu/en/press/press-releases/2022/07/18/dma-council-gives-final-approval-to-new-rules-for-fair-competition-online/.
173 Fed. Trade Comm’n Hearings on Competition and Consumer Protection in the 21st Century, Comments on the September 21 Hearing, Topic 1 Updating the Consumer Welfare Standard (Nov. 15, 2018), https://www.eff.org/document/ftc-hearings-competition-and-consumer-protection-21st-century-comments-september-21-hearing [hereinafter EFF Hearing]; Michael Katz & Jonathan Sallet, Multisided Platforms and Antitrust Enforcement, 127 Yale L.J. 2142, 2143–5 (2018).
174 See, e.g., Fed. Trade Comm’n v. Meta Platforms, Case No. 1:20-cv-03590 (D.D.C. 2021), www.ftc.gov/system/files/documents/cases/051_2021.01.21_revised_partially_redacted_complaint.pdf.
175 Khan, supra note 167; Terrell McSweeny, FTC 2.0: Keeping Pace with Online Platforms, 32 Berkeley Tech. L.J. 1027, 1038–9 (2017).
176 EU: More Ambitious DMA Needs to Shape Digital Markets of Our Future, supra note 162. The DMA proposal is explicitly “concerned with economic imbalances”: see DMA Proposal, supra note 172, at 3, art. 1.
177 For example, the German Hate Speech Act refers to services with 2 million users, which do not necessarily meet the market dominancy threshold.
178 McSweeny, supra note 175, at 1034.
179 Santa Clara Principles, https://santaclaraprinciples.org (last visited Sept. 29, 2022).
180 Platform Accountability and Consumer Transparency Act, S. 4066, 116th Cong. (2020).
181 Id. § 3.
182 Id. § 4(2).
183 Id. § 5(a).
184 Id. § 5(b)–(c).
185 Id. § 5(d).
186 Id. § 5(e).
187 Id. § 6.
188 Fischman-Afori, supra note 9.
189 See Aaron Mackey, The PACT Act’s Attempt to Help Internet Users Hold Platforms Accountable Will End Up Hurting Online Speakers, Elec. Frontier Found. (July 21, 2020), https://www.eff.org/deeplinks/2020/07/pact-acts-attempt-help-internet-users-hold-platforms-accountable-will-end-hurting.
190 See, e.g., Jennifer M. Urban, Joe Karaganis & Brianna L. Schofield, Notice and Takedown: Online Service Provider and Rightsholder Accounts of Everyday Practice, 64 J. Copyright Soc’y 371 (2017); Daniel Etcovitch, DMCA Sec 512 Pain Points: Music and Technology Industry Perspectives in Juxtaposition, 30 Harv. J. L. & Tech. 547 (2017).
191 Dep’t for Digital, Culture, Media & Sport, Online Harms White Paper (last updated Dec. 15, 2020), https://www.gov.uk/government/consultations/online-harms-white-paper/online-harms-white-paper [hereinafter OHWP].
192 Online Harms White Paper: Full Government Response to the Consultation, Command Paper No. 354 (last updated Dec. 15, 2020), www.gov.uk/government/consultations/online-harms-white-paper/outcome/online-harms-white-paper-full-government-response [hereainafter R-OHWP].
193 OHWP, supra note 191, para. 2.17.
194 Id. para. 3.16.
195 Id. para. 3.9.
196 R-OHWP, supra note 192, at 17, para. 1.3.
197 Id. at 16, para. 1.2.
198 Id. at 20, para. 1.10.
199 Id. at 24, para. 2.2.
200 Id. at 25, para. 2.4.
201 Id. at 49.
202 Id. at 26.
203 Id. at 5.
204 OHWP, supra note 191, para. 7.5.
205 Id. para. 3.20.
206 R-OHWP, supra note 192, at 16.
207 Id. at 41, 65–6.
208 OHWP, supra note 191, at para. 3.23; R-OHWP, supra note 192, at 67–8.
209 OHWP, supra note 191, para. 3.25.
210 Id. para. 3.26; R-OHWP, supra note 192, at 71.
211 R-OHWP, supra note 192, at 71.
212 Id. at 71.
213 Liesbeth, supra note 155, at 452.
214 Restatement (Second) Of Torts § 4 (Am. L. Inst. 1965); John C. P. Goldberg & Benjmain C. Zipursky, The Restatement (Third) and the Place of Duty in Negligence Law, 54 Vand. L. Rev. 657 (2001).
215 Ernest J. Weinrib, The Monsanto Lectures: Understanding Tort Law, 23 Val. U. L. Rev. 485 (1989).
216 Liesbeth, supra note 155, at 523–4.
217 Vladislava Stoyanova, Common Law Tort of Negligence as a Tool for Deconstructing Positive Obligations under the European Convention on Human Rights, 24 Int’l J. Hum. Rts. 632, 634 (2020).
218 See supra Section 3.
219 Proposal for a Regulation of the European Parliament and of the Council on a Single Market for Digital Services (DSA) and amending Directive 2000/31/EC, COM/2020/825 final, at 1 (Dec. 15, 2020), https://eur-lex.europa.eu/legal-content/en/TXT/?uri=COM%3A2020%3A825%3AFIN [hereinafter DSA]; Eur. Comm’n, The Digital Services Act: Ensuring a Safe and Accountable Online Environment, https://ec.europa.eu/info/strategy/priorities-2019-2024/europe-fit-digital-age/digital-services-act-ensuring-safe-and-accountable-online-environment_en (last visited Sept. 29, 2022) [hereinafter Eur. Comm’n, The Digital Services Act].
220 Press Release, Digital Services Act: Council and European Parliament Provisional Agreement for Making the Internet a Safer Space for European Citizens (Apr. 23, 2022), https://www.consilium.europa.eu/en/press/press-releases/2022/04/23/digital-services-act-council-and-european-parliament-reach-deal-on-a-safer-online-space/.
221 Christoph Schmon & Cory Doctorow, EU and the Digital Services Act: 2020 Year in Review, Elec. Frontier Found. (Dec. 26, 2020), https://www.eff.org/deeplinks/2020/12/eu-and-digital-services-act-year-review.
222 Eur. Comm’n, The Digital Services Act, supra note 220.
223 DSA, supra note 220, art. 2 (Definitions).
224 Eur. Comm’n, The Digital Services Act, supra note 220, at 4.
225 DSA, supra note 220, art. 7.
226 Schmon & Doctorow, supra note 223.
227 Eur. Comm’n, The Digital Services Act, supra note 220, at 3.
228 DSA, supra note 220, arts. 25, 26.
229 Id. art. 27.
230 Eur. Comm’n, The Digital Services Act, supra note 220, at 33–4.
231 DSA, supra note 220, art. 12.
232 Id. arts. 13, 23. Very large online platforms are subject to additional periodically transparency report obligations: id. art. 33.
233 Id. art. 15.
234 Id. art. 18 (1).
235 Id. arts. 17, 18 (1).
236 Id. art. 18.
© The Author(s) 2022. Oxford University Press and New York University School of Law. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com
This article is published and distributed under the terms of the Oxford University Press, Standard Journals Publication Model (https://academic.oup.com/pages/standard-publication-reuse-rights)