Recently, I had a discussion with a friend about Internet platforms and free speech. The core issue of the discussion was whether if Section 230 of the Communication Decency Act1 did not exist companies like Facebook, Twitter, YouTube, and the like could be sued for removing speech of users from their platforms. My friend made a reasonable argument that repealing Section 230 would make it possible for users to sue Internet platforms for the removal of speech. However, during the course of the discussion it became clear to me that I did not understand Section 230 or the rights of Internet platforms when it came to speech.
The First Amendment to the United States Constitution states in part
Congress shall make no law … abridging the freedom of speech, or of the press….2 The free speech clause applies to the Federal government and the States through the Fourteenth Amendment. It is a negative right. That is, it is a prohibition against the government Federal and State. Are Internet platforms, which are private companies; legally obligated to protect speech in the same way as the government? No, they are
nongovernmental entities.3 As such, Internet platforms are not constrained by the Constitutional limitations of the First Amendment. To this point, Courts have ruled that Internet platforms are not state actors for First Amendment purposes.4 In order to be considered a state actor a private company would have to fulfill the same functions as a small town.5 While Internet platforms provide places for the exchange of ideas, they are more like living rooms than town squares. A homeowner invites an individual into their living room, where the individual expresses ideas and uses language that the homeowner objects to; the homeowner has every right to tell the individual to leave the house and not come back. Private companies have the same right when it comes to speech.6
Marsh v. Alabama (1946), Hudgens v. National Labor Relations Board (1976), and Lugar v. Edmondson Oil Company, Inc. (1982) established the doctrine of state action and defined the meaning of state actor.7 Additionally, they permit companies to be discriminatory in terms of speech that companies will allow. Not all discrimination is illegal. For example, a restaurant has the right to deny service to anyone not wearing a shirt or shoes.8 These cases predate Section 230 of the Communication Decency Act and provide private companies such as Facebook, Twitter, and YouTube wide latitude to moderate speech on their platforms. In a January 12, 2021 conversation with Renee Ritchie, Devin Stone suggested, in essence, that if Section 230 did not exist the actions taken by Facebook, Twitter, and YouTube after the events of January 6, 2021 would have been taken faster.9 In United States v. Morrison (2000), the Supreme Court made the
violations of Constitutional rights by a private actor a conceptual impossibility.10 The Morrison case is the most recent dealing with the state action doctrine, thus only Federal and State government have the responsibility not to infringe upon the Constitutional rights of individuals; unless such discrimination is prohibited by law such as on the basis of race, age, disability, and the like. The moderation of users by Internet platforms is separate and different from those protections.
So, Section 230 has nothing to do with the actions taken by Facebook, Twitter, and YouTube. What, then, is its purpose? In 1995, the authors of Section 230, Representatives Chris Cox (R-CA) and Ron Wyden (D-OR)
wanted internet companies to be able to regulate themselves in some instances—keeping offensive material away from children, for example—without fear of being culpable for everything their users post.11 Cox stated,
... it will be the policy of the United States that we do not wish to have content regulation by the Federal Government of what is on the Internet.12 Wyden made clear that the laws of the time were a disincentive for the companies who could assist in controlling the Internet to do so.13
Section 230 was the response to two separate New York cases one Federal and one State. Both cases, Cubby, Inc. v. CompuServe Inc., (1991) and Stratton Oakmont, Inc. v. Prodigy Services Co. (1995), were defamation lawsuits.14 In CompuServe, the company was sued for libel because of defamatory statements made in a newsletter on its journalism forum. CompuServe did not monitor the forum, nor edit the newsletter. The Court concluded CompuServe had no knowledge of or reason to know that the content of the newsletter contained defamatory statements; therefore, it could not be held liable. The Court classified CompuServe a distributor.15 In Prodigy, the Court classified the company as a publisher because Prodigy monitored and edited the contents of its bulletin boards. By such actions, the Court concluded, Prodigy had taken on a role similar to that of a
newspaper or television network. Therefore, Prodigy could be held liable.16 CompuServe did not moderate, nor edit and was not held liable. Prodigy moderated and edited and was held liable. Wyden and Cox did not want the Federal government to create a
Federal Computer Commission to regulate the Internet.17 However, based on the two cases, it would be better for Internet platforms not to moderate or edit user content and be considered distributors.
Basically, Section 230 allows the market of Internet platforms to police themselves and not be held liable for everything their users post. Additionally, it protects smaller companies and individuals in the same way.18 Section 230 (c) states:
47 U.S.C. § 230(c)
- TREATMENT OF PUBLISHER OR SPEAKER.—No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.
- CIVIL LIABILITY.—No provider or user of an interactive computer service shall be held liable on account of—
- any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected; or
- any action taken to enable or make available to information content providers or others the technical means to restrict access to material described in paragraph (1).19
This quasi-laissez-faire approach allows the market to control what is on the internet in terms of speech without regulation by government, which was the intent of Congress. The Civil liability section is merely an additional protection for companies to those described in the court cases above.
If the additional protections of Section 230 are too broad it is in the following phrases: otherwise objectionable and whether or not such material is Constitutionally protected. Otherwise objectionable gives the provider or user absolute discretion as to what is and is not offensive, and what they will tolerate. The removal of Constitutionally protected material is covered by Morrison in general and the phrase only clarifies existing case law. In the end, Internet platforms such as Facebook, Twitter, and YouTube can moderate their platforms in whatever manner they choose, and users who object need to spend their time and money on other platforms. But these Internet platforms cannot and are not infringing upon users’ free speech rights regardless of the existence of Section 230 as the examination of case law has shown. Users need to be extremely careful about what they post online. If a user’s content is removed by a platform, they need to remember that because of Section 230 they are less likely to be sued.
Notes1 Protection for private blocking and screening of offensive material 47 U.S.C. § 230, https://www.govinfo.gov/content/pkg/USCODE-2011-title47/html/USCODE-2011-title47-chap5-subchapII-partI-sec230.htm. 2 Constitution of the United States of America, Amendment One, National Archives, https://www.archives.gov/founding-docs/bill-of-rights-transcript. 3 Jonathan Peters, The
‘Sovereigns of Cyberspace’ and State Action: The First Amendment’s Application (or Lack Thereof) to Third-Party Platforms,Berkeley Technology Law Journal, 32.989, (2017): 991, https://keegan.wiki/. 4 Peters, 992. 5 Marsh v. Alabama, 326 U.S. 501 (1946), https://www.oyez.org/cases/1940-1955/326us501. 6 Hudgens v. National Labor Relations Board, 424 U.S. 507 (1976), https://supreme.justia.com/. 7 Supra notes 5 and 6; Lugar v. Edmondson Oil Company, Inc., 457 US 922 (1982), https://www.oyez.org/cases/1981/80-1730. 8 Rene Ritchie,
Parler v. Apple & Twitter — Real Lawyer Reacts! (Feat. Legal Eagle),January 12, 2021, 06:40-09:00, https://youtu.be/Xzh-ppO7oes. 9 Ritchie, 04:18-0639, https://youtu.be/Xzh-ppO7oes. See Devin Stone Legal Eagle, https://www.youtube.com/legaleagle. See also Stone Law, https://stonelawdc.com/. 10 Peters, 999. 11Anshu Siripurapu,
Trump and Section 230: What to Know,Council on Foreign Relation, last modified December 2, 2020, https://www.cfr.org/in-brief/trump-and-section-230-what-know; Dictionary.com, s. v.
culpableaccessed January 18, 2021, https://www.dictionary.com/. deserving blame or censure; blameworthy. 12 141 Congressional Record H8470 (August 4, 1995) (Rep. Chris Cox), https://keegan.wiki/. 13 141 Congressional Record H8469. 14 Cubby, Inc. v. CompuServe Inc., 776 F. Supp. 135 (S.D. New York 1991), https://law.justia.com/; Stratton Oakmont, Inc. v. Prodigy Services Co. 1995 WL 323710 (N.Y. Sup. Ct May 24, 1995), https://h2o.law.harvard.edu/cases/4540; Dictionary.com, s. v.
defamationaccessed January 18, 2021, https://www.dictionary.com/. the act of defaming; false or unjustified injury of the good reputation of another, as by slander or libel: She sued the magazine for defamation of character. 15 Jessica R. Friedman,
Defamation,Fordham Law Review, 64.3 (1995):797, https://keegan.wiki/. 16 Friedman, 797-798. 17 141 Congressional Record H8469. (Cox). 18 David Greene,
Section 230 Is Not A ‘Special Tech’ Company Immunity,Electronic Frontier Foundation, last modified, May 1, 2019, https://www.eff.org/. 19 Supra note 1.