Difference between revisions of "The Common Carrier Theory of Facebook"
(→Miscellaneous) |
|||
(60 intermediate revisions by 3 users not shown) | |||
Line 1: | Line 1: | ||
+ | ==Introduction== | ||
+ | *See also [[Texas v. Tech Lords]]. | ||
+ | |||
+ | *[https://blog.ericgoldman.org/archives/2021/12/the-u-s-department-of-justice-defends-section-230s-constitutionality.htm Technology and Marketing Law Blog] | ||
+ | |||
+ | * I should organize a Zoom conference of Common Carrier people for late January. Invite Goldman too. Private conference, no media, just learn. Like publshing each others' amicus briefs in advance. | ||
+ | |||
*[https://www.lawfareblog.com/are-facebook-and-google-state-actors "Are Facebook and Google State Actors?"] Lawfare, Jed Rubenfeld , November 4, 2019. | *[https://www.lawfareblog.com/are-facebook-and-google-state-actors "Are Facebook and Google State Actors?"] Lawfare, Jed Rubenfeld , November 4, 2019. | ||
+ | |||
+ | *[https://twitter.com/AriCohn/status/1466061393233420289 Going to be tweeting through the House Energy & Commerce "Big Tech" hearing ] | ||
+ | |||
+ | *[https://www.law.cornell.edu/uscode/text/47/230 47 U.S. Code 230], the big federal statute on Big Tech. | ||
+ | |||
+ | (a)Findings | ||
+ | The Congress finds the following: | ||
+ | (1)The rapidly developing array of Internet and other interactive computer services available to individual Americans represent an extraordinary advance in the availability of educational and informational resources to our citizens. | ||
+ | (2)These services offer users a great degree of control over the information that they receive, as well as the potential for even greater control in the future as technology develops. | ||
+ | (3)The Internet and other interactive computer services offer a forum for a true diversity of political discourse, unique opportunities for cultural development, and myriad avenues for intellectual activity. | ||
+ | (4)The Internet and other interactive computer services have flourished, to the benefit of all Americans, with a minimum of government regulation. | ||
+ | (5)Increasingly Americans are relying on interactive media for a variety of political, educational, cultural, and entertainment services. | ||
+ | (b)Policy | ||
+ | It is the policy of the United States— | ||
+ | (1)to promote the continued development of the Internet and other interactive computer services and other interactive media; | ||
+ | (2)to preserve the vibrant and competitive free market that presently exists for the Internet and other interactive computer services, unfettered by Federal or State regulation; | ||
+ | (3)to encourage the development of technologies which maximize user control over what information is received by individuals, families, and schools who use the Internet and other interactive computer services; | ||
+ | (4)to remove disincentives for the development and utilization of blocking and filtering technologies that empower parents to restrict their children’s access to objectionable or inappropriate online material; and | ||
+ | (5)to ensure vigorous enforcement of Federal criminal laws to deter and punish trafficking in obscenity, stalking, and harassment by means of computer. | ||
+ | (c)Protection for “Good Samaritan” blocking and screening of offensive material | ||
+ | (1)Treatment of publisher or speaker | ||
+ | No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider. | ||
+ | |||
+ | (2)Civil liability | ||
+ | No provider or user of an interactive computer service shall be held liable on account of— | ||
+ | (A)any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected; or | ||
+ | (B)any action taken to enable or make available to information content providers or others the technical means to restrict access to material described in paragraph (1).[1] | ||
+ | (d)Obligations of interactive computer service | ||
+ | A provider of interactive computer service shall, at the time of entering an agreement with a customer for the provision of interactive computer service and in a manner deemed appropriate by the provider, notify such customer that parental control protections (such as computer hardware, software, or filtering services) are commercially available that may assist the customer in limiting access to material that is harmful to minors. Such notice shall identify, or provide the customer with access to information identifying, current providers of such protections. | ||
+ | |||
+ | (e)Effect on other laws | ||
+ | (1)No effect on criminal law | ||
+ | Nothing in this section shall be construed to impair the enforcement of section 223 or 231 of this title, chapter 71 (relating to obscenity) or 110 (relating to sexual exploitation of children) of title 18, or any other Federal criminal statute. | ||
+ | |||
+ | (2)No effect on intellectual property law | ||
+ | Nothing in this section shall be construed to limit or expand any law pertaining to intellectual property. | ||
+ | |||
+ | (3)State law | ||
+ | Nothing in this section shall be construed to prevent any State from enforcing any State law that is consistent with this section. No cause of action may be brought and no liability may be imposed under any State or local law that is inconsistent with this section. | ||
+ | |||
+ | (4)No effect on communications privacy law | ||
+ | Nothing in this section shall be construed to limit the application of the Electronic Communications Privacy Act of 1986 or any of the amendments made by such Act, or any similar State law. | ||
+ | |||
+ | (5)No effect on sex trafficking law | ||
+ | Nothing in this section (other than subsection (c)(2)(A)) shall be construed to impair or limit— | ||
+ | (A)any claim in a civil action brought under section 1595 of title 18, if the conduct underlying the claim constitutes a violation of section 1591 of that title; | ||
+ | (B)any charge in a criminal prosecution brought under State law if the conduct underlying the charge would constitute a violation of section 1591 of title 18; or | ||
+ | (C)any charge in a criminal prosecution brought under State law if the conduct underlying the charge would constitute a violation of section 2421A of title 18, and promotion or facilitation of prostitution is illegal in the jurisdiction where the defendant’s promotion or facilitation of prostitution was targeted. | ||
+ | (f)Definitions | ||
+ | As used in this section: | ||
+ | (1)Internet | ||
+ | The term “Internet” means the international computer network of both Federal and non-Federal interoperable packet switched data networks. | ||
+ | |||
+ | (2)Interactive computer service | ||
+ | The term “interactive computer service” means any information service, system, or access software provider that provides or enables computer access by multiple users to a computer server, including specifically a service or system that provides access to the Internet and such systems operated or services offered by libraries or educational institutions. | ||
+ | |||
+ | (3)Information content provider | ||
+ | The term “information content provider” means any person or entity that is responsible, in whole or in part, for the creation or development of information provided through the Internet or any other interactive computer service. | ||
+ | |||
+ | (4)Access software provider | ||
+ | The term “access software provider” means a provider of software (including client or server software), or enabling tools that do any one or more of the following: | ||
+ | (A)filter, screen, allow, or disallow content; | ||
+ | (B)pick, choose, analyze, or digest content; or | ||
+ | (C)transmit, receive, display, forward, cache, search, subset, organize, reorganize, or translate content. | ||
+ | (June 19, 1934, ch. 652, title II, § 230, as added Pub. L. 104–104, title V, § 509, Feb. 8, 1996, 110 Stat. 137; amended Pub. L. 105–277, div. C, title XIV, § 1404(a), Oct. 21, 1998, 112 Stat. 2681–739; Pub. L. 115–164, § 4(a), Apr. 11, 2018, 132 Stat. 1254.) | ||
+ | |||
+ | ---- | ||
+ | |||
+ | ==Ohio case== | ||
+ | *[https://rasmusen.org/special/common_carriers/ohio_case.pdf Here] it is. | ||
+ | |||
+ | ==Evidence== | ||
+ | *[https://amgreatness.com/2021/12/29/dr-robert-malone-renowned-physician-and-inventor-of-mrna-technology-permanently-banned-from-twitter/ "dr-robert-malone-renowned-physician-and-inventor-of-mrna-technology-permanently-banned-from-twitter/"] | ||
+ | |||
+ | ---- | ||
+ | |||
+ | ==Facts== | ||
+ | *[https://pjmedia.com/news-and-politics/victoria-taft/2021/11/15/youtube-cuts-off-the-best-real-time-legal-coverage-of-rittenhouse-trial-and-immediately-regrets-it-n1533042 "youtube-cuts-off-the-best-real-time-legal-coverage-of-rittenhouse-trial-and-immediately-regrets-it,"] PJMedia (Nov. 15, 2021): | ||
+ | {{Quotation| | ||
+ | Right after Assistant District Attorney Thomas Binger began his closing statement in the Kyle Rittenhouse Trial, YouTube cut off channels that were beating legacy media channels. Coincidence? | ||
+ | The Rekieta Law channel, which features multiple lawyers doing real-time analysis of the trial, often beat the number of people watching the PBS stream. The PBS stream is one of the more reliable ones available to YouTube users and was being used by several outlets. | ||
+ | After getting cut off, Nick Rekieta reminded YouTube that ten lawyers considered it a breach of contract. | ||
+ | |||
+ | ... | ||
+ | |||
+ | Ticking off channels featuring dozens of lawyers seemed like a bad business plan. Within a few minutes, the stream was put back up after Rekieta reminded the tech giant that the courtroom coverage was public property and therefore not under copyright.}} | ||
+ | |||
+ | ---- | ||
+ | |||
+ | *[https://www.zerohedge.com/political/stunning-facebook-court-filing-admits-fact-checks-are-just-matter-opinion Stossel defamation case admission] by Facebook that its factcheckers are just opinion-listers. The [https://digitalcommons.law.scu.edu/cgi/viewcontent.cgi?article=3543&context=historical complaint] is this. The [https://wattsupwiththat.com/wp-content/uploads/2021/12/Facebook-admits-its-fact-check-is-opinion-page-2.pdf motion to dismiss]. | ||
+ | |||
+ | ---- | ||
+ | |||
+ | ==Statutory History== | ||
+ | *[https://s3.documentcloud.org/documents/7213938/2020-09-17-Cox-Wyden-FCC-Reply-Comments-Final-2.pdf "2020 09 17 Cox Wyden FCC Reply Comments], (2020). | ||
+ | |||
+ | ---- | ||
+ | |||
+ | ==Regulations== | ||
+ | asdsdfsd | ||
+ | |||
+ | ---- | ||
+ | ==Commentary== | ||
+ | *[https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3961703 "Section 230’s Application to States’ Regulation of Social Media,] Eric Goldman, | ||
+ | Santa Clara Univ. Legal Studies Research Paper, 14 Nov 2021). | ||
+ | |||
+ | *[https://www.techdirt.com/articles/20211030/01585247847/scale-content-moderation-is-unfathomable.shtml "The Scale Of Content Moderation Is Unfathomable: from the it's-way-more-than-you-think dept,"] TechDirt.com, (Nov 2nd 2021) Mike Masnick. | ||
---- | ---- | ||
− | [https://www.supremecourt.gov/ | + | ==Litigation Strategy== |
+ | * The idea is for someone to sue Facebook or Twitter for defamation. The plaintiff would be somebody, preferably not a public figure in any way, who was clearly defamed by someone posting on defendant's website (let's use Twitter for concreteness). Twitter would move to dismiss on the grounds that they are immune by statute from liability for defamation by somebody posting on their website, and perhaps immune for some common law reason such as that they can't be expected to police content any more than somebody polices the graffiti on the wall of their warehouse. The response would be that Twitter has forfeited its immunity because it actually *does* police content, and not just for obscenity, but for correctness of political content, so it has shown itself willing and able to do so. THus, Twitter is actually a content provider with editorial control like a magazine, not a neutral software provider. This is a question of fact, which must be decided by a jury. | ||
+ | |||
+ | The purpose would not be to win the particular lawsuit, but to establish in a court of law that Twitter is liable for defamatory content. If this could get beyond motion to dismiss, it would allow Discovery of Twitter's internal documents and practices and publicity about them at trial, even if it lost in the end. It would prepare the way for Twitter to be treated by courts and regulated by the government as a common carrier (though that of course raises the additional issue of whether it is a natural monopoly). | ||
+ | |||
+ | A big thing is to find a test case which would not be dismissed for other reasons--- so it would be good to find defamation per se (false accusation of a crime, say), of a private figure, with a false statement of fact not opinion, etc., which was widely circulated, with a plaintiff in a favorable state and federal circuit. | ||
+ | |||
+ | ---- | ||
+ | |||
+ | ==Caselaw== | ||
+ | *[https://blog.ericgoldman.org/ Eric Goldman's blog] | ||
+ | |||
+ | *[https://twitter.com/Section_230 Section 230 Twitter] | ||
+ | |||
+ | *[https://courts.delaware.gov/Opinions/Download.aspx?id=316680 Page v. Oath], 2021 WL 528472 (Del. Superior Ct. Feb. 11, 2021). | ||
+ | |||
+ | *[https://cases.justia.com/federal/district-courts/california/candce/5:2006cv03926/181461/117/0.pdf IO Group v. Veoh Networks], 5:2006cv03926 (N.D. Cal. Aug. 27, 2008) . | ||
+ | |||
+ | *[https://scholar.google.com/scholar_case?case=1958442027646479582 Blumenthal v. Drudge] (1998) | ||
+ | |||
+ | * [https://digitalcommons.law.scu.edu/cgi/viewcontent.cgi?article=2950&context=historical Downs v. Oath], 2019 WL 2209206 (S.D.N.Y. May 22, 2019) | ||
+ | |||
+ | *[UMG Recordings, Inc. v. Veoh Networks, Inc]., 2008 WL 5423841 (C.D. Cal. Dec. 29, 2008) | ||
+ | ---- | ||
+ | ===Judge Thomas (2021)=== | ||
+ | [https://www.supremecourt.gov/opinions/20pdf/20-197_5ie6.pdf Thomas, J. ] in Biden v. Knight (2021): | ||
{{Quotation| | {{Quotation| | ||
If part of the problem is private, concentrated control over online content and platforms available to the public, then part of the solution may be found in doctrines that limit the right of a private company to exclude. Historically, at least two legal doctrines limited a company’s right to exclude. First, our legal system and its British predecessor have long subjected certain businesses, known as common carriers, to special regulations, including a general requirement to serve all comers. Candeub, Bargaining for Free Speech: Common Carriage, Network Neutrality, and Section 230, 22 ''Yale J. L. & Tech.'' 391, 398–403 (2020) (Candeub); see also Burdick, The Origin of the Peculiar Duties of Public Service Companies, Pt. 1, 11 ''Colum. L. Rev.'' 514 (1911). Justifications for these regulations have varied. Some scholars have argued that common-carrier regulations are justified only when a carrier possesses substantial market power. Candeub 404. Others have said that no substantial market power is needed so long as the company holds itself out as open to the public. Ibid.; see also ''Ingate v. Christie, 3 Car. & K''. 61, 63, 175 Eng. Rep. 463, 464 (N. P. 1850) (“[A] person [who] holds himself out to carry goods for everyone as a business . . . is a common carrier”). And this Court long ago suggested that regulations like those placed on common carriers may be justified, even for industries not historically recognized as common carriers, when “a business, by circumstances and its nature, . . . rise[s] from private to be of public concern.” See ''German Alliance Ins. Co. v. Lewis,'' 233 U. S. 389, 411 (1914) (affirming state regulation of fire insurance rates). At that point, a company’s “property is but its instrument, the means of rendering the service which has become of public interest.” Id., at 408. | If part of the problem is private, concentrated control over online content and platforms available to the public, then part of the solution may be found in doctrines that limit the right of a private company to exclude. Historically, at least two legal doctrines limited a company’s right to exclude. First, our legal system and its British predecessor have long subjected certain businesses, known as common carriers, to special regulations, including a general requirement to serve all comers. Candeub, Bargaining for Free Speech: Common Carriage, Network Neutrality, and Section 230, 22 ''Yale J. L. & Tech.'' 391, 398–403 (2020) (Candeub); see also Burdick, The Origin of the Peculiar Duties of Public Service Companies, Pt. 1, 11 ''Colum. L. Rev.'' 514 (1911). Justifications for these regulations have varied. Some scholars have argued that common-carrier regulations are justified only when a carrier possesses substantial market power. Candeub 404. Others have said that no substantial market power is needed so long as the company holds itself out as open to the public. Ibid.; see also ''Ingate v. Christie, 3 Car. & K''. 61, 63, 175 Eng. Rep. 463, 464 (N. P. 1850) (“[A] person [who] holds himself out to carry goods for everyone as a business . . . is a common carrier”). And this Court long ago suggested that regulations like those placed on common carriers may be justified, even for industries not historically recognized as common carriers, when “a business, by circumstances and its nature, . . . rise[s] from private to be of public concern.” See ''German Alliance Ins. Co. v. Lewis,'' 233 U. S. 389, 411 (1914) (affirming state regulation of fire insurance rates). At that point, a company’s “property is but its instrument, the means of rendering the service which has become of public interest.” Id., at 408. | ||
Line 9: | Line 148: | ||
This latter definition of course is hardly helpful, for most things can be described as “of public interest.” But whatever may be said of other industries, there is clear historical precedent for regulating transportation and communications networks in a similar manner as traditional common carriers. Candeub 398–405. Telegraphs, for example, because they “resemble[d] railroad companies and other common carriers,” were “bound to serve all customers alike, without discrimination.” ''Primrose v. Western Union Telegraph Co''., 154 U. S. 1, 14 (1894).footnote 2 | This latter definition of course is hardly helpful, for most things can be described as “of public interest.” But whatever may be said of other industries, there is clear historical precedent for regulating transportation and communications networks in a similar manner as traditional common carriers. Candeub 398–405. Telegraphs, for example, because they “resemble[d] railroad companies and other common carriers,” were “bound to serve all customers alike, without discrimination.” ''Primrose v. Western Union Telegraph Co''., 154 U. S. 1, 14 (1894).footnote 2 | ||
− | Footnote 2 This Court has been inconsistent about whether telegraphs were common carriers. Compare ''Primrose'', 154 U. S., at 14, with ''Moore v. New York Cotton Exchange'', 270 U. S. 593, 605 (1926). But the Court has consistently recognized that ]y from certain types of suits”Footnote3 or to regulations that make it more difficult for other companies to compete with the carrier (such as franchise licenses). Ibid. By giving | + | Footnote 2 This Court has been inconsistent about whether telegraphs were common carriers. Compare ''Primrose'', 154 U. S., at 14, with ''Moore v. New York Cotton Exchange'', 270 U. S. 593, 605 (1926). But the Court has consistently recognized that ]y from certain types of suits”Footnote3 or to regulations that make it more difficult for other companies to compete with the carrier (such as franchise licenses). Ibid. By giving these companies special privileges, governments place them into a category distinct from other companies and closer to some functions, like the postal service, that the State has traditionally undertaken. Second, governments have limited a company’s right to exclude when that company is a public accommodation. This concept—related to common-carrier law—applies to companies that hold themselves out to the public but do not “carry” freight, passengers, or communications. See, e.g., ''Civil Rights Cases,'' 109 U. S. 3, 41–43 (1883) (Harlan, J., dissenting) (discussing places of public amusement). It also applies regardless of the company’s market power. See, e.g., 78 Stat. 243, 42 U. S. C. §2000a(a). |
− | these companies special privileges, governments place them into a category distinct from other companies and closer to some functions, like the postal service, that the State has traditionally undertaken. Second, governments have limited a company’s right to exclude when that company is a public accommodation. This concept—related to common-carrier law—applies to companies that hold themselves out to the public but do not “carry” freight, passengers, or communications. See, e.g., ''Civil Rights Cases,'' 109 U. S. 3, 41–43 (1883) (Harlan, J., dissenting) (discussing places of public amusement). It also applies regardless of the company’s market power. See, e.g., 78 Stat. 243, 42 U. S. C. §2000a(a). | ||
− | Footnote | + | Footnote 3T elegraphs, for example, historically received some protection from defamation suits. Unlike other entities that might retransmit defamatory content, they were liable only if they knew or had reason to know that a message they distributed was defamatory. Restatement (Second) of Torts §581 (1976); see also ''O’Brien v. Western Union Tel. Co.,'' 113 F. 2d 539, 542 (CA1 1940). |
− | + | B. | |
Internet platforms of course have their own First Amendment interests, but regulations that might affect speech are valid if they would have been permissible at the time of the founding. See ''United States v. Stevens'', 559 U. S. 460, 468 (2010). The long history in this country and in England of restricting the exclusion right of common carriers and places of public accommodation may save similar regulations today from triggering heightened scrutiny—especially where a restriction would not prohibit the company from speaking or force the company to endorse the speech. See ''Turner Broadcasting System, Inc. v. FCC'', 512 U. S. 622, 684 (1994) (O’Connor, J., concurring in part and dissenting in part); ''PruneYard Shopping Center v. Robins'', 447 U. S. 74, 88 (1980). There is a fair argument that some digital platforms are sufficiently akin to common carriers or places of accommodation to be regulated in this manner. | Internet platforms of course have their own First Amendment interests, but regulations that might affect speech are valid if they would have been permissible at the time of the founding. See ''United States v. Stevens'', 559 U. S. 460, 468 (2010). The long history in this country and in England of restricting the exclusion right of common carriers and places of public accommodation may save similar regulations today from triggering heightened scrutiny—especially where a restriction would not prohibit the company from speaking or force the company to endorse the speech. See ''Turner Broadcasting System, Inc. v. FCC'', 512 U. S. 622, 684 (1994) (O’Connor, J., concurring in part and dissenting in part); ''PruneYard Shopping Center v. Robins'', 447 U. S. 74, 88 (1980). There is a fair argument that some digital platforms are sufficiently akin to common carriers or places of accommodation to be regulated in this manner. | ||
− | + | 1. | |
In many ways, digital platforms that hold themselves out to the public resemble traditional common carriers. | In many ways, digital platforms that hold themselves out to the public resemble traditional common carriers. | ||
Line 37: | Line 175: | ||
(recognizing that a private space can become a public forum when leased to the government). Common-carrier regulations, although they directly restrain private companies, thus may have an indirect effect of subjecting government officials to suits that would not otherwise be cognizable under our public-forum jurisprudence. This analysis may help explain the Second Circuit’s intuition that part of Mr. Trump’s Twitter account was a public forum. But that intuition has problems. First, if market power is a predicate for common carriers (as some scholars suggest), nothing in the record evaluates Twitter’s market power. Second, and more problematic, neither the Second Circuit nor respondents have identified any regulation that restricts Twitter from removing an account that would otherwise be a “government-controlled space.” | (recognizing that a private space can become a public forum when leased to the government). Common-carrier regulations, although they directly restrain private companies, thus may have an indirect effect of subjecting government officials to suits that would not otherwise be cognizable under our public-forum jurisprudence. This analysis may help explain the Second Circuit’s intuition that part of Mr. Trump’s Twitter account was a public forum. But that intuition has problems. First, if market power is a predicate for common carriers (as some scholars suggest), nothing in the record evaluates Twitter’s market power. Second, and more problematic, neither the Second Circuit nor respondents have identified any regulation that restricts Twitter from removing an account that would otherwise be a “government-controlled space.” | ||
− | + | 2. | |
Even if digital platforms are not close enough to common carriers, legislatures might still be able to treat digital platforms like places of public accommodation. Although definitions between jurisdictions vary, a company ordinarily is a place of public accommodation if it provides “lodging, food, entertainment, or other services to the public . . . in general.” Black’s Law Dictionary 20 (11th ed. 2019) (defining “public accommodation”); accord, 42 U. S. C. §2000a(b)(3) (covering places of “entertainment”). Twitter and other digital platforms bear resemblance to that definition. This, too, may explain the Second Circuit’s intuition. Courts are split, however, about whether federal accommodations laws apply to anything other than “physical” locations. Compare, e.g., ''Doe v. Mutual of Omaha Ins. Co.'', 179 F. 3d 557, 559 (CA7 1999) (Title III of the Americans with Disabilities Act (ADA) covers websites), with ''Parker v. Metropolitan Life Ins. Co.'', 121 F. 3d 1006, 1010–1011 (CA6 1997) (en banc) (Title III of the ADA covers only physical places); see also 42 U. S. C. §§2000a(b)–(c) (discussing “physica[l] locat[ions]”). | Even if digital platforms are not close enough to common carriers, legislatures might still be able to treat digital platforms like places of public accommodation. Although definitions between jurisdictions vary, a company ordinarily is a place of public accommodation if it provides “lodging, food, entertainment, or other services to the public . . . in general.” Black’s Law Dictionary 20 (11th ed. 2019) (defining “public accommodation”); accord, 42 U. S. C. §2000a(b)(3) (covering places of “entertainment”). Twitter and other digital platforms bear resemblance to that definition. This, too, may explain the Second Circuit’s intuition. Courts are split, however, about whether federal accommodations laws apply to anything other than “physical” locations. Compare, e.g., ''Doe v. Mutual of Omaha Ins. Co.'', 179 F. 3d 557, 559 (CA7 1999) (Title III of the Americans with Disabilities Act (ADA) covers websites), with ''Parker v. Metropolitan Life Ins. Co.'', 121 F. 3d 1006, 1010–1011 (CA6 1997) (en banc) (Title III of the ADA covers only physical places); see also 42 U. S. C. §§2000a(b)–(c) (discussing “physica[l] locat[ions]”). | ||
Line 43: | Line 181: | ||
Once again, a doctrine, such as public accommodation, that reduces the power of a platform to unilaterally remove a government account might strengthen the argument that an account is truly government controlled and creates a public forum. See ''Southeastern Promotions'', 420 U. S., at 547, 555. But no party has identified any public accommodation restriction that applies here. | Once again, a doctrine, such as public accommodation, that reduces the power of a platform to unilaterally remove a government account might strengthen the argument that an account is truly government controlled and creates a public forum. See ''Southeastern Promotions'', 420 U. S., at 547, 555. But no party has identified any public accommodation restriction that applies here. | ||
− | + | II. | |
The similarities between some digital platforms and common carriers or places of public accommodation may give legislators strong arguments for similarly regulating digital platforms. “[I]t stands to reason that if Congress may demand that telephone companies operate as common carriers, it can ask the same of ” digital platforms. ''Turner'', 512 U. S., at 684 (opinion of O’Connor, J.). That is especially true because the space constraints on digital platforms are practically nonexistent (unlike on cable companies), so a regulation restricting a digital platform’s right to exclude might not appreciably impede the platform from speaking. | The similarities between some digital platforms and common carriers or places of public accommodation may give legislators strong arguments for similarly regulating digital platforms. “[I]t stands to reason that if Congress may demand that telephone companies operate as common carriers, it can ask the same of ” digital platforms. ''Turner'', 512 U. S., at 684 (opinion of O’Connor, J.). That is especially true because the space constraints on digital platforms are practically nonexistent (unlike on cable companies), so a regulation restricting a digital platform’s right to exclude might not appreciably impede the platform from speaking. | ||
Line 53: | Line 191: | ||
For example, although a “private entity is not ordinarily constrained by the First Amendment,” ''Halleck,'' 587 U. S., at ___, ___ (slip op., at 6, 9), it is if the government coerces or induces it to take action the government itself would not be permitted to do, such as censor expression of a lawful viewpoint. Ibid. Consider government threats. “People do not lightly disregard public officers’ thinly veiled threats to institute criminal proceedings against them if they do not come around.” ''Bantam Books, Inc. v. Sullivan'', 372 U. S. 58, 68 (1963). The government cannot accomplish through threats of adverse government action what the Constitution prohibits it from doing directly. See ibid.; ''Blum v. Yaretsky'', 457 U. S. 991, 1004–1005 (1982). Under this doctrine, plaintiffs might have colorable claims against a digital platform if it took adverse action against them in response to government threats. | For example, although a “private entity is not ordinarily constrained by the First Amendment,” ''Halleck,'' 587 U. S., at ___, ___ (slip op., at 6, 9), it is if the government coerces or induces it to take action the government itself would not be permitted to do, such as censor expression of a lawful viewpoint. Ibid. Consider government threats. “People do not lightly disregard public officers’ thinly veiled threats to institute criminal proceedings against them if they do not come around.” ''Bantam Books, Inc. v. Sullivan'', 372 U. S. 58, 68 (1963). The government cannot accomplish through threats of adverse government action what the Constitution prohibits it from doing directly. See ibid.; ''Blum v. Yaretsky'', 457 U. S. 991, 1004–1005 (1982). Under this doctrine, plaintiffs might have colorable claims against a digital platform if it took adverse action against them in response to government threats. | ||
− | But no threat is alleged here. What threats would cause a private choice by a digital platform to “be deemed . . . that of the State” remains unclear. Id., at 1004.5 And no party | + | But no threat is alleged here. What threats would cause a private choice by a digital platform to “be deemed . . . that of the State” remains unclear. Id., at 1004.5 And no party has sued Twitter. The question facing the courts below involved only whether a government actor violated the First Amendment by blocking another Twitter user. That issue turns, at least to some degree, on ownership and the right to exclude. |
Footnote 5 Threats directed at digital platforms can be especially problematic in the light of 47 U. S. C. §230, which some courts have misconstrued to give digital platforms immunity for bad-faith removal of third-party content. ''Malwarebytes, Inc. v. Enigma Software Group USA, LLC,'' 592 U. S. ___, ___–___ (2020) (THOMAS, J., statement respecting denial of certiorari) (slip op., at 7–8). This immunity eliminates the biggest deterrent— a private lawsuit—against caving to an unconstitutional government threat. | Footnote 5 Threats directed at digital platforms can be especially problematic in the light of 47 U. S. C. §230, which some courts have misconstrued to give digital platforms immunity for bad-faith removal of third-party content. ''Malwarebytes, Inc. v. Enigma Software Group USA, LLC,'' 592 U. S. ___, ___–___ (2020) (THOMAS, J., statement respecting denial of certiorari) (slip op., at 7–8). This immunity eliminates the biggest deterrent— a private lawsuit—against caving to an unconstitutional government threat. | ||
− | For similar reasons, some commentators have suggested that immunity provisions like §230 could potentially violate the First Amendment to the extent those provisions pre-empt state laws that protect speech from private censorship. See Volokh, Might Federal Preemption of SpeechProtective State Laws Violate the First Amendment? ''The Volokh Conspiracy, Reason'', Jan. 23, 2021. According to that argument, when a State creates a private right and a federal statute pre-empts that state law, “the federal statute is the source of the power and authority by which any private rights are lost or sacrificed.” Railway Employees v. | + | For similar reasons, some commentators have suggested that immunity provisions like §230 could potentially violate the First Amendment to the extent those provisions pre-empt state laws that protect speech from private censorship. See Volokh, Might Federal Preemption of SpeechProtective State Laws Violate the First Amendment? ''The Volokh Conspiracy, Reason'', Jan. 23, 2021. According to that argument, when a State creates a private right and a federal statute pre-empts that state law, “the federal statute is the source of the power and authority by which any private rights are lost or sacrificed.” Railway Employees v.}} |
− | has | + | ---- |
+ | |||
+ | ==Stossel v Meta== | ||
+ | Stossell, represented by Harmeet Dillon, sued Facebook (the Meta company) for defamation because they said their factcheckers found he lied in connection with posts at Climate Feedback. Facebook moved to dismiss, on grounds including that its factchecking was just opinion and that Section 230 gave it immunity anyway. | ||
+ | |||
+ | Consider this in relation to the Texas Netchoice v. Paxton case. There, the defense of the Big Tech companies is that they exercise editorial discretion, just like magazines, and hence for them to be required to be viewpoint-neutral would be compelled speech. In Stossel, Facebook's defence is that Climate Feedback provides the content, Facebook just passively accepts it. These are incompatible. | ||
+ | |||
+ | I hope that someone squarely presents the argument that Facebook has forfeited its Section 230 immunity by becoming an editor. Facebook will argue that it is immune anyway. That is to take the position that it could operate as a webzine and invite free content that it admits it knows is false and defamatory, and still be immune. Indeed, it could even invite people to submit defamatory posts, advertising, | ||
+ | |||
+ | :"Frustrated that you can't libel your private enemies because they'll sue? Publish here on Facebooks's Libelzine site! We're immune from suit, by Section 230. Do it anonymously, with protonmail, and you'll be safe too. Note: we do exercise some editorial discretion. If you libel a Democrat, we won't let you publish." | ||
+ | |||
+ | This, I believe, is Big Tech's current interpretation of Section 230, though they do not advertise this implication. | ||
+ | |||
+ | It goes even further. Under this interpretation, I think, it is not just Facebook that is immune, it is Climate Feedback, if they set themselves up cleverly. Facebook says Climate Feedback provided the content. But did they? If Climate Feedback did not pay its authors, and merely let them post there, or even invited them but did not tell them what to write, then they are not agents of Climate Feedback and so Climate Feedback can push the liability off on the authors. Libelzine can be a stand-alone operation. I, Eric Rasmusen, could set up a website specifically devoted to defaming non-public persons, advertising that fact, and I would be immune from suit, by Section 230. | ||
+ | |||
+ | We could perhaps go a step further, tho I haven't looked closely enough at the exceptions in Section 230 in writing this. How about if Eric Rasmusen sets up a site called Hit Man Central, where people could anonymously advertise their services as murderers for hire? I couldn't profit from it--- I would do it from pure malice-- and I could be subpoena'd for any info I had, but I'd be sure to explain to the content providers how to submit secretly and how to tell clients to get in touch with them. | ||
+ | |||
+ | I do not know Section 230 law much, though I am rapidly learning. The statute itself is poorly written and ambiguous. One literal interpretation is that Facebook can indeed set up Libelzine. An opposite interpretation is is that Section 230 says it does not displace state law, so it doesn't displace defamation law; it only clarifies that Facebook isn't liable for unintentionally allowing the posting of defamatory materials by someone unrelated to Facebook and unencouraged by them. | ||
+ | |||
+ | The purpose of Section 230 seems pretty clear, though: to let internet service providers provide arms-length hosting without having to check all the content in advance. It was to improve information flow, not to worsen it by legalizing defamation. It did not contemplate service providers who operated like newspapers. | ||
+ | |||
+ | *[https://digitalcommons.law.scu.edu/cgi/viewcontent.cgi?article=3543&context=historical Stossel complaint] | ||
+ | |||
+ | *[https://wattsupwiththat.com/wp-content/uploads/2021/12/Facebook-admits-its-fact-check-is-opinion-page-2.pdf MEta (facebook) motion to dismiss] | ||
+ | |||
+ | ---- | ||
+ | |||
+ | == Examples of Tech Lord Censorship== | ||
+ | *[https://twitter.com/seanmdav/status/1510055681545719810 Google and Ukraine] (2022) | ||
+ | |||
+ | ==Political Bias== | ||
+ | *[https://twitter.com/HillelNeuer/status/1518736198361100290 Massive imbalance in campaign contributions]. Source: Data from Open Secrets, a nonprofit political spending database by the Center for Responsive Politics, published in Vox, October 31, 2018. | ||
+ | |||
+ | ==Parler's Destruction== | ||
+ | *[https://greenwald.substack.com/p/how-silicon-valley-in-a-show-of-monopolistic?s=w Greenwald on Parler] | ||
+ | |||
+ | ==Miscellaneous== | ||
+ | |||
+ | *[https://www.nationalreview.com/corner/is-big-tech-censorship-unconstitutional/ National Review on Cato] | ||
+ | |||
+ | *[https://www.cato.org/blog/all-roads-lead-big-government-heritage-takes-big-tech Cato on Heritage on the Lords of Tech] | ||
+ | |||
+ | *""Public Forum" is a term of constitutional significance - it refers to the public space that the govt provides - not a private website at which people congregate. Courts have repeatedly held that social media platforms are not subject to the "public forum doctrine." But what if the government bans private companies from doing something, and then sets up their own public space? What about air waves TV and the FCC? | ||
+ | |||
+ | *It would be quite possible to issue new versions of Windows or the Apple operating system that do not allow you to read articles criticizing the Biden Administration. In fact, they could be written to disallow writing such criticism on them. Do we want to allow that? Think of secondary boycotts in labor law, or the prohibition on cartels in antitrust. | ||
+ | |||
+ | |||
+ | ---- |
Latest revision as of 11:21, 29 July 2023
Contents
Introduction
- See also Texas v. Tech Lords.
- I should organize a Zoom conference of Common Carrier people for late January. Invite Goldman too. Private conference, no media, just learn. Like publshing each others' amicus briefs in advance.
- "Are Facebook and Google State Actors?" Lawfare, Jed Rubenfeld , November 4, 2019.
- 47 U.S. Code 230, the big federal statute on Big Tech.
(a)Findings The Congress finds the following: (1)The rapidly developing array of Internet and other interactive computer services available to individual Americans represent an extraordinary advance in the availability of educational and informational resources to our citizens. (2)These services offer users a great degree of control over the information that they receive, as well as the potential for even greater control in the future as technology develops. (3)The Internet and other interactive computer services offer a forum for a true diversity of political discourse, unique opportunities for cultural development, and myriad avenues for intellectual activity. (4)The Internet and other interactive computer services have flourished, to the benefit of all Americans, with a minimum of government regulation. (5)Increasingly Americans are relying on interactive media for a variety of political, educational, cultural, and entertainment services. (b)Policy It is the policy of the United States— (1)to promote the continued development of the Internet and other interactive computer services and other interactive media; (2)to preserve the vibrant and competitive free market that presently exists for the Internet and other interactive computer services, unfettered by Federal or State regulation; (3)to encourage the development of technologies which maximize user control over what information is received by individuals, families, and schools who use the Internet and other interactive computer services; (4)to remove disincentives for the development and utilization of blocking and filtering technologies that empower parents to restrict their children’s access to objectionable or inappropriate online material; and (5)to ensure vigorous enforcement of Federal criminal laws to deter and punish trafficking in obscenity, stalking, and harassment by means of computer. (c)Protection for “Good Samaritan” blocking and screening of offensive material (1)Treatment of publisher or speaker No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.
(2)Civil liability No provider or user of an interactive computer service shall be held liable on account of— (A)any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected; or (B)any action taken to enable or make available to information content providers or others the technical means to restrict access to material described in paragraph (1).[1] (d)Obligations of interactive computer service A provider of interactive computer service shall, at the time of entering an agreement with a customer for the provision of interactive computer service and in a manner deemed appropriate by the provider, notify such customer that parental control protections (such as computer hardware, software, or filtering services) are commercially available that may assist the customer in limiting access to material that is harmful to minors. Such notice shall identify, or provide the customer with access to information identifying, current providers of such protections.
(e)Effect on other laws (1)No effect on criminal law Nothing in this section shall be construed to impair the enforcement of section 223 or 231 of this title, chapter 71 (relating to obscenity) or 110 (relating to sexual exploitation of children) of title 18, or any other Federal criminal statute.
(2)No effect on intellectual property law Nothing in this section shall be construed to limit or expand any law pertaining to intellectual property.
(3)State law Nothing in this section shall be construed to prevent any State from enforcing any State law that is consistent with this section. No cause of action may be brought and no liability may be imposed under any State or local law that is inconsistent with this section.
(4)No effect on communications privacy law Nothing in this section shall be construed to limit the application of the Electronic Communications Privacy Act of 1986 or any of the amendments made by such Act, or any similar State law.
(5)No effect on sex trafficking law Nothing in this section (other than subsection (c)(2)(A)) shall be construed to impair or limit— (A)any claim in a civil action brought under section 1595 of title 18, if the conduct underlying the claim constitutes a violation of section 1591 of that title; (B)any charge in a criminal prosecution brought under State law if the conduct underlying the charge would constitute a violation of section 1591 of title 18; or (C)any charge in a criminal prosecution brought under State law if the conduct underlying the charge would constitute a violation of section 2421A of title 18, and promotion or facilitation of prostitution is illegal in the jurisdiction where the defendant’s promotion or facilitation of prostitution was targeted. (f)Definitions As used in this section: (1)Internet The term “Internet” means the international computer network of both Federal and non-Federal interoperable packet switched data networks.
(2)Interactive computer service The term “interactive computer service” means any information service, system, or access software provider that provides or enables computer access by multiple users to a computer server, including specifically a service or system that provides access to the Internet and such systems operated or services offered by libraries or educational institutions.
(3)Information content provider The term “information content provider” means any person or entity that is responsible, in whole or in part, for the creation or development of information provided through the Internet or any other interactive computer service.
(4)Access software provider The term “access software provider” means a provider of software (including client or server software), or enabling tools that do any one or more of the following: (A)filter, screen, allow, or disallow content; (B)pick, choose, analyze, or digest content; or (C)transmit, receive, display, forward, cache, search, subset, organize, reorganize, or translate content. (June 19, 1934, ch. 652, title II, § 230, as added Pub. L. 104–104, title V, § 509, Feb. 8, 1996, 110 Stat. 137; amended Pub. L. 105–277, div. C, title XIV, § 1404(a), Oct. 21, 1998, 112 Stat. 2681–739; Pub. L. 115–164, § 4(a), Apr. 11, 2018, 132 Stat. 1254.)
Ohio case
- Here it is.
Evidence
Facts
- "youtube-cuts-off-the-best-real-time-legal-coverage-of-rittenhouse-trial-and-immediately-regrets-it," PJMedia (Nov. 15, 2021):
Right after Assistant District Attorney Thomas Binger began his closing statement in the Kyle Rittenhouse Trial, YouTube cut off channels that were beating legacy media channels. Coincidence? The Rekieta Law channel, which features multiple lawyers doing real-time analysis of the trial, often beat the number of people watching the PBS stream. The PBS stream is one of the more reliable ones available to YouTube users and was being used by several outlets. After getting cut off, Nick Rekieta reminded YouTube that ten lawyers considered it a breach of contract.
...
Ticking off channels featuring dozens of lawyers seemed like a bad business plan. Within a few minutes, the stream was put back up after Rekieta reminded the tech giant that the courtroom coverage was public property and therefore not under copyright.
- Stossel defamation case admission by Facebook that its factcheckers are just opinion-listers. The complaint is this. The motion to dismiss.
Statutory History
Regulations
asdsdfsd
Commentary
Santa Clara Univ. Legal Studies Research Paper, 14 Nov 2021).
- "The Scale Of Content Moderation Is Unfathomable: from the it's-way-more-than-you-think dept," TechDirt.com, (Nov 2nd 2021) Mike Masnick.
Litigation Strategy
- The idea is for someone to sue Facebook or Twitter for defamation. The plaintiff would be somebody, preferably not a public figure in any way, who was clearly defamed by someone posting on defendant's website (let's use Twitter for concreteness). Twitter would move to dismiss on the grounds that they are immune by statute from liability for defamation by somebody posting on their website, and perhaps immune for some common law reason such as that they can't be expected to police content any more than somebody polices the graffiti on the wall of their warehouse. The response would be that Twitter has forfeited its immunity because it actually *does* police content, and not just for obscenity, but for correctness of political content, so it has shown itself willing and able to do so. THus, Twitter is actually a content provider with editorial control like a magazine, not a neutral software provider. This is a question of fact, which must be decided by a jury.
The purpose would not be to win the particular lawsuit, but to establish in a court of law that Twitter is liable for defamatory content. If this could get beyond motion to dismiss, it would allow Discovery of Twitter's internal documents and practices and publicity about them at trial, even if it lost in the end. It would prepare the way for Twitter to be treated by courts and regulated by the government as a common carrier (though that of course raises the additional issue of whether it is a natural monopoly).
A big thing is to find a test case which would not be dismissed for other reasons--- so it would be good to find defamation per se (false accusation of a crime, say), of a private figure, with a false statement of fact not opinion, etc., which was widely circulated, with a plaintiff in a favorable state and federal circuit.
Caselaw
- Page v. Oath, 2021 WL 528472 (Del. Superior Ct. Feb. 11, 2021).
- IO Group v. Veoh Networks, 5:2006cv03926 (N.D. Cal. Aug. 27, 2008) .
- Blumenthal v. Drudge (1998)
- Downs v. Oath, 2019 WL 2209206 (S.D.N.Y. May 22, 2019)
- [UMG Recordings, Inc. v. Veoh Networks, Inc]., 2008 WL 5423841 (C.D. Cal. Dec. 29, 2008)
Judge Thomas (2021)
Thomas, J. in Biden v. Knight (2021):
If part of the problem is private, concentrated control over online content and platforms available to the public, then part of the solution may be found in doctrines that limit the right of a private company to exclude. Historically, at least two legal doctrines limited a company’s right to exclude. First, our legal system and its British predecessor have long subjected certain businesses, known as common carriers, to special regulations, including a general requirement to serve all comers. Candeub, Bargaining for Free Speech: Common Carriage, Network Neutrality, and Section 230, 22 Yale J. L. & Tech. 391, 398–403 (2020) (Candeub); see also Burdick, The Origin of the Peculiar Duties of Public Service Companies, Pt. 1, 11 Colum. L. Rev. 514 (1911). Justifications for these regulations have varied. Some scholars have argued that common-carrier regulations are justified only when a carrier possesses substantial market power. Candeub 404. Others have said that no substantial market power is needed so long as the company holds itself out as open to the public. Ibid.; see also Ingate v. Christie, 3 Car. & K. 61, 63, 175 Eng. Rep. 463, 464 (N. P. 1850) (“[A] person [who] holds himself out to carry goods for everyone as a business . . . is a common carrier”). And this Court long ago suggested that regulations like those placed on common carriers may be justified, even for industries not historically recognized as common carriers, when “a business, by circumstances and its nature, . . . rise[s] from private to be of public concern.” See German Alliance Ins. Co. v. Lewis, 233 U. S. 389, 411 (1914) (affirming state regulation of fire insurance rates). At that point, a company’s “property is but its instrument, the means of rendering the service which has become of public interest.” Id., at 408.
This latter definition of course is hardly helpful, for most things can be described as “of public interest.” But whatever may be said of other industries, there is clear historical precedent for regulating transportation and communications networks in a similar manner as traditional common carriers. Candeub 398–405. Telegraphs, for example, because they “resemble[d] railroad companies and other common carriers,” were “bound to serve all customers alike, without discrimination.” Primrose v. Western Union Telegraph Co., 154 U. S. 1, 14 (1894).footnote 2
Footnote 2 This Court has been inconsistent about whether telegraphs were common carriers. Compare Primrose, 154 U. S., at 14, with Moore v. New York Cotton Exchange, 270 U. S. 593, 605 (1926). But the Court has consistently recognized that ]y from certain types of suits”Footnote3 or to regulations that make it more difficult for other companies to compete with the carrier (such as franchise licenses). Ibid. By giving these companies special privileges, governments place them into a category distinct from other companies and closer to some functions, like the postal service, that the State has traditionally undertaken. Second, governments have limited a company’s right to exclude when that company is a public accommodation. This concept—related to common-carrier law—applies to companies that hold themselves out to the public but do not “carry” freight, passengers, or communications. See, e.g., Civil Rights Cases, 109 U. S. 3, 41–43 (1883) (Harlan, J., dissenting) (discussing places of public amusement). It also applies regardless of the company’s market power. See, e.g., 78 Stat. 243, 42 U. S. C. §2000a(a).
Footnote 3T elegraphs, for example, historically received some protection from defamation suits. Unlike other entities that might retransmit defamatory content, they were liable only if they knew or had reason to know that a message they distributed was defamatory. Restatement (Second) of Torts §581 (1976); see also O’Brien v. Western Union Tel. Co., 113 F. 2d 539, 542 (CA1 1940).
B.
Internet platforms of course have their own First Amendment interests, but regulations that might affect speech are valid if they would have been permissible at the time of the founding. See United States v. Stevens, 559 U. S. 460, 468 (2010). The long history in this country and in England of restricting the exclusion right of common carriers and places of public accommodation may save similar regulations today from triggering heightened scrutiny—especially where a restriction would not prohibit the company from speaking or force the company to endorse the speech. See Turner Broadcasting System, Inc. v. FCC, 512 U. S. 622, 684 (1994) (O’Connor, J., concurring in part and dissenting in part); PruneYard Shopping Center v. Robins, 447 U. S. 74, 88 (1980). There is a fair argument that some digital platforms are sufficiently akin to common carriers or places of accommodation to be regulated in this manner.
1.
In many ways, digital platforms that hold themselves out to the public resemble traditional common carriers.
Though digital instead of physical, they are at bottom communications networks, and they “carry” information from one user to another. A traditional telephone company laid physical wires to create a network connecting people. Digital platforms lay information infrastructure that can be controlled in much the same way. And unlike newspapers, digital platforms hold themselves out as organizations that focus on distributing the speech of the broader public. Federal law dictates that companies cannot “be treated as the publisher or speaker” of information that they merely distribute. 110 Stat. 137, 47 U. S. C. §230(c). The analogy to common carriers is even clearer for digital platforms that have dominant market share. Similar to utilities, today’s dominant digital platforms derive much of their value from network size. The Internet, of course, is a network. But these digital platforms are networks within that network. The Facebook suite of apps is valuable largely because 3 billion people use it. Google search—at 90% of the market share—is valuable relative to other search engines because more people use it, creating data that Google’s algorithm uses to refine and improve search results. These network effects entrench these companies. Ordinarily, the astronomical profit margins of these platforms—last year, Google brought in $182.5 billion total, $40.3 billion in net income—would induce new entrants into the market. That these companies have no comparable competitors highlights that the industries may have substantial barriers to entry.
To be sure, much activity on the Internet derives value from network effects. But dominant digital platforms are different. Unlike decentralized digital spheres, such as the e-mail protocol, control of these networks is highly concentrated. Although both companies are public, one person controls Facebook (Mark Zuckerberg), and just two control Google (Larry Page and Sergey Brin). No small group of people controls e-mail.
Much like with a communications utility, this concentration gives some digital platforms enormous control over speech. When a user does not already know exactly where to find something on the Internet—and users rarely do— Google is the gatekeeper between that user and the speech of others 90% of the time. It can suppress content by deindexing or downlisting a search result or by steering users away from certain content by manually altering autocomplete results. Grind, Schechner, McMillan, & West, How Google Interferes With Its Search Algorithms and Changes Your Results, Wall Street Journal, Nov. 15, 2019. Facebook and Twitter can greatly narrow a person’s information flow through similar means. And, as the distributor of the clear majority of e-books and about half of all physical books,Footnote4 Amazon can impose cataclysmic consequences on authors by, among other things, blocking a listing.
Footnote4 As of 2018, Amazon had 42% of the physical book market and 89% of the e-book market. Day & Gu, The Enormous Numbers Behind Amazon’s Market Reach, Bloomberg, Mar. 27, 2019.
It changes nothing that these platforms are not the sole means for distributing speech or information. A person always could choose to avoid the toll bridge or train and instead swim the Charles River or hike the Oregon Trail. But in assessing whether a company exercises substantial market power, what matters is whether the alternatives are comparable. For many of today’s digital platforms, nothing is.If the analogy between common carriers and digital platforms is correct, then an answer may arise for dissatisfied platform users who would appreciate not being blocked: laws that restrict the platform’s right to exclude. When a platform’s unilateral control is reduced, a government official’s account begins to better resemble a “government-controlled spac[e].” Mansky, 585 U. S., at ___ (slip op., at 7); see also Southeastern Promotions, 420 U. S., at 547, 555 (recognizing that a private space can become a public forum when leased to the government). Common-carrier regulations, although they directly restrain private companies, thus may have an indirect effect of subjecting government officials to suits that would not otherwise be cognizable under our public-forum jurisprudence. This analysis may help explain the Second Circuit’s intuition that part of Mr. Trump’s Twitter account was a public forum. But that intuition has problems. First, if market power is a predicate for common carriers (as some scholars suggest), nothing in the record evaluates Twitter’s market power. Second, and more problematic, neither the Second Circuit nor respondents have identified any regulation that restricts Twitter from removing an account that would otherwise be a “government-controlled space.”
2.
Even if digital platforms are not close enough to common carriers, legislatures might still be able to treat digital platforms like places of public accommodation. Although definitions between jurisdictions vary, a company ordinarily is a place of public accommodation if it provides “lodging, food, entertainment, or other services to the public . . . in general.” Black’s Law Dictionary 20 (11th ed. 2019) (defining “public accommodation”); accord, 42 U. S. C. §2000a(b)(3) (covering places of “entertainment”). Twitter and other digital platforms bear resemblance to that definition. This, too, may explain the Second Circuit’s intuition. Courts are split, however, about whether federal accommodations laws apply to anything other than “physical” locations. Compare, e.g., Doe v. Mutual of Omaha Ins. Co., 179 F. 3d 557, 559 (CA7 1999) (Title III of the Americans with Disabilities Act (ADA) covers websites), with Parker v. Metropolitan Life Ins. Co., 121 F. 3d 1006, 1010–1011 (CA6 1997) (en banc) (Title III of the ADA covers only physical places); see also 42 U. S. C. §§2000a(b)–(c) (discussing “physica[l] locat[ions]”).
Once again, a doctrine, such as public accommodation, that reduces the power of a platform to unilaterally remove a government account might strengthen the argument that an account is truly government controlled and creates a public forum. See Southeastern Promotions, 420 U. S., at 547, 555. But no party has identified any public accommodation restriction that applies here.
II.
The similarities between some digital platforms and common carriers or places of public accommodation may give legislators strong arguments for similarly regulating digital platforms. “[I]t stands to reason that if Congress may demand that telephone companies operate as common carriers, it can ask the same of ” digital platforms. Turner, 512 U. S., at 684 (opinion of O’Connor, J.). That is especially true because the space constraints on digital platforms are practically nonexistent (unlike on cable companies), so a regulation restricting a digital platform’s right to exclude might not appreciably impede the platform from speaking.
See id., at 675, 684 (noting restrictions on one-third of a cable company’s channels but recognizing that regulation may still be justified); PruneYard, 447 U. S., at 88. Yet Congress does not appear to have passed these kinds of regulations. To the contrary, it has given digital platforms “immunity from certain types of suits,” Candeub 403, with respect to content they distribute, 47 U. S. C. §230, but it has not imposed corresponding responsibilities, like nondiscrimination, that would matter here.
None of this analysis means, however, that the First Amendment is irrelevant until a legislature imposes common carrier or public accommodation restrictions—only that the principal means for regulating digital platforms is through those methods. Some speech doctrines might still apply in limited circumstances, as this Court has recognized in the past.
For example, although a “private entity is not ordinarily constrained by the First Amendment,” Halleck, 587 U. S., at ___, ___ (slip op., at 6, 9), it is if the government coerces or induces it to take action the government itself would not be permitted to do, such as censor expression of a lawful viewpoint. Ibid. Consider government threats. “People do not lightly disregard public officers’ thinly veiled threats to institute criminal proceedings against them if they do not come around.” Bantam Books, Inc. v. Sullivan, 372 U. S. 58, 68 (1963). The government cannot accomplish through threats of adverse government action what the Constitution prohibits it from doing directly. See ibid.; Blum v. Yaretsky, 457 U. S. 991, 1004–1005 (1982). Under this doctrine, plaintiffs might have colorable claims against a digital platform if it took adverse action against them in response to government threats.
But no threat is alleged here. What threats would cause a private choice by a digital platform to “be deemed . . . that of the State” remains unclear. Id., at 1004.5 And no party has sued Twitter. The question facing the courts below involved only whether a government actor violated the First Amendment by blocking another Twitter user. That issue turns, at least to some degree, on ownership and the right to exclude.
Footnote 5 Threats directed at digital platforms can be especially problematic in the light of 47 U. S. C. §230, which some courts have misconstrued to give digital platforms immunity for bad-faith removal of third-party content. Malwarebytes, Inc. v. Enigma Software Group USA, LLC, 592 U. S. ___, ___–___ (2020) (THOMAS, J., statement respecting denial of certiorari) (slip op., at 7–8). This immunity eliminates the biggest deterrent— a private lawsuit—against caving to an unconstitutional government threat.
For similar reasons, some commentators have suggested that immunity provisions like §230 could potentially violate the First Amendment to the extent those provisions pre-empt state laws that protect speech from private censorship. See Volokh, Might Federal Preemption of SpeechProtective State Laws Violate the First Amendment? The Volokh Conspiracy, Reason, Jan. 23, 2021. According to that argument, when a State creates a private right and a federal statute pre-empts that state law, “the federal statute is the source of the power and authority by which any private rights are lost or sacrificed.” Railway Employees v.
Stossel v Meta
Stossell, represented by Harmeet Dillon, sued Facebook (the Meta company) for defamation because they said their factcheckers found he lied in connection with posts at Climate Feedback. Facebook moved to dismiss, on grounds including that its factchecking was just opinion and that Section 230 gave it immunity anyway.
Consider this in relation to the Texas Netchoice v. Paxton case. There, the defense of the Big Tech companies is that they exercise editorial discretion, just like magazines, and hence for them to be required to be viewpoint-neutral would be compelled speech. In Stossel, Facebook's defence is that Climate Feedback provides the content, Facebook just passively accepts it. These are incompatible.
I hope that someone squarely presents the argument that Facebook has forfeited its Section 230 immunity by becoming an editor. Facebook will argue that it is immune anyway. That is to take the position that it could operate as a webzine and invite free content that it admits it knows is false and defamatory, and still be immune. Indeed, it could even invite people to submit defamatory posts, advertising,
- "Frustrated that you can't libel your private enemies because they'll sue? Publish here on Facebooks's Libelzine site! We're immune from suit, by Section 230. Do it anonymously, with protonmail, and you'll be safe too. Note: we do exercise some editorial discretion. If you libel a Democrat, we won't let you publish."
This, I believe, is Big Tech's current interpretation of Section 230, though they do not advertise this implication.
It goes even further. Under this interpretation, I think, it is not just Facebook that is immune, it is Climate Feedback, if they set themselves up cleverly. Facebook says Climate Feedback provided the content. But did they? If Climate Feedback did not pay its authors, and merely let them post there, or even invited them but did not tell them what to write, then they are not agents of Climate Feedback and so Climate Feedback can push the liability off on the authors. Libelzine can be a stand-alone operation. I, Eric Rasmusen, could set up a website specifically devoted to defaming non-public persons, advertising that fact, and I would be immune from suit, by Section 230.
We could perhaps go a step further, tho I haven't looked closely enough at the exceptions in Section 230 in writing this. How about if Eric Rasmusen sets up a site called Hit Man Central, where people could anonymously advertise their services as murderers for hire? I couldn't profit from it--- I would do it from pure malice-- and I could be subpoena'd for any info I had, but I'd be sure to explain to the content providers how to submit secretly and how to tell clients to get in touch with them.
I do not know Section 230 law much, though I am rapidly learning. The statute itself is poorly written and ambiguous. One literal interpretation is that Facebook can indeed set up Libelzine. An opposite interpretation is is that Section 230 says it does not displace state law, so it doesn't displace defamation law; it only clarifies that Facebook isn't liable for unintentionally allowing the posting of defamatory materials by someone unrelated to Facebook and unencouraged by them.
The purpose of Section 230 seems pretty clear, though: to let internet service providers provide arms-length hosting without having to check all the content in advance. It was to improve information flow, not to worsen it by legalizing defamation. It did not contemplate service providers who operated like newspapers.
Examples of Tech Lord Censorship
- Google and Ukraine (2022)
Political Bias
- Massive imbalance in campaign contributions. Source: Data from Open Secrets, a nonprofit political spending database by the Center for Responsive Politics, published in Vox, October 31, 2018.
Parler's Destruction
Miscellaneous
- ""Public Forum" is a term of constitutional significance - it refers to the public space that the govt provides - not a private website at which people congregate. Courts have repeatedly held that social media platforms are not subject to the "public forum doctrine." But what if the government bans private companies from doing something, and then sets up their own public space? What about air waves TV and the FCC?
- It would be quite possible to issue new versions of Windows or the Apple operating system that do not allow you to read articles criticizing the Biden Administration. In fact, they could be written to disallow writing such criticism on them. Do we want to allow that? Think of secondary boycotts in labor law, or the prohibition on cartels in antitrust.