Written Statement
Jonathan Turley
Shapiro Professor of Public Interest Law
The George Washington University Law School
“Censorship Laundering:
How the U.S. Department of Homeland Security Enables the Silencing of Dissent”
Subcommittee on Oversight, Investigation, and Accountability
Committee on Homeland Security
United States House of Representatives
May 11, 2023
I. INTRODUCTION
Chairman Bishop, ranking member Ivey, members of the Subcommittee, my
name is Jonathan Turley, and I am a law professor at George Washington University,
where I hold the J.B. and Maurice C. Shapiro Chair of Public Interest Law.
1
It is an honor
to appear before you today to discuss free speech and government censorship.
For the purposes of background, I come to this subject as someone who has
written,
2
litigated,
3
and testified
4
in the areas of congressional oversight and the First
1
I appear today on my own behalf, and my views do not reflect those of my law school or the
media organizations that feature my legal analysis.
2
In addition to a blog with a focus on First Amendment issues (www.jonathanturley.org), I have
written on First Amendment issues as an academic for decades. See, e.g., Jonathan Turley, THE
INDISPENSABLE RIGHT: FREE SPEECH IN THE AGE OF RAGE (forthcoming 2024); Jonathan
Turley, The Unfinished Masterpiece: Speech Compulsion and the Evolving Jurisprudence of Religious
Speech 82 MD L. REV. (forthcoming 2023); Jonathan Turley, Rage Rhetoric and the Revival of American
Sedition, 65 William & Mary Law Review (forthcoming 2023), Jonathan Turley, The Right to Rage in
American Political Discourse, GEO. J.L. & PUB. POLY (forthcoming 2023); Jonathan Turley, Harm and
Hegemony: The Decline of Free Speech in the United States, 45 HARV. J.L. & PUB. POLY 571 (2022);
Jonathan Turley, The Loadstone Rock: The Role of Harm in The Criminalization of Plural Unions, 64
EMORY L.J. 1905 (2015); Jonathan Turley, Registering Publicus: The Supreme Court and Right to
Anonymity, 2002 SUP. CT. REV. 57-83.
3
See, e.g., Eugene Volokh, The Sisters Wives Case and the Criminal Prosecution of Polygamy,
WASH. POST, Aug. 28, 2015 (discussing challenge on religious, speech, and associational rights); Jonathan
Turley, Thanks to the Sisters Wives Litigation, We have One Less Morality Law, WASH. POST, Dec. 12,
2013.
4
See, e.g., “Hearing on the Weaponization of the Federal Government,” United States House of
Representatives, House Judiciary Committee, Select Subcommittee on the Weaponization of the Federal
Government, February 9, 2023 (statement of Jonathan Turley); Examining the ‘Metastasizing’ Domestic
Terrorism Threat After the Buffalo Attack: Hearing Before the S. Comm. on the Judiciary, 117th Cong.
(2022) (statement of Jonathan Turley); Secrecy Orders and Prosecuting Leaks: Potential Legislative
Responses to Deter Prosecutorial Abuse of Power: Hearing Before H. Comm. on the Judiciary, 117th
Cong. (2021) (statement of Jonathan Turley); Fanning the Flames: Disinformation and Extremism in the
Media: Hearing Before the Subcomm. on Commc’n & Tech. of the H. Comm. on Energy & Com., 117th
Cong. (2021) (statement of Jonathan Turley); The Right of The People Peacefully to Assemble: Protecting
2
Amendment for decades. I have also represented the United States House of
Representatives in litigation.
5
My testimony today obviously reflects that past work and I
hope to offer a fair understanding of the governing constitutional provisions, case law,
and standards that bear on this question.
As I recently testified before the House Judiciary Committee, the growing
evidence of censorship and blacklisting efforts by the government raises serious and
troubling questions over our protection of free speech.
6
There are legitimate
disagreements on how Congress should address the role of the government in such
censorship. The first step, however, is to fully understand the role played in prior years
and to address the deep-seated doubts of many Americans concerning the actions of the
government to stifle or sanction speech.
The Twitter Files and other recent disclosures raise serious questions of whether
the United States government is now a partner in what may be the largest censorship
system in our history. That involvement cuts across the Executive Branch, with
confirmed coordination with agencies ranging from the Homeland Security to the State
Department to the Federal Bureau of Investigation (FBI). Even based on our limited
knowledge, the size of this censorship system is breathtaking, and we only know of a
fraction of its operations through the Twitter Files, congressional hearings, and pending
litigation. Most of the information has come from the Twitter Files after the purchase of
the company by Elon Musk. Notably, Twitter has 450 million active users
7
but it is still
only ranked 15th in the number of users, after companies such as Facebook, Instagram,
TikTok, Snapchat, and Pinterest.
8
The assumption is that the government censorship
program dovetailed with these other companies, which continue to refuse to share past
communications or work with the government. Assuming these efforts extended to the
larger platforms, we have a government-supported censorship system that is unparalleled
in history.
We now have undeniable evidence of a comprehensive system of censorship that
stretches across the government, academia, and corporate realms. Through disinformation
offices, grants, and other means, an array of federal agencies has been active
“stakeholders” in this system. This includes Homeland Security, State Department, the
FBI and other federal agencies actively seeking the censorship of citizens and groups.
The partners in this effort extend across social media platforms. The goal is not just to
remove dissenting views, but also to isolate those citizens who voice them. We recently
Speech By Stopping Anarchist Violence: Hearing Before the Subcomm. on the Const. of the S. Comm. on
the Judiciary, 116th Cong. (2020) (statement of Jonathan Turley); Respect for Law Enforcement and the
Rule of Law: Hearing Before the Commission on Law Enforcement and the Administration of Justice,
(2020) (statement of Jonathan Turley); The
Media and The Publication of Classified Information: Hearing
Before the H. Select Comm. on Intelligence, 109
th
Cong. (2006) (statement of Jonathan Turley).
5
See U.S. House of Representatives v. Burwell, 185 F. Supp. 3d 165 (D.D.C. 2016),
https://casetext.com/case/us-house-of-representatives-v-capacity-1.
6
Some of today’s testimony is material include from that earlier hearing. “Hearing on the
Weaponization of the Federal Government,” United States House of Representatives, House Judiciary
Committee, Select Subcommittee on the Weaponization of the Federal Government, February 9, 2023
(statement of Jonathan Turley).
7
Twitter Revenue and User Statistics, BUSINESS OF APPS, Jan. 31, 2023,
https://www.businessofapps.com/data/twitter-statistics/.
8
Most Popular Social Networks, STATISTA, https://www.statista.com/statistics/272014/global-
social-networks-ranked-by-number-of-users/.
3
learned that this effort extended even to companies like LinkedIn.
9
New emails
uncovered in the Missouri v. Biden litigation reportedly show that the Biden
Administration’s censorship efforts extended to Facebook to censor private
communications on its WhatsApp messaging service.
10
The effort to limit access, even to
professional sites like LinkedIn, creates a chilling effect on those who would challenge
majoritarian or official views. It was the same chilling effect experienced by scientists
who tried to voice alternative views on vaccines, school closures, masks, or the Covid
origins. The success of this partnership may surpass anything achieved by direct state-run
systems in countries like Russia or China.
The recent disclosures involving the Cybersecurity and Infrastructure Security
Agency (CISA) is chillingly familiar. It is part of an ever-expanding complex of
government programs and grants directed toward the censorship or blacklisting of
citizens and groups. In just a matter of weeks, the size of this complex has come into
greater focus and has confirmed the fears held by many of us over the use of private
actors to do indirectly what the government is prohibited from doing directly. I have
called it “censorship by surrogate” and CISA appears to be the latest agency to enlist
private proxy actors.
The focus of this hearing is particularly welcomed, as it reminds us that the cost
of censorship is not just the loss of the right to free expression. Those costs can include
the impact of reducing needed public debate and scrutiny in areas like public health. For
years, government and corporate figures worked to silence scientists and researchers who
opposed government policies on mask efficacy, universal vaccinations, school closures,
and even the origin of Covid-19. Leading experts Drs. Jayanta Bhattacharya (Stanford
University) and Martin Kulldorff (Harvard University) as well as a host of others, faced
overwhelming attacks for questioning policies or views that later proved questionable or
downright wrong. Those doctors were the co-authors of the Great Barrington
Declaration, which advocated for a more focused Covid response that targeted the most
vulnerable populations, rather than widespread lockdowns and mandates.
Dr. Kulldorff was censored in March 2021 when he tweeted “Thinking that
everyone must be vaccinated is as scientifically flawed as thinking that nobody should.
COVID vaccines are important for older high-risk people and their care-takers. Those
with prior natural infection do not need it. Nor children.” Every aspect of that tweet was
worthy of scientific and public debate. However, with the support of political, academic,
and media figures, such views were suppressed at the very moment in which they could
have made the most difference. For example, if we had a true and open debate, we might
have followed other countries in keeping schools open for young children. Agencies and
the media now recognize that these objections had merit.. We are now experiencing an
educational and mental health crisis associated with a lockdown that might have been
avoided or reduced (as in other countries). Millions died as government agencies enlisted
9
Jonathan Turley, “Connect to Opportunity”: State Department Pushed LinkedIn to Censor
“Disinformation,” Res Ipsa Blog (www.jonathanturley.org), Apr. 12, 2023,
https://jonathanturley.org/2023/04/12/connect-to-opportunity-new-evidence-shows-state-department-
pushing-linkenin-to-censor-disinformation/.
10
Jonathan Turley, New Documents Expose Government Censorship Efforts at Facebook and
WhatsApp, Res Ipsa Blog (www.jonathanturley.org), March 26, 2023,
https://jonathanturley.org/2023/03/26/new-documents-expose-government-censorship-efforts-at-facebook-
and-whatsapp/.
4
companies to silence dissenting viewpoints on best practices and approaches. We do not
know how many of those deaths or costs might have been avoided because this debate
was delayed until after the pandemic had largely subsided.
The purpose of my testimony today is to address the legal question of when
government support for censorship systems becomes a violation of the First Amendment
and, more broadly, when it convenes free speech principles. To that end, I hope to briefly
explore what we know, what we do not know, and why we must know much more about
the government’s efforts to combat speech deemed misinformation, disinformation, and
malinformation (MDM).
Regardless of how one comes out on the constitutional ramifications of the
government’s role in the censorship system, there should be no serious debate over the
dangers that government-supported censorship presents to our democracy. The United
States government may be outsourcing censorship, but the impact is still inimical to the
free speech values that define this country. This should not be a matter that divides our
political parties. Free speech is the core article of faith of all citizens in our constitutional
system. It should transcend politics and, despite our deepening divisions, unite us all in a
common cause to protect what Justice Louis Brandeis once called “the indispensable
right.”
11
II. MDM AND CENSORSHIP BY SURROGATE
It is a common refrain among many supporters of corporate censorship that the
barring, suspension, or shadow banning of individuals on social media is not a free
speech problem. The reason is that the First Amendment applies to the government, not
private parties. As a threshold matter, it is important to stress that free speech values are
neither synonymous with, nor contained exclusively within, the First Amendment. The
First Amendment addressed the most prevalent danger of the time in the form of direct
government regulation and censorship of free speech and the free press. Yet, free speech
in society is impacted by both public and private conduct. Indeed, the massive censorship
system employed by social media companies presents the greatest loss of free speech in
our history. These companies, not the government, now control access to the
“marketplace of ideas.” That is also a free speech threat that needs to be taken seriously
by Congress. While the Washington Post has shown that the Russian trolling operations
had virtually zero impact on our elections,
12
the corporate censorship of companies like
Twitter and Facebook clearly had an impact by suppressing certain stories and viewpoints
in our public discourse. It was the response to alleged disinformation, not the
disinformation itself, that manipulated the debate and issues for voters.
The First Amendment addresses actions by the government, but there are certainly
actions taken by these agencies to censor the views of citizens. While one can debate
whether social media executives became effective government agents, public employees
are government agents. Their actions must not seek to abridge the freedom of speech. It is
possible that a systemic government program supporting a privately-run censorship
11
Whitney v. California, 274 U.S. 357, 375-76 (1927) (Brandeis, J., concurring).
12
Tim Starks, Russian Trolls on Twitter Had Little Influence on 2016 Election, WASH. POST, Jan. 9,
2023, https://www.washingtonpost.com/politics/2023/01/09/russian-trolls-twitter-had-little-influence-2016-
voters/.
5
system is sufficient to justify injunctive relief based on the actions of dozens of federal
employees to target and seek the suspension of citizens due to their viewpoints. However,
this program can also run afoul of the First Amendment if the corporate counterparts in
the system are considered effective government agents themselves. The most common
example occurs under the Fourth Amendment where the government is sometimes
viewed as acting through private security guards or snitches performing tasks at its
request.
The same agency relationship can occur under the First Amendment, particularly
on social media. The “marketplace of ideas” is now largely digital. The question is
whether the private bodies engaging in censorship are truly acting independently of the
government. There is now ample reason to question that separation. Social media
companies operate under statutory conditions and agency review. That relationship can
allow or encourage private parties to act as willing or coerced agents in the denial of free
speech. Notably, in 1946, the Court dealt with a town run by a private corporation in
Marsh v. Alabama.
13
It was that corporation, rather than a government unit, that
prevented citizens from distributing religious literature on a sidewalk. However, the
Court still found that the First Amendment was violated because the corporation was
acting as a governing body. The Court held that, while the denial of free speech rights
“took place, [in a location] held by others than the public, [it] is not sufficient to justify
the State’s permitting a corporation to govern a community of citizens so as to restrict
their fundamental liberties.”
14
Congress has created a curious status for social media companies in granting
immunity protections in Section 230. That status and immunity have been repeatedly
threatened by members of Congress unless social media companies expanded censorship
programs in a variety of different areas. The demands for censorship have been
reinforced by letters threatening congressional action. Many of those threats have
centered around removing Section 230 immunity, pursuing antitrust measures, or other
vague regulatory responses. Many of these threats have focused on conservative sites or
speakers. The language of the Section itself is problematic in giving these companies
immunity “to restrict access to or availability of material that the provider or user
considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or
otherwise objectionable, whether or not such material is constitutionally protected.”
15
As
Columbia Law professor Phil Hamburger has noted, the statute appears to permit what is
made impermissible under the First Amendment:
16
“Congress makes explicit that it is
immunizing companies from liability for speech restrictions that would be
unconstitutional if lawmakers themselves imposed them.”
17
As Hamburger notes, that
does not mean that the statute is unconstitutional, particularly given the judicial rule
favoring narrow constructions to avoid unconstitutional meanings.
18
However, there is
13
Marsh v. Alabama, 326 U.S. 501 (1946).
14
Id. at 509.
15
47 U.S.C. § 230(c).
16
Philip Hamburger, The Constitution Can Crack Section 230, WALL STREET JOURNAL (Jan. 30,
2021).
17
Congress makes explicit that it is immunizing companies from liability for speech restrictions that
would be unconstitutional if lawmakers themselves imposed them.
18
Id. See, e.g., Republican Party of Hawaii v. Mink, 474 U.S. 1301, 1302 (1985) (narrowly
interpreting the recall provisions of the Honolulu City Charter).
6
another lingering issue raised by the use of this power to carry out the clear preference on
“content moderation” of one party.
The Court has recognized that private actors can be treated as agents of the
government under a variety of theories. Courts have found such agency exists when the
government exercises “coercive power” or “provided such significant encouragement,
either overt or covert, that the choice must in law be deemed to be that of the
State.”
19
The Court has also held that the actions of a private party can be “fairly treated
as that of the State itself” where there exists a “close nexus between the State and the
challenged action” that a private action “may be fairly treated as that of the State itself.”
20
I will return to the case law below, but first it is useful to consider what is currently
known about the government-corporate coordination revealed by the Twitter Files.
I will not lay out the full array of communications revealed by Twitter and recent
litigation, but some are worth noting as illustrative of a systemic and close coordination
between the company and federal officials, including dozens reportedly working within
the FBI. The level of back-channel communications at one point became so
overwhelming that a Twitter executive complained that the FBI was “probing & pushing
everywhere.” Another official referred to managing the government censorship referrals
as a “monumental undertaking.” At the same time, dozens of ex-FBI employees were
hired, including former FBI General Counsel James Baker. There were so many FBI
employees that they set up a private Slack channel and a crib sheet to allow them to
translate FBI terms into Twitter terms more easily. The Twitter Files have led groups
from the right to the left of our political spectrum to raise alarms over a censorship
system maintained by a joint government-corporate effort.
21
Journalist Matt Taibbi was
enlisted by Elon Musk to present some of these files and reduced his findings to a simple
header: “Twitter, the FBI Subsidiary.”
As discussed today, these disclosures show that FBI is not alone among the
federal agencies in systemically targeting posters for censorship. Indeed, emails reveal
FBI figures, like San Francisco Assistant Special Agent in Charge Elvis Chan, asking
Twitter executives to “invite an OGA” (or “Other Government Organization”) to an
upcoming meeting. A week later, Stacia Cardille, a senior Twitter legal executive,
indicated the OGA was the CIA, an agency under strict limits regarding domestic
activities. Much of this work apparently was done through the multi-agency Foreign
Influence Task Force (FITF), which operated secretly to censor citizens. Cardille
referenced her “monthly (soon to be weekly) 90-minute meeting with FBI, DOJ, DHS,
ODNI [Office of the Director of National Intelligence], and industry peers on election
threats.” She detailed long lists of tasks sent to Twitter by government officials. The
censorship efforts reportedly included “regular meetings” with intelligence officials. This
included an effort to warn Twitter about a “hack-and-leak operation” by state actors
targeting the 2020 presidential election. That occurred just before the New York Post
19
Blum v. Yaretsky, 457 U.S. 991, 1004 (1982).
20
Jackson v. Metro. Edison Co., 419 U.S. 345, 351 (1974).
21
Compare Yes, You Should be Worried About the Relationship with Twitter, THE FIRE, Dec. 23,
2022, https://www.thefire.org/news/yes-you-should-be-worried-about-fbis-relationship-twitter with Branco
Marcetic, Why the Twitter Files Are In Fact a Big Deal, JACOBIN, Dec. 29, 2022,
https://jacobin.com/2022/12/twitter-files-censorship-content-moderation-intelligence-agencies-surveillance.
7
story on Hunter Biden’s laptop was published and then blocked by Twitter. It was also
blocked by other social media platforms like Facebook.
22
The files also show the staggering size of government searches and demands. The
FBI reportedly did key word searches to flag large numbers of postings for possible
referral to Twitter. On November 3, 2020, Cardille told Baker that “[t]he FBI has “some
folks in the Baltimore field office and at HQ that are just doing keyword searches for
violations. This is probably the 10th request I have dealt with in the last 5 days.” Baker
responded that it was “odd that they are searching for violations of our policies.” But it
was not odd at all. Twitter had integrated both current and former FBI officials into its
network and the FBI was using the company’s broadly defined terms of service to target
a wide array of postings and posters for suspensions and deletions.
At one point, the coordination became so tight that, in July 2020, Chan offered to
grant temporary top-secret clearance to Twitter executives to allow for easier
communications and incorporation into the government network.
23
This close working
relationship also allowed the government use of accounts covertly, reportedly with the
knowledge of Twitter. One 2017 email sent by an official from United States Central
Command (CENTCOM) requested that Twitter “whitelist” Arabic-language Twitter
accounts that the government was using to “amplify certain messages.” The government
also asked that these accounts be granted the “verified” blue checkmark.
The range of available evidence on government coordination with censorship
extends beyond the Twitter Files and involves other agencies. For example, recent
litigation brought by various states over social media censorship revealed a back-channel
exchange between defendant Carol Crawford, the CDC’s Chief of digital media and a
Twitter executive.
24
The timing of the request for the meeting was made on March 18,
2021. Twitter senior manager for public policy Todd O’Boyle asked Crawford to help
identify tweets to be censored and emphasized that the company was “looking forward to
setting up regular chats.” However, Crawford said that the timing that week was “tricky.”
Notably, that week, Dorsey and other CEOs were to appear at a House hearing to discuss
“misinformation” on social media and their “content moderation” policies. I had
just testified on private censorship in circumventing the First Amendment as a type of
censorship by surrogate.
25
Dorsey and the other CEOs were asked at the March 25, 2021,
hearing about my warning of a “little brother problem, a problem which private entities
do for the government which it cannot legally do for itself.”
26
Dorsey insisted that there
was no such censorship office or program.
22
Mark Zuckerberg has also stated that the FBI clearly warned about the Hunter Biden laptop as
Russian disinformation. David Molloy, Zuckerberg Tells Rogan that FBI Warning Prompted Biden Laptop
Story Censorship, BBC, August 26, 2022, https://www.bbc.com/news/world-us-canada-62688532.
23
Gadde and Roth both testified that they do not know if anyone took up this offer for clearances.
24
The lawsuit addresses how experts, including Drs. Jayanta Bhattacharya (Stanford University) and
Martin Kulldorff (Harvard University), have faced censorship on these platforms.
25
Fanning the Flames: Disinformation and Extremism in the Media: Hearing Before the Subcomm.
on Commc’n & Tech. of the H. Comm. on Energy & Com., 117th Cong. (2021) (statement of Jonathan
Turley, Shapiro Professor of Public Interest Law, The George Washington University Law School).
26
Misinformation and Disinformation on Online Platforms: Hearing Before the Subcomm. on
Commc’n & Tech. and Subcomm. on Consumer Protection of the H. Comm. on Energy & Com., 117th
Cong. (2021).
8
The pressure to censor Covid-related views was also coming from the White
House, as they targeted Alex Berenson, a former New York Times reporter, who had
contested agency positions on vaccines and underlying research. Rather than push
information to counter Berenson’s views, the White House wanted him banned. Berenson
was eventually suspended.
These files show not just a massive censorship system but a coordination and
integration of the government to a degree that few imagined before the release of the
Twitter Files. Congressional hearings have only deepened the alarm for many in the free
speech community. At one hearing, former Twitter executive Anika Collier Navaroli
testified on what she called the “nuanced” standard used by her and her staff on
censorship, including the elimination of “dog whistles” and “coded” messaging. She then
said that they balanced free speech against safety and explained that they sought a
different approach:
“Instead of asking just free speech versus safety to say free speech for whom and
public safety for whom. So whose free expression are we protecting at the expense
of whose safety and whose safety are we willing to allow to go the winds so that
people can speak freely?”
The statement was similar to the statement of the former CEO Parag Agrawal. After
taking over as CEO, Agrawal pledged to regulate content as “reflective of things that we
believe lead to a healthier public conversation.” Agrawal said the company would “focus
less on thinking about free speech” because “speech is easy on the internet. Most people
can speak. Where our role is particularly emphasized is who can be heard.”
The sweeping standards revealed at these hearings were defended by members as
necessary to avoid “insurrections” and other social harms. What is particularly distressing
is to hear members repeatedly defending censorship by citing Oliver Wendell Holmes’
famous statement on “shouting fire in a crowded theater.” This mantra has been grossly
misused as a justification for censorship. From statements on the pandemic to climate
change, anti-free speech advocates are claiming that opponents are screaming “fire” and
causing panic. The line comes from Schenck v. United States, a case that discarded the
free speech rights of citizens opposing the draft. Charles Schenck and Elizabeth Baer
were leading socialists in Philadelphia who opposed the draft in World War I. Fliers were
distributed that encouraged men to “assert your rights” and stand up for their right to
refuse such conscription as a form of involuntary servitude. Writing for the Court, Justice
Oliver Wendell Holmes dismissed the free speech interests in protecting the war and the
draft. He then wrote the most regrettable and misunderstood judicial soundbites in
history: “the character of every act depends on the circumstances in which it is done . . .
The most stringent protection of free speech would not protect a man in falsely shouting
fire in a theater and causing a panic.” “Shouting fire in a crowded theater” quickly
became a mantra for every effort to curtail free speech.
Holmes sought to narrow his clear and present danger test in his dissent
in Abrams v. United States. He warned that “we should be eternally vigilant against
attempts to check the expression of opinions that we loath and believe to be frought (sic)
with death, unless they so imminently threaten immediate interference with the lawful
and pressing purposes of the law that at an immediate check is required to save the
country.” Holmes’ reframing of his view would foreshadow the standard
in Brandenburg v. Ohio, where the Supreme Court ruled that even calling for violence is
9
protected under the First Amendment unless there is a threat of “imminent lawless
action and is likely to incite or produce such action.” However, members are still
channeling the standard from Schenck, which is a curious choice for most Democrats in
using a standard used against socialists and anti-war protesters.
Even more unnerving is the fact that Navaroli’s standard and those referencing
terms like “delegitimization” makes the Schenck standard look like the model of clarity.
Essentially, they add that you also have to consider the theater, movie, and audience to
decide what speech to allow. What could be treated as crying “Fire!” by any given person
or in any given circumstances would change according to their “nuanced” judgment.
III. CISA WITHIN THE GOVERNMENT-CORPORATE ALLIANCE
The role of CISA in this complex of government-corporate programs only
recently came into closer scrutiny. The Department of Homeland Security was previously
the focus of public controversy with the disclosure of the creation of Department’s
Disinformation Governance Board and the appointment of Nina Jankowicz, its head.
Jankowicz was a long advocate for censorship in the name of combating disinformation.
At the time, White House press secretary Jen Psaki described the board as intended “to
prevent disinformation and misinformation from traveling around the country in a range
of communities.”
27
While the Department ultimately yielded to the public outcry over the
board and disbanded it, the public was never told of a wide array other offices doing
much of the same work in targeting citizens and groups for possible censorship.
In January 2017, the Homeland Security declared that election infrastructure
would be treated as “critical infrastructure.” CISA took a lead in supporting election
infrastructure integrity and countering election misinformation. In 2018, CISA and its
Countering Foreign Influence Task Force (CFITF) reportedly assumed a greater role in
monitoring and counteracting foreign interference in U.S. elections. In 2020, this work
appears to have expanded further to pursue allegations of “switch boarding” by domestic
actors, or individuals thought to be acting as conduits for information undermining
elections or critical infrastructure. Much about this work remains unclear and I am no
expert on CISA or its operational profile. However, the expanding mandate of CISA
follows a strikingly familiar pattern.
The Twitter Files references CISA participation in these coordination meetings.
Given a mandate to help protect election integrity, CISA plunged into the monitoring and
targeting of those accused of disinformation. Infrastructure was interpreted to include
speech. As its director, Jen Easterly, declared “the most critical infrastructure is our
cognitive infrastructure” and thus included “building that resilience to misinformation
and disinformation, I think, is incredibly important.”
28
She pledged to continue that work
with the private sector including social media companies on that effort. We do not need
the government in the business of building our “cognitive infrastructure.” Like content
27
Press Briefing by Press Secretary Jen Psaki, April 29, 2022, https://www.whitehouse.gov/briefing-
room/press-briefings/2022/04/28/press-briefing-by-press-secretary-jen-psaki-april-28-2022/.
28
Maggie Miller, Cyber Agency Beefing Up Disinformation, Misinformation Team, THE HILL, Nov.
10, 2022, https://thehill.com/policy/cybersecurity/580990-cyber-agency-beefing-up-disinformation-
misinformation-team/.
10
moderation, the use of this euphemism does not disguise the government’s effort to direct
and control what citizens may read or say on public platforms.
Over the years, the range of information deemed harmful has expanded to the
point that even true information is now viewed as harmful for the purposes of censorship.
Some of the recent disclosures from Twitter highlighted the work of Stanford’s Virality
Project which insisted “true stories … could fuel hesitancy” over taking the vaccine or
other measures.
29
It is reminiscent of the sedition prosecutions under the Crown before
the American revolution where truth was no defense. Even true statements could be
viewed as seditious and criminal. Once the government gets into the business of speech
regulation, the appetite for censorship becomes insatiable as viewpoints are deemed
harmful, even if true. CISA shows the same broad range of suspect speech:
§ Misinformation is false, but not created or shared with the intention of causing
harm.
§ Disinformation is deliberately created to mislead, harm, or manipulate a person,
social group, organization, or country.
§ Malinformation is based on fact, but used out of context to mislead, harm, or
manipulate. An example of malinformation is editing a video to remove important
context to harm or mislead.”
30
MDM regulations offer the government the maximal space for censorship based on how
information may be received or used. The inclusion of true material used to “manipulate”
others is particularly chilling as a rationale for speech controls.
According to the Election Integrity Partnership (EIP), “tickets” flag material for
investigation that can be “one piece of content, an idea or narrative, or hundreds of URLs
pulled in a data dump.”
31
These tickets reportedly include those suspected of
“delegitimization,” which includes speech that undermines or spread distrust in the
political or electoral system. The ill-defined character of these categories is by design. It
allows for highly selective or biased “ticketing” of speech. The concern is that
conservative writers or sites subjected to the greatest targeting or ticketing. This pattern
was evident in other recent disclosures from private bodies working with U.S. agencies.
For example, we recently learned that the U.S. State Department funding for the National
Endowment for Democracy (NED) included support for the Global Disinformation
Index (GDI).
32
The British group sought to discourage advertisers from supporting sites
29
Jonathan Turley, True Stores …Could Fuel Hesitancy”: Stanford Project Worked to Censor Even
True Stories on Social Media, Res Ipsa Blog (www.jonathanturley.org), March 19, 2023, at
https://jonathanturley.org/2023/03/19/true-stories-could-fuel-hesitancy-stanford-project-worked-to-censor-
even-true-stories-on-social-media/.
30
Foreign Influence Operations and Disinformation,https://www.cisa.gov/topics/election-
security/foreign-influence-operations-and-disinformation.
31
ELECTION INTEGRITY PARTNERSHIP, THE LONG FUSE MISINFORMATION AND THE
2020 ELECTION 9 (2021), https://stacks.stanford.edu/file/druid:tr171zs0069/EIP-Final-Report.pdf
32
Jonathan Turley, Scoring Speech: How the Biden Administration has been Quietly Shaping Public
Discourse, Res Ipsa Blog (www.jonathanturley.org), Feb. 20, 2023,
https://jonathanturley.org/2023/02/20/scoring-speech-how-the-biden-administration-has-been-quietly-
shaping-speech/.
11
deemed dangerous due to disinformation. Companies were warned by GDI about “risky”
sites that pose “reputational and brand risk” and asked them to avoid “financially
supporting disinformation online.” All ten of the “riskiest” sites identified by the GDI are
popular with conservatives, libertarians, and independents, including Reason, a site
featuring legal analysis of conservative law professors. Liberal sites like HuffPost were
ranked as the most trustworthy. The categories were as ill-defined as those used by CISA.
RealClearPolitics was blacklisted due to what GDI considers “biased and sensational
language.” The New York Post was blacklisted because “content sampled from the Post
frequently displayed bias, sensationalism and clickbait, which carries the risk of
misleading the site’s reader.” After the biased blacklisting was revealed, NED announced
that it would withdraw funding for the organization. However, as with the Disinformation
Board, the Disinformation Index was just one of a myriad of groups being funded or fed
information from federal agencies. These controversies have created a type of “Whack-a-
mole” challenge for the free speech community. Every time one censorship partnership is
identified and neutralized, another one pops up.
EIP embodies this complex of groups working with agencies. It describes itself as
an organization that “was formed between four of the nation’s leading institutions
focused on understanding misinformation and disinformation in the social media
landscape: the Stanford Internet Observatory, the University of Washington’s Center for
an Informed Public, Graphika, and the Atlantic Council’s Digital Forensic Research
Lab.” The EIP has referred to CISA as one of its “stakeholders” and CISA has used the
partnership to censor individuals or groups identified by the agency. We still do not the
full extent of the coordination between CISA and other agencies with private and
academic groups in carrying out censorship efforts. However, the available evidence
raises legitimate questions over an agency relationship for the purposes of the First
Amendment.
IV. OUTSOURCING CENSORSHIP: THE NEED FOR GREATER
TRANSPARENCY AND ACCOUNTABILITY
In recent years, a massive censorship complex has been established with
government, academic, and corporate components. Millions of posts and comments are
now being filtered through this system in arguably the most sophisticated censorship
system in history. This partnership was facilitated by the demands of the First
Amendment, which bars the government from directly engaging in forms of prior
restraint and censorship. If “necessity is the mother of invention,” the censorship complex
shows how inventive motivated people can be in circumventing the Constitution. It has
been an unprecedented challenge for the free speech community. The First Amendment
was designed to deal with the classic threat to free speech of a government-directed
system of censorship. However, the traditional model of a ministry of information is now
almost quaint in comparison to the current system. It is possible to have an effective state
media by consent rather than coercion. There is no question that the work of these
12
academic and private groups limits free speech. Calling opposing views disinformation,
malinformation, or misinformation does not sanitize the censorship. It is still censorship
being conducted through a screen of academic and corporate entities. It may also
contravene the First Amendment.
The government can violate the Constitution through public employees or private
actors. As I testified recently before the Judiciary Committee, this agency relationship
can be established through consent or coercion. Indeed, the line can be difficult to discern
in many cases. There is an argument that this is a violation of the First Amendment.
Where the earlier debate over the status of these companies under Section 230 remained
mired in speculation, the recent disclosures of government involvement in the Twitter
censorship program presents a more compelling and concrete case for arguing agency
theories. These emails refer to multiple agencies with dozens of employees actively
coordinating the blacklisting and blocking of citizens due to their public statements.
There is no question that the United States government is actively involved in a massive
censorship system. The only question is whether it is in violation of the First
Amendment.
Once again, the Twitter Files show direct action from federal employees to censor
viewpoints and individual speakers on social media. The government conduct is direct
and clear. That may alone be sufficient to satisfy courts that a program or policy abridges
free speech under the First Amendment. Even if a company like Twitter declined
occasionally, the federal government was actively seeking to silence citizens. Any
declinations only show that that effort was not always successful.
In addition to that direct action, the government may also be responsible for the
actions of third parties who are partnering with the government on censorship. The
government has long attempted to use private parties to evade direct limits imposed by
the Constitution. Indeed, this tactic has been part of some of the worst chapters in our
history. For example, in Lombard v. Louisiana,
33
the Supreme Court dealt with the denial
of a restaurant to serve three black students and one white student at a lunch counter in
New Orleans reserved for white people. The Court acknowledged that there was no state
statute or city ordinance requiring racial segregation in restaurants. However, both the
Mayor and the Superintendent of Police had made public statements that “sit-in
demonstrations” would not be permitted. The Court held that the government cannot do
indirectly what it cannot do directly. In other words, it “cannot achieve the same result by
an official command which has at least as much coercive effect as an ordinance.”
34
As the Court said in Blum v. Yaretsky (where state action was not found), “a State
normally can be held responsible for a private decision only when it has exercised
coercive power or has provided such significant encouragement, either overt or covert,
that the choice must in law be deemed to be that of the State.”
35
Past cases (often dealing
with state action under the Fourteenth Amendment) have produced different tests for
establishing an agency relationship, including (1) public function; (2) joint action; (3)
governmental compulsion or coercion; and (4) governmental nexus.
36
Courts have noted
33
373 U.S. 267 (1963).
34
Id. at 273.
35
Blum v. Yaretsky, 457 U.S. 991, 1004-05 (1982).
36
Pasadena Republican Club v. W. Justice Ctr., 985 F.3d 1161, 1167 (9th Cir. 2021); Kirtley v.
Rainey, 326 F.3d 1088, 1092 (9th Cir. 2003). Some courts reduce this to three tests.
13
that these cases “overlap” in critical respects.
37
I will not go into each of these tests but
they show the highly contextual analysis performed by courts in finding private conduct
taken at the behest or direction of the government. The Twitter Files show a multilayered
incorporation of government information, access, and personnel in the censorship
program. One question is “whether the state has so far insinuated itself into a position of
interdependence with [the private entity] that it must be recognized as a joint participant
in the challenged activity.”
38
Nevertheless, the Supreme Court noted in Blum that “[m]ere
approval of or acquiescence in the initiatives of a private party is not sufficient to justify
holding the State responsible for those initiatives.”
39
Courts have previously rejected claims of agency by private parties over social
media.
40
However, these cases often cited that lack of evidence of coordination and
occurred before the release of the Twitter Files. For example, in Rogalinski v. Meta
Platforms, Inc.,
41
the court rejected a claim that Meta Platforms, Inc. violated the First
Amendment when it censored posts about COVID-19. However, the claim was based
entirely on a statement by the White House Press Secretary and “all of the alleged
censorship against Rogalinski occurred before any government statement.” It noted that
there was no evidence that there was any input of the government to challenge the
assertion that Meta’s message was “entirely its own.”
42
There is an interesting comparison to the decision of the United States Court of
Appeals for the Sixth Circuit in Paige v. Coyner, where the Court dealt with the
termination of an employee after a county official called her employer to complain about
comments made in a public hearing.
43
The court recognized that “[t]his so-called state-
actor requirement becomes particularly complicated in cases such as the present one
where a private party is involved in inflicting the alleged injury on the plaintiff.”
44
However, in reversing the lower court, it still found state action due to the fact that a
government official made the call to the employer, which prompted the termination.
Likewise, in Dossett v. First State Bank, the United States Court of Appeals for
the Eighth Circuit ruled that the termination of a bank employee was the result of state
action after school board members contacted her employer about comments made at a
public-school board meeting.
45
The Eighth Circuit ruled that the district court erred by
instructing a jury that it had to find that the school board members had “actual authority”
to make these calls. In this free speech case, the court held that you could have state
action under the color of law when the “school official who was purporting to act in the
performance of official duties but was acting outside what a reasonable person would
believe the school official was authorized to do.”
46
In this case, federal officials are
clearly acting in their official capacity. Indeed, that official capacity is part of the concern
37
Rogalinski v. Meta Platforms, Inc., 2022 U.S. Dist. LEXIS 142721 (August 9, 2022).
38
Gorenc v. Salt River Project Agr. Imp. & Power Dist., 869 F.2d 503, 507 (9th Cir. 1989).
39
Blum, 457 U.S. at 1004-05.
40
O’Handley v. Padilla, 579 F. Supp.3d 1163 1192-93 (N.D. Cal. 2022).
41
2022 U.S. Dist. LEXIS 142721 (August 9, 2022).
42
Id.
43
Paige v. Coyner, 614 F.3d 273, 276 (6th Cir. 2010).
44
Id.
45
399 F.3d 940 (8th Cir. 2005).
46
Id. at 948.
14
raised by the Twitter Files: the assignment of dozens of federal employees to support a
massive censorship system.
Courts have also ruled that there is state action where government officials use
their positions to intimidate or pressure private parties to limit free speech. In National
Rifle Association v. Vullo, the United States Court of Appeals for the Second Circuit
ruled that a free speech claim could be made on the basis of a state official’s pressuring
companies not to do business with the NRA.
47
The Second Circuit held “although
government officials are free to advocate for (or against) certain viewpoints, they may not
encourage suppression of protected speech in a manner that ‘can reasonably be
interpreted as intimating that some form of punishment or adverse regulatory action will
follow the failure to accede to the official’s request.’”
48
It is also important to note that
pressure is not required to establish an agency relationship under three of the prior tests.
It can be based on consent rather than coercion.
We have seen how censorship efforts began with claims of foreign interference
and gradually expanded into general efforts to target harm or “delegitimizing” speech.
The Twitter Files show FBI officials warning Twitter executives that their platform was
being targeted by foreign powers, including a warning that an executive cited as a basis
for blocking postings related to the Hunter Biden laptop. At the same time, various
members of Congress have warned social media companies that they could face
legislative action if they did not continue to censor social media. Indeed, after Twitter
began to reinstate free speech protections and dismantle its censorship program, Rep.
Schiff (joined by Reps. André Carson (D-Ind.), Kathy Castor (D-Fla.) and Sen. Sheldon
Whitehouse (D-R.I.)) sent a letter to Facebook, warning it not to relax its censorship
efforts. The letter reminded Facebook that some lawmakers are watching the company
“as part of our ongoing oversight efforts” — and suggested they may be forced to
exercise that oversight into any move by Facebook to “alter or rollback certain
misinformation policies.” This is only the latest such warning. In prior hearings, social
media executives were repeatedly told that a failure to remove viewpoints were
considered “disinformation.” For example, in a November 2020 Senate hearing, then-
Twitter CEO Jack Dorsey apologized for censoring the Hunter Biden laptop story. But
Sen. Richard Blumenthal, D-Conn., warned that he and his Senate colleagues would not
tolerate any “backsliding or retrenching” by “failing to take action against dangerous
disinformation.”
49
Senators demands increased censorship in areas ranging from the
pandemic to elections to climate change.
These warnings do not necessarily mean that a court would find that executives
were carrying out government priorities. An investigation is needed to fully understand
the coordination and the communications between the government and these companies.
In Brentwood Academy v. Tennessee Secondary School Athletic Assn.,
50
the Supreme
Court noted that state action decisions involving such private actors are highly case
specific:
47
National Rifle Association of America v. Vullo, 49 F.4th 700, 715 (2d Cir. 2022).
48
Id. (quoting Hammerhead Enters., Inc. v. Brezenoff, 707 F.2d 33, 39 (2d Cir. 1983)).
49
Misinformation and Disinformation on Online Platforms: Hearing Before the Subcomm. on
Commc’n & Tech. and Subcomm. on Consumer Protection of the H. Comm. on Energy & Com., 117th Cong.
(2021).
50
531 U.S. 288 (2001).
15
What is fairly attributable is a matter of normative judgment, and the criteria lack
rigid simplicity. From the range of circumstances that could point toward the
State behind an individual face, no one fact can function as a necessary condition
across the board for finding state action; nor is any set of circumstances
absolutely sufficient, for there may be some countervailing reason against
attributing activity to the government…
Our cases have identified a host of facts that can bear on the fairness of such an
attribution. We have, for example, held that a challenged activity may be state
action when it results from the State’s exercise of “coercive power,” …when the
State provides “significant encouragement, either overt or covert,” … or when a
private actor operates as a ‘willful participant in joint activity with the State or its
agents,” … We have treated a nominally private entity as a state actor when it is
controlled by an “agency of the State,” … when it has been delegated a public
function by the State, … when it is “entwined with governmental policies,” or
when government is “entwined in [its] management or control.”
51
Obviously, many of these elements appear present. However, the Twitter Files also show
executives occasionally declining to ban posters targeted by the government. It also
shows such pressure coming from the legislative branch. For example, the Twitter Files
reveal that Twitter refused to carry out censorship requests from at least one member
targeting a columnist and critic. Twitter declined and one of its employees simply wrote,
“no, this isn’t feasible/we don’t do this.”
52
There were also requests from Republicans to
Twitter for action against posters, including allegedly one from the Trump White House
to take down content.
53
We simply do not know the extent of what companies like Twitter “did do,nor
for whom. We do not know how demands were declined when flagged by the CISA, FBI,
or other agencies. The report from Twitter reviewers selected by Elon Musk suggests that
most requests coming from the Executive Branch were granted. That is one of the areas
that could be illuminated by this select subcommittee. The investigation may be able to
supply the first comprehensive record of the government efforts to use these companies
to censor speech. It can pull back the curtain on America’s censorship system so that both
Congress and the public can judge the conduct of our government.
Whether the surrogate censorship conducted by social media companies is a form
of government action may be addressed by the courts in the coming years. However,
certain facts are well-established and warrant congressional action. First, while these
companies and government officials prefer to call it “content moderation,” these
companies have carried out the largest censorship system in history, effectively
51
Id. at 296.
52
Jonathan Turley, We Don’t Do This: Twitter Censors Rejected Adam Schiff’s Censorship
Request, THE HILL, Jan. 5, 2023, https://thehill.com/opinion/judiciary/3800380-we-dont-do-this-even-
twitters-censors-rejected-adam-schiffs-censorship-request/.
53
This included the Trump White House allegedly asking to take down derogatory tweets from the
wife of John Legend after the former president attacked the couple. Moreover, some Trump officials
supported efforts to combat foreign interference and false information on social media. It has been reported
that Twitter has a “database” of Republican demands. Adam Rawnsley and Asawin Suebaeny, Twitter Kept
Entire “Database” of Republican Requests to Censor Posts, ROLLING STONE, Feb. 8, 2023,
https://www.rollingstone.com/politics/politics-news/elon-trump-twitter-files-collusion-biden-censorship-
1234675969/.
16
governing the speech of billions of people. The American Civil Liberties Union, for
example, maintains that censorship applies to both government and private actions. It is
defined as “the suppression of words, images, or ideas that are ‘offensive,’ [and] happens
whenever some people succeed in imposing their personal political or moral values on
others.”
54
Adopting Orwellian alternative terminology does not alter the fact that these
companies are engaging in the systemic censoring of viewpoints on social media.
Second, the government admits that it has supported this massive censorship
system. Even if the censorship is not deemed government action for the purposes of the
First Amendment, it is now clear that the government has actively supported and assisted
in the censorship of citizens. Objecting that the conduct of government officials may not
qualify under the First Amendment does not answer the question of whether members
believe that the government should be working for the censorship of opposing or
dissenting viewpoints. During the McCarthy period, the government pushed blacklists for
suspected communists and the term “fellow travelers” was rightfully denounced
regardless of whether it qualified as a violation of the First Amendment. Even before Joe
McCarthy launched his un-American activities hearings, the Justice Department created
an effective blacklist of organizations called “Attorney General’s List of Subversive
Organizations” (AGLOSO) that was then widely distributed to the media and the public.
It became the foundation for individual blacklists.
55
The maintenance of the list fell to the
FBI. Ultimately, blacklisting became the norm with both legislative and executive
officials tagging artists, writers, and others. As Professor Geoffrey Stone observed,
“Government at all levels hunted down ‘disloyal’ individuals and denounced them.
Anyone so stigmatized became a liability to his friends and an outcast to society.”
56
At
the time, those who raised the same free speech objections were also attacked as “fellow
travelers” or “apologists” for communists. It was wrong then and it is wrong now. It was
an affront to free speech values that have long been at the core of our country. It is not
enough to say that the government is merely seeking the censorship of posters like any
other user. There are many things that are more menacing when done by the government
rather than individuals. Moreover, the government is seeking to silence certain speakers
in our collective name and using tax dollars to do so. The FBI and other agencies have
massive powers and resources to amplify censorship efforts. The question is whether
Congress and its individual members support censorship whether carried out by corporate
or government officials on social media platforms.
57
Third, the government is engaged in targeting users under the ambiguous
mandates of combating disinformation or misinformation. These are not areas
traditionally addressed by public affairs offices to correct false or misleading statements
54
American Civil Liberties Union, What is Censorship?, https://www.aclu.org/other/what-
censorship.
55
Robert Justin Goldstein, Prelude to McCarthyism, PROLOGUE MAGAZINE, Fall 2006,
https://www.archives.gov/publications/prologue/2006/fall/agloso.html. Courts pushed back on the listing to
require some due process for those listed.
56
Geoffrey R. Stone, Free Speech in the Age of McCarthy: A Cautionary Tale, 93 CALIF. L. REV.
1387, 1400 (2005).
57
The distinction between these companies from other corporate entities like the NFL or Starbucks
is important. There is no question that businesses can limit speech on their premises and by their own
employees. However, these companies constitute the most popular communication platforms in the
country. They are closer to AT&T than Starbucks in offering a system of communication.
17
made about an agency’s work. The courts have repeatedly said that agencies are allowed
to speak in their voices without viewpoint neutrality.
58
As the Second Circuit stated,
“[w]hen it acts as a speaker, the government is entitled to favor certain views over
others.”
59
This was an effort to secretly silence others. Courts have emphasized that “[i]t
is well-established that First Amendment rights may be violated by the chilling effect of
governmental action that falls short of a direct prohibition against speech.”
60
These public
employees were deployed to monitor and target user spreading “disinformation” on a
variety of subjects, from election fraud to government corruption. The Twitter Files show
how this mandate led to an array of abuses, from targeting jokes to barring opposing
scientific views.
These facts already warrant bipartisan action from Congress. Free speech
advocates have long opposed disinformation mandates as an excuse or invitation for
public or private censorship. I admittedly subscribe to the view that the solution to bad
speech is better speech, not speech regulation.
61
Justice Brandeis embraced the view of
the Framers that free speech was its own protection against false statements: “If there be
time to discover through discussion the falsehood and the fallacies, to avert the evil by
the processes of education, the remedy to be applied is more speech not enforced
silence.”
62
We have already seen how disinformation was used to silence dissenting
views of subjects like mask efficacy and Covid policies like school closures that are now
being recognized as legitimate.
We have also seen how claims of Russian trolling operations may have been
overblown in their size or their impact. Indeed, even some Twitter officials ultimately
concluded that the FBI was pushing exaggerated claims of foreign influence on social
media.
63
The Twitter Files refer to sharp messages from the FBI when Twitter failed to
find evidence supporting the widely reported foreign trolling operations. One Twitter
official referred to finding “no links to Russia.” This was not for want of trying. Spurred
on by the FBI, another official promised “I can brainstorm with [redacted] and see if we
can dig even deeper and try to find a stronger connection.” The pressure from the FBI led
Roth to tell his colleagues that he was “not comfortable” with the agenda of the FBI and
said that it reminded him of something “more like something we’d get from a
congressional committee than the Bureau.”
The danger of censorship is not solely a concern of one party. To his great credit,
Rep. Ro Khanna (D., Cal.) in October 2020, said that he was appalled by the censorship
58
Pleasant Grove City v. Summum, 555 U.S. 460, 467-68 (2009); Johanns v. Livestock Mktg.Assn,
544 U.S. 550, 553 (2005).
59
Wandering Dago, Inc. v. Destito, 879 F.3d 20, 34 (2d Cir. 2018).
60
Zieper v. Metzinger, 474 F.3d 60, 65 (2d Cir. 2007).
61
See generally Jonathan Turley, Harm and Hegemony: The Decline of Free Speech in the United
States, 45 HARV. J.L. & PUB. POLY 571 (2022).
62
Whitney, 274 U.S. at 375, 377.
63
In his testimony, Roth stated that they found substantial Russian interference impacting the
election. Protecting Speech from Government Interference and Social Media Bias, Part 1: Twitter’s Role in
Suppressing the Biden Laptop Story: Hearing Before the H. Comm. on Oversight & Accountability, 118th
Cong. (2023) (statement of Yael Roth, Former Head of Trust and Safety, Twitter). That claim stands in
conflict with other studies and reports, but it can also be addressed as part of the investigation into these
communications.
18
and was alarmed by the apparent “violation of the 1st Amendment principles.”
64
Congress can bar the use of federal funds for such disinformation offices. Such
legislation can require detailed reporting on agency efforts to ban or block public
comments or speech by citizens. Even James Baker told the House Oversight Committee
that there may be a need to pass legislation to limit the role of government officials in
their dealings with social media companies.
65
Legislation can protect the legitimate role
of agencies in responding and disproving statements made out its own programs or
policies. It is censorship, not disinformation, that has damaged our nation in recent years.
Free speech like sunshine can be its own disinfectant. In Terminiello v. City of Chicago,
the Supreme Court declared that:
The right to speak freely and to promote diversity of ideas . . . is . . . one of the
chief distinctions that sets us apart from totalitarian regimes . . . [A] function
of free speech under our system of government is to invite dispute. . . . Speech is
often provocative and challenging. . . [F]reedom of speech, though not absolute, is
nevertheless protected against censorship.
66
Disinformation does cause divisions, but the solution is not to embrace government-
corporate censorship. The government effort to reduce speech does not solve the problem
of disinformation. It does not change minds but simply silences voices in national
debates.
V. CONCLUSION
There is obviously a deep division in Congress over censorship, with many
members supporting the efforts to blacklist and remove certain citizens or groups from
social media platforms. That is a debate that many of us in the free speech community
welcome. However, let it be an honest and open debate. The first step in securing such a
debate is to support transparency on the full extent of these efforts by federal agencies.
The second step is to allow these questions to be discussed without attacking journalists
and witnesses who come to Congress to share their own concerns over the threat to both
free press and free speech values. Calling reporters “so called journalists” or others
“Putin lovers” represent a return to the rhetoric used against free speech advocates during
the Red Scare.
67
We are better than that as a country and our Constitution demands more
from this body. If members want to defend censorship, then do so with the full record
before the public on the scope and standards of this government effort.
64
Democratic Rep. Ro Khana Expressed Concerns Over Twitter’s Censorship of Hunter Biden
Laptop, FOX NEWS, Dec. 2, 2022, https://www.foxnews.com/politics/democratic-rep-ro-khanna-expressed-
concerns-twitters-censorship-hunter-biden-laptop-story.
65
Protecting Speech from Government Interference and Social Media Bias, Part 1: Twitter’s Role in
Suppressing the Biden Laptop Story: Hearing Before the H. Comm. on Oversight & Accountability, 118th
Cong. (2023) (statement of James Baker, Former General Counsel, FBI.
66
Terminiello v. City of Chicago, 337 U.S. 1, 4 (1949) (citations omitted).
67
Jonathan Turley, Is the Red Scare Turning Blue?, Res Ipsa Blog (www.jonathanturley.org), Feb. 12,
2023, https://jonathanturley.org/2023/02/12/is-the-red-scare-going-blue-democrats-accuse-government-
critics-of-being-putin-lovers-and-supporting-insurrectionists/.
19
The public understands the threat to free speech and strongly supports an
investigation into the FBI’s role in censoring social media. Despite the push for
censorship by some politicians and pundits, most Americans still want free-speech
protections. It is in our DNA. This country was founded on deep commitments to free
speech and limited government – and that constitutional tradition is no conspiracy theory.
Polls show that 73% of Americans believe that these companies censored material for
political purposes.
68
Another poll showed that 63% want an investigation into FBI
censorship allegations.
69
Adlai Stevenson famously warned of this danger: “Public confidence in the
integrity of the Government is indispensable to faith in democracy; and when we lose
faith in the system, we have lost faith in everything we fight . . . for.” Senator
Stevenson’s words should resonate on both sides of our political divide and that we
might, even now, find a common ground and common purpose. The loss of faith in our
government creates political instabilities and vulnerabilities in our system. Moreover,
regardless of party affiliation, we should all want answers to come of these questions. We
can differ on our conclusions, but the first step for Congress is to force greater
transparency on controversies involving bias to censorship. One of the greatest values of
oversight is to allow greater public understanding of the facts behind government actions.
Greater transparency is the only course that can help resolve the doubts that many have
over the motivations and actions of their government. I remain an optimist that it is still
possible to have a civil and constructive discussion of these issues. Regardless of our
political affiliations and differences, everyone in this room is here because of a deep love
and commitment to this country. It was what brought us from vastly different
backgrounds and areas in our country. We share a single article of faith in our
Constitution and the values that it represents. We are witnessing a crisis of faith today
that must be healed for the good of our entire nation. The first step toward that healing is
an open and civil discussion of the concerns that the public has with our government. We
can debate what measures are warranted in light of any censorship conducted with
government assistance. However, we first need to get a full and complete understanding
of the relationship between federal agencies and these companies in the removal or
suspension of individuals from social media. At a minimum, that should be a position that
both parties can support in the full disclosure of past government conduct and
communications with these companies.
Once again, thank you for the honor of appearing before you to discuss these
important issues, and I would be happy to answer any questions from the Committee.
Jonathan Turley
J.B. & Maurice C. Shapiro Chair of Public Interest Law
George Washington University
68
Sean Burch, Nearly 75% of Americans Believe Twitter, Facebook Censor Posts Based on
Viewpoints, Pew Finds, THE WRAP, Aug. 19, 2020, https://www.thewrap.com/nearly-75-percent-twitter-
facebook-censor/.
69
63% Want FBI’s Social Media Activity Investigated, RASMUSSEN REPORTS, Dec. 26, 2022,
https://www.rasmussenreports.com/public_content/politics/partner_surveys/twittergate_63_want_fbi_s_soc
ial_media_activity_investigated.
CENSORSHIP CAN BE DEADLY
!"#$%&$'()*+,#*-"*.",./01*2&"".%3*%4*$)""+'*($*,/5,0$*(3)%&6,#67*-86*(6*($*"$)"+(,//0*(3)%&6,#6*.8&(#9*,*
#,:%#,/*"3"&9"#+0*$8+'*,$*,*),#."3(+1*;%*,86'%&(60*($*(#4,//(-/"7*,#.*5'"#*,*#"5*<(&8$*"3"&9"$7*(6*($*
(3)%$$(-/"*4%&*)%/(:+(,#$* ,#.* )8-/(+* '",/6'*%=+(,/$* 6%* 9"6* 6'(#9$* &(9'6* 5(6'%86* /($6"#(#9* 6%* .($+8$$(%#$*
-"65""#*,*5(."*+,$6*%4*$+("#:$6$*5(6'*.(>"&"#6*,&",$*%4*"?)"&:$"*,#.*6'%89'6$1**
@*,3*,#*")(."3(%/%9($67*,*-(%$6,:$:+(,#*,#.*,*)&%4"$$%&*%4*3".(+(#"*,6*A,&<,&.7*%#*/",<"1*2%&*%<"&*65%*
."+,."$*@*',<"*.%#"*&"$",&+'*%#*6'"*."6"+:%#*,#.*3%#(6%&(#9*%4*(#4"+:%8$*.($",$"*%86-&",B$*,#.*%#*6'"*
$,4"60*"<,/8,:%#*%4*<,++(#"$*,#.*.&89$1*@*'"/)".*-8(/.*6'"*#,:%#$*.($",$"*$8&<"(//,#+"*$0$6"3$1*C"$)(6"*
6'($7*@*5,$*+"#$%&".*,#.*-/,+B*/($6".*.8&(#9*6'"*),#."3(+7*-0*D5(E"&7*F(#B".@#7*G%8D8-"*,#.*2,+"-%%B1**
@#*",&/0*HIHI7*5"*,/&",.0*B#"5*4&%3*J8',#*.,6,*6',6*6'"&"*($*3%&"*6',#*,*6'%8$,#.K4%/.*.(>"&"#+"*(#*
!%<(.*3%&6,/(60*-"65""#*6'"*%/.*,#.*6'"*0%8#9
1
1*C8&(#9*6'"*),#."3(+7*5"*4,(/".*6%*,."L8,6"/0*)&%6"+6*
%/."&* M3"&(+,#$* 5'(/"* $+'%%/* +/%$8&"$* ,#.* %6'"&* /%+B.%5#* 3",$8&"$* 9"#"&,6".* "#%&3%8$* +%//,6"&,/*
)8-/(+*'",/6'*.,3,9"*6',6*5"*#%5*38$6*/(<"*5(6'7*,#.*.("*4&%37*4%&*0",&$*6%*+%3"1*
C8&(#9*6'"*$)&(#9*%4*HIHI7*N5"."#*5,$*6'"*%#/0*3,O%&*J"$6"&#*+%8#6&0*6%*B"")*$+'%%/$*,#.*.,0+,&"*%)"#*
4%&*+'(/.&"#*,9"$*P*6%*PQ1*M3%#9*6'%$"*P1R*3(//(%#*+'(/.&"#7*6'"&"*5"&"*S"&%*+%<(.*.",6'$*,#.*6'"*+%<(.*
&($B*4%&*6",+'"&$*5"&"*/"$$*6',#*6'"*,<"&,9"*%4*%6'"&*)&%4"$$(%#$
2
1*D'($*$'%5".*6',6*(6*5,$*$,4"*6%*B"")*
$+'%%/$*%)"#1*@6*5,$*(3)%&6,#6*4%&*M3"&(+,*6%*B#%5*6',67*-86*,*T8/0*HIHI*;"5*U#9/,#.*T%8&#,/*%4*V".(+(#"*
,&:+/"*%#*$+'%%/*+/%$8&"*.(.*#%6*"<"#*3"#:%#*N5"."#
3
1*D',6W$*/(B"*&")%&:#9*%#*,*#"5*3".(+,/*6&",63"#6*
5(6'%86*(#+/8.(#9*(#4%&3,:%#*4&%3*6'"*+%3),&($%#*+%#6&%/*9&%8)1**
X#,-/"*6%*)8-/($'*30*6'%89'6$*,-%86*6'"*),#."3(+*(#*XN*U#9/($'*/,#98,9"*3".(,7*(#*6'"*$833"&*%4*HIHI*
@*8$".*30*D5(E"&*,++%8#6*6%*$',&"*6'"*N5".($'*.,6,*,#.*,&98"*4%&*%)"#*$+'%%/$1*Y86*(#*T8/0*HIHI7*D5(E"&*
)86*3"*%#*6'"(&*Z6&"#.$*-/,+B/($6[*6%*/(3(6*6'"*&",+'*%4*30*%)"#*$+'%%/*)%$6$
4
1*
@#*\+6%-"&*HIHI7*@*,86'%&".*6'"*]&",6*Y,&&(#96%#*C"+/,&,:%#*5(6'*65%*4"//%5*")(."3(%/%9($6$7*C&1*N8#"6&,*
]8)6,*,6*\?4%&.*,#.*C&1*T,0*Y',E,+',&0,*,6*N6,#4%&.
5
1*J"*,&98".*4%&*-"E"&*)&%6"+:%#*%4*'(9'K&($B*%/."&*
)"%)/"*5'(/"* B"")(#9*$+'%%/$* %)"#* ,#.* /"^#9*0%8#9*)"%)/"* /(<"* 3%&"* #%&3,/* /(<"$1* D'($* 5,$* $',.%5*
-,##".*-0*]%%9/"*,#.*+"#$%&".*-0*_"..(6
6
1*M`"&*)%$:#9*(#*4,<%&*%4*)&(%&(:S(#9*6'"*"/."&/0*4%&*<,++(#,:%#7*
%8&*2,+"-%%B*),9"*5,$*“unpublished”.*a2(98&"*Pb*
@#*V,&+'*HIHP7*D5(E"&*+"#$%&".*,*)%$6*5'"#*@*5&%6"*6',6*Thinking that everyone must be vaccinated is
as scien:fically flawed as thinking that nobody should. COVID vaccines are important for older high-risk
people and their care-takers. Those with prior natural infec:on do not need it. Nor children.*D5(E"&*4,/$"/0*
+/,(3".*6',6*6'"*65""6*5,$*3($/",.(#97*,#.*(6*+%8/.*#%6*-"*&")/(".*6%7*$',&".*%&*/(B".1*J"*',<"* B#%5#*
,-%86*(#4"+:%#K,+L8(&".*(338#(60*$(#+"*6'"*M6'"#(,#*c/,98"*(#*deI*Y!7*,#.*6'"*L8"$:%#(#97*."#(,/*,#.*
+"#$%&(#9*%4*$8+'*#,68&,/*(338#(60*($*6'"*3%$6*$68##(#9*."#(,/*%4*$+("#:f+*4,+6$*.8&(#9*6'"*),#."3(+1*
*
1
!"#$$%&'(!)*!+,-./012!+&#345'!)567#'57!89&#$%!:5!;<5!8=5>?@>*!A?3B5%.3C!;='?$!1DC!EDED*!
2
!F#:$?>!G56$49!;<53>H!&I!8J5%53*!+&K?%012!?3!7>9&&$>9?$%'53!L!;!>&M=6'?7&3!:54J553!N?3$63%!63%!8J5%53C!O#$H!PC!
EDED*!
3
!A5K?37&3!)C!+5K?B!)C!A?=7?4>9!)*!Q5&=53?3<!F'?M6'H!7>9&&$7!%#'?3<!495!F63%5M?>*!R5J!S3<$63%!O&#'36$!&I!
)5%?>?35C!O#$H!E2C!EDED*!
4
*T96U6>96'H6!O*!V964!.!%?7>&K5'5%!64!WJ?U5'!956%X#6'45'7C!Y3G5'%C!/5>5M:5'!EEC!EDEE*!
5
!T96U6>96'H6!OC!Z#=46!8C!"#$$%&'(!)*!Z'564!T6''?3<4&3!/5>$6'6[&3C!,>4&:5'!\C!EDED*!
6
!]&#3<!W*!!V9H!>63^4!J5!46$B!6:&#4!495!Z'564!T6''?3<4&3!/5>$6'6[&3_!W95!8=5>464&'C!,>4&:5'!1PC!EDED*!
J(6'* .",./0* +%#$"L8"#+"$1* M6* ,* :3"* 5'"#* <,++(#"$* 5"&"* (#* $'%&6* $8))/07* 5"* 5"&"*<,++(#,:#9* 0%8#9*
,.8/6$*,#.*)"%)/"*5(6'*#,68&,/*(338#(607*5'%*.(.*#%6*#"".*(67*-"4%&"*3,#0*%/."&*M3"&(+,#$*5'%*5'%$"*
/(<"$*+%8/.*',<"*-""#*$,<".*-0*(61*a2(98&"*Hb*
D'&%89'* &,#.%3(S".* $68.("$* ,#.* &"<("5$
7
*
8
7* 5"* B#%5* 6',6* 4,+"* 3,$B$* )&%<(."* %#/0* 3,&9(#,/* %&* #%*
)&%6"+:%#* ,9,(#$6*!%<(.1* M* &,#.%3(S".* $68.0* (#* C"#3,&B* $'%5".* #%* $(9#(f+,#6* -"#"f6
9
* 5'(/"* ,* G,/"*
X#(<"&$(60*$68.0* +%#.8+6".* (#* Y,#9/,."$'* $'%5".* ,* &".8+:%#* -"65""#*I* ,#.* PR* )"&+"#6
10
1* @6* ($* 6'"#*
.,#9"&%8$*6%*3,B"*%/."&*'(9'K&($B*M3"&(+,#$*-"/("<"*6',6*3,$B$*5(//*)&%6"+6*6'"3*5'"#*6'"0*5(//*#%67*,$*
6'"0*3,0*9%*6%*+&%5.".*&"$6,8&,#6$*%&*$8)"&*3,&B"6$*6'(#B(#9*6',6*6'"(&*3,$B*($*B"")(#9*6'"3*$,4"1*@#*
V,0*HIHP*@*5,$*6"3)%&,&(/0*$8$)"#.".*-0*D5(E"&*4%&*6'&""*5""B$*4%&*5&(:#9*6',6g*“Naively fooled to think
that masks would protect them, some older high-risk people did not socially distance properly, and some
died from Covid19 because of it. Tragic. Public health officials/scien:sts must always be honest with the
public”1*a2(98&"*eb*
D5(E"&* ,/$%* +"#$%&".* 3"* 4%&* L8%:#9* ,#.* /(#B(#9* 6%* ,#* ,&:+/"* ,-%86* 3,$B$* 5&(E"#* -0* ,#* M$$%+(,6"*
c&%4"$$%&*%4*Y/,+B*N68.("$*,6*6'"*X#(<"&$(60*%4*!,/(4%&#(,1*a2(98&"*db*
@#* M)&(/* HIHP* @* ),&:+(),6".* (#* ,* $+("#:f+* &%8#.6,-/"* '%$6".* -0* 2/%&(.,* 9%<"&#%&* _%#* C"N,#:$1* D'($*
&%8#.6,-/"*5,$*+"#$%&".*-0*G%8D8-"*,`"&*-"(#9*)%$6".*-0*,*!YN*,=/(,6".*6"/"<($(%#*$6,:%#*(#*2/%&(.,
11
1*
G%8D8-"*($*%5#".*-0*]%%9/"1*
V,#0*#8&$"$*5"&"*(#4"+6".*5'(/"*'"&%(+,//0*6,B(#9*+,&"*%4*+%<(.*),:"#6$7*,#.*$%3"*%4*6'"3*5"&"*/,6"&*
f&".*4%&*#%6*6,B(#9*,*<,++(#"*"<"#*6'%89'*6'"0*',.*$6&%#9"&*(338#(60*6',#*6'"*<,++(#,6".1*@#*\+6%-"&*
HIHP7*@*5&%6"*,#*,&:+/"*8&9(#9*'%$)(6,/$*6%*'(&"*(#$6",.*%4*f&"*#8&$"$*5(6'*#,68&,/*(338#(607*,$*6'"0*,&"*
6'"* /",$6* /(B"/0* 6%* (#4"+6* %/."&* 4&,(/* ),:"#6$
12
1* D',6* 5,$* +"#$%&".* -0* F(#B".@#7* 5'(+'* ($* %5#".* -0*
V(+&%$%`
13
1*A%$)(6,/$*,#.*#8&$(#9*'%3"$*+%8/.*',<"*-"E"&*)&%6"+6".*),:"#6$*(4*6'"0*',.*,+:<"/0*'(&".*
)"&$%##"/*5(6'*(#4"+:%#*,+L8(&".*(338#(601*D',6*5%8/.*',<"*$,<".*/(<"$1*
F(#B".@#*+"#$%&".*3"*38/:)/"*:3"$7*%#"*%6'"&*"?,3)/"*-"(#9*,*&")%$6*%4*,#*(#6"&<("5*5(6'*6'"*+'("4*
")(."3(%/%9($6*%4*@+"/,#.
14
*
15
*a2(98&"$*QKhb1*D',6*($7*F(#B".@#*.(.*#%6*%#/0*+"#$%&*)8-/(+*'",/6'*,+,."3(+$*
-86*,/$%*9%<"&#3"#6*)8-/(+*'",/6'*%=+(,/$*6',6*.(.*#%6*+%#4%&3*6%*F(#B".@#W$*<("5*%#*6'"*),#."3(+1**
*
7
Jefferson T, et al. Physical interven[ons to interrupt or reduce the spread of respiratory viruses. Cochrane Library,
January 30, 2023.
8
Liu AT, Prasad V, Darrow JJ. Evidence for Community Cloth Face Masking to Limit the Spread of SARS-CoV-2: A
Cri[cal Review, Cato Working Paper, November 8, 2021.
9
Bundgaard H, et al. Effec[veness of Adding a Mask Recommenda[on to Oher Public Health Measures to prevent
SARS-CoV-2 Infec[on in Danish Mask Wearers: A Randomized Controlled Trial. Annals of Internal Medicine,
November 18, 2020.
10
Abaluck J, et al. Impact of community masking on COVID-19: A cluster-randomized trial in Bangladesh. Science,
December 2, 2021.
11
Wilson K, Ross A. YouTube removes video of DeSan[s coronavirus roundtable. Tampa Bay Times, April 9, 2021.
12
Kulldorff M. Hospitals Should Hire, not Fire, Nurses with Natural Immunity. Brownstone Ins[tute, October 1, 2021.
13
Harvard Epidemiologist Censored by LinkedIn for Defending Healthcare Jobs. Brownstone Ins[tute, October 4,
2021.
14
LinkedIn Censors Harvard Epidemiologist Mar[ne Kulldorff, Brownstone Ins[tute, August 12, 2021.
15
Tucker JA. Kulldorff Deleted: Famed Epidemiologist and Early Opponent of Lockdowns Banned by LinkedIn,
Brownstone Ins[tute, January 28, 2022.
M$*,*/",.(#9*"?)"&6*%#*<,++(#"*$,4"607*!C!*,$B".*3"*6%*$"&<"*%#*6'"(&*!\i@CKPh*i,++(#"*N,4"60*D"+'#(+,/*
J%&B* ]&%8)1* @#* M)&(/* HIHP7* !C!* f&".* 3"* 4&%3* 6',6* 9&%8)
16
1*@4*0%8*6'(#B*@*5,$*f&".*4%&*L8"$:%#(#9*6'"*
<,++(#"$7*0%8*,&"*5&%#91*@*,3*)&%-,-/0*6'"*%#/0*)"&$%#*f&".*-0*!C!*4%&*-"(#9*6%%*)&%K<,++(#"1*\#*M)&(/*
Pe7*HIHP7*!C!*),8$".*6'"*T%'#$%#*j*T%'#$%#*<,++(#"*,`"&*&")%&6$*%4*-/%%.*+/%6$*(#*,*4"5*5%3"#*8#."&*
,9"*QI1*D'"&"*5"&"*#%*&")%&6".*+,$"$*,3%#9*%/."&*)"%)/"7*5'%*-"#"f6*6'"*3%$6*4&%3*6'"*<,++(#"$1*N(#+"*
6'"&"*5,$*,*9"#"&,/*<,++(#"*$'%&6,9"*,6*6'"*:3"7*@*,&98".*(#*,#*%)K".*(#*D'"*A(//*6',6*6'"*<,++(#"*$'%8/.*
#%6*-"*),8$".*4%&*%/."&*'(9'K&($B*M3"&(+,#$
17
1*D',6*9%6*3"*f&".7*,/6'%89'*!C!*.(.*/(`*6'"*),8$"*4%8&*.,0$*
/,6"&1*D&,9(+,//07*$%3"*%/."&*M3"&(+,#$*.(".*-"+,8$"*%4*6'($*<,++(#"*Z),8$"[1*
D'"*)&(3,&0*<(+:3*%4*+"#$%&$'()*($*#%6*3"*,#.*%6'"&$*-"(#9*+"#$%&".7*-86*6'"*)8-/(+1*M$*)%/(:+(,#$7*6%*
)&%)"&/0*$"&<"*0%8&*+%#$:68"#6$7*0%8*',<"*-%6'*6'"*&(9'6*6%*'",&*4&%3*,#.*,*.860*6%*/($6"#*6%*,*&,#9"*%4*
$+("#:$6$1*D'"*)8-/(+* ,/$%* ',$* 6',6* &(9'61*2%&*"?,3)/"7*'%5*3,#0*%4* 0%8*B#"5*6',6* N5"."#*B")6* 6'"(&*
$+'%%/$*%)"#*(#*6'"*$)&(#9*%4*HIHI*5(6'%86*,*$(#9/"*+%<(.*3%&6,/(60*,3%#9*(6$*P1R*3(//(%#*+'(/.&"#k*A%5*
3,#0*%4*0%8*B#%5*#%57*6',6*4%&*HIHIKHIHIH*N5"."#*4%+8$".*)&%6"+:%#*$6&,6"90*/".*6%*6'"*/%5"$6*"?+"$$*
3%&6,/(60*,3%#9*5"$6"&#*+%8#6&("$k
18
*!"#$%&$'()*.")&(<"$*-%6'*0%8*,#.*6'"*)8-/(+*4&%3*<(6,/*(#4%&3,:%#*
#"".".*6%*$,<"*/(<"$1**
!"#$%&$'()*(#"<(6,-/0*/",.$*6%*$"/4K+"#$%&(#91*N%3"*%4*30*)8-/(+*'",/6'*+%//",98"$*.(.*#%6*$)",B*8)*4%&*
4",&* %4* -"(#9* +"#$%&".7* $(/"#+".* %&* $/,#."&".7* /(B"* @* 5,$1* @* .%#W6* -/,3"*6'"31*@*5,$*,/$%*4%&+".* 6%*$"/4K
+"#$%&7*6%*,<%(.*-"(#9*)"&3,#"#6/0*-,##".*4&%3*$%+(,/*3".(,1*a2(98&"*PIb**
@*',<"*,*L8"$:%#*4%&*0%8g*C%*5"*',<"*4&"".%3*%4*$)""+'*-"+,8$"*%4*6'"*2(&$6*M3"#.3"#6*%&*.%*5"*',<"*
6'"*2(&$6*M3"#.3"#6*-"+,8$"*4&"".%3*%4*$)""+'*($*(3)%&6,#6*6%*)&"$"&<"*$%+("60*,#.*/(4"k*
J'"#*6'"*Y(//*%4*_(9'6$*5,$*5&(E"#7*M3"&(+,#$*',.*/(<".*6'&%89'*6&%8-/".*:3"$1*D'"0*.(.*#%6*8$"*6',6*,$*
,#*"?+8$"*4%&*+"#$%&$'()1*@*6'(#B*6'"0*8#."&$6%%.*6',6*4&"".%3*%4*$)""+'*($*"$)"+(,//0*(3)%&6,#6*.8&(#9*
.(=+8/6*:3"$*5'"#*.(=+8/6*."+($(%#$*38$6*-"*3,."1*@*'%)"*6',6*6'"*PPR6'*!%#9&"$$*($*O8$6*,$*5($"*,$*6'"*
P
st
*!%#9&"$$*5,$*5'"#*(6*,.%)6".*6'"*2(&$6*M3"#.3"#6*,$*),&6*%4*6'"*Y(//*%4*_(9'6$1**
D',#B*0%8*4%&*/($6"#(#91*
V,&:#*l8//.%&>*
* *
*
16
Pullman J. CDC Punishes ‘Superstar’ Scien[st For COVID Vaccine Recommenda[on The CDC Followed 4 Days
Later, The Federalist, April 28, 2021.
17
Kulldorff M. The dangers of pausing the J&J vaccine. The Hill, April 17, 2021.
18
Simmons M. Sweden, Covid and ‘excess deaths’: a look at the data. The Spectator, March 10, 2023.
_________________________________________________________________________________
FIGURES
Figure 1: Tweet with Screenshot of “UnpublishedFacebook Page.
Figure 2: TwiLer, March 2021
Figure 3: TwiLer, May 2021
Figure 4: TwiLer, November 2021
Figure 5: LinkedIn, August 2021
Figure 6: LinkedIn, August 2021
Figure 7: LinkedIn, January 2022
Figure 8: LinkedIn, January 2022
*
012345*Z7*.1=I5F(=M*.H>:*)?>:*,5@?45*+<<?3=:*$3>C5=>1?=M*XH=3H4Q*KOKK*
*
*
*
012345*6O7*891L54M*NH4<;*6UM*KOKK*
1
Censorship Laundering: How the U.S. Department of Homeland Security Enables the
Silencing of Dissent
Subcommittee on Oversight, Investigations, and Accountability
Committee on Homeland Security
United States House of Representatives
May 11, 2023
Statement for the Record
Benjamin Weingarten
Investigative Journalist & Columnist
2
I. Introduction
Chairman Bishop, Ranking Member Ivey, and members of the Subcommittee, thank you for the
opportunity to testify today.
1
It is an honor and a privilege to appear before you to discuss the
Department of Homeland Security’s (DHS) enabling of the silencing of dissent.
Government’s first charge is to defend the life and limb of the governed. DHS generally, and the
Cybersecurity and Infrastructure Security Agency (CISA) specifically, have vital roles to play in
this regard. Given the criticality of their mission to protect the homeland, these agencies must be
held to exacting standards. Should they experience mission creep, in so doing wielding powers in
ways violative of the constitutional rights they are meant to secure for all Americans, it compels
good faith scrutiny. I offer today’s testimony in this spirit.
Our republic rests on the inalienable right to free speech. That right is currently under assault by
those working to consign their political foes to the digital gulag in defense of our democracy.”
Disturbingly, the federal government itself appears to be a key culprit. Overwhelming evidence
2
suggests that federal agencies led by, among others, CISA
3
,
4
buoyed by senior executive
branch officials and lawmakers, colluding with Big Tech, and a coterie of often government-
coordinated and government-funded
5
counter-disinformationorganizations, have imposed
nothing less than a mass public-private censorship
6
regime on the American people.
!
1
I appear today on my own behalf, and my views do not necessarily reflect those of the media or other
organizations with which I am affiliated.
2
See generally Missouri v. Biden and Special Assistant Attorney General for the Louisiana
Department of Justice D. John Sauer’s related testimony before the House Judiciary Committee’s
Weaponization Subcommittee at https://judiciary.house.gov/sites/evo-subsites/republicans-
judiciary.house.gov/files/2023-03/Sauer-Testimony.pdf; Hines v. Stamos; and
https://report.foundationforfreedomonline.com/11-9-22.html.
3
DHS’ Inspector General has reported that the agency’s Office of Intelligence and Analysis (I&A) was also
involved in counter-disinformation efforts during the 2020 election season. Other DHS components in the
last several years have also worked to counter disinformation originating from foreign and domestic
sources.” [Emphasis mine] See https://www.oig.dhs.gov/sites/default/files/assets/2022-08/OIG-22-58-
Aug22.pdf#page=7. These efforts extend to other agencies including the Federal Bureau of Investigations
(FBI), Department of Justice (DOJ), and Office of the Director of National Intelligence (ODNI). Senior
executive branch officials and federal lawmakers have also publicly and privately exerted pressure on
social media companies to more aggressively police speechat times under threat of adverse regulatory or
legislative action. See generally Missouri v. Biden; https://www.wsj.com/articles/save-the-constitution-
from-big-tech-11610387105; https://www.newsweek.com/taxpayer-dollars-must-not-fund-government-led-
censorship-regime-opinion-1792828.
4
https://foundationforfreedomonline.com/dhs-censorship-agency-had-strange-first-mission-banning-
speech-that-casts-doubt-on-red-mirage-blue-shift-election-events/.
5
https://foundationforfreedomonline.com/bidens-national-science-foundation-has-pumped-nearly-40-
million-into-social-media-censorship-grants-and-contracts/.
6
I use censorship herein broadly to encompass terminating speakers’ accounts, deplatforming speakers,
temporarily suspending accounts, imposing warnings or strikes against accounts to chill future disfavored
speech, shadow banningspeakers, demonetizing content or speakers, adjusting algorithms to suppress or
de-emphasize speakers or messages, deboosting speakers or content, promoting or demoting content,
placing warning labels or explanatory notes on content, suppressing content in other users’ feeds,
promoting negative comments on disfavored content, and requiring additional click-through(s) to access
content, and other methods,” as plaintiffs in Hines v. Stamos define it. See: https://aflegal.org/wp-
content/uploads/2023/05/Doc-1-Complaint.pdf#page=9.
3
Authorities, led by the federal government, tell us this censorship is for our own good that we
suffer from a pandemic of “mis-, dis-, and mal-information(MDM);
7
that MDM fuels domestic
terrorism;
8
,
9
and therefore that America must undertake a whole-of-society effort to combat
MDM.
10
For its part, the censorship regime has equated MDM with Wrongthinkdissenting
opinions from its orthodoxy, and even facts
11
inconvenient to its agenda, on an ever-growing
number of subjective and contentious issues.
12
It conflates, cynically and purposefully, genuine
political difference with extremism,which it links to danger and violence to justify speech
policing.
13
In turn, the regime has surveilled the wide expanse of the digital public square to
identify such Wrongthink, and proceeded to suppress it under guise of national security and
public health.
14
Notably, the public-private speech police have targeted, for example, skepticism about the
integrity of mass mail-in balloting that used to be shared on a bipartisan basis and was never
linked to “domestic violent extremism;”
15
and skepticism about COVID-19 mitigation efforts that
often proved not only justified, but which in some instances, if more widely heard and
understood, might have saved lives and liberties. Given authorities have asserted, but not
necessarily established a clear and compelling nexus between the mere expression of such views
and widespread or dire threats of violenceand certainly not threats justifying suspension of the
First Amendment to quell them, for which this non-lawyer witness finds little precedent; and
given that authorities show little equivalent concern or zeal for suppressing a virtually limitless
array of other views that can be linked to violence – anti-cop sentiment to attacks on law
!
7
CISA has defined Misinformation” as that whichis false, but not created or shared with the intention of
causing harm.It has defined “Disinformation as that whichis deliberately created to mislead, harm, or
manipulate a person, social group, organization, or country.It has defined Malinformationas that which
is based on fact, but used out of context to mislead, harm, or manipulate.See:
https://www.cisa.gov/sites/default/files/publications/mdm-incident-response-guide_508.pdf. Setting aside
the question of who is to be the arbiter of truth in CISA’s MDM paradigm, on what grounds, and whether
and to what extent government ought to intervene accordingly, the matter of intent baked into these
definitions makes MDM a largely subjective concept.
8
https://www.dhs.gov/ntas/advisory/national-terrorism-advisory-system-bulletin-february-07-2022.
9
https://storage.courtlistener.com/recap/gov.uscourts.lawd.189520/gov.uscourts.lawd.189520.268.0.pdf#pag
e=88.
10
See for example the Biden administration’s National Strategy for Countering Domestic Terrorismat
https://www.whitehouse.gov/wp-content/uploads/2021/06/National-Strategy-for-Countering-Domestic-
Terrorism.pdf.
11
https://nypost.com/2023/03/17/private-federal-censorship-machine-targeted-true-misinformation/.
12
The targeting began largely with a focus on skepticism of the integrity and outcome of the 2020 election;
it expanded to encompass derogatory views to those of federal authorities including those ultimately
proving true and even known to be true contemporaneouslyconcerning virtually every aspect of COVID-
19, and particularly around mitigation efforts and their efficacy; since, federal officials have shown their
intent to expand such targeting to cover abortion, climate-related speech, gendered disinformation,
economic policy, the financial services industry, the U.S. withdrawal from Afghanistan, the war in Ukraine,
and other[]” topics, per recent testimony from litigation counsel in Missouri v. Biden, Special Assistant
Attorney General for the Louisiana Department of Justice D. John Sauer. See:
https://judiciary.house.gov/sites/evo-subsites/republicans-judiciary.house.gov/files/2023-03/Sauer-
Testimony.pdf.
13
https://www.newsweek.com/biden-domestic-terror-strategy-codifies-woke-war-wrongthink-opinion-
1605341.
14
https://foundationforfreedomonline.com/dhs-censorship-agency-had-strange-first-mission-banning-
speech-that-casts-doubt-on-red-mirage-blue-shift-election-events/.
15
https://www.wsj.com/articles/heed-jimmy-carter-on-the-danger-of-mail-in-voting-11586557667.
4
enforcement and widespread riots,
16
pro-abortion sentiment to attacks on pro-life centers and
threats to judges,
17
environmentalist sentiment to attacks on relevant targets by eco-terrorists, etc.
this indicates the speech-muzzling is rooted in politics, not the public good. Understood in this
light, the censorship regimes efforts start to look like they are intended more for its own benefit,
than ours.
The regimes systematic speech-stifling, targeting core political speech and intensifying during
recent federal election cycles, seems tantamount to a conspiracy to violate the First Amendment,
18
viewpoint discrimination, and running domestic election interferenceironically borne of claims
of foreign election interference.
In short, Americans have unknowingly and unwittingly been paying unelected and unaccountable
bureaucrats to, directly and by proxy, silence ourselves.
CISA has been described as a nerve center” of these federal government-led censorship efforts.
It has served as a key facilitator of, and participant in, meetings between federal authorities and
technology companies aimed at encouraging the latter to combat purported misinformation and
disinformation. It has served as a clearinghouse for social media content flagged for censorship
by third partiesgovernmental and non-governmentalrelaying the partiescensorship requests
on to social media companies, and flagged perceived problematic speech for the platforms
directly.
19
And it has helped foster the development of the broader public-private censorship
architecture through consulting, partnering with, and networking often government-linked third-
party organizations to themselves serve as First Amendment-circumventing,
20
mass-surveillance
and mass-censorship clearinghouses for content flagged by, among others, government partners.
21
It is perhaps incalculable how many people have been bereft of their right to speak, and listen, by
way of these censorship effortsand at what cost.
Remarkably, we would know little of such efforts were it not for a billionaires decision to
purchase a social media platform,
22
and then empower a handful of journalists to expose the
government-tied censorship efforts in which it had been implicated;
23
and the legal action of
vigilant state attorneys general, who, alongside the silenced, sued implicated federal authorities,
and through discovery began to untangle this twisted censorship web.
24
!
16
https://archive.is/SA9H1.
17
https://www.realclearinvestigations.com/articles/2023/01/25/frustrated_by_police_inaction_the_pro-
life_movement_takes_up_the_work_of_law_enforcement_877348.html.
18
https://storage.courtlistener.com/recap/gov.uscourts.lawd.189520/gov.uscourts.lawd.189520.268.0.pdf#pag
e=7.
19
https://storage.courtlistener.com/recap/gov.uscourts.lawd.189520/gov.uscourts.lawd.189520.214.1_1.pdf#p
age=278.
20
As Justice Clarence Thomas wrote in his concurring opinion in Biden v. Knight, a private entity violates
the First Amendment if the government coerces or induces it to take action the government itself would
not be permitted to do, such as censor expression of a lawful viewpoint.” Further, “The government cannot
accomplish through threats of adverse government action what the Constitution prohibits it from doing
directly.” See: https://www.supremecourt.gov/opinions/20pdf/20-197_5ie6.pdf#page=11.
21
https://ago.mo.gov/docs/default-source/press-releases/212-3-proposed-findings-of-
fact.pdf?sfvrsn=739f8cbf_2.
22
https://www.wsj.com/articles/elon-musk-completes-twitter-takeover-11666918031.
23
https://www.racket.news/p/capsule-summaries-of-all-twitter.
24
See Missouri v. Biden.
5
As its role in the censorship regime has started to come into focus, CISA has gone about
scrubbing evidence of its associated efforts;
25
it has reorganized related entities;
26
and it has
stonewalled congressional investigators
27
– while maintaining that, as the agency’s Director, Jen
Easterly put it in recent congressional testimony, “We don’t censor anythingor “flag anything
for social media organizations at all.
28
It is hard to fully square this position with what we have learned to date. Congress can and should
help resolve this seeming dispute. At minimum, the troubling evidence suggests the national
security apparatus’s apparent interest in Americans’ speech warrants oversight, without which, if
merited, there can be no accountability and reform. This Subcommittee’s engagement, therefore,
alongside other committees with relevant jurisdiction,
29
is most welcome and necessary. It is also
most urgent, with the 2024 elections looming, censorship tools becoming more sophisticated and
powerful,
30
and the censorship regimes ambitions only growingalongside its footprint.
31
,
32
,
33
To help inform this Subcommittee’s efforts, I will briefly address how CISA came to take on a
pivotal role in this censorship regime, detail its associated actions, and offer recommendations for
further oversight.
II. How CISA Became a “Nerve Center” of America’s Censorship Regime
The plaintiffs in the landmark pending case, Missouri v. Biden, allege, and have revealed a trove
of information substantiating the claim that there is a massive, sprawling federal Censorship
Enterprise,which includes dozens of federal officials across at least eleven federal agencies and
components, who communicate with social-media platforms about misinformation,
disinformation, and the suppression of private speech on social mediaall with the intent and
effect of pressuring social-media platforms to censor and suppress private speech that federal
officials disfavor,” in violation of the First Amendment.
34
The plaintiffs identify CISA
specifically as a nerve centerof federal government-led speech policing, which began in
earnest in the run-up to the 2020 election.
35
Several key developments help to explain how a DHS sub-agency tasked with preventing
cyberattacks and defending physical infrastructure would come to occupy a central role in this
censorship effort. Among them are that: (i) Donald Trump won the 2016 presidential election. (ii)
His victory came to be seen by many as being enabled by (a) Social media and (b) Russian
!
25
https://foundationforfreedomonline.com/flash-report-dhs-quietly-purges-cisa-mis-dis-and-
malinformation-website-to-remove-domestic-censorship-references-2/.
26
https://www.racket.news/p/homeland-security-reorganizes-appearing.
27
https://judiciary.house.gov/sites/evo-subsites/republicans-judiciary.house.gov/files/evo-media-
document/2023-04-28-jdj-to-easterly-cisa-subpoena-cover-letter.pdf.
28
https://www.youtube.com/watch?v=JnbWb5ZFN8s&t=4673s.
29
https://judiciary.house.gov/media/press-releases/chairman-jordan-subpoenas-cdc-cisa-and-gec-
documents-and-communications.
30
https://foundationforfreedomonline.com/the-national-science-foundations-convergence-accelerator-track-
f-is-funding-domestic-censorship-superweapons/.
31
https://twitter.com/shellenberger/status/1651355243722973186?s=20.
32
https://twitter.com/DFRLab/status/1654500447816654849?s=20.
33
https://theintercept.com/2023/05/05/foreign-malign-influence-center-disinformation/.
34
https://storage.courtlistener.com/recap/gov.uscourts.lawd.189520/gov.uscourts.lawd.189520.268.0.pdf/
35
https://ago.mo.gov/docs/default-source/press-releases/212-3-proposed-findings-of-
fact.pdf?sfvrsn=739f8cbf_2.
6
interference on social media aimed at elevating Trump’s candidacy. These developments would
both escalate to a matter of national security “content moderation” – a euphemism for speech
regulation up to and including deplatforming and fuel the creation of America’s mass public-
private censorship regime.
36
(iii) In partial response, in January 2017 outgoing DHS Secretary Jeh
Johnson designated election infrastructure as a critical infrastructure subsector, putting elections
ultimately under CISA’s purview.
37
,
38
(iv) That same year, the State Department established the
Global Engagement Center (GEC), tasked with leading federal efforts to counter foreign state
and non-state propaganda and disinformation efforts aimed at undermining United States national
security interests.”
39
The FBI also established its Foreign Influence Task Force (FITF) to
identify and counteract malign foreign influence operations targeting the United States,” with an
explicit emphasis on voting and elections.
40
(v) Following suit, in 2018 DHS stood up a
Countering Foreign Influence Task Force comprised of CISA’s Election Security Initiative
division, and Office of Intelligence and Analysis (I&A) staff. Its purpose, according to a recent
DHS Office of Inspector General (OIG) report, was to focus on election infrastructure
disinformation.
41
(vi) On top of this counter-disinformation mobilization, certain federal
lawmakers increasingly chided social media platforms for dithering on content moderation,”
including but not exclusively pertaining to foreign adversaries.
42
(vii) Amid the government’s
growing counter-disinformation push, a constellation of sometimes state-funded non-
governmental counter-disinformation organizations grew alongside it.
43
This by no means exhaustive list of developments, combined with two shifts in the posture of key
players within the looming censorship regime, would create the conditions for, and leave CISA
uniquely positioned to serve as a linchpin of it. First, federal authorities and their future private-
sector partners
44
would train their sights increasingly on domestic Wrongthinkers over foreign
adversaries as key disinformation threat actorsor at minimum focus on the content of speech
over the country of origin of the speaker. Second, they would begin to treat words critical of
institutions as threats to those institutions.
In CISAs case, under its first Director Chris Krebs, who served through the 2020 election cycle,
that meant targeting speech dubious of election administration and outcomes as a threat to
election infrastructure. Under his successor, infrastructure would come to comprise nearly every
!
36
For a more comprehensive treatment on both the theory and practice of our censorship regime, see
https://www.tabletmag.com/sections/news/articles/guide-understanding-hoax-century-thirteen-ways-
looking-disinformation.
37
See https://www.dhs.gov/news/2017/01/06/statement-secretary-johnson-designation-election-
infrastructure-critical and https://www.intelligence.senate.gov/sites/default/files/documents/os-jjohnson-
032118.pdf. In the designation, Sec. Johnson describes election infrastructure as storage facilities, polling
places, and centralized vote tabulations locations used to support the election process, and information and
communications technology to include voter registration databases, voting machines, and other systems to
manage the election process and report and display results on behalf of state and local governments.
38
https://www.cisa.gov/topics/election-security.
39
https://www.congress.gov/bill/114th-congress/senate-bill/2943/text.
40
https://www.fbi.gov/news/press-releases/the-fbi-launches-a-combating-foreign-influence-webpage.
41
https://www.oig.dhs.gov/sites/default/files/assets/2022-08/OIG-22-58-Aug22.pdf#page=7.
42
See https://twitter.com/mtaibbi/status/1610372352872783872?s=20 and
https://www.wsj.com/articles/save-the-constitution-from-big-tech-11610387105.
43
For an extensive accounting of the theory and practice behind this burgeoning disinformation industrial
complex, see https://www.tabletmag.com/sections/news/articles/guide-understanding-hoax-century-
thirteen-ways-looking-disinformation and https://judiciary.house.gov/sites/evo-subsites/republicans-
judiciary.house.gov/files/evo-media-document/shellenberger-testimony.pdf#page=8.
44
https://rumble.com/v1gx8h7-dhss-foreign-to-domestic-disinformation-switcheroo.html/.
8
reporting on the contents of Hunter Biden’s laptopindicating Biden family
influence peddlingweeks from the 2020 presidential election, on false grounds
that it was the product of such a “hack-and-leak.”
51
o It has been reported that government warnings about hack-and-leaksled
platforms to change their terms of service in the run-up to the 2020 election to
suppress related content. In CISA-convened industry meetings, content
moderation policies are a regular topic, and CISA regularly communicates with
social media platforms about such policies.
52
Switchboardreports of purported misinformation and disinformation from state and
local authorities, among others, beginning in 2018 and expanding through the 2020
election. Switchboarding entails receiving and then forwarding reports of offending
content to social media platforms for censorship. Officials did so without assessing
whether the content came from foreign or domestic speakers. Among other notable points
about these efforts:
o CISA staff switchboarded misinformation reports, for example, flagging tweets
for censorship alleging election fraud, that ballots were not counted, and mail-in
voting was implemented to benefit Democrats. One such report forwarded by a
CISA official to Twitter called for “swift removal of…posts and continued
monitoring of the user’s account” because said user had “claimed…that mail-in
voting is insecure,” and that “conspiracy theories about election fraud are hard to
discount.” Twitter reported back to CISA it had taken action pursuant to its
policy on Civic Integrity.
53
o Staffers also switchboarded misinformation reports flagging obviously satirical
social media accounts for censorship, including one Colorado Twitter account
with 56 followers UnOfficialCOgov.The users biographical information read:
“dm us your weed store location (hoes be mad, but this is a parody account).”
54
o A CISA switchboard tracking spreadsheet from 2020 suggests that in certain
instances, officials from both CISA and DHS I&A were the originators of
flagged content ultimately conveyed by CISA staff to social media companies for
review.
55
o Switchboarding efforts at times would even touch on private postings on social
media platforms.
56
o Social media companies would often report that they would escalateCISA-
switchboarded requests and revert to CISA once addressed.
57
Brief state officials about content CISA considers misinformation, which those officials
often then flag for social media platforms for censorship; fact-check misinformation
!
51
https://ago.mo.gov/docs/default-source/press-releases/212-3-proposed-findings-of-
fact.pdf?sfvrsn=739f8cbf_2#page=274.
52
https://judiciary.house.gov/sites/evo-subsites/republicans-judiciary.house.gov/files/2023-03/Sauer-
Testimony.pdf#page=18.
53
https://storage.courtlistener.com/recap/gov.uscourts.lawd.189520/gov.uscourts.lawd.189520.214.1_1.pdf#p
age=267.
54
https://storage.courtlistener.com/recap/gov.uscourts.lawd.189520/gov.uscourts.lawd.189520.209.15.pdf#pa
ge=11.
55
https://storage.courtlistener.com/recap/gov.uscourts.lawd.189520/gov.uscourts.lawd.189520.214.35.pdf.
56
https://ago.mo.gov/docs/default-source/press-releases/212-3-proposed-findings-of-
fact.pdf?sfvrsn=739f8cbf_2#page=271
57
Ibid.
7
significant institution, and now, even our brains. Director Easterly would argue that the American
mind “our cognitive infrastructure” – is “the most critical infrastructure,” obligating authorities
to protect” such infrastructure.
45
One way to do so would be through controlling the information
space by suppressing disfavored narrativeshence the efforts she would take to grow and
strengthen my misinformation and disinformation team.”
46
,
47
Accordingly, CISA would come to equate first the American public’s skeptical tweets on subjects
like mail-in voting with attacks on election infrastructure, and later a growing list of dissident
views on other issues as threats to relevant infrastructure, and arrogate unto itself the power to
neutralize the threats through helping orchestrate a public-private censorship regime.
III. CISA’s Leadership in the Censorship Regime
In fact, CISA would not only help orchestrate widespread censorship efforts, but would actively
participate in them. During the 2020 election, and in some instances continuing and expanding
thereafter, findings from Missouri v. Biden and additional supporting evidence demonstrate that
CISA officials contribute to censorship efforts directly and by proxy.
CISAs Direct Censorship-Related Efforts
Among other direct actions CISA officials have taken with respect to countering MDM,
personnel:
48
Convene and coordinate meetings between national security and law enforcement
agencies, and technology companiesincluding not just social media platforms
Facebook/Meta, Google, Twitter, and Reddit, but also Microsoft, Verizon Media,
Pinterest, LinkedIn, and Wikimedia Foundation
49
aimed at combating purported
misinformation and disinformation. These meetings occur more frequently in the run-up
to elections.
50
CISA is party to at least five sets of recurring confabs with social media
platforms touching on MDM and/or policing of speech on said platforms, separate and
apart from the many bilateral such meetings CISA hosts.
o In 2020 meetings with social media companies, CISA and other officials warned
of potential foreign hack-and-leakoperations to come during the election.
Major social media companies would proceed to censor the New York Post’s
!
45
https://thehill.com/policy/cybersecurity/580990-cyber-agency-beefing-up-disinformation-
misinformation-team/.
46
Ibid.
47
The Biden administration in fact would incorporate this view into its first-of-its-kind National Strategy
for Countering Domestic Terrorism, in calling for government to accelerat[e] work to contend with an
information environment that challenges healthy democratic discourse” as part of its effort to “confront
long-term contributors to domestic terrorism.” See: https://www.whitehouse.gov/wp-
content/uploads/2021/06/National-Strategy-for-Countering-Domestic-Terrorism.pdf#page=29/
48
Since much of the available record details CISA activities prior to the 2022 midterm elections, it is not
entirely clear in some instances whether certain activities persist. This only further underscores the need for
congressional oversight.
49
See https://twitter.com/MSFTIssues/status/1293623288262987777?s=20. While much of this testimony
focuses on the actions of social media platforms, the inclusion of other technology companies in
conversations with U.S. government agencies about MDM suggests oversight need be done on the actions
of these companies in conjunction with the federal government as well.
50
Officials from CISA, DHS’s I&A division, ODNI, FITF, and other agencies attend the meetings. See:
https://ago.mo.gov/docs/default-source/press-releases/212-3-proposed-findings-of-
fact.pdf?sfvrsn=739f8cbf_2#page=218.
9
reports for social media platforms;
58
and publish debunks of social-media narratives,
knowing...platforms will use this information to censor,” per litigation counsel in
Missouri v. Biden.
59
Coordinate with public and private sector partners, including social media companies on
a variety of projects to build resilience against malicious information activities,” as well
as supporting private sector partnersCOVID-19 response efforts via regular reporting
and analysis of key pandemic-related MDM trends.”
60
This is part and parcel of what
CISAs Cybersecurity Advisory Committee has described as a burgeoning MDM effort
that includes directly engaging with social media companies to flag MDM.
61
The coordination referenced above comes from a bulletin CISA posted on its website detailing
the work of its MDM teamthe successor to its Countering Foreign Influence Task Force. The
creation of that team formally codified the transition that had already taken place during the 2020
election cycle, from a focus on foreign to domestic speech.
62
In February 2023, CISA pulled
down that site, redirecting viewers to a Foreign Influence Operations and Disinformationpage
that makes no mention of domestic actors. One can only speculate as to why CISA made this
change.
CISA would also expand its focus to encompass not just MDM around elections, or COVID-19
vaccine efficacy under banner of defending public health infrastructure,
63
but “all types of
disinformation, to be responsive to current events,” according to an official quoted in an August
2022 DHS OIG Report.
64
Evidence collected in Missouri v. Biden indicates CISA has been
involved in combatting misinformationwith respect to the ongoing Russo-Ukrainian War,
65
and on an initiative in conjunction with the Treasury Department to address MDM regarding the
financial services industry.
66
In a January 2023 deposition taken in connection with Missouri v. Biden, the chief of CISA’s
MDM Team, Brian Scully, asserted that his team had a mandate that was almost limitless, in
pursuing MDM that could affect “critical infrastructure in a number of ways,” including causing
“reputational risk [that] could come about if the integrity or the public confidence in a particular
sector was critical to that sector’s functioning.
67
It is also possible CISAs efforts have extended beyond social media companies, and perhaps the
other technology companies with which it and other federal agencies have regularly met in
!
58
https://ago.mo.gov/docs/default-source/press-releases/212-3-proposed-findings-of-
fact.pdf?sfvrsn=739f8cbf_2#page=269.
59
https://judiciary.house.gov/sites/evo-subsites/republicans-judiciary.house.gov/files/2023-03/Sauer-
Testimony.pdf#page=19.
60
https://web.archive.org/web/20211231181148/https://www.cisa.gov/mdm.
61
https://s3.documentcloud.org/documents/23175380/dhs-cybersecurity-disinformation-meeting-
minutes.pdf.
62
https://web.archive.org/web/20211231181148/https://www.cisa.gov/mdm.
63
https://judiciary.house.gov/sites/evo-subsites/republicans-judiciary.house.gov/files/2023-03/Sauer-
Testimony.pdf#page=25.
64
https://www.oig.dhs.gov/sites/default/files/assets/2022-08/OIG-22-58-Aug22.pdf#page=9.
65
https://ago.mo.gov/docs/default-source/press-releases/212-3-proposed-findings-of-
fact.pdf?sfvrsn=739f8cbf_2#page=280.
66
https://ago.mo.gov/docs/default-source/press-releases/212-3-proposed-findings-of-
fact.pdf?sfvrsn=739f8cbf_2#page=283.
67
https://ago.mo.gov/docs/default-source/press-releases/212-3-proposed-findings-of-
fact.pdf?sfvrsn=739f8cbf_2#page=282.
11
Election Integrity Partnership (EIP), and a successor organization folding in additional partners,
the Virality Project (VP).
CIS is a nonprofit that collects and forwards reports of disinformation from state and local
government officials to social media platforms, and which continued to do so during the 2022
election cycle.
73
As CISA’s switchboarding activities became too labor-intensive for it, CISA
would direct election officials to report content to be flagged for social media platforms to CIS.
CISA would also help connect CIS, and various election official groups, with EIP.
EIP is a non-governmental “anti-disinformation” consortium that was conceived by and created in
consultation with CISA officials in the run-up to the 2020 election. Its stated purpose was to fill
thecritical gap” created by the fact no federal agency “has a focus on, or authority regarding,
election misinformation originating from domestic sources within the United States.”
74
That lack
of authoritymay have included both an inability for government agencies, to access social
media platform dataas EIP didas well as “very real First Amendment questions” regarding
EIP’s work, as a key player in the consortium, Renee DiResta, would acknowledge.
75
EIP’s four partner organizations, “leading institutions focused on understanding misinformation
and disinformation in the social media landscape,
76
sharing pervasive ties to the federal
government, include the:
Stanford Internet Observatory (SIO)Founded in June 2019 by former Facebook chief
security officer Alex Stamos, several of SIO’s students came up with the idea for EIP
while serving as CISA interns.
77
Stamos serves on CISA’s Cybersecurity Advisory
Committee. He and Chris Krebs, CISAs director through the 2020 election, formed a
consultancy in late 2020 called the Krebs/Stamos Group. CISA’s top election official
through 2020, Matt Masterson, who was involved in the establishment of EIP, joined SIO
as a fellow after leaving CISA in January 2021. SIO’s Research Manager, the
aforementioned DiResta, served as a Subject Matter Expert for CISA’s Cybersecurity
Advisory Committee’s since-abolished MDM Subcommittee.
78
University of Washington’s Center for an Informed PublicFounded in December 2019,
its cofounder Kate Starbird served as the chairperson of the since-abolished MDM
Subcommitteeserving incidentally alongside former Twitter executive Vijaya Gadde, a
leader of its censorship efforts prior to her ouster under new owner Elon Musk.
79
UWs
Center, along with SIO, would share in a $3 million National Science Foundation grant
awarded in August 2021 to “study ways to apply collaborative, rapid-response research to
mitigate online disinformation.
80
!
73
https://judiciary.house.gov/sites/evo-subsites/republicans-judiciary.house.gov/files/2023-03/Sauer-
Testimony.pdf#page=19.
74
https://stacks.stanford.edu/file/druid:tr171zs0069/EIP-Final-Report.pdf#page=9.
75
https://ago.mo.gov/docs/default-source/press-releases/212-3-proposed-findings-of-
fact.pdf?sfvrsn=739f8cbf_2#page=265.
76
https://stacks.stanford.edu/file/druid:tr171zs0069/EIP-Final-Report.pdf#page=20.
77
Ibid.
78
https://www.racket.news/p/homeland-security-reorganizes-
appearing?r=5mz1&utm_campaign=post&utm_medium=web.
79
https://www.politico.com/news/magazine/2020/10/28/twitter-vijaya-gadde-free-speech-policies-
technology-social-media-429221.
80
https://www.cip.uw.edu/2021/08/15/national-science-foundation-uw-cip-misinformation-rapid-response-
research/.
10
connection with combatting MDM. A June 2022 report from CISAs Cybsersecurity Advisory
Committee Subcommittee on Protecting Critical Infrastructure from Misinformation and
Disinformation (MDM Subcommittee) suggests that CISA should approach the mis- and dis-
information problem with the entire information ecosystem in view.This means focusing not
just on social media platforms, but mainstream media, cable news, hyper partisan media, talk
radio, and other online resources.
68
CISA would, as with its MDM webpage, scrap its MDM
Subcommittee, as first publicized in a late 2022 summary of an advisory board meeting.
69
As significant as CISAs MDM efforts have been, DHScounter-disinformation operations
spread far beyond the sub-agency. According to the aforementioned August 2022 DHS OIG
report, numerous components inside DHS have in recent years been targeting MDM foreign and
domestic. Whats more, the report details that DHS planned to target inaccurate information” on
myriad topics including the origins of the COVID-19 pandemic and the efficacy of COVID-19
vaccines, racial justice, U.S. withdrawal from Afghanistan, and the nature of U.S. support to
Ukraine.”
70
Corroborating the OIG Report, one document revealed in connection with congressional inquiries
into DHSstunted Disinformation Governance Board (DGB) indicated that myriad “DHS
components are already engaged in countering disinformation,” alongside “excellent work being
done by interagency partners, the private sector, and academia—particularly concerning
identifying and analyzing disinformation,” which “DHS should leverage.
71
A subsequent
memorandum would indicate that the DGB would support and coordinateMDM work with
other departments and agencies, the private sector, and non-government actors.” The purpose of
creating the DGB, in other words, was not so much to establish a “Ministry of Truth,” but, as
plaintiffs in Missouri v. Biden aptly describe it, “to impose a bureaucratic structure on the
enormous censorship activities already occurring involving dozens of federal officials and many
federal agencies” – that is, to oversee many such ministries.
72
CISAs Proxy Censorship-Related Efforts
Not all of these ministries are to be found within the federal government. CISA officials
coordinate and partner with non-governmental entities who both mass-surveil social media
content for purported MDM, and serve as clearinghouses for receipt of flagged content, which
they then relay to social media platforms for censorship in an apparent bid to circumvent the
First Amendment via cutout.
CISA has primarily partnered with three non-governmental entities, beginning during the 2020
election cycle, to facilitate the flow of problematic content for potential censorship to social
media platforms: The Center for Internet Security (CIS) and its CISA-funded Election
Infrastructure Information Sharing & Analysis Center (EI-ISAC); and two consortia: The
!
68
https://www.cisa.gov/sites/default/files/publications/June%202022%20CSAC%20Recommendations%20%
E2%80%93%20MDM_0.pdf#page=2.
69
https://www.racket.news/p/homeland-security-reorganizes-appearing?utm_source=post-email-
title&publication_id=1042&post_id=110070633&isFreemail=true&utm_medium=email.
70
https://theintercept.com/2022/10/31/social-media-disinformation-dhs/.
71
https://www.hawley.senate.gov/sites/default/files/2022-06/2022-06-
07%20DOCS%20ONLY%20CEG%20JH%20to%20DHS%20%28Disinformation%20Governance%20Bo
ard%29%5B1%5D.pdf.
72
https://storage.courtlistener.com/recap/gov.uscourts.lawd.189520/gov.uscourts.lawd.189520.268.0.pdf#pag
e=106.
12
The Atlantic Council’s Digital Forensics Research LabFounded in 2016, it receives
substantial taxpayer funding from a variety of agencies.
81
GraphikaFounded in 2013, it reportedly has historically received funding from
DARPA and the Defense Department’s Minerva Initiative.
82
Collectively, these groups sought to fill the gapby creating a mass-surveillance and censorship-
flagging platform aimed at “content intended to suppress voting, reduce participation, confuse
voters as to election processes, or delegitimize election results without evidence.”
83
In practice,
this meant targeting for suppression speech dubious of an unprecedented election given the
sweeping, pandemic-driven changes made to the voting system that cycle, whereby the razor-thin
final results in key states did not materialize for days.
84
EIP did so in part through lobbying social
media platforms to adopt more aggressive content moderation policies around election rhetoric,
and flagging relevant content including entire narratives via “tickets” for suppression by social
media platforms under their often EIP-influenced terms. EIP analystssome 120 of whom
worked on the project in the waning days of the 2020 election both identified content for
flagging via tickets, and incorporated requests from “trusted external stakeholders.”
85
It lists three
such governmental stakeholders: CISA,
86
CISA-backed EI-ISAC, and the State Department’s
GEC. EIP in fact connected government partnerswith platform partners” – understood to be
the social media companies to enable the former to debunk flagged content directly for the
latter.
87
Some raw numbers concerning EIP’s efforts during the 2020 election cycle alone illustrate the
size and scope of its effort. EIP:
Collected 859 million tweets for “misinformation” analysis.
Flagged for Twitter tweets shared 22 million times ultimately labeled “misinformation,” a
disproportionate percentage of which were dinged for “delegitimization,”
88
which Twitter
adopted as a standard for suppression.
89
!
81
https://www.google.com/url?q=https://foundationforfreedomonline.com/dhs-censorship-agency-had-
strange-first-mission-banning-speech-that-casts-doubt-on-red-mirage-blue-shift-election-
events/&source=gmail&ust=1683295935068000&usg=AOvVaw0KqPf31iTuGK3TncCZZwjc.
82
https://foundationforfreedomonline.com/dhs-censorship-agency-had-strange-first-mission-banning-
speech-that-casts-doubt-on-red-mirage-blue-shift-election-events/.
83
https://stacks.stanford.edu/file/druid:tr171zs0069/EIP-Final-Report.pdf#page=23.
84
Former Trump State Department Cyber official Mike Benz would observe that CISA, “tasked with
election security,” via EIPalso gained the power to censor any questions about election security.See:
https://foundationforfreedomonline.com/dhs-censorship-agency-had-strange-first-mission-banning-speech-
that-casts-doubt-on-red-mirage-blue-shift-election-events/.
85
https://judiciary.house.gov/sites/evo-subsites/republicans-judiciary.house.gov/files/2023-03/Sauer-
Testimony.pdf#page=19.
86
It is worth noting that CISA and EIP’s relationship went both ways. At times, evidence suggests, CISA
would forward reports of misinformation received directly from EIP on to social media platforms for their
review.
87
https://storage.courtlistener.com/recap/gov.uscourts.lawd.189520/gov.uscourts.lawd.189520.209.2.pdf#pag
e=47.
88
EIP cites as an example of delegitimization Claims of fraud or malfeasance with inaccurate or missing
evidence.” See:
https://storage.courtlistener.com/recap/gov.uscourts.lawd.189520/gov.uscourts.lawd.189520.209.2.pdf#pag
e=25.
89
Alex Stamos has challenged this characterization in terms of tweets EIP ensnared. The competing
arguments can be seen here: https://twitter.com/MikeBenzCyber/status/1644110224150736897?s=20.
13
Influenced platforms to take action on 35% of all URLs flagged 21% slapped with a
warning label where content remained visible, 13% removed, and 1% soft-blocked
with a warning one would have to bypass to view the content.
90
Pushed platforms to target dozens of “misinformation narratives” for throttling.
Impacted hundreds of millions of posts and videos across major social media platforms
via the terms of service policy changes for which EIP lobbied. EIP members openly
boasted that technology companies would never have modified their terms accordingly
without EIPs insistence and “huge regulatory pressure” from government.
91
Further demonstrating the interconnection between EIP and CISA, the group featured former
CISA Director Chris Krebs at the launch seminar associated with the report in which it divulged
some of these figures.
Of note, EIP coded less than one percent of its tickets for having an element of foreign
interference. EIP characterized all 21 of the “most prominent repeat spreaders” of election
integrity “misinformation” on Twitter as “conservative or right-wing.
92
Of the civil society
groups that submitted tickets to the EIP, many had a left-leaning bentincluding the DNC
itself.
93
None appear to have been right-leaning.
Mike Benz, a former State Department Cyber official during the Trump administration, has found
that many principals in EIP leadership were heavily invested in the idea that Russia interfered in
the 2016 presidential election, to President Trump’s benefit, and that they or the organizations
with which they were affiliated were critical generally of Trump and Western populist
movements. In an associated report, he concludes that given the backgrounds of EIP’s principals,
when originally conceived in June 2020 it should have been understood to be a partisan,
powerfully connected political network, panicked that Americans might push back on the use of
mail-in ballots months in the future,” convenedto stop that pushback from happening by
unleashing censorship of the Internet on a scale never before seen in American history.
94
Though the EIPs efforts would re-emerge in the 2022 election, in the interim it also launched a
successor effort called the Virality Project, targeting MDM spreading in relation to COVID-19,
such as “narratives that questioned the safety, distribution, and effectiveness of the vaccines.” Its
leaders, including Stamos communicated with CISA officials about their efforts, as they did
during the original EIP operation. DiResta would serve as principal Executive Editor of its final
April 2022 report, and contributors included herself alongside Kate Starbird and Matt Masterson.
Several current and former CISA interns are also listed as “researchers and analysts” who
monitored social media platforms in connection with the project.
The VPs stakeholders included federal health agencies, working alongside social media
platforms to combat, for example, vaccine-related “misinformation.” All told, the Virality Project
tracked content with 6.7 million engagements on social media per week or over 200 million
during the seven months over which the project transpired.
!
90
https://stacks.stanford.edu/file/druid:tr171zs0069/EIP-Final-Report.pdf#page=57.
91
https://foundationforfreedomonline.com/dhs-censorship-agency-had-strange-first-mission-banning-
speech-that-casts-doubt-on-red-mirage-blue-shift-election-events/.
92
https://stacks.stanford.edu/file/druid:tr171zs0069/EIP-Final-Report.pdf#page=205.
93
https://oversight.house.gov/wp-content/uploads/2022/11/DHS-Censorship-Letter-11022022.pdf/.
94
https://www.google.com/url?q=https://foundationforfreedomonline.com/dhs-censorship-agency-had-
strange-first-mission-banning-speech-that-casts-doubt-on-red-mirage-blue-shift-election-
events/&source=gmail&ust=1683295935068000&usg=AOvVaw0KqPf31iTuGK3TncCZZwjc.
14
Much of what the VP cast as “misinformation” included true facts to the extent they portrayed
narratives with which the project’s leaders and certainly its government partners disapproved
of, from reports of vaccine injuries to discussion of “breakthrough” cases and “natural
immunity,” to discussion of potential then-hypothetical vaccine mandates. VP particularly
targeted the speech of health freedomgroups, and like EIP, overwhelmingly targeted right-
leaning figures.
IV. Conclusion
We may find much of the speech that social media platforms have suppressed in recent years
under government coercion, cajoling, and/or collusion to be wrongheaded or objectionable. But
infinitely more wrongheaded, objectionable, and indeed dangerous for a free society than the
proliferation of “bad ideas” is perhaps the worst idea of all: That government should be the arbiter
of what we are allowed to think and speak.
The notion that to ensure the health and safety of the country, the public and private sectors must
work together to silence those who express unauthorized opinions, that such opinions are to be
treated as threats to an infinitely flexible definition ofcritical infrastructure,” and those who hold
them as actual or would-be domestic terrorists, is the stuff of tyranny.
That the state itself has treated as dangerous MDM that which ultimately often has become settled
science indicating government officials and their partners en masse should have been
deplatformed themselves by their own standards illustrates the folly of this project.
To turn over to the state and its private sector ancillaries a monopoly on narrative would
ultimately give these partners a monopoly on power, reducing us from citizens with agency to
hapless subjects.
We are a free people capable of evaluating information and ideas for ourselves to discern fact
from fiction, and separate good ideas from bad.
Historically, we would have held in utter contempt authorities who would suggest we are
incapable of thinking for ourselves, and that for our own benefit, since the authorities know best,
that they will do the thinking for us while silencing those who dare dissent.
No American should stand for it today.
If, as the foregoing suggests, CISA, and perhaps other DHS components, have played an integral
role in imposing a mass public-private censorship regime on the American people, it is incumbent
upon this and other relevant congressional bodies to get to the bottom of it.
This Subcommittee can help develop a comprehensive picture of the “public” side of the regime
within DHS by using its oversight powers to, over a timeline beginning from CISA’s inception in
November 2018, pursue the following questions:
Which offices and personnel within CISA
95
are or have been engaged in social media
censorship efforts, or related efforts to impact any other part of the “information
ecosystemas CISA has defined it?
!
95
Plaintiffs in Missouri v. Biden assert that On information and belief, CISA maintains a number of
15
Which other DHS agencies, and/or federal, state, county, and local government entities
have CISA coordinated with in connection with social media censorship efforts, or
related efforts to impact any other part of the “information ecosystem” as CISA has
defined it?
Which entities within DHS, independent of CISA, if any,
96
engaged in social media
censorship efforts, or related efforts to impact any other part of theinformation
ecosystemas CISA has defined it?
What specific policies and practices has each DHS entity developed and undertaken in
connection with each respective censorship effort?
Is there a comprehensive list of all communications, technology, media, educational, non-
profit, and any other non-governmental agency with which DHS broadly engaged in
fostering its censorship efforts?
What level of federal funding has each DHS entity received to carry out such censorship
efforts?
What level of federal funding has each private sector entity with which DHS interacted in
its censorship efforts received?
What have been the qualitative and quantitative impacts of such censorship efforts during
periods leading up to and immediately following the 2020 and 2022 elections?
What censorship efforts are CISA and/or any other DHS agencies engaging in at present,
and/or planning for in anticipation of the 2024 elections?
Only with full transparency can Congress and the American people understand the full size and
scope of this portion of the censorship regime and determine what if anything Congress ought to
do about itbe it in terms of withholding funding, curtailing operations, and/or holding
malefactors to account.
If indeed we have had a mass public-private censorship regime foisted upon us, defunding,
dismantling, and deterring government officials from participating in, or funding such an
apparatus ever again, would seem to be of the utmost importance.
Congress should be commended for efforts already underway to prevent such behavior.
97
I hope it
will do more.
Once again, thank you for the honor of appearing before you to discuss these important issues,
and I would be happy to answer any questions from the Committee.
!
task forces, working groups, and similar organizations as joint government-private enterprises,
which provide avenues for government officials to push for censorship of disfavored viewpoints
and speakers online.” See:
https://storage.courtlistener.com/recap/gov.uscourts.lawd.189520/gov.uscourts.lawd.189520.268.0.pdf#pag
e-87.
96
An August 10, 2022 DHS Office of Inspector General report indicates that several other DHS
components have engaged over “the last three years to counter disinformation originating from foreign and
domestic sources.[Emphasis mine] See https://www.oig.dhs.gov/sites/default/files/assets/2022-08/OIG-
22-58-Aug22.pdf#page=10.
97
https://www.congress.gov/bill/118th-congress/house-bill/140/text.
1
Congressional Testimony
Statement of Dr. Cynthia Miller-Idriss
Professor, School of Public Affairs and School of Education, American University
Founding Director, Polarization and Extremism Research and Innovation Lab (PERIL)
Washington, D.C.
Hearing
“Censorship Laundering:
How the U.S. Department of Homeland Security Enables the Silencing of Dissent”
The House Committee on Homeland Security
Subcommittee on Oversight, Investigations, and Accountability
U.S. House of Representatives
310 Cannon House Office Building
Washington, D.C. 20515
Thursday, May 11, 2023, 2:00 PM ET
2
Chairman Green, Ranking Member Thompson, and Members of the Committee: I would
like to thank you for your service to our country and for calling attention to the critical issue of
disinformation. My name is Cynthia Miller-Idriss, and I am a Professor in the Department of
Justice, Law, and Criminology and in the School of Education at the American University in
Washington, D.C., where I also direct the Polarization and Extremism Research and Innovation
Lab (PERIL)— an applied research lab in the School of Public Affairs. I have been studying
education-based solutions to the prevention of violent extremism, including through early
prevention related to disinformation and propaganda– for over twenty years. I want to
acknowledge the support of my research team at PERIL, whose assistance was invaluable in
preparing my testimony today.
1
The Polarization and Extremism Research and Innovation Lab, PERIL, develops
evidence-based initiatives- such as, short-form videos, trainings and train-the-trainer programs,
research studies, community toolkits and guides- to build social cohesion, reverse political
polarization, and prevent violent extremism. Utilizing a public health framework and multi-
disciplinary, pre-preventative approaches, we design, test, and scale up evidence-based tools and
intervention strategies to help people recognize and reject harmful online and offline content,
propaganda, supremacist ideologies, conspiracy theories, misinformation, and disinformation
while safeguarding their freedom of speech. As widely recognized experts and leaders in the
field of preventing extremism and radicalization, we have created effective, evidence-based
resources to inoculate against propaganda and extremist content, as well as empower individuals
to intervene and interrupt early radicalization and keep their loved ones safe from online
manipulationall as an alternative to security-based approaches that rely on surveillance,
monitoring, censorship, or banning.
3
PERIL’s work focuses specifically on equipping people with tools to recognize online
manipulation in order to protect themselves and their loved ones from disinformation that seeks
to harm them (see below for definitions of disinformation and related terms). We do not teach
people what to think; our work is nonpartisan and rooted in evidence. Our focus is on responding
to community needs and on providing resources to help people understand the kinds of
persuasive techniques that bad actors often use to manipulate others. For example, foreign
influence operations, domestic and international extremist and terrorist groups, and scammers
seeking a profit will often use rhetorical strategies, propaganda, and emotional tactics that are
designed and used to convince others to believe, think, or act in a specific manner. These
persuasive techniques manipulate observers for the purposes of grooming, recruiting, and
building support for violent ideologies, tactics, strategies, or actions. Our research has
demonstrated with consistent statistical significance that people can learn to recognize persuasive
and manipulative tactics in order to make more informed choices in their lives, especially online.
SCOPE AND SCALE: COMMUNITIES IN NEED
The national crisis facing communities across the country is all too evident. Over the past
three years, PERIL has fielded a constant stream of emails and calls from individuals and
communities across the country– all asking for help confronting the impacts of disinformation
and propaganda in their lives. In Michigan, a grandfather and military veteran wrote to ask what
he could do about his grandson, who had joined an armed militia. In Texas, faith leaders asked
for ways to support pastors whose congregations were torn apart by partisan polarization and
conspiracy theories. In Washington State, a local government needed training for city employees
4
to prevent polarization and reject online manipulation. In Vermont, a local entrepreneur asked if
the school system could do more to ensure that his future employees– most of whom he hired
straight from the local high school– would stop espousing so much propaganda and conspiracy
theories, which had become a problem for his business. A local mom wanted help with her
middle school son, who during the pandemic had consumed so much online misogyny that he
said he did not need to respect her authority as a parent, because she is a woman.
These stories illustrate what research evidence has also demonstrated: we face a national
crisis rooted in the rampant circulation of propaganda, dis/mis and malinformation, and other
harmful online content. American communities are coming to us because they feel threatened by
online disinformation. Some fall prey to hostile foreign influence operations by people who try
to manipulate Americans for profit or to disrupt our democratic process. People give their bank
information to scammers pretending to be from the IRS. Teenagers share intimate details of their
lives with people online who they think are friends their own age, but who are not. Others come
to believe propaganda and disinformation that lures them into what they think is heroic action to
save their racial or ethnic group after going down rabbit holes of antisemitic conspiracies about
demographic change and a supposed orchestrated replacement of white people.
The data on this is clear. The pace, scope, and scale of violent extremism have probably
increased and are escalating rapidly. The Anti-Defamation League reports that white supremacist
propaganda efforts are at the highest level they have ever recorded, jumping 38% above 2021
levels to 6,751 reported cases in 2022.
2
These incidents include distribution of racist, antisemitic,
and anti-LGBTQIA+ fliers, graffiti and posters, stickers, banners, and laser projections that have
heavily targeted houses of worship and other community institutions.
5
The repercussions of so much circulation of propaganda, conspiracy theories, and
disinformation are abundantly clear. Between 2013 and 2021, the number of open domestic
terrorism-related cases in the U.S. jumped 357% to 9,049 cases, with the most violent incidents
being committed by racially or ethnically-motivated violent extremists during the same years.
3
Of the 444 people killed by extremism in the U.S. between 2013 and 2022, the significant
majority of deaths were at the hands of right-wing extremists (335 deaths, or 75%).
4
Of those
killed by right-wing extremists in 2021, 73% were affiliated with white supremacy, 5% with
incel/toxic masculinity extremism, and 17% with anti-government extremism.
5
The 2022 racist
shooting that killed 10 people in a grocery store in a predominantly Black neighborhood in
Buffalo, motivated by the false Great Replacement conspiracy theory, is just one tragic recent
example.
Non-lethal attacks have also risen significantly. More than 50 bomb threats were made to
HBCUs (Historically Black Colleges and Universities) and predominantly Black churches in
2022. And the problem goes well beyond white supremacist extremism.
6
Antisemitism,
conspiracy theories, anti-LGBTQ+ hate, and misogynistic content has spiked across online
platforms. Before he was banned from social media platforms in mid 2022, violent and deeply
misogynistic videos from one content creator were viewed 12 billion times on TikTok alone.
7
Violent outcomes often show a toxic mix of ideological hatred. Just this week, eight people lost
their lives at a Texas shopping mall at the hands of a man with a swastika tattoo who had posted
both violent misogynistic and neo-Nazi content online.
In October 2020, the U.S. Department of Homeland Security under President Trump
issued a threat assessment report declaring domestic violent extremism in general and white
supremacist extremists (WSEs) in particular the ‘most persistent and lethal threat in the
6
Homeland.’
8
The Biden administration issued a similar assessment in spring 2021,
9
followed by
the first ever national strategy to counter domestic terrorism, noting the rising threat from white
supremacist extremism and anti-government and unlawful militias that threaten civilians, elected
officials, and democratic institutions.
10
Much of this violence is motivated by disinformation,
propaganda, and conspiracy theories. According to the Global Terrorism Database, terrorist
attacks motivated by conspiracy theory extremists were responsible for 119 attacks in 2020– a
jump from 6 attacks the year before– in Australia, New Zealand, the United States, Canada,
United Kingdom and Germany.
11
Meanwhile, hate crimes in the U.S. are at the highest level in
decades,
12
despite persistent underreporting. In sum, the U.S. and our allies have seen rising
violent extremism and hate-fueled and political violence fueled by antisemitism, conspiracy
theories, propaganda, disinformation, and other harmful online content as a pattern of violence
that has been escalating for years.
WHAT WORKS TO PREVENT AND COUNTER DISINFORMATION
The good news is there is a growing body of evidence about what works to equip the
public with tools that shore up their capacity to intervene in pathways to violent extremism,
while protecting their right to free speech and reducing the need for security-based approaches.
We have found that it only takes 7-12 minutes of reading one of our intervention guides for its
audience to be significantly better informed about harmful online content and the risks of
radicalization to violence; to feel more empowered and confident about intervening; to build
their own capability to intervene; and to know where to get more help.* This is the case across
our research with parents and caregivers, including grandparents, uncles, and cousins; with
7
educators and youth mentors; with local governments and small businesses, and more. For
example, in just 12 minutes of reading one of our intervention tools, 85% of our participants
understood the process by which youth become radicalized, and 83% felt that they knew where
to get help if they suspect a young person to be engaging in extremist ideas.
Importantly, across our work, we found that both prior to and after reading our parents
and caregivers guide, Democrats and Republicans did not significantly differ in their knowledge
of extremism. Republicans scored significantly better (5% better) than Democrats did in terms of
knowledge of extremism after having read the guide, and members of both political parties
reported being satisfied with the guide’s contents and equally willing to intervene with a young
person they suspect is coming into contact with radicalizing content. We have also found that
education alone doesn’t solve our problem of disinformation. Our research has shown that higher
levels of education do not necessarily mean people have the skills to consistently recognize
harmful manipulation tactics online. On the contrary- we found that parents with higher levels of
education were overconfident in their ability to help children distinguish trustworthy and
untrustworthy news sources. After reading our guide, their confidence went down as they
realized how tricky online disinformation and harmful content can be.
There is also strong emerging evidence that even short interventions can have a lasting
impact on local communities. We are currently studying a group of 1500 parents and caregivers
in three-month intervals for a full year after reading our intervention tool. Three months after
reading our guide for parents and caregivers, over 11% (135 individuals) of respondents said that
after the intervention, they either joined or created a group that discusses issues of youth
radicalization and extremism. Six percent of our participants, or about 75 people, told us that
within the three months after reading our guide, they used what they learned to take direct action
8
to prevent youth from radicalizing further or being recruited through additional online
manipulation. Overall, three months after reading our guide, parents and caregiver retained the
vast majority of the knowledge and skills they had learned. Seventy five percent of participants
reported understanding the process by which youth become radicalized online - a 23% increase
from the initial survey - and 70% felt prepared to talk with youth about online extremism - only a
5% drop from the initial survey. Over a third of participants told us they had shared or used the
information with their biological children, while nearly 13% shared it with other young people in
their family, including grandchildren, nephews and nieces, and cousins.
Taken together, our evidence shows that it is possible to provide communities with tools
to be safer online. Parents, grandparents, teachers, coaches, mental health professionals, and
others deserve help confronting an unprecedented amount of harmful online content and being
more confident and capable to keep their families safe and protected from harmful online
content. All communities need information and tangible action steps for how to help their loved
ones resist manipulative rhetoric, propaganda, conspiracy theories, and disinformation they are
exposed to online and offline in ways that help them make better choices while avoiding
censorship, surveillance, monitoring, or other security-based approaches.
ADOPTING A PUBLIC HEALTH APPROACH
PERIL advocates for a holistic public-health mode of prevention consisting of
investments at the primary, secondary, and tertiary prevention levels to prevent violent
extremism and the components that contribute to it, including disinformation and propaganda.
Primary prevention refers to efforts to address radicalization before it takes root, including
9
through broad civic education and media literacy focused on helping the public build resilience
in ways that do not infringe on their right to free speech or free association, and that work as an
alternative to security-based approaches that surveil, monitor, censor, or ban content. Secondary
prevention refers to efforts to mitigate the impacts of already radicalized people and groups,
primarily through surveillance, monitoring, arrests, and interruption of plots. Tertiary prevention
refers to focused deradicalization efforts, including through prison deradicalization programs and
“exit”-type counseling services that help radicalized individuals disengage from extremism.
An effective public health approach to countering disinformation builds prevention and
intervention across all three of those levelswith the significant majority of efforts and
resources on the primary prevention side—and would require four things. First, it must be nimble
and responsive to communities’ needs depending on regional areas of concern. Second, it must
be holistic and whole-of-community in ways that broaden engagement of a wide range of
government offices, agencies, and organizations beyond the security and law enforcement
sectors, such as the education, health and human services, and mental health sectors. It would
include primary prevention efforts through the arts, community organizations, faith communities,
or other community-based non-profits. Third, an effective public-health prevention model rests
on evidence at all levels of intervention. This means moving beyond outcome evaluations that
describe only outputs, or the numbers of people trained, the numbers of downloads of a
particular tool, or other descriptive metrics that do not actually provide evidence of impact.
Finally, a holistic public health approach focuses on building resilient systems as well as resilient
individuals. Resilience to propaganda and disinformation is not merely a technical skill, in other
words: it is also rooted in national and community values and commitment to an inclusive
democracy that must be reinforced, emphasized, and modeled in all aspects of life across the life
10
course. The aim is to reduce the fertile ground in which disinformation, propaganda, hate and
anti-democratic ideas thrive.
This is a vision of a public health-style prevention system that works to prevent violence
and counter harm while simultaneously promoting concrete steps toward inclusive equity,
respect, coexistence, and real and symbolic recognition of difference. Such a prevention system
gives us the best chance of building community social cohesion, reducing violent outcomes, and
strengthening our democracies.
POLICY RECOMMENDATIONS
1. Invest in a holistic, community-based, public-health approach to preventing the spread of
supremacist ideologies, mis/dis/malinformation, conspiracy theories, and propaganda.
This includes creating avenues to fund both pilot testing of innovative approaches,
followed by national scale-up of what is proven to be effective in primary-level
prevention strategies, including digital literacy and civic education that equips educators,
parents and caregivers, youth mentors, faith leaders, coaches, mental health counselors,
and others with better tools to recognize and “offramp” individuals who are persuaded by
disinformation from further radicalization to violence. The federal government can
support the creation of impact driven networks that bring together government agencies
from well beyond the security sector; civil society institutions like schools, mental health
professionals, sports leagues and after school programs; local NGOs and advocacy efforts
that enhance community wellness; and others. At the local level, people need to hear and
see pathways for their own engagement, to spark their imaginations about ways to act, to
11
be moved to change their behaviors, to know there are resources to support their learning,
and to want to know more in ways that make a difference in their families’ and
communities’ wellbeing.
2. Incentivize and prioritize rigorous impact assessment and evaluation frameworks to
ensure policies and programs are implemented as intended and are effective beyond
descriptive metrics. Evaluation frameworks and results funded with public dollars should
be made publicly available to ensure transparency and reduce the need for every initiative
to reinvent the wheel.
3. Ensure that prevention initiatives focus on equipping the public with better tools for their
own decision-making, while not interfering with any individual’s freedom of speech,
conscience, or association. We cannot repeat the mistakes of historical civil liberties
violations or promote censorship as a solution to disinformation.
4. Continue to work with the tech sector to remove harmful and dangerous content, while
understanding that banning and content removal is an after-the-fact solution that does not,
on its own, solve the crisis of disinformation and propaganda. Addressing the problem of
disinformation must begin with upstream prevention that reduces the production of and
receptivity to harmful content to begin with.
5. Create a central, national, nonpartisan center for prevention to provide federal, state and
local governments and all local communities with tools, resources, training, capacity-
building, and evidence about what works.
12
DEFINITIONS
We define disinformation as false, untrue, or incorrect information spread to
intentionally deceive, manipulate, misinform and erode an individual or group’s belief of
established facts, often with a specific interest or goal. This includes efforts from hostile foreign
influence operations, profiteers, and international extremist and terrorist groups who aim to harm
American democracy, U.S. elections, or scam unsuspecting Americans for profit. It also includes
domestic efforts that undermine inclusive democracy, such as antisemitism or anti-immigrant
conspiracy theories, or compromise the physical health and wellbeing of communities.
Disinformation is similar but distinct from misinformation, which is the unintentional sharing of
false or incorrect information or untrue claims spread without the aim to deceive, manipulate, or
harm. It also differs from malinformation, which refers to true claims spread with the intent to
deceive, manipulate, or harm. Propaganda refers to manipulative persuasive techniques that
seek to make people believe true or untrue information, or values and opinions, sometimes using
dis/mis/malinformation, persuasive narratives (stories that help audiences imagine themselves as
heroes, villains, victors, or victims), or rhetoric (emotionally-stirring language, image and
sounds), which lend manipulative power. Extremism is the belief that one group of people is in
dire conflict with other groups who don’t share the same racial or ethnic, gender or sexual,
religious, or political identity. This “us” versus “them” framework positions the ‘other’ as an
existential threat and calls for total separation, domination, or other forms of violence.
Notably, terrorist violence from domestic violent extremists does not usually link back to
specific groups. Instead, it’s most often perpetrated by individuals who have experienced
13
networked online radicalization through exposure to propaganda. Groups are still an important
source of much of the propaganda that circulates in extremist scenes and subcultures, including
online. Finally, it is important to note that the spread of online propaganda and disinformation is
fueled by how people spend time online. Online radicalization happens in part when people
spend time in echo chambers, where extreme content is self-reinforcing across platforms. There
is also significant algorithmic radicalization through recommendation systems that suggest
content that is related, but more salacious or more extreme than the content the viewer just
watched. This can lead to “rabbit holes” of disinformation, conspiracy theories and propaganda
consumption that are difficult to climb out of.
CONCLUSION
The crisis of domestic violent extremism that is fueled by disinformation and propaganda
cannot be solved by law enforcement and security-based approaches. We must invest in
upstream strategies to keep communities safe from online harms. We seek a world in which
every community is equipped with the tools they need to reject harmful online propaganda,
conspiracy theories, and manipulative content without the need for censorship, surveillance,
banning, or security-based solutions. Thank you for your attention and I look forward to your
questions.
14
ENDNOTES
1. With gratitude to researchers and staff at American University’s Polarization and
Extremism Research and Innovation Lab (PERIL) who helped prepare this written
testimony: Dr. Pasha Dashtgard, Dr. Brian Hughes, Laura Kralicky, Wyatt Russell, and to
colleagues at the Southern Poverty Law Center for their support and partnership on many
of the intervention tools and guides cited above.
2. Anti-Defamation League, Center on Extremism. “Audit of Antisemitic Incidents, 2022.”
March, 2023. https://www.adl.org/sites/default/files/pdfs/2023-03/ADL-2022-Audit-of-
Antisemitic-Incidents-2021.pdf
3. United States Government Accountability Office, Report to the Ranking Member,
Committee on Homeland Security, House of Representatives. “DOMESTIC
TERRORISM Further Actions Needed to Strengthen FBI and DHS Collaboration to
Counter Threats.” GAO-23-104720. February, 2023. https://www.gao.gov/assets/gao-23-
104720.pdf
4. Anti-Defamation League, a Report from the ADL Center on Extremism. “Murder &
Extremism in the United States in 2022.” February, 2023.
https://www.adl.org/sites/default/files/pdfs/2023-02/Murder-and-Extremism-in-the-
United-States-in-2022.pdf
5. Anti-Defamation League, a Report from the ADL Center on Extremism. “Murder &
Extremism in the United States in 2022.” February, 2023.
https://www.adl.org/sites/default/files/pdfs/2023-02/Murder-and-Extremism-in-the-
United-States-in-2022.pdf
6. Weissman, Sara. “Suspect Identified in Bomb Threats Against HBCUs.” Inside Higher
Ed., November 16, 2022. https://www.insidehighered.com/news/2022/11/17/fbi-says-
most-bomb-threats-against-hbcus-made-minor
7. Miller-Idriss, Cynthia. “How to Counter Andrew Tate’s Growing Subculture of Violent
Toxic Masculinity.” MSNBC, February 7, 2023. https://www.msnbc.com/opinion/msnbc-
opinion/counter-andrew-tates-growing-subculture-violent-toxic-masculinity-rcna69411.
8. U.S. Department of Homeland Security. “Homeland Threat Assessment.” October, 2020.
https://www.dhs.gov/sites/default/files/publications/2020_10_06_homeland-threat-
assessment.pdf
9. Office of the Director of National Intelligence, “Domestic Violent Extremism Poses
Heightened Threat in 2021.” March 1, 2021.
https://www.dhs.gov/sites/default/files/publications/21_0301_odni_unclass-summary-of-
dve-assessment-17_march-final_508.pdf.
10. Executive Office of the President, National Security Council. “National Strategy for
Countering Domestic Terrorism.” June, 2021. https://www.whitehouse.gov/wp-
content/uploads/2021/06/National-Strategy-for-Countering-Domestic-Terrorism.pdf
11. START (National Consortium for the Study of Terrorism and Responses to Terrorism).
(2022). Global Terrorism Database 1970-2020 [data file]. https://www.start.umd.edu/gtd
12. U.S. Department of Justice, Federal Bureau of Investigation, Criminal Justice
Information Services Division. “Supplemental Hate Crime Statistics, 2021”. March,
2023. https://cde.ucr.cjis.gov/LATEST/webapp/#/pages/explorer/crime/hate-crime.
15
*Findings from the lab are included in the following works:
PERIL (Polarization & Extremism Research & Innovation Lab) & SPLC (Southern Poverty
Law Center). “Empowered to Intervene: An Impact Report on the PERIL/SPLC Guide
to Youth Radicalization. 2021. https://www.splcenter.org/peril-assessments-
impact#impact-parents
PERIL & SPLC. “Building Networks & Addressing Harm: A Community Guide to Online
Youth Radicalization, Impact Study.” November, 2023.
https://www.splcenter.org/peril-assessments-impact#impact-networks
PERIL & SPLC. “Parents & Caregivers Longitudinal Impact Study.”
PERIL & Rosa Luxemburg Stiftung. “Resources to Combat Extremism: Impact Study
Report.” 2022.
PERIL & The Lumina Foundation. “Building Resilient & Inclusive Communities of
Knowledge.” July, 2022. https://perilresearch.com/wp-content/uploads/2022/12/PERIL-
Building-Inclusive-and-Resilient-Campuses-v15.2-FINAL.pdf