Does your website allow comments?  President Trump may be talking to you in his Executive Order on Preventing Online Censorship


In light of a whirlwind week on Twitter where President Donald Trump has been challenged for sharing a debunked allegation regarding Former Congressman and current MSNBC Morning Joe host Joe Scarborough and was fact-checked by Twitter itself based on two voter fraud statements labeled as “potentially misleading,” the President responded with a strong rebuke, issuing an Executive Order on Preventing Online Censorship (the “EO”).  This Order seeks to impact social media platforms like Twitter and Facebook while also creating potential for concern for newspapers and businesses of all sizes that allow for open comments on a website or app. 

The Order and its challenges – in a nutshell

The EO attempts to use creative means to curb what the President believes are biases by several Internet platforms.  These efforts include (1) filing a petition for rulemaking with the Federal Communications Commission (“FCC”) to issue a re-interpretation of Section 230; (2) directing the Federal Trade Commission (“FTC”) to examine whether the Internet platforms are engaging in deceptive practices; (3) working with states’ Attorneys General to determine whether states laws might be relevant; and (4) threatening to withhold any federal funds spent on advertising on these Internet platforms.  The Department of Justice may also conduct investigations as to whether these allegedly-biased Internet platforms are violating antitrust laws.

Back to the beginning: The history behind the Executive Order’s target - Section 230 of the Communications Decency Act

Section 230 (47 U.S.C. § 230) was a response to two New York cases decided in the early 1990s.  In Cubby, Inc. v. CompuServe, Inc., 776 F. Supp. 135 (S.D.N.Y. 1991), the court held that an internet service provider could not be held responsible for defamation when it had not reviewed the content before it was posted.  In Stratton Oakmont, Inc. v. Prodigy Servs. Co., ‎23 Media L. Rep. 1794 (N.Y. Sup. Ct. 1995), the court determined that a web services company with moderated online bulletin boards had become a publisher that could be held responsible for defamatory postings even if it merely moderated some posts.   In an effort to balance these decisions and particularly to respond to the ruling in Stratton Oakmont, Representative Ron Wyden (D-OR) and Chris Cox (R-CA) drafted a bipartisan amendment to the Communications Decency Act to distinguish “providers of an interactive computer service” from traditional publishers of content in order to encourage free speech on the internet while allowing for the creation of standards for policing content and providing for the safety of children.  In fact, the initial bill was called the “Internet Freedom and Family Empowerment Act.”

Section 230 is comprised of two key components providing separate rationales for immunity.  The first component has been dubbed “the 26 words that created the internet.”  Under this first section, “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” § 230(c)(1).

The second component, supporting the family empowerment aspect of the bill, is the protection from liability for any provider and user of interactive computer services for “any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected.”  § 230(c)(2)(A).   

Over the years, Section 230’s scope and interpretation has varied, with many calls for a “refresh” based on modern approaches to the Internet.  With such legislative changes out of sight at this time, courts have balanced a plethora of interesting facts and challenges as they have frequently upheld this defense for service providers since 1996.

The Directives Under the Executive Order


Section 1 provides the rationale for the EO.  Noting that the freedom of speech is “the bedrock of American democracy,” the EO asserts that “a limited number of online platforms” were hand-picking “the speech that Americans may access and convey on the internet.”  The EO maintains that any action taken by the platforms means that the platforms have ceased functioning “as passive bulletin boards, and ought to be viewed and treated as content creators” in a manner that alters and potentially diminishes the current protections within Section 230.   

Specifically, the EO maintains that platforms are “engaging in selective censorship” by flagging content that doesn’t violate platform terms of services, changing company policies that have a perception of favoring viewpoints, and deleting accounts without warning or recourse.  As an example, the Order notes how recent tweets, such as the President’s assertions on voter fraud, were greeted with a warning label while Representative Adam Schiff, as the EO suggests, has been “peddling the long-disproved Russian Collusion Hoax” therefore leading to a demonstration of “political bias.”   

Protection Against Online Censorship

In Section 2 of the Executive Order, the President outlines the policy to foster “clear, nondiscriminatory ground rules promoting free and open debate on the Internet.”  The goal of the EO is to assure that Section 230 immunity “should not extend beyond its text and purpose to provide protection for those who purport to provide users a forum for free and open speech, but in reality, use their power over a vital means of communication to engage in deceptive or pretextual actions stifling free and open debate by censoring certain viewpoints.”  Indeed, the EO concludes that Section 230 should assure that the internet is a “forum for a true diversity of political discourse” and that the statute should be construed to support that purpose. 

The EO, tracking the language in Section 230, notes that the law provides immunity when a provider does not moderate content or when it acts in “’good faith’ to restrict access to content that it considers to be ‘obscene, lewd, lascivious, filthy, excessively violent, harassing or otherwise objectionable.’”  The EO states that the provisions do not apply to nor extend to “deceptive or pretextual actions restricting online content or actions inconsistent with an online platform’s terms of service.”

As a result, the EO orders all executive departments and agencies to narrow the scope of interpretation of Section 230(c) in accordance with the White House interpretation.  It orders the Secretary of Commerce in consultation with the Attorney General, though the National Telecommunications and Information Administration (NTIA) to file a petition for rulemaking with the FCC within 60 days that requests proposed regulations that clarify several asserted issues:

  • Whether a provider who restricts content in a manner not related to obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected can claim immunity
  • the meaning of “taken in good faith” especially in the context of actions that appear to be (1) “deceptive, pretextual, or inconsistent with a provider’s terms of service” or (2) the result of an asserted inadequate notice, the product of potential unreasoned explanation, or having been undertaking without a meaningful opportunity to be heard

The President has also asked for any proposed regulations that the NTIA concludes should be appropriate to advance the policy noted in the Executive Order.

Protecting Federal Taxpayer Dollars from Financing Online Platforms That Restrict Free Speech

In Section 3 of the Order, the head of each executive agency and department is ordered to review that organization’s advertising and marketing spends paid to online platforms within 30 days and report the findings to the Director of the Office of Management and Budget.  The Department of Justice is ordered to undertake a review of “viewpoint-based speech restrictions imposed by each online platform” that is identified and assess “whether any online platforms are problematic vehicles for government speech due to viewpoint discrimination, deception to consumers, or other bad practices.”

Federal Review of Unfair or Deceptive Acts or Practices

Section 4 suggests that the United States has adopted the policy that a large social media platform is a traditional public forum.  To support its goals of promoting free expression and debate, the White House Office of Digital Strategy is ordered to redeploy the White House Tech Bias Reporting Tool to collect complaints of online censorship and other potentially unfair and deceptive acts and to submit them to the Department of Justice and the FTC.   The Order then commands the FTC to consider taking action for unfair and deceptive practices against any platform under 15 U.S.C. § 45.  Moreover, the Order maintains that “unfair or deceptive acts or practice may include practices by entities covered by section 230 that restrict speech in ways that do not align with those entities’ public representations about those practices.”  For large online platforms, the FTC is authorized to develop a report describing the claims and making the report public. 

State Review of Unfair or Deceptive Acts or Practices and Anti-Discrimination

The EO orders the Attorney General to create a working group to assess the potential enforcement of state statutes relating to unfair and deceptive practices.  This mandate includes the development of model legislation where “existing statutes do not protect Americans from such unfair and deceptive acts and practices.”  The EO establishes that the working group will also collect information on:

  • increased scrutiny of certain users based on those they choose to follow or interact with
  • algorithms that may suppress content based on political alignment or viewpoints of users
  • policies allowing behavior asserted to be impermissible when committed by the Chinese Communist Party or other “anti-democratic” associations or governments
  • reliance on the use of third parties to review content that may demonstrate “indicia of bias”
  • acts that limit the ability of individuals of certain viewpoints to earn as much money as others similarly situated


The EO instructs the Attorney General to develop a legislative proposal to support the policy objectives of the Order.

Does this only apply to social media platforms?  If not, am I is included?

The EO defines an “online platform” as any website or app where users can create and share content, engage in social networking, or perform a search on a search engine.  It does include Twitter, Facebook, Instagram, and other traditional social media entities but it also includes newspapers with active comment sections, business websites that encourage customers to post comments or feedback, or interactive apps that allow for sharing information. 

What does it mean if providers and online platforms are considered “the public square?”

The EO states that “these platforms function in many ways as a 21st century equivalent of the public square.”  Public squares have historically constituted “public forums.”  Where a space is a public forum, blocking or prohibiting certain speech constitutes viewpoint discrimination that violates the First Amendment. But the Supreme Court has held that to potentially qualify as a public forum for purposes of First Amendment analyses, the space must be owned or controlled by the government.​

Can the President compel the FCC, an independent agency, to enforce or apply this interpretation of Section 230?

Many legal scholars have already opined that it may be a violation of the traditional view of separation of powers for the Executive Branch to interpret or command the interpretation of the law.  The EO may, therefore, find itself bearing little weight as it commands the FCC to interpret Section 230 in a distinct manner.  In Reno v. ACLU, 521 U.S. 844 (1997), the Supreme Court established the principle that courts definitively interpret the law, not an Executive Agency.  The White House may find opposition, therefore, arguing that it cannot compel an Independent Agency to do anything, much less to definitively interpret the law, as that is the purview of the courts. 

Do we have to provide some form of “due process” if we are removing content?

The EO maintains that a lack of adequate notice, reasoned explanation, and meaningful opportunity to be heard are emblematic problems in today’s internet marketplace of ideas.  In order to support this, the EO directs the NTIA to create regulations that may impose such guidelines or mandates.

How can platforms and providers address potentially false and misleading content if it may be perceived as being discriminatory against a viewpoint?

Section 230 allows platforms and providers to address and restrict content that is “obscene, lewd, lascivious, filthy, excessively violent, harassing or otherwise objectionable.”  The EO seeks to protect against what they consider deceptive or pretextual actions that are “often contrary to their stated terms of service.”  The question then becomes whether false and misleading content meets the definition of “otherwise objectionable” under Section 230. 

Are there any other potential challenges that could block the enforcement of the Executive Order?

These various approaches suffer from significant legal shortcomings, including most prominently, the First Amendment.  But the Executive Order cannot simply be ignored.  There are likely to be multiple lawsuits challenging this order.  And assuming a petition for rulemaking is filed at the FCC, there will be a formal process with comments and reply comments at the FCC as to whether to grant the petition for rulemaking and begin a proceeding to re-examine Section 230.  The FTC may also open an investigation, although as an independent agency, the President cannot direct them to do so.  In addition, there may be attempts in Congress to revisit Section 230.   Ultimately, any real substantive change is unlikely to occur, but in the short term, there will be a great deal of activity and uncertainty as a result of this Executive Order.

Is there anything I should do in response to this Executive Order?

  1. Determine if this impacts you. Initially, you should consider whether you have a website or app that falls in the scope of an online platform.  Although it may be a direct response to Twitter, this EO can impact newspapers, or any business with an online platform.
  2. Prepare to comment on the petition for rulemaking. As the rules roll out, assuming that a lawsuit does not limit or negate the scope of the Executive Order before the rulemaking is drafted, your Butzel attorney is ready to provide assistance in preparing a response to this potentially significant regulatory change.
  3. Depending on the landscape, prepare for a review and revision of your Terms of Use. If and when these concepts become regulation, it’s clear that an organization’s strategy for monitoring inappropriate content must be assessed.  At that time, reach out to your Butzel attorney.
  4. Consult with your Butzel attorney. Reach out with any specific questions on the President’s Executive Order Preventing Online Censoring and how any of the policies might impact you or your business.

Jennifer Dukarski

Robin Luce Herrmann

Steve Goodman

What's Trending

Follow us on social media

Jump to Page

By using this site, you agree to our updated Privacy Policy and our Terms of Use.