As a lead up to the launch of the Citizen Media Law Project's Legal Guide in January, we'll be putting up longer, substantive blog posts on various subjects covered in the guide. This first post in the series stems from a talk I gave at the Legal Risk Management in the Web 2.0 World conference in Washington, DC. As the token academic, I had the task of providing a general overview of the liability that publishers might face if they allow users to comment on or submit content to their sites. I've adapted this post from that talk.
I'll provide some brief background on section 230 of the Communications Decency Act ("CDA 230") and highlight the types of claims and online activities it covers as well as the types of activities that might fall outside CDA 230's immunity provisions.
Publisher and Distributor Liability
Before I discuss the ins and outs of CDA 230, however, I want to highlight the difference between publisher and distributor liability. Under standard common-law principles, a person who publishes a defamatory statement by another bears the same liability for the statement as if he or she had initially created it. Thus, a book publisher or a newspaper publisher can be held liable for anything that appears within its pages. The theory behind this "publisher" liability is that a publisher has the knowledge, opportunity, and ability to exercise editorial control over the content of its publications.
Distributor liability is much more limited. Newsstands, bookstores, and libraries are generally not held liable for the content of the material that they distribute. The concern is that it would be impossible for distributors to read every publication before they sell or distribute it, and that as a result, distributors would engage in excessive self-censorship. In addition, it would be very hard for distributors to know whether something is actionable defamation; after all, speech must be false to be defamatory.
Not surprisingly, the first websites to be sued for defamation based on the statements of others argued that they were merely distributors, and not publishers, of the content on their sites. One of the first such cases was Cubby v. CompuServe, Inc., 776 F.Supp. 135 (S.D.N.Y. 1991). CompuServe provided subscribers with access to over 150 specialty electronic "forums" that were run by third parties. When CompuServe was sued over allegedly defamatory statements that appeared in the "Rumorville" forum, it argued that it should be treated like a distributor because it did not review the contents of the bulletin board before it appeared on CompuServe’s site. The court agreed and dismissed the case against CompuServe.
Four years later, a New York state court came to the opposite conclusion when faced with a website that held itself out as a "family friendly" computer network. In Stratton Oakmont v. Prodigy, 23 Media L. Rep. 1794 (N.Y. Sup. Ct. 1995), the court held that because Prodigy was exercising editorial control over the messages that appeared on its bulletin boards through its content guidelines and software screening program, Prodigy was more like a "publisher" than a "distributor" and therefore fully liable for all of the content on its site.
The perverse upshot of the CompuServe and Stratton opinions was that any effort by an online information provider to restrict or edit user-submitted content on its site faced a much higher risk of liability if it failed to eliminate all defamatory material than if it simply didn’t try to control or edit the content of third parties at all.
Passage of the Communications Decency Act
This prompted Congress to pass the Communications Decency Act in 1996. The Act contains deceptively simple language under the heading "Protection for Good Samaritan blocking and screening of offensive material":
No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.
CDA 230 further provides that "[n]o cause of action may be brought and no liability may be imposed under any State or local law that is inconsistent with this section."
So is an "interactive computer service" some special type of website? No. For purposes of CDA 230, an
"interactive computer service" means any information service, system, or access software provider that provides or enables computer access by multiple users to a computer server.
Most courts have held that through these provisions, Congress granted interactive services of all types, including blogs, forums, and listservs, immunity from tort liability so long as the information is provided by a third party.
As a result of CDA 230, Internet publishers are treated differently from publishers in print, television, and radio. Let's look at these difference in more detail.
Claims Covered by CDA 230
CDA 230 has most frequently been applied to bar defamation-based claims. In the typical case, a plaintiff who believes she has been defamed sues both the author of the statement and the website that provided a forum or otherwise passively hosted the material. Courts have held with virtual unanimity that such claims against a website are barred by CDA 230.
But immunity under CDA 230 is not limited to defamation or speech-based torts. Courts have applied CDA 230 immunity to bar claims such as invasion of privacy, misappropriation, and most recently in a case brought against MySpace (Doe v. MySpace, 474 F.Supp.2d 843 (W.D. Tex. 2007)), a claim asserting that MySpace was negligent for failing to implement age verification procedures and to protect a fourteen-year old from sexual predators.
I should note, however, that CDA 230 explicitly exempts from its coverage criminal law, communications privacy law, and "intellectual property claims." In interpreting these exclusions, courts agree that Congress meant to exclude federal intellectual property claims, such as copyright and trademark, but they disagree whether state-law intellectual property claims (or claims that arguably could be classified as intellectual property claims, such as the right of publicity) are also exempted from the broad immunity protection CDA 230 provides.
Finally, CDA 230 does not immunize the actual creator of content. The author of a defamatory statement, whether he is a blogger, commenter, or anything else, remains just as responsible for his online statements as he would be for his offline statements.
Online Activities Covered by CDA 230
Courts have consistently held that exercising traditional editorial functions over user-submitted content, such as deciding whether to publish, remove, or edit material, is immunized under CDA 230. As one moves farther away from these basic functions, immunity may still exist, but the analysis becomes more fact-specific. While the case law addressing some of these activities is still developing, generally speaking CDA 230 provides immunity for the following actions:
- Screening objectionable content prior to publication. This is the quintessential activity that CDA 230 was meant to immunize, and courts have consistently held that screening content prior to publication does not make an interactive computer service liable for defamatory material it does publish on its site.
- Correcting, editing, or removing content. A website operator may take an active role in editing content, whether for accuracy or civility, and it will still be entitled to CDA 230 immunity so long as the edits do not substantially alter the meaning of the content (i.e., make an otherwise non-defamatory statement defamatory). In an interesting case involving New Jersey politics, Stephen Moldow ran a website and forum where users criticized local elected officials. Muldow regularly deleted offensive messages, gave guidelines for posting, and edited and re-posted messages to remove obscenities. Although the plaintiffs argued the Moldow participated in the creation of the defamatory content and should therefore be held liable, the court concluded that Moldow’s activities were nothing more than the exercise of traditional editorial functions and thus immunized under CDA 230. Donato v. Moldow, 865 A.2d 711 (N.J. Super. Ct. 2005).
- Soliciting, encouraging, or selecting content for publication. Two cases illustrate the scope of this immunity. In Corbis Corporation v. Amazon.com, Inc., 351 F.Supp.2d 1090 (W.D. Wash. 2004), the court immunized Amazon.com from Washington State Consumer Protection Act and tortious interference with business relations claims even though Amazon solicited and encouraged third parties to post images and other content on its site. And in Batzel v. Smith, 333 F.3d 1018 (9th Cir. 2003), the court granted immunity to a museum administrator who selected, edited, and then published on the museum's listerv and website emails he had personally received that claimed Batzel possessed paintings looted by the Nazis during WWII.
- Paying a third party to create or submit content. So long as the author of the material is not your employee (typically a question of state agency law), you will not lose CDA 230 immunity if you pay for the content. One of the first cases to test this involved Matt Drudge, who in the late nineties received all of his income from AOL, which paid him for his popular gossip column and exercised "certain editorial rights with respect to the content." When Syndey Blumenthal sued Drudge and AOL for defamation, the court concluded that the payments to Drudge did not make him an AOL employee nor did they make AOL responsible for his postings and held that CDA 230 immunized the service. Blumenthal v. Drudge, 992 F. Supp. 44 (D.D.C. 1998).
- Providing forms or drop-downs to facilitate user submission of content. Most courts have held that a website will not lose immunity if it facilitates the submission of user content through forms and drop-downs. For example, in a case involving Matchmaker.com, a B-list actress sued the operator of the site after a user created a fake profile that listed her name and home address and indicated an interest in finding a sexual partner. The Ninth Circuit Court of Appeals held that the website was immune from the actress' claims of invasion of privacy, misappropriation, defamation, and negligence, noting that while "the questionnaire facilitated the expression of information by individual users[,] the selection of the content was left exclusively to the user." Carafano v. Metrosplash.com, 339 F.3d 1119 (9th Cir. 2003). (As I discuss below, a recent decision by another panel of the Ninth Circuit casts some doubt on the Carafano decision. See Fair Housing of Council of San Fernando Valley v. Roommates.com.) In a similar case involving a publisher of business databases, an anonymous user used an entry form to submit false information about David Prickett and his wife indicating that they were in the adult entertainment and lingerie business. The court rejected the Pricketts' argument that infoUSA should lose its immunity because it helped create the information by supplying a form and drop-down boxes, concluding that it was the anonymous third party who actually made the choice and submitted the information. Prickett v. infoUSA, Inc., 2006 WL 887431 (E.D. Tex. Mar. 30, 2006).
- Leaving content up after you have been notified that the material is defamatory. CDA 230's immunity provisions do not require that you remove content from your site after you have been notified that that material is defamatory. In the well-known Zeran v. America Online case, an AOL user posted messages purporting to offer for sale items that supported the Oklahoma City bombing and falsely included Kenneth Zeran’s contact information. Despite Zeran’s repeated demands that AOL take down the messages, they remained on the site until he filed a lawsuit. In an early test of CDA 230's scope, the U.S. Court of Appeals for the Fourth Circuit held that CDA 230 immunizes interactive computer services even if they have been notified that the material is defamatory. See Zeran v. America Online, Inc., 129 F.3d 327 (4th Cir. 1997).
Online Activities Not Covered by CDA 230
While most courts have held that CDA 230 grants interactive computer services broad, expansive immunity, this recognition often comes with some reluctance by the courts. Occasionally courts try to find ways around the broad immunity grant of CDA 230. Early on, most courts that tried to hold service providers liable were trial courts that eventually found themselves reversed on appeal.
Lately, however, some appellate courts have been willing to limit CDA 230's immunity. This has primarily involved two types of activities by online publishers:
- Editing of content that materially alters its meaning. If you edit content created by a third party and those edits make an otherwise non-defamatory statement defamatory, you will likely lose your immunity under CDA 230. Where this line is, however, remains unclear. Obviously, if you remove the word "not" from a sentence that reads "Jim Jones is not a murderer," you will have substantially altered the meaning of the sentence and made an otherwise non-defamatory statement defamatory.
- Engaging with users through drop-down forms to create discriminatory content. In a case that appears to be in direct conflict with the Carafano decision mentioned earlier, the Ninth Circuit Court of Appeals held that Roommates.com was not immune from claims under the Fair Housing Act and related state laws because it "created or developed" the forms and answer choices that those seeking to use the service had to fill out. For example, anyone seeking a roommate had to provide information about themselves, such as “male” or “female,” and indicate who else lived in the house (e.g., “straight males,” “straight females,” “gay males,” or “lesbians”). All prospective users had to choose from a drop-down menu to indicate whether they were willing to live with “straight or gay males,” only “straight males, only “gay males,” or “no males” and had to make comparable selections pertaining to females. In a fractured opinion, the court reasoned that by requiring members to answer questions, Roommates.com was essentially causing users to make discriminatory statements. In addition, the court held, Roommates.com also bore liability because it permits users to search the profiles of other members with certain compatible preferences (e.g., search only for females with no children). Fair Housing of Council of San Fernando Valley v. Roommates.com, CV-03-09386-PA (9th Cir. 2007). The Ninth Circuit recently agreed to rehear this case en banc, so we can expect some clarification, or a possible reversal, soon. (For a recent update on the en banc rehearing, see Eric Goldman's Technology & Marketing Law Blog.)
It has now been more than ten years since Congress enacted section 230 of the Communications Decency Act. During that time courts have held that CDA 230 grants interactive online services of all types, including blogs, forums, and listservs, broad immunity from tort liability so long as the information at issue is provided by a third party. Relatively few court decisions, however, have analyzed the scope of this immunity in the context of "mixed content" that is created jointly by the operator of the interactive service and a third party through significant editing of content or shaping of content by submission forms and drop-downs. Accordingly, this is an area that we will be watching carefully and reporting on in the future.
So what are the practical things you can take away from this discussion? Here are five:
- If you passively host third-party content, you will be fully protected against defamation and defamation-like claims under CDA 230.
- If you exercise traditional editorial functions over user submitted content,
such as deciding whether to publish, remove, or edit material, you will not lose your immunity unless your edits materially alter the meaning of the content.
- If you pre-screen objectionable content or correct, edit, or remove content, you will not lose your immunity.
- If you encourage or pay third-parties to create or submit content, you will not lose your immunity.
- If you use drop-down forms or multiple-choice questionnaires, you should be cautious of allowing users to submit information through these forms that might be deemed illegal.