On December 21, 2012, Dr. Guy Hingston, a cancer surgeon from Port Macquarie in New South Wales, Australia, filed suit against Google in the U.S. District Court for the Central District of California. Dr. Hingston's complaint alleges that Google portrayed him in a "false light" through its "autocomplete" feature, because for at least some users entering his name into Google's search engine has triggered the option to search for the phrase "guy hingston bankrupt."
Dr. Hingston, of course, denies that he is bankrupt. This is not the first lawsuit against Google based upon autocomplete results; we have written before about suits in other countries. However, this is the first such suit filed in a United States court, raising the question of whether the suit could succeed under the laws of California and the United States, including the First Amendment.
"False light" is a notoriously ambiguous tort, but for the purposes of this post I will focus on California's interpretation of the tort. California's version of "false light" is very similar to defamation, except that while defamation involves false statements of fact about the plaintiff, false light involves false implications of fact. (For more on California's false light tort, see our Legal Guide entry). Nevertheless, false light is subject to limitations similar to those applicable to a defamation claim, several of which are fatal to this claim.
"Of and Concerning"
Let's start with the fact that the allegedly actionable publication must identify, or be "of and concerning," the plaintiff. See New York Times v. Sullivan, 376 U.S. 254, 288 (1964) (finding plaintiff's evidence constitutionally defective where it was "incapable of supporting the jury's finding that the allegedly libelous statements were made 'of and concerning'" him). In this case, yes, the autocomplete suggestion incorporates the name "Guy Hingston." But while this name is not quite as common as John Smith, it is a safe bet that there is more than one Guy Hingston in the world. Given that Google's autocomplete suggestions are drawn from actual searches by all of Google's millions of users, it is a stretch to presume that every autocomplete result must relate to a prior search directed at this particular Guy Hingston.
By way of example, my name is Jeff Hermes, not a particularly common name. Entering my name into Google's search engine produces the autocomplete suggestion "jeff hermes merck." I have no connection to Merck & Co., nor would I have any reason to believe that someone had previously searched for me in connection with Merck. Rather, I would guess (and, as it turns out, be correct in guessing) that there is another Jeff Hermes who has some tie to the pharmeceuticals company. Even if I had intended to search for information about myself when I entered my own name, it would not be reasonable to assume that the autocomplete result was "of and concerning" me.
Similarly, if Dr. Hingston were to enter his own name and see "guy hingston bankrupt," when he claims that he himself has no connection to a bankruptcy, the reasonable interpretation would be that there is another Guy Hingston out there somewhere whom some Google user thought might be involved in a bankruptcy. In this context, the mere fact that the name is identical does not mean that a reasonable user would understand the term in the search suggestion to refer to him. See Tamkin v. CBS Broadcasting, Inc., 193 Cal.App.4th 133, 146-47 (2011) (fact that character in work of fiction shared same name as plaintiff, "Scott Tamkin," was insufficient to show that work could be reasonably understood to refer to plaintiff).
Next, to be actionable for "false light," the publication at issue must convey an implication that is provably false. If the statement "cannot be read to imply the assertion of an objective fact," the plaintiff's claim will fail. Partington v. Bugliosi, 56 F.3d 1147, 1157 (9th Cir. 1995). Dr. Hingston interprets the autocomplete result as implying that he personally is bankrupt or associated with a bankruptcy. But even if there were only one Guy Hingston in the world, the most that an autocomplete suggestion implies is that someone in the world once thought that they might find content of interest by searching the terms "guy hingston bankrupt." It does not, however, indicate that there is any content on the Internet actually responsive to that search, let alone any content that is damaging to Dr. Hingston.
Possibly earlier searchers were wondering if Dr. Hingston was solvent, and checked online using these search terms. Dr. Hingston alleges that they would see no search results connecting him to a bankruptcy (see Complaint, ¶ 10), presumably resolving that question in the affirmative. Search terms by their nature do not constitute an assertion of an objective fact, but a request submitted to Google for relevant content.
But even if Google's autocomplete suggestion were somehow treated as a factual statement, there will be interesting questions as to whether Dr. Hingston can prove falsity. Again, the most that the autocomplete suggestion might indicate is that Dr. Hingston was potentially connected to a bankruptcy in the mind of someone who has searched Google in the past. TechDirt has done some legwork on this issue, finding possible origins of the suggested search terms that might undermine a claim of falsity.
47 U.S.C. § 230
Even if the autocomplete suggestions can be interpreted to be referring to Dr. Hingston, and even if they can be construed as both factual and false, the suggestions are the unadulterated search terms entered into Google's search engine by third party users. As such, the claim would appear to run afoul of our old friend, Section 230 of the Communication Decency Act (47 U.S.C. § 230). Let us recite the mantra:
No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.
Google is an interactive computer service. The search terms that users input are information provided by another information content provider. Google presents those search terms to other users based upon text matches with the later users' entries, but does not alter those terms in any way that would change their "meaning" (to the extent that they have meanings) and turn Google itself into an "information content provider." Therefore, Section 230 preempts state law liability for Google based upon the autocomplete suggestions.
In false light and defamation actions, context is king. The situation would have been very different if the words "guy hingston bankrupt" had been splashed across the front page of Dr. Hingston's hometown newspaper, but that's not what we have here. The same words appearing as suggested search terms on Google might be disturbing, but would convey virtually nothing about Dr. Hingston to any reasonable user. Toss in Section 230 for good measure, and this claim seems dead on arrival.
The case is being referred to mediation, and if that fails, I'm sure Google will move to get this case kicked out promptly. But for now, all that one finds when searching for "guy hingston bankrupt" on Google are page after page of reports about this lawsuit. If at any time these search terms might have led to damaging information about the plaintiff's finances, that information is now well hidden from the casual searcher.
Jeff Hermes is the Director of the Digital Media Law Project.