There are always lawsuits against big targets, and initially the suit filed by KinderStart against Google, in which the parenting site/vertical search engine complained that it’s ranking has been intentionally lowered, felt like a nuisance and nothing more. But sometimes interesting things come out of these complaints.
I was reading last Friday’s Cnet coverage, for instance, and a Google lawyer was explaining Google’s ranking. OK, well, I know enough about that to notice when things are being spun, and the spin was clearly on. Because it suited the defense, the lawyer decided to argue that Google’s index was subjective – ie, that Google made editorial decisions about each site’s quality.
Now, that strikes me as a bit askew from public perception, and from what Google encourages us to think, in general, about how it ranks. Here’s how Cnet covered it:
David Kramer, a Wilson Sonsini attorney also representing Google, said the search giant’s PageRank system is subjective, using a combination of reviews into whether a Web site is adhering to its guidelines and is worth a user’s time to view.
“Google is constantly evaluating Web sites for standards and quality, which is entirely subjective,” Kramer said.
The judge probed Kramer on the topic of whether Google engages in misleading behavior, and whether it uses objective criteria to evaluate sites–rather than solely relying on subjective reasoning.
“What if, say, Google says it uses facts one through 10 to evaluate a site, but actually uses number 11 to decide its rank. Isn’t that misleading?” the judge asked.
Kramer, however, said Google readers understand that the site’s ranking system is subjective and based on Google’s opinion about whether a site is worth viewing.
Google’s opinion? Really? Huh. That’s a new one. More on this as I grok it.
48 thoughts on “Now, This Reads Wrong to Me”
Wow. Here’s one “Google Reader” who had no idea Google’s subjective “opinion” had anything to do with rank.
John, they try to be objective, but the factors they use are subjective, since they determine what those factors are. When you point out the universal factors that determines a website quality, then I’ll believe that Google is truly objective.
Of course, they don’t review every single website, but they use their own editorial criteria in their algorithms to automatically rank websites.
I may think my commercial website should be number one because of X, Y, Z, Google may say it’s not because of factors A,B,C – Google argues that there factors are objective, I argue that my factors are objective. Who’s right? They are all subjective.
I sure thought PageRank was a computerized algorithm. I would have bet large amounts of (someone else’s) money on it. If it turns out an “11th factoid” IS used then I suspect the triumverate will have to circle the wagons quickly.
I kinda doubt there’s anything other than a new algo in the wild.
John, has anyone successfully sued a search engine for loss of placement? I recall the case in your book but… anything that can be traced back to editorial discretion for placement? DMOZ and the like…
Or: There is a difference between subjectivity in feature selection, and subjectivity post hoc, in the rankings. The expectation is that, for two web pages, if their subjectively-selected objective features are all “equal”, they will be ranked similarly.
And that’s the rub. Sure, everyone knows that feature selection is a subjective process. That is what made Google great.. they subjectively chose to use inter-document links as a feature when the only other person doing it was Eugene Garfield. And thus pagerank was born. But once they subjectively chose to use the link structure feature, there is still the expectation that Google lets the objective link data speak for itself. (You know…Marissa Mayer’s whole spiel about data, not politics, ruling the Google roost?)
Now, we have quotes like this above: “[Google’s ranking system] us[es] a combination of reviews into whether a Web site is adhering to its guidelines and is worth a user’s time to view. Google is constantly evaluating Web sites for standards and quality, which is entirely subjective.”
To me, this flies in the face of everything I heard coming out of Google since 1998. Google used to brag about how all of its rankings were exclusively machine generated, that there is no human interference in the loop (beyond, of course, the initial feature selection).
Maybe it is good for Google to do this. One reason to do this is to fight adversarial IR (i.e. website owners and/or SEOs trying to “game” the subjective set of features Google chooses with objective feature values that make the page appear more relevant than it really is). But when you add this to a growing list of “editorializations” of the web.. such as China censorship.. you begin to wonder how neutral the whole ranking system really is anymore.
Successfully combating link farms and other forms of web spam always involves humans judging the quality of sites.
How many sites are judged in this way? It varies. Sometimes a very small set just to build algorithms which then perform these judgments on web scale (1). Sometimes a somewhat larger but still small set to enable automated identification of “bad neighborhoods” (2). At the other end, sometimes armies of workers will manually flag as much spam as they can (3). Even if Google doesn’t practice 3, it’s quite possible they practice 2, and certain that they practice 1.
I wonder if this is a lawyer not expressing things clearly. As Gabe says, there’s some human judgement in determingin whether a site is spammy or not (see Matt Cutts’ blog). I imagine that Google would try to find generalizable rules for this, but as black hat SEOs try new things to game the system, there will be new edge cases that humans will have to examine.
Google’s duality on this point is at its core. Google, perceives itself as an algorithms company that spits out objective results, but for some time, its search results have been the product of subjective decisions. I’ve documented numerous examples of this duality in this article. However, I reach a different policy conclusion–I don’t have any problem with search engine subjectiveness, which I think is unavoidable for maxmimum relevancy.
IMHO, Google PageRank is technically subjective because, although no person is involved during operation, algorithms collectively called PageRank were selected by people.
However, I don’t think most Google users understand this.
I agree with Don. The fact that PageRank is always being tweaked to reveal the “best” results means that it’s always being tweaked further into subjectivity.
“Best” or even “relevant” is a subjective distinction on its own.
PageRank, as a set of algorithms, may be automated, but not objective.
I think every computerized and automated algorithm is subjective -> every of them was written by a human!
It’s just lawyerspeak!
I think John and most commenters are looking at it from the wrong angle. This is a lawsuit, hence, this is lawyerspeak! Lawyers, and I assume David Kramer is no exception, don’t philosophize about ‘subjectivity’ or ‘opinions’, at least not outside the courtroom before an actual trial. It may ‘read wrong’ to you and me, but Kramer is only referring to prior case decisions, in particular SearchKing v. Google three years ago. Back then, Judge Vicki Miles-LaGrange ruled, according to CNet:
“PageRanks are opinions–opinions of the significance of particular Web sites as they correspond to a search query,” according to the decision filed in the U.S. Western District Court of Oklahoma.
“The court simply finds there is no conceivable way to prove that the relative significance assigned to a given Web site is false,” the decision said. “Accordingly, the court concludes Google’s PageRanks are entitled to full constitutional protection.”
A curious ‘opinion’ to say the least, but one for which Miles-LaGrange should be queried, not Google or Google’s attorney. BTW, David Utter of WebProNews explained this case law more thoroughly back in March when Kinderstart filed their suit against Google.
Andreas – an excellent note.
JG – certainly Matt Cutts can and (I think rarely) does exclude sites on the basis of his (somewhat) subjective decisions. There’s nothing wrong with this and he’s explained the decisions about it at some length on his blog.
But I think Kinderstart’s problem was the same as a travel site I run – downranking for mysterious reasons with continued inclusion in the index. This happens for many, many reasons most of which are poorly explained by Google support and some I think capricious on Google’s part.
I’ll just me-too the sentiment that you’re reading too much into a lawyer’s phrasing. Explaining a complicated algorithm is difficult, explaining it in legal conceptual terms even more so.
Here, simple way of putting it: The algorithm’s *factors* are subjective (granted many people don’t grasp this, and believe an algorithm itself has no values). The algorithm’s *result* is objective, once the *factors* are set. More or less, Google sets some wide factors, and doesn’t tweak individual sites except for spam and government censorship.
That’s all he’s saying.
There might also be a reference there to TrustRank, with a relatively small set of sites marked as trusted seed sites.
There is actually an objective way of scoring websites: you give tasks to a large group of users and see which sites give them the best solutions. Of course, for each individual user “best” may be subjective, but aggregated across enough users to achieve statistical significance you will end up with objective scores.
For example, in a recent eyetracking study one of our test tasks was “tie a bowline knot.” We gave users a rope and watched them tie the knot. The site that scores the highest on Google for most of the users’ queries was not the best at helping people tie the knot correctly (i.e., the knot didn’t hold). So in this case Google’s ranking was not optimal.
In practice, of course, you can’t perform such user testing with hundreds of users for each of the millions of tasks that people turn to the Internet to perform. However, one could collect objective data to help guide the automated scoring.
All of this is irrelevant for the law suit: In the United States, there should be a clear first ammendment case that any publisher has the right to provide any recommendations they like, whether or not they are correct. If you ask me what’s the best Indian restaurant in Silicon Valley, I’ll tell you the one I prefer, whether or not it’s the one preferred by the largest nummber of other people. Same for any guidebook publisher. In the long run, customers will buy those guidebooks that recommend the best restaurants, just as they will use those search engines that recommend the best websites. But there can never be an obligation to recommend anybody, or we will have lost our freedom of speech.
It’s clear to me that Google uses hand interventions on certain ocasions.
In the past few months I see enormous problems in Google’s infrastructure. They might need an army of reviewers to fix all the damage the automatic filters have done.
Joe, I agree with what you say: complete removal from the index (Cutts’ job) is a different creature than capricious reranking of items still in the index. I think the latter is what the lawsuit is about.
Seth, if by “factors” you mean “features”, then I think we are saying the same thing…in that the features are subjectively chosen, but objectively (except in the case of spam and chinese censorship) applied.
However, I disagree that this is all the lawyer is saying. It really sounds like it is saying that humans go in and tweak the rankings. Here, let us read it again: “[Google’s ranking is done] using a combination of reviews into whether a Web site is adhering to its guidelines and is worth a user’s time to view.”
The word “review” makes it sound very human-done, not machine-done. Determining whether it is “worth a user’s time to view” also sounds very human-done. It does not sounds like a subjective selection of an objective feature. Objective features would be things like term frequencies, or the decision to match morphological variations (stemming, i.e. car/cars) or not. Those are objective features. How much to weight term frequency, or whether to turn stemming on or off is the subjective part of the decision. I think everybody agrees that it is ok for Google to make those sorts of subjective decisions using objective features.
But going in and having a human do a review, and say “this page is worth more than that page”, maybe because it has a better known brand or whatever.. THAT goes against everything I thought Google stood for.
Finally, Jakob: You make a good point about freedom of speech and recommendation engines. But since when did Google become a web recommendation engine? I thought it was a web search engine. If I wanted a recommendation engine, I’d go to Digg. My impression was that Google was for searching, not for recommending.
But maybe that’s what Google really is. Just an opinionated recommender. In that case, I certainly hope that the Google Print project does not get very far. What good is all that access to libraries full of knowledge, when some Google opinion-maker is just sitting there at the backend, determining which books are “worth a user’s time to view”?
No, thank you. I’d rather just go the real library and look through the card catalogue, and find all the references, than have Google downrank (past 1000 where nothing more is shown) certain items it thinks are not “worthy”.
So typical – in germany we say:
i only believe in statistics, that i forge myself!
Seems like we need a search engine neutrality law. It is just dangerous and against online liberty if a single company could decide what website we visit or not.
Search engine neutrality is really the web neutrality.
JG – I think the lawyer was merely saying on that specific point, in English: “Google sometimes make a human review of a site to determine if it’s a spam-type site”.
I don’t think they e.g. change a site from PageRank 5 to PageRank 7, or vice-versa, based on a human-reviewed quality score. But they will set a site to PageRank 0 based on a human-reviewed determination that it’s playing spam games. And apparently that’s what happened to KinderStart.
Of course page rank is subjective. Most Google users don’t understand just how subjective the results are. This in itself is not bad, but I don’t believe there is enough transparency in Google’s stated methodology to escape accusations of deception.
My site suffered a similar fate to Kinderstart. 17,000 unique pages indexed went down to 600. All keywords dropped below the first 5 pages of results. Zero response from any human at Google.
What is interesting is that out of all these pages, the one page that now comes up top when searching on the business name is an article we wrote on how Google shares rose on the news of their partnership with Adobe. The only page on the whole site that mentions Google. Hmmm … imo a clear and amusing example of positive site-unique bias.
And thus explains how Google has monopolized the search engine (industry?)
JG, I’m afraid you’re mistaken about both search engines and libraries:
But since when did Google become a web recommendation
engine? I thought it was a web search engine.
All search engines not only find pages containing the keywords you entered, but also order (rank) the results. The ordering is some sort of “recommendation” of which results are more worthy of the user’s attention. Search engines use a variety of factors to determine this, including the contents of the page, the links to the page, etc. Determining how important the various factors are requires judgement, and there is no objective standard for that.
> I’d rather just go the real library and look through
> the card catalogue, and find all the references….
No library has infinite resources, so librarians select books using (again) a variety of factors, some fairly objective (bestsellers) and others less so (reviews in various journals, local interests, etc.). A few libraries (Library of Congress etc.) do try to get a copy of everything, but very few library patrons need or want that….
The patina of scientific objectivity has fueled the massive interest in search engines by users and the press, from the early days on. More recently it has underpinned huge valuations that allowed search engine tools to grow into media powerhouses capable of swallowing other businesses.
The reality has always been that these are publishers with subjective methodologies. As others here have shown and as the court will show, such publishers’ judgments must be protected under the constitution, regardless of how cleverly they’ve used the aura of mathematical objectivity to tap into the hearts and minds of millions… and the capital markets. Nor is it relevant from a legal standpoint that the CEO wants us to believe that it’s “just computers,” when the company needs to dodge tough questions about its editorial role in Google News, for example (though such inconsistencies and misdirection surely don’t endear Google to their loyal audience).
Remember Yahoo!?! They started as a directory of recommended web pages.
The point, of course, is not that Google regularly makes real editorial judgments and uses humans in quality review processes, though they do. It’s that any ranking method is subjective. Google could introduce a randomizer if it feels like it, and users could choose to use or not use the search tool as long as there is some basic clarity to how a search tool works. Or could it? What factors *can* they introduce? Could they stray far enough from how they publicly promote themselves that they could be in legal trouble? I’d like to hear more enlightened legal opinions on that one (as opposed to knee jerk ranting by aggrieved litigants).
I suppose that may become a problem. Search engines must be secretive about their algorithms, but must avoid deep deception of consumers (which can become a legal issue) or “corruption” (quietly helping some companies or causes based on personal agendas or profit motives – which can be at least a moral and practical business issue and possibly a legal issue).
In conclusion, Google will win this one, but not all cases are going to be clear cut if search engines become too “corrupt” or deceptive. There seems to be a certain cachet of science that has been promoted by these companies that does not resemble precedents with simple listings businesses like the Yellow Pages. And this could be at odds with specific chatter that may circulate inside search engine companies about the need to tweak algorithms specifically to “get” or “knock out” specific companies — especially competitors. How far will the law protect such “subjective” decisions if consumers believe in a different model of how these listings are arrived at?
Stavros: You don’t need to explain to me how search engines work; I’ve been coding all sorts of search algorithms for a decade.
Yes, I know that search engines rank results. So maybe this is just a quibble over semantics, but they don’t “recommend” documents. They “return” documents. Think about the earliest days of electronic search, before there was even a World Wide Web. Back then, search engines were boolean, which means that they didn’t even rank documents. They just returned documents. They didn’t “recommend” a set of documents. They just “returned” a set of documents. That is why it is called information retrieval and not information recommendation.
But look, no matter what we call it, what you are saying is true. In a search engine, objective factors are combined in a subjective manner, and there is no standard for that.
But the key point in all this is that Google’s combination has always purportedly been consistent. Just because there is no standard way to do the combination doesn’t mean there is one subjective combination for one web page, and a different subjective combination for another web page. Google has always led us to believe that the subjectivity stays the same from page to page. It is only the values of the objective factors that change. In other words, Google believes in the almighty Algorithm.
So what it sounds like to me is that, from the lawyerspeak, Google has admitted to inconsistent application of its Algorithm from page to page, through human reviews. This is a departure from the almighty Algorithm, into truly inconsistent “opinion” territory. (Seth, I know you disagree.) So, at least to my understanding, that is what the lawsuit is about…the inconsistency.
JG – Again, Google has always made it clear that they will remove or zero sites they deem spammers, no matter what else The Algorithm says. This is not new. They’re not inconsistent on the basis that this disclaimer doesn’t appear on every thumbnail PR description.
The real concern is the stated policy by one of Google’s Engineers – that certain SEO tactics – frowned upon by Google – could mean the BANNING of an entire DOMAIN – even if it boils down to one JUST page – or a Paragraph on one page – or even certain types of redirects.
Counter arguments were voiced about just limiting the Banning to just that one page – or just changing the Algos to neutralize the questionable tactics from being influential in the SERPS.
This type of Lawsuit is welcomed; there must be open arguments and debates about Algos and about equity.
Search has become extremely influential in society.
John I’d spotted those comments from their lawyer as well, and was surprised as it’s the first time publicly they admitted it, although I thought most people were already aware of this becuase of the eval lab.
I also looked at the Kinderstart website last year and it has some technical errors so don’t think they have a case against Google.
Interestingly though the lawyer admitting that there is a human review process, might work against Google as I believe a few years ago Google won a similar case by saying that the whole process was automated and no human intervention. I’ll try and find the case if I can.
Think it might have been the SearchKing case?
Anyway – the PR rank play a big role for all SEOs. If it is very important for more users? i dont know
Seth: I am not talking about removal of spammers from the index. I am talking about the sort of situation Joe Hunkins describes above:
“But I think Kinderstart’s problem was the same as a travel site I run – downranking for mysterious reasons with continued inclusion in the index. This happens for many, many reasons most of which are poorly explained by Google support and some I think capricious on Google’s part.”
..and that Chris St. Cartmail describes above:
“All keywords dropped below the first 5 pages of results. Zero response from any human at Google.”
In both those cases, as well as (perhaps) in the Kinderstart case, it doesn’t sound like we’re talking about spammers. Otherwise, the pages would be gone from the index. No, in both cases above, the pages are still in the index, but went from results page 1 to results page 6.
If it really was a matter of SEO tactics and banning an entire page or even domain (as Search Engines Web says above), that would be one thing. But again, we’re talking about pages that are still in the index, pages that have not been banned!
So, Seth, it really does sound like they are using human reviews to bump things around in the rankings.
And for those pages, if you are using humans to determine quality of content and whether the pages is “worth a user’s time to read”, as the lawyers have said, then what you are no longer doing is relying exclusively on the power of The Algorithm. And if you are doing that, I think there needs to be a bit more truth in advertising on Google’s part.. they need to publically and clearly let everyone know that not everyone’s web page will be treated consistently.
As Andrew Goodman eloquently states above, there may be nothing we can do against Google’s right to state their ranking “opinions”. But then, Google needs to be absolutely clear about that to the consumer. Judging from many of the reactions above, I think it is patently false what Google’s attorney said, about how “Google readers understand that the site’s ranking system is subjective and based on Google’s opinion about whether a site is worth viewing.”
(As just another quick aside: Note that the lawyer explicitly says that the “ranking system” is based on “whether a site is worth viewing”. He does not say “indexing system”. Removal of spammers from the index is part of the indexing system, not the ranking system. The lawyer says ranking system. So we are indeed talking about using human reviews to bump things around in the rankings.)
Google readers do not understand this. Rather, they understand that Google employs 1,000 PhDs to come up with clever, subjective, but consistently applied Algorithms for ranking.
I do not think Google readers consider Google a Zagat for the web.
Andrew: Very interesting post. Gives me a lot to think about. Thank you.
One comment. You write: “The point, of course, is not that Google regularly makes real editorial judgments and uses humans in quality review processes, though they do. It’s that any ranking method is subjective.”
I just want to distinguish between using editorial judgments and human quality review to establish a baseline against which you measure your subjective combination of objective factors, versus using editorial judgments to establish the actual rank of a particular item. I don’t think this distinction has been made clear yet in this discussion.
Naturally everybody needs to do the former. You first come up with a (subjective) algorithm (using objective features), and then you need human feedback on how well that algorithm does. Based on the feedback, you tweak the algorithm, so that it hopefully also does better for the queries you have not yet seen. This is a very natural, normal approach to developing a search system. (There are entire standards bodies built around this idea, e.g. http://trec.nist.gov/overview.html)
What is not natural is if you were to take those human judgments, and directly transfer them into ranking. That breaks the (subjective) objectivity of The Algorithm.
This reminds me of the debate over hard coding. Recall JB’s post on this matter from last November. (See also my comment quoting a Wired news article. What does it mean to “attach” a better site for a future query??)
I’ve heard numerous Googlers swear up and down that Google does not hard code. Why don’t they? Because of their belief in the almighty Algorithm, and the fear of the loss of trust from the user that the results the user is seeing are based solely on the (subjective but consistent mixture of) objective merits of the pages being returned.
But what is hard coding, if not an inconsistent application of the subjective mixing of objective factors…i.e. an “opinion”?
Hard coding says “I don’t care if the term frequency for the query term is low on page X (something that is otherwise very important for determining rank). I am going to rank it first, anyway, because I did a “review” and found that it is “worth a user’s time to view”.
So what it sounds like Google’s lawyers are saying is that not only is Google no longer opposed to hard coding, Google is actually in full support of hard coding…if it will provide the best “recommendation” possible for helping users find what they want. If the hard code is “relevant”, what does it matter? Everyone’s happy, right? Well, except for those of us that believe in SERP integrity and consistency.
“ranking system” is based on “whether a site is worth viewing”.
Who determines if some is “WORTH” viewing. Are we not part of the open internet, sounds like the china issues that google takes care for china is happening here.
Next thing you know we get penalized for the colors or text fonts we use on the websites??? Whats Next?
Yes, font size and colors could certainly become part of the ranking crtieria.
The next frontier in search engines is to produce rankings based on how useful each site is likely to be for the users, and not just how popular it is. If text is difficult to read, because of tiny type or low-contrast typography, that’s a reason to avoid visiting a website (in user testing, we see people leave such sites), and thus also a reason for a search engine to rank the site lower.
I hared a rumor – that the next step is the Trusted Link – just links from trusted google sites will count for a trusted page rank.
JG: Google has TWO different penalties. TWO. It’s like the difference between felony and misdemeanor. For “felony spamming”, sites are removed. For “misdemeanor spamming”, sites are stripped of their Pagerank. Felons aren’t ever coming back. Misdemeanors can possibly clean up their act and get their Pagerank back. A human often makes a judgment call, which is in fact a good thing.
Kinderstart is in the misdemeanor category. If you look at the site, it uses a link-exchange structure that does look sleazy.
Google has done nothing unclear to anyone familiar with the details – not the PR summary – the details, of how they treat sites.
I grant they could be a lot clearer to the targeted webmasters. But none of this is a confession of any sort.
Seth: I did not realize there was a tiered punishment system at Google. On that detail, and on the link-exchange detail, I stand fully corrected. Most of my hot air here is running off of what the Google lawyers are saying, and how it just doesn’t jive with prior Google rhetoric, rather than on the specific details of any one case. As such, I welcome any reigning in of my flailing.
But I’m still a bit confused, in that lawyers say Google is doing (human) reviews into whether a “site is worth viewing”. Not reviews on whether or not a site conforms to Google’s indexing policies.
Whether or not Kinderstart’s link exchange structure looks sleazy, is Kinderstart worth viewing? Whether or not BMW employs cloaking, is the BMW page worth viewing? I would say yes to the latter, if not the former as well. Google disagreed. Their human reviews somehow found BMW not “worth viewing”.
That’s just silly.
I would say that, as long as Google is employing human reviewers anyway, they come up with better judgements than the one in which they de-enlist someone like BMW. I think they’re forgetting that the ultimate goal here is dissemination of relevant information. Whether or not something conforms with your own internal corporate way of parsing things is a worthy subgoal along that path, but should not be confused with the main task.
It just feels a bit like Google just wanting to punish webmasters for not doing things its way, rather than worrying about whether the content is relevant, no matter what SEOs are doing.
Too much cart before the horse…?
Seth: Although Google does indeed mention in their webmaster guidelines that sites involved in spamming may be penalized, and although you’ve hinted that Kinderstart looks like such a spam site, something that Google’s lawyer says in the transcript seems to go beyond advertised considerations of quality.
Google’s lawyer says specifically (page 24):
“Even if Google has a malicious intent with respect to its expression of page rank, it’s still expressing its opinion, even if Google is improperly motivated in that opinion.”
Granted, this still doesn’t amount to a confession, but it goes well past the message conveyed in their guidelines. We’re no longer talking about whether a site was penalized because of spamming. The argument seems to imply that Google has the right to penalize sites for any reason whatsoever, be they consistent or inconsistent, fair-minded or malicious. And while Google may indeed have that right, it’s questionable as to whether they have a right to do so while advertising to the world otherwise.
John I’m glad you are being persistent on this topic because I think it’s possible (though I still think unlikely) that the lawsuit will reveal that some pretty subjective / punative/ questionable filtering is done under many sets of circumstances. If Google is not acting with the algorithmic objectivity they’ve claimed for some time it is important people understand that.
The Google labs incident comes to mind. Human review processes were revealed in that, but Google insisted the reviewer feedback was only used to compare result sets for algo tweaking. I still believe them though I sure wish they’d provide more direct feedback regarding downrankings
TL: Take it up with _Fox News_ “Fair And Balanced”. People are simply imputing a strawman in what Google says, and then going crazy demolishing the strawman. What the lawyer said is completely expected in this context, it’s standard legal tough-talk.
Go look at Kinderstart.com. At this moment:
“Last Few Sites Added:
… Console Gaming … Buy Belize Real Estate … College Degree Direct …”
That screams SPAM.
Seth: You’re right. Those sites displayed on Kinderstart do look kind of suspect. I was a little surprised to see that. About what the Google lawyer said, though: Tough-talk or not, it’s still part of Google’s argument. There aren’t straw men being set up here, it’s all straight from the mouth of Google’s defense attorney.
In both your comments here and in the entries on your own weblog, it looks like you’re saying that the key part in all the stuff Google has said about “subjectivity” and “opinions” is that they are simply pursuing their right to shut out sites “playing web-spam games” without having to explain themselves endlessly every time they do it.
While that’s all well and good, I think it’s important to note that the reasoning they use about “subjectivity” to argue for that extends well beyond web-spam games. The model case in point, perhaps, is in the part I quoted earlier. It looks like Google could use this line of reasoning to justify shutting out sites maliciously and arbitrarily. Of perhaps equal significance, it was their own defense attorney who made this point.
As I said earlier, Google may indeed have the right to treat sites the way they want. But it doesn’t seem right for them to be able to do that without being honest and upfront about it, especially if so much of their success comes from their reputation for “objectivity” and “fairness.”
Titus: I completely agree with you that it wouldn’t be morally right for Google to treat sites in (my phrasing) an arbitrary and capricious manner. What I’m saying is that, here, Kinderstart wasn’t treated unfairly (all in all – apart from the general weaknesses of not notifying spammish sites – but they aren’t an unclear case!). And that the tough-talk argument doesn’t imply that Google is in fact being arbitrary and capricious. The strawman is that people are reading too much into the short description of Google’s algorithm, and then saying human judgment in penalizing spammers makes that short description a lie.
There’s a certain type of legal tough-talk argument that one sees in these situations. It runs like this (I’ll exaggerate somewhat, to illlustrate the points):
Plaintiff: Google says they’re completely objective. But they *ADMITTED* they use human judgment in changing rankings. That’s subjective.
Lawyer: Google uses objective and subjective factors. We do penalize spammers, and sometime make human judgment calls about who is a spammer.
Plaintiff: So, you admit Google lied in saying it was completely objective?
Lawyer (going on the attack, talking tough): Even if were in fact lying, it’s *our right* as *free speech* to lie through our teeth. As long it’s as not libel/defamation/etc, we have a legal right to be liars. [Sometimes further, if they want to be nasty: Our audience can make up their own minds as to whether or not we are telling the truth. Who are YOU, huh, huh, huh, to arrogate yourself the right to determine truth from lie? That’s totalitarianism, religious fanaticism – you think you know what’s true, which is a sign of INTOLERANCE! You should be open to alternative views, you closed-minded arrogant True Believer!]
Sadly, I have had entirely too much experience with that sort of lawyer conduct … But, anyway, other discussion can go down that path that it’s morally wrong to tell a lie, even if there’s a legal right to do so, and I certainly concur. But I’m saying that you can’t derive from the tough-talk legal argument that Google is in fact lying. It’s a common argument in these situations. The factual answer is that the plaintiff is making a straw man out of a public relations summary of Google’s procedures. But it would be a weak rhetorical argument to get into that discussion too much in court. So, yes, Google is in fact arguing in court it can lie if pleases. And I agree doing so would be morally wrong. But that argument is also not any sort of confession either. Rather, it’s a completely expected legal tactic in this case.
New Google motto: “Don’t be evil…be legal!”
But seriously.. I understand the point you are making, Seth, and you may very well be correct in assessing the situation. However, it still leaves a bad taste in my mouth. Was there really no other legal way for Google to make its case without resorting to this tactic? It feels like Google is willing to say anything, to anyone, at any time, in order to get what it wants. And while that may be par for the course in 99% of the business world, it wasn’t what I thought Google stood for. I switched to Google not because its results were any better (back in 1998 I was using SavvySearch, and it was just as goood as, if not better than, the Goog), but because of its mottos and ideals.
More and more, the things that attracted me to Google as a user are disappearing, if not already gone. This particular example is just one more item among many. Anyway, thanks for the discussion.
Kinderstart vrs Google – Suit DISMISSED!
I think the “with leave to amend” part of the decision looks pretty interesting. Particularly since in the CNET article quoted above Kinderstart’s attorney also has a fairly confident sounding statement, saying something to the effect that the door has been left open for a second amended complaint and that Google now seems more at risk to the defamation claim than before.
It’s also important to note that nowhere in the decision or in the article does it mention any of the counts being dismissed “with prejudice” (i.e. with no chance to modify/refile the complaint whatsoever). A few are simply dismissed and most are dismissed with leave to amend.
As Eric Goldman (quoted in the above CNET article) writes on his blog:
One side note: it’s unclear from the opinion if all of the claims are amendable, or only those where the judge expressly said that KinderStart could amend–the Sherman Act claim, the common carrier claim, the 17045 claim, the good faith and fair dealing claim, the defamation claim and the negligent interference with prospective economic advantage claim. My reading is that KinderStart can amend all of its claims, but the opinion is ambiguous on that point.
So, it seems like because of a relatively soft dismissal, this case may still have a ways to go.
Wow, I hadn’t noted all this follow up until today, and based on how many people are challenging John’s legitimate concerns it’s clear to me that this well informed community is … out to lunch in their lack of concern.
Google engineers have publicly explained the 30+ day ranking penalty process for deceptive practices. This “defense of the algorithm” activity can’t be squared with a fully objective approach to ranking, which would put the ‘best’ site first regardless of that site’s former practices. Arguably this approach cannot be squared with Google’s odd claims of a sort of pagerank objectiveness with reasonable ranking subjectiveness, which I think will come back to bite them eventually.
Kinderstart will be refiling and the judge appeared to leave a very interesting door open if Kinderstart can show that Google ranks sites capriciously. I don’t think they do it often, but they certainly do it sometimes.
Do You Feel Your Business Has Been Mislead by Google’s Search Indexing or Google Adwords?
It appears that Google has reached the ethical apex within the American Corporate sector. What is sad about all the perplexing information on Google and the many recent lawsuits against the company is that we continue to let corporations like Google and the Enron’s of the world eat away at what was a country with strong ethical codes of conduct, professionalism, and sturdy and honest leadership. Though we have not been a knowing victim of corporate deception, up till this point, we sadly admit our company feels that Google is misrepresenting it’s advertising and search engine ranking services. After a two and a half week vigilant analysis of Google’s Adwords “bidding” program, we have found that Google is, in short, is “misleading” it’s customers and depriving users of a free and open internet experience in the name of profit and corporate greed. We are unclear on what action to take and which bureaucratic body to approach in dealing with this matter, we are resolved in gaining more facts and information about Google and its business dealings and will seek damages to be rewarded to cover the offense that the company has caused. This statement is not intended to disparage or sway any of the current pending judgments against Google, but since we have not been able to resolve this matter with the company directly, and since they choose not to cooperate and continue their current business methods and hold such subjective business practices, we hold no recourse but to conform and align with other parties with similar interest against the company and take action. With that being said, we are seeking to initiate a class action law suite against the Google Corporation for ethics violations and misleading advertising customers by implementing a rather precarious and subjective search engine and advertising program. If you feel you may have fallen into any of these list of items, we would be interested in speaking with you about your experience and the possible offenses the company has caused your business.
Of course google is being subjective. Use their “news” search, and click on the “Blogs” after you search for “Iraq”.
Only left-leaning, Democrat-friendly blogs are included.