Search Engine Loyalty, A9 Prognostications…

Mediapost reports fresh search engine loyalty numbers, concluding that "Google gets the gold again" – 65 percent of Google users use only Google, as opposed to just 55 percent of Yahoo users. I'm not sure I buy the whole search engine loyalty thing. I think folks aren't loyal, they're lazy….

Mediapost reports fresh search engine loyalty numbers, concluding that “Google gets the gold again” – 65 percent of Google users use only Google, as opposed to just 55 percent of Yahoo users. I’m not sure I buy the whole search engine loyalty thing. I think folks aren’t loyal, they’re lazy. As Yoda might say, not until a compelling choice they have, switch will they.

Which brings me to A9. Over at his blog, Rex points out I missed the most compelling potential A9 feature, that of collaborative filtering.

To me, A9 is not designed as an Internet search engine, but as a knowledge-searching tool to end all knowledge searching tools…..As you look for information, Amazon will provide you the results that “people like you” have found most helpful when searching for the same information, product, place, answer, etc

He’s right, of course, and I should have made a bigger point of this, but I sort of presumed that was the sex in A9 to begin with – it’s Amazon, after all. A knowledge-searching tool is exactly what a search engine should evolve towards. But I disagree with this:

I don’t think Amazon wants to compete with Google. Google admitted recently that it was a content business. Amazon has no such designs. Amazon, rather, wants to connect you with something you can purchase.

While I agree that Amazon wants to connect you to a purchase, I disagree with the implication that Google does not. The advertising business = the marketing business. and the marketing business = the ecommerce business. This loop, somewhat blurry in the physical world, is nearly seamless online. As I said in my posts, it’s two ends (search, commerce) to the middle here. That’s what makes it so interesting. And I am pretty sure that Google spiffed Froogle up and put it on the home page last month for a reason…and that the announcement of GMail, which will offer a compelling user lock-in, was not a coincidence. Rex points out, and I second wholeheartedly:

If A9 incorporates the collaborative filtering algorithms that power Amazon’s predictive recommendations to customers, it will (and I know this from very, very expensive first-hand experience) produce search results that will astound the user. Just think about it: Your search results will be filtered first by Google algorithms and then through Amazon’s collaborative filtering algorithms. In the simplistic metaphor we used at, “Your search results will be based on those results found most helpful by people like you. It will be cool. Promise.”

Damn straight. That sounds like one hell of a search engine, and if I were Google, I’d be most attentive.

5 thoughts on “Search Engine Loyalty, A9 Prognostications…”

  1. User behavior is one of the underutilized sources of information for web search, primarily because the content is separate from the index. For a long time SE’s were reluctant to put in tracking redirects, even though they only need a small sample to get viable results.

    There’s a paper I read a few months ago that explores some of this concept in an intranet environment (and, BTW, shows why the Google search appliance is a bad idea):

    Current search engines generally utilize link analysis techniques to improve the ranking of returned web-pages. However, the same techniques applied to a small Web, such as a website or an intranet Web, cannot achieve the same level of performance because the link structure is different from the global Web. In this paper we proposed a novel method of generating implicit link structure based on users? access patterns, and then apply a modified PageRank algorithm to produce the ranking of web-pages. Our experimental results indicate that the proposed method outperforms keyword-based method by 16%, explicit link-based PageRank by 20% and DirectHit by 14%, respectively.

    While I’m at it, here’s a link to the blog I got it from:

  2. Here’s the thing for me: Amazon can tell me “People who bought this item also bought….” because those users have taken a definitive action and made a definitive statement to say, “Yes, I’m buying this product, too.” The fact that a purchase is made is an endorsement, and can be assumed to mean “this is what I’m looking for.”

    When you port that over to A9, the only definitive statement the user makes in the SERPs is to click a link. It doesn’t cost the user any money, so there’s no definitive endorsement going on like you have with an actual purchase. And — very often, the user clicks to a site that is NOT what s/he’s looking for.

    So, when A9 starts saying, “People who searched for this term visited these web sites…”, how do they know if the user actually found those sites helpful? How do they know if that’s actually what the user was looking for? Clicking a link in a SERP is not necessarily the same level of recommendation as buying a related product.

  3. Matt,

    I couldn’t help but mention that your observation is really insightful. However,

    1) Search engines like google include excerpts of the web pages returned in results, so to an extent the user knows what to expect in a page before clicking the link.
    2) If a user clicks the back button less than a particular number of seconds after visiting a page, and then clicks another link on the same result page, we can assume that he wasn’t satisfied with the original link he clicked. can track this.

    Best Regards,
    Seun Osewa

  4. Seun — thanks for the reply.

    I understand your points, and agree with you to some degree, but:

    1) If the excerpts that a search engine shows were always enough to tell you whether that’s the page you’re looking for, we’d all be a lot happier with search engines in general, wouldn’t we? Plus, with the way some folks rig their content to take advantage of the SERP excerpt, what you see there isn’t always the most accurate barometer of what the page is really about.

    2) I was wondering if there would be some sort of measuring going on like you suggest. But even if there is, it’s still not reliable enough:

    a) How quickly would you have to click BACK for A9 to know that site wasn’t what you wanted? 10 seconds? 30 seconds? A minute? I don’t always know that quickly if the site is what I’m looking for, and I often go several pages into a site before I know, rendering the BACK button useless.

    b) What about a situation where I click on an A9 link, spend a minute or two at the site, decide it’s not what I’m looking for, and then leave to go try another search engine? I never return to A9. Yet, A9 thinks I found what I was looking for and starts suggesting this site to other users making the same query.

    I could go on, but I do understand your points. I’m still unconvinced about the reality of using clicks in the SERPs as a tool to recommend sites. A click isn’t always a recommendation, and A9 would be hard-pressed (IMO) to start determining when a click is or isn’t with any level of accuracy.

    One solution might be: when you revisit your past searches, in the space near where they tell you “You clicked this site 24 hours ago”, they could add a quick little “Do you recommend this site for this search?” and let me choose Yes/No. Then again, that idea is ripe for abuse as Company B tells all its people to click “NO” about Company A’s web site. Hehehehe.

  5. I’m surprised none of the search engines has added a proxy service to gather these statistics more objectively. The google cache is almost a proxy already, and they could keep it up to date, if they were smart.

    Some of those old free ISP’s like NetZero used to gather stats like that. Their users were chained to a proxy for ad serving, and every click was tracked. Any pretense of privacy was signed away.

Leave a Reply

Your email address will not be published. Required fields are marked *