Almost immediately after the Web 2.0 Summit last month, Tim O’Reilly and I sat down at an FM event and debriefed each other on what we learned. Here’s the video.
Wow, I’ve never seen this before. Check out Google’s post, responding to the New York Times story about a bad actor who had figured out a way to make a living leveraging what he saw as holes in Google’s approach to ranking.
How Google ranks is the subject of increasing scrutiny, including and particularly in Europe.
From Google’s blog:
Even though our initial analysis pointed to this being an edge case and not a widespread problem in our search results, we immediately convened a team that looked carefully at the issue. That team developed an initial algorithmic solution, implemented it, and the solution is already live.
What I find fascinating is the way Google handled this. Read this carefully:
Instead, in the last few days we developed an algorithmic solution which detects the merchant from the Times article along with hundreds of other merchants that, in our opinion, provide an extremely poor user experience. The algorithm we incorporated into our search rankings represents an initial solution to this issue, and Google users are now getting a better experience as a result.
What word stands out? Yep, “opinion.”
Think on that for a second. If ever there was an argument that algorithms are subjective, there it is.
(Oh, and by the way, the last paragraph in the blog post clearly is directed at the regulators in Europe, if you think about it….)
One of the many reasons I find Twitter fascinating is that the company seems endlessly at an inflection point. Eighteen months ago I was tracking its inflection point in usage (holy shit, look how it’s growing! Then, holy shit, has it stopped?!), then its inflection in business model (hey, it doesn’t have one! Wait, yes it does, but can it scale?!), and more recently, its inflection point in terms of employees (as in growing from 80+ staff to 350+ in one year – necessitating a shift in management structure….).
Twitter now faces yet another inflection point – one I’ve been tracking for some time, and one that seems to be coming to a head. To me, that inflection has to do with usefulness – can the service corral all the goodness that exists in its network and figure out a way to make it useful to its hundreds of millions of users?
To me, this inflection point is perhaps its most challenging, and its greatest opportunity, because it encompasses all the others. If Twitter creates delightful instrumentations of the unique cacophony constantly crossing its servers, it wins big time. Users will never leave, marketers will never get enough, and employees will pine to join the movement (witness Facebook now, and Google five years ago).
Now, I’m not saying Twitter isn’t already a success. It is. The service has a dedicated core of millions who will never leave the service (I’m one of them). And I’m going to guess Twitter gets more resumes than it knows what to do with, so hiring isn’t the problem. And lastly, I’ve been told (by Ev, onstage at Web 2) that the company has more marketing demand than it can fulfill.
But therein lies the rub. Twitter has the potential to be much more, and everyone there knows it. It has millions of dedicated users, but it also has tens of millions who can’t quite figure out what the fuss is all about. And you can’t hire hundreds of engineers and product managers unless you have a job for them to do – a scaled platform that has, at its core, a product that everyone and their mother understands.
As for that last point – the surfeit of marketing demand – well that’s also a problem. Promoted tweets, trends, and accounts are a great start, but if you don’t have enough inventory to satisfy demand, you’ve not crossed the chasm from problem to opportunity.
In short, Twitter has a publishing problem. Put another way, it has a massive publishing opportunity.
Oh, I know, you’re saying “yeah Battelle, there you go again, thinking the whole world fits neatly into your favorite paradigm of publishing.”
Well yes, indeed, I do think that. To me, publishing is the art of determining what is worth paying attention to, by whom. And by that definition, Twitter most certainly is a publishing platform, one used by nearly 200 million folks.
The problem, of course, is that while Twitter makes it nearly effortless for folks to publish their own thoughts, it has done far too little to help those same folks glean value from the thoughts of others.
It was this simple truth that led FM to work with Microsoft to create ExecTweets, and AT&T to create the TitleTweets platform. It’s the same truth that led to the multi-pane interface of Tweetdeck as well as countless other Twitter apps, and it was with an eye toward addressing this problem that led to the introduction of Lists on Twitter.com and its associated APIs.
But while all those efforts are worthy, they haven’t yet solved the core problem or addressed the massive opportunity. At its core, publishing is about determining signal from noise. What’s extraordinary about Twitter is the complexity of that challenge – one man’s noise is another man’s signal, and vice versa. And what’s signal now may well be noise tomorrow – or two minutes from now. Multiply this by 200 million or so, then add an exponential element. Yow.
There is both art and science to addressing this challenge. What we broadly understand to be “the media” have approached the problem with a mostly one-to-many approach: We define an audience, determine what topic most likely will want to pay attention to, then feed them our signal, one curated and culled from the noise of all possible information associated with that topic. Presto, we make Wired magazine, Oprah, or Serious Eats.
Facebook has done the same with information associated with our friend graph. The newsfeed is, for all intents and purposes, a publication created just for you. Sure, it has its drawbacks, but it’s pretty darn good (though its value is directly determined by the value you place in your Facebook friend graph. Mine, well, it don’t work so well, for reasons of my own doing).
So how might Twitter create such a publication for each of its users? As many have pointed out (including Twitter’s CEO Dick Costolo), Twitter isn’t a friend graph, it’s more of an interest graph, or put another way, an information graph – a massive set of points interconnected by contextual meaning. To the uninitiated, this graph is daunting.
Twitter’s current approach to navigating this graph centers around following human beings – at first with its “suggested users” list, which simply didn’t scale. Twitter soon replaced suggested users with “Who to follow” – a more sophisticated, algorithm-driven list of folks who seem to match your current set of followers and, to some extent, your interests. When you follow someone who’s a big foodie, for example, Twitter will suggest other folks who tweet about food. It does so, one presumes, by noting shared interests between users.
The question is, does Twitter infer those interests via the signal of who follows who, or does it do it by actually *understanding* what folks are tweeting about?
Therein, I might guess, lies the solution. The former is a proxy for a true interest graph – “Hey, follow these folks, because they seem to follow folks who are like the folks you already follow.” But latter *is* an interest graph – “Hey, follow these folks, because they tweet about things you care about.”
From that logically comes the ability to filter folks’ streams based on interests, and once you can do that, well, things get really…interesting. You could follow interests, instead of people, for example. It’s like search meets social! And hey – isn’t that kind of the holy grail?
If Twitter can make the interest graph explicit to its users, and develop products and features which surface that graph in real time, it wins on all counts. That is a very big problem to solve, and a massive opportunity to run after.
For more on this, read Making Twitter an Information Network, by Mike Champion, and “The unbearable lameness of web 2.0”, by Kristian Köhntopp, as well as the wonderful but too short The Algorithm + the Crowd are Not Enough, by Rand Fishkin. These and many more have informed my ongoing thinking on this topic. What do you think?
ATD is reporting that Google is offering well more than twice what had been previously offered – $6 billion, instead of $2.5 billion. That sounds more like it. As I wrote yesterday, $2.5 billion sounded very low for this particular asset. (And this from the guy who thought YouTube was overpriced.)
Clearly, that leak to Vator.tv last weekend was timed to push a deal point, I’m guessing.
Key to this deal is Marissa Mayer, who recently took over local for Google and was promoted to boot. This would be her defining deal, and the integration of the acquisition would be critical. Google has had mixed results in this department so far – YouTube is clearly on its way to being a winner, but took far too long to get there. Many small pickups have proven to be big winners – Applied Semantics comes to mind. And Blogger, FeedBurner, and many other small acquisitions never really found their footing, but DoubelClick was a huge win.
Today’s big rumor, so far at least, is that Google may be buying Groupon for a reported $2.5 billion. This sale has been rumored for some time, but the figure – and story – is based on a rather thin piece by VatorNews which broke early this morning. The headline is honest in its lack of, er, definitiveness – Google buys Groupon for $2.5 billion? – but it does claim one “reliable source.”
Whether or not this story plays out, my first thought was that the company is worth far more than $2.5 billion. If, as many have reported, Groupon is doing $50mm in revenue a month, that’s a mere 4.2x multiple on current run rate. Given the insanely strategic nature of the purchase to not only Google, but just about every other major player in the game (Microsoft, Yahoo, eBay, Amazon, hell, American Express if they wanted to play, not to mention Facebook), I’d be utterly stunned if Google won the company for that price. Location is a key signal connecting commerce, search, social, and small business. It’s a big, big deal, and Groupon is the leader in the space.
My spidey senses tell me someone is shopping this deal through a leak to the press, seeking to drive an auction. We’ll see.
China’s Politburo directed the intrusion into Google’s computer systems in that country, a Chinese contact told the American Embassy in Beijing in January, one cable reported. The Google hacking was part of a coordinated campaign of computer sabotage carried out by government operatives, private security experts and Internet outlaws recruited by the Chinese government. They have broken into American government computers and those of Western allies, the Dalai Lama and American businesses since 2002, cables said.
However, there is nothing in this reporting that justifies how TechCrunch headlined its coverage:
No, guys, it does not confirm it. Re-read that paragraph. It says one source told another that it was the government. That does not qualify as a “confirmation” in any journalistic sense.
I searched for the original cable, but it has not been released yet. All we have is the summary above.
Ev recently turned over CEO duties to Dick Costolo, but it’s clear he’s still very, very engaged. Highlights for me included when Ev spoke of his mission to lower barriers to publishing, avoided talking about financing, and needled me a few times, in a humorous way, of course.
A highlight for me at Web 2 was watching John Heilemann interview Fred and John, two giants of the VC world. This was a pretty historic pairing, and I’m very pleased we made it happen. For your enjoyment:
If I could sum up the overarching theme of our conference this year, it’s that “this sh*t is getting real.” Plucky startups with funny names have consolidated power, and are disrupting the entire global economy. This, of course, means things are “getting real” from the point of view of government and policy as well. Here’s a candid conversation with one of the key policy chiefs, FCC Chair Julius Genachowski.
Carol Bartz has been under fire for nearly two years, and it shows in her responses to my questions – she’s ready to look forward and not defend what Yahoo has been in the past. My sense is that it will take another year or so before the real changes at Yahoo – in key areas of infrastructure, management, and products – will really take hold.