I was thinking about what Google might do with the huge platform it has and is continuing to build. What might be a profitable and deeply cool use of such a platform? Something Wayne Rosing said in Alex’s piece struck me, when crossed with Simson’s Akamai insights:
Engineering Vice-President Wayne Rosing has on several occasions emphasized that Google’s primary expertise is in so-called distributed computing. That’s a fancy way of talking about delivering applications to a computer user’s browser or to remote locations.
So, what if Google becomes an application server cum platform for business innovation? I mean, a service, a platform service, that any business can build upon? In other words, an ecologic potentiality – “Hey guys, over here at Google Business Services Inc. we’ve got the entire web in RAM and the ability to mirror your data across the web to any location in real time. We’ve got plug in services like search, email, social networking, and commerce clearing, not to mention a shitload of bandwidth and storage, cheap. So…what do you want to build today?”
If I had that opportunity, I’d take a percentage of revs or profits on the businesses that got built, rather than just service fees. it’s Google as incubator to Web 2.0.
Yahoo is already doing this, though for a fee and in the SMB market. So is MSN. The traces are laid. Both of them were also doing mail. But neither of them have more than 100,000 servers and the GFS. Hmmmm.
OK, back to writing the book, damnit.
4 thoughts on “The Incubation Platform”
Again with this “web in RAM” thing! I guess it doesn’t take long to create an urban myth in Internet time :-
I still think it’s hokum :-
Someone (convincingly) tell me I’m wrong, if I’m wrong, please.
Hmm, I wonder how long before this (metalink?) works too? :-
Um, that wasn’t really your main point, I realise …
This looks like hooey to me too. If they do 300M searches per day, and the average page is 10K, then they’re doing one snippet per day for every
4,285,199,774 * 10 kilobytes /(300 million * 10)
= 14.3 kilobytes
That’s a fairly sparse hit rate, and assuming a Zapfian distribution, most of the pages are probably hit less than once a month.
Sure, they could fit the “web in RAM”, but why would they? They clearly have enough RAM to hold compressed copies of the 5B pages HTML pages they index, if that qualifies as putting the “web in RAM”. But I’d guess they have better things to do with all that RAM. Search-related accesses to individual pages are at the rate of 10x query traffic, i.e, probably less than 100k/second. There’s no need to put that in RAM when you’ve got 100k disks.
Using the Google Web API they already provide a “plugin” for others to use their search. Would be nice if they’d plugin the rest of their apps as well, though.