My coverage of Paul’s post has prompted some very thoughtful discussion in the comments, and I wanted to point it out. An employee from Google and one from Yahoo are discussing the value and approach of R&D, with some great comments thrown in by other readers. Excerpts:
(JG@Yahoo)”Google treats research as an engineering task. And thus really only comes up with engineering solutions. They see some problem that’s slightly broken, so they engineer a slightly better solution. With MS on the other hand, they’ve allowed funding for more pie-in-the-sky, long term projects, such as those that used to happen at PARC and Bell Labs.”
(Random Googler) “I work at Google, and I see an amazing amount of research going on. The entire company is staffed with people with academic backgrounds in disciplines like computer science, computer engineering, mathematics, and so on. To imagine that we’re not doing research constantly seems bizarre to me. The question of “yes, but how much basic research” you’re doing also seems weird to me. When running your company involves solving fundamental problems in computer science and mathematics, that’s what you do as your bread and butter.”
(JG) “You mention the hordes of academics who have joined Google. I know, they’re there. But if they’re all busy launching products, who is creating the seeds for the next generation?”
Update: My bad. JG has a Yahoo mail address, but is not at Yahoo, he’s a researcher at another Valley firm.
3 thoughts on “Thoughtful Discussion”
I think traditional wisdom that sees R&D as a pursuit external to ‘bread and butter’ product development is broken. If you make R&D external to your product dev then that’s where most of the R&D innovation will remain. If you make it an integral part (say 20%?) of your engineers job, keep them onsite and engaged in daily product dev, then any innovation will feed directly and immediatelly back into the product.
Ah, MarkM, therein lies the false dichotomy. Traditional wisdom doesn’t see R&D as a pursuit external to ‘bread and butter’…in that it is planting the seeds to grow the wheat to make the bread. It is tending the cows to milk the milk to churn the butter. The pursuit is not external. It just looks ten feet ahead, instead of two inches ahead.
I think everyone here who has said that most companies, most of the time, do an extremely poor job of transitioning from seed to germ to dough to bread, or from Bessie to butter, is correct. I don’t disagree, and I can understand the rationale and motivation behind wanting to behave like Google in this matter, rather than like IBM, AT&T, MS, HP, etc. etc.
But what most people do not see is that there is a flip-side hidden danger to everyone switching to the Google model. When you stop looking ten feet ahead, and only look two inches ahead, to the projects at the end of your nose that are being picked at, you’re left with very shallow, narrow, short-term improvements. You never break out of your “local maximum”. Eventually you will need to back up, look around, and move in a fundamentally different direction. But if you’re only ever looking two inches ahead, by poking around at things only 20% of the time, it is hard to really know where to back up to.
I wish someone would show me something different about the Google model. Maybe I’m too blind or cynical. But after eight years of “research” and thousands of different “researchers” (all that 20% time from day one onwards), all I see is…Google Suggest?
Compare that against IBM Research. From a recent ZDNET article: ‘Mr Wladawsky-Berger said that it was IBM’s research labs, the largest in the world, that helped save the company. “In labs, we were able to see a few years ahead and we could predict the disruptive effect of the PC but our management wasn’t able to react fast enough.” He said that making the necessary changes at IBM, the cuts in staff and projects was very difficult to do.‘
So, because you had folks working 100% of the time on research, they were able to see something coming years ahead. Echoing Deep’s comments from yesterday, management still didn’t “get it” right away. But because research had something cooking years ahead of time, it was able to help the company start steering, and, according to Wladawsky-Berger, save itself.
I would say that is a huge payoff for “old style” research. And I really don’t see how it would have happened if IBM had just been “throwing things against the wall and seeing what stuck” ala Google.
Just a quick update on this. A case in point.
The 29th annual ACM SIGIR conference started yesterday. It is the world’s premier venue for information retrieval (“search”) related research, i.e. Google’s core business.
Here is the program.
So let us count the number of papers Google has gotten accepted/published: 2. And let’s count the number of papers Microsoft has accepted/published: 13.
So, you can add up all the numbers you want about revenue, money spent, etc. But there is something to be said when Microsoft has 550%, or 6.5 times, the number of research papers than Google. In Google’s core area of expertise!
SIGIR is an extremely high quality, peer-reviewed conference. You can see the list of almost 250 paper reviewers here, from a wide range of top universities and research institutions around the world.
So we can add up however many years and/or billions of dollars from the graph above. But when it comes down to it, I think the fact that Microsoft has 13 papers and Google has 2 speaks volumes. And let’s not forget that Microsoft is also doing all sorts of other research in many different areas other than search as well. So these numbers from the other day do make sense to me, comparatively.