The rate at which the Google computing system has grown is as remarkable as its size. In March 2001, when the company was serving about 70 million Web pages daily, it had 8,000 computers, according to a Microsoft researcher granted anonymity to talk about a detailed tour he was given at one of Google's Silicon Valley computing centers. By 2003 the number had grown to 100,000.
Today even the closest Google watchers have lost precise count of how big the system is. The best guess is that Google now has more than 450,000 servers spread over at least 25 locations around the world. The company has major operations in Ireland, and a big computing center has recently been completed in Atlanta. Connecting these centers is a high-capacity fiber optic network that the company has assembled over the last few years.
Last year I wrote about "Google data centers and dark fiber connections" explaining how Google was buying up unused "dark" fiber from failing telecom companies, and using it to tie together its massive data centers around the world. Danny Hillis says "Google has constructed the biggest computer in the world, and it's a hidden asset."
Chris Gulker has an interesting take on the story "
Markoff and Hansell peg Google's 'computer' at 450,000 servers in 25 centers worldwide. If that's true, and positing 900 million computer users in the world, then each Google server supports some 2,000 users.
2,000 users is no big deal for a web server. But that can be a heavy load for many heavier, more traditional networked applications that are server based. So one wonders how scalable Google's applications, Google Spreadsheets, Writely et al. might be as more customers flock to them - especially since Google will want to keep its search engine and attached ad-serving processes humming along at top speed as well.
This sounds about right. I remember at Napster we were able to support about 8,000 users per server, but that was more like a simple P2P search engine connecting users to search results. No heavy duty computing. Web wide search engines eat up enormous compute cycles. Web based applications like maps, spreadsheets, and word processors will chew up cycles too. A plan of 2,000 users per server is probably adequate today, but more servers per user will be needed if application use takes off.
Microsoft stunned Wall Street last month when it announced a plan to spend up to $2 billion more than expected next year. Much of that money will go to build out a world class computing infrastructure to rival, and exceed, Google. The NYT story says "Microsoft's Internet computing effort is currently based on 200,000 servers, and the company expects that number to grow to 800,000 by 2011 under its most aggressive forecast, according to a company document."
This is a multi-billion dollar battle of the titans in a fight for the world's Internet users. There is no doubt that more and more applications and services will be hosted on the web and served to a browser. This infrastructure is what enables that to happen at the speed of light.