Robert X. Cringely wrote a fascinating story today "Google Mart" about Google's plans for all their data centers and dark fiber they have been buying. Google has tens of thousands of servers, and over 50 data centers, scattered all over the world. This infrastructure, coupled with the Java script and code they already download to your PC in the background, is what will allow them to deliver Web based software services (SaaS) in a new and interesting way. Google has been quietly buying up huge amounts of dark fiber to connect all of their distributed data centers. This plan will take years to unfold, but all the pieces are falling into place.
Dark fiber is a term used to describe unused fiber optic cable in the communications infrastructure. It is "dark" because it is "unlit" by usage. Back in the 90's telecom boom those companies built out a communications infrastructure with 10 times the necessary capacity. The theory was that the fiber was cheap. All the cost was in the labor and installation process of burying the cable. It was much cheaper to do it all once rather than add capacity every few years. Years later these telecom companies fell on hard times and Google quietly bought the unused fiber capacity for pennies on the dollar. Very shrewd move.
So, why has Google been buying up lots of dark fiber?
"The probable answer lies in one of Google's underground parking garages in Mountain View. There, in a secret area off-limits even to regular GoogleFolk, is a shipping container. But it isn't just any shipping container. This shipping container is a prototype data center.
Google hired a pair of very bright industrial designers to figure out how to cram the greatest number of CPUs, the most storage, memory and power support ind power support into a 20- or 40-foot box. We're talking about 5000 Opteron processors and 3.5 petabytes of disk storage that can be dropped-off overnight by a tractor-trailer rig. The idea is to plant one of these puppies anywhere Google owns access to fiber, basically turning the entire Internet into a giant processing and storage grid.
While Google could put these containers anywhere, it makes the most sense to place them at Internet peering points, of which there are about 300 worldwide.
Two years ago Google had one data center. Today they are reported to have 64. Two years from now, they will have 300-plus. The advantage to having so many data centers goes beyond simple redundancy and fault tolerance. They get Google closer to users, reducing latency. They offer inter-datacenter communication and load-balancing using that no-longer-dark fiber Google owns. But most especially, they offer super-high bandwidth connections at all peering ISPs at little or no incremental cost to Google.
Where some other outfit might put a router, Google is putting an entire data center, and the results are profound."
Cringely goes on to say that Google will place their "cargo container data centers" near network access points, thereby improving performance for end users, and creating its own reliable, redundant super computing grid.
My guess is that Google will not compete head to head with Microsoft. That would be foolish, and is entirely unnecessary. Remember my favorite cliche': "In a fight between a grizzly bear and an alligator, the terrain determines the victor". I think Google and Microsoft will, for the most part, stay with the terrain where they are strong, and only occasionally venture off into the others turf.
GYMA (Google, Yahoo, Microsoft, AOL) will be important players in the next generation of the web, sometimes referred to as Web2.0. It really is a whole new thing. The idea of SaaS (Software as a Service) is to move computing resources off individual PCs and onto the network. To do this you need high bandwidth connections. Nearly everyone has that. But, if you can get data storage near the network access points and a computing infrastructure there as well...Wow! This changes the dynamics significantly.
We could be on the verge of another web innovation cycle reminiscent of 1995. Some entrepreneurs have spent the last two years building products and services in anticipation of this wave. The fun is about to begin.