Robert X. Cringely wrote a fascinating story today "Google Mart" about Google's plans for all their data centers and dark fiber they have been buying. Google has tens of thousands of servers, and over 50 data centers, scattered all over the world. This infrastructure, coupled with the Java script and code they already download to your PC in the background, is what will allow them to deliver Web based software services (SaaS) in a new and interesting way. Google has been quietly buying up huge amounts of dark fiber to connect all of their distributed data centers. This plan will take years to unfold, but all the pieces are falling into place.
Dark fiber is a term used to describe unused fiber optic cable in the communications infrastructure. It is "dark" because it is "unlit" by usage. Back in the 90's telecom boom those companies built out a communications infrastructure with 10 times the necessary capacity. The theory was that the fiber was cheap. All the cost was in the labor and installation process of burying the cable. It was much cheaper to do it all once rather than add capacity every few years. Years later these telecom companies fell on hard times and Google quietly bought the unused fiber capacity for pennies on the dollar. Very shrewd move.
So, why has Google been buying up lots of dark fiber?
"The probable answer lies in one of Google's underground parking garages in Mountain View. There, in a secret area off-limits even to regular GoogleFolk, is a shipping container. But it isn't just any shipping container. This shipping container is a prototype data center.
Google hired a pair of very bright industrial designers to figure out how to cram the greatest number of CPUs, the most storage, memory and power support ind power support into a 20- or 40-foot box. We're talking about 5000 Opteron processors and 3.5 petabytes of disk storage that can be dropped-off overnight by a tractor-trailer rig. The idea is to plant one of these puppies anywhere Google owns access to fiber, basically turning the entire Internet into a giant processing and storage grid.
While Google could put these containers anywhere, it makes the most sense to place them at Internet peering points, of which there are about 300 worldwide.
Two years ago Google had one data center. Today they are reported to have 64. Two years from now, they will have 300-plus. The advantage to having so many data centers goes beyond simple redundancy and fault tolerance. They get Google closer to users, reducing latency. They offer inter-datacenter communication and load-balancing using that no-longer-dark fiber Google owns. But most especially, they offer super-high bandwidth connections at all peering ISPs at little or no incremental cost to Google.
Where some other outfit might put a router, Google is putting an entire data center, and the results are profound."
Cringely goes on to say that Google will place their "cargo container data centers" near network access points, thereby improving performance for end users, and creating its own reliable, redundant super computing grid.
My guess is that Google will not compete head to head with Microsoft. That would be foolish, and is entirely unnecessary. Remember my favorite cliche': "In a fight between a grizzly bear and an alligator, the terrain determines the victor". I think Google and Microsoft will, for the most part, stay with the terrain where they are strong, and only occasionally venture off into the others turf.
GYMA (Google, Yahoo, Microsoft, AOL) will be important players in the next generation of the web, sometimes referred to as Web2.0. It really is a whole new thing. The idea of SaaS (Software as a Service) is to move computing resources off individual PCs and onto the network. To do this you need high bandwidth connections. Nearly everyone has that. But, if you can get data storage near the network access points and a computing infrastructure there as well...Wow! This changes the dynamics significantly.
We could be on the verge of another web innovation cycle reminiscent of 1995. Some entrepreneurs have spent the last two years building products and services in anticipation of this wave. The fun is about to begin.
I went and read the article. As I like what you and Robert X. write, I'll try not to be snarky, even if Robert X.'s article leaves an enormous credibility gap.
These Internet Content Server bombs are just that. They are concentrated masses of capital expenditure that will be obsolete two years after they are deployed. Frankly, this is some gadget guru's wet dream gone horribly wrong.
And the whole idea ignores currently deployed equipment that all the major players have. Is it so woefully inadequate that Google will do a quantum leap on everyone else? Robert X. did not really establish that key point.
I really love the hook about the cordoned area in the parking garage. I understand the TV series "Alias" puts most of their scenes in that garage ;-)
We are certainly living in exciting times in software. I just hope the big players aren't paying too much attention to fiction when the facts are exciting enough.
Posted by: Walter Lounsbery | November 18, 2005 at 10:03 AM
And since Google has hired engineers who worked on Firefox, they could introduce a browser that can more easily route traffic over this private "Internet," providing faster more reliable service for Google products.
It can also server as the carrier for certain bandwidth-heavy or security-intensive products that should not run on the public Internet (instead, you'd only have to reach the edge of Google's private "Internet").
Posted by: John | November 18, 2005 at 07:05 PM
See my blog posting entitled: "Rail Roads versus Car Roads":
(http://itscout.blogspot.com/2005/11/rail-roads-versus-car-roads.html)
What if we had free public Internet highways instead of fee-based private Internet service providers?
Broadband is an abundant resource. It used to be a scarcity. Sea changes of this magnitude always require major paradigm shifts.
Posted by: Jeff Tash | November 18, 2005 at 08:00 PM
I posted a while ago about Google's partnership with Sun. While many people were wondering if it meant that they were going to release StarOffice as a service, I wondered something else: if they were going to take advantage of Google's network and Sun's expertise selling CPU time by the hour and create the world's largest grid-for-hire. (see http://westcoastgrid.blogspot.com/2005/10/not-done-googling-after-all.html)
Posted by: Dan Ciruli | November 29, 2005 at 12:08 PM
When Google acquires small innovative companies such as writely.com, they are not doing so in my opinion to compete with Microsoft, rather to create more buzz about the company, and remember 2% increase in their stock price due to buzz means hundreds of millions of dollars of market cap, even if the product remains free and in beta for years to come.
When Google says they are buying dark fiber and when their job postings says they need experienced people who have negotiated dark fiber deals before; this is a sign of buzz even if they bought so much of it.
Now let’s say that they actually have lots of unused fiber, what could they do with it?
Google will use this to keep track of the growing web, which will be ten times bigger in few years as well as offer rich multimedia content.
Google in my opinion will be the database and storage system of choice for all the SaaS (software-as-a-service) vendors. While you might get your CRM from Salesforce, Netsuite or Salesboom.com, the actual data will be hosted on Google’s network. The latest nasty outages we have seen at salesforce suggest that if the data was hosted at google, and the software service at the CRM vendor, the affected companies most likely would’ve had access to their data.
I am not suggesting Google never goes down, because they too had outages in the past, but if one company in the world has the most resources and vested interest to be up all the time; it is Google.
The partnership with Sun could mean just that; but it also could be just fiction.
Posted by: Tom Stefano | July 28, 2006 at 01:06 PM