Science Communications - Publicity for Technologya


whose watching the data?

In the half-century since the invention of the integrated circuit, computer scientists and engineers have raced to discover smaller, faster and cheaper ways to turn electrons into information.  In testament to their success, today, ICs and the computers they populate are everywhere. And in one way or another, they are all connected.


Data centers are the points at which the data in our collective integrated circuits meet up.  When you do a Google search, make a phone call, change a cable channel or scan a shopping card, you're tapping into a data center.  Data centers now account for about one and a half percent of our annual electric power consumption.   Today, the cost of powering and cooling a computer throughout its life equals that of buying it in the first place.  And the cost of installing, networking and maintaining it exceeds the purchase cost by between four and seven times.  The situation has become serious enough to warrant the creation of Carnegie Mellon University's Data Center Observatory (DCO), the first facility of its kind to address the problem head on.


Ironically, the successful quest for better computers has resulted in a profusion of data so large that processor speed and storage capacity have become less crucial measures of performance than energy consumption.    That energy is evinced in numerous forms, including: the electrical energy needed to switch billions of transistors on and off; the mechanical energy needed to keep the circuits cool enough to function properly; the standby energy needed to keep all those computers running while awaiting their next set of instructions; and last but far from least, the human energy it takes to keep the systems and networks happy. 


The heart of the DCO's physical plant is a 2,000 square-foot room containing a pair of 10-foot-wide metal enclosures, one 24 feet, the other 26 feet long.  The enclosures are lined on both sides with racks of computer servers periodically interspersed with battery backups and cooling units known as CRACs (computer room air conditioners).


The servers draw air from the main room and vent their warm exhaust air into the enclosure's central chamber where the CRACs draw it in, cool it, and blow it back to the main room none the worse for wear. Concentrating the warm exhaust air in the central chamber reduces the system’s energy consumption by obviating the need to cool all the air in the main room.   In addition to the system’s large-scale energy efficiencies, the DCO is fitted with sensors and instruments to monitor and control power consumption, temperature, humidity, fan speed and cooling water consumption rate.


But the biggest promise of the DCO resides in its role as a university research environment for exploring data center efficiency, such as the integration of virtualization, in a realistic environment with real users.  Virtualization is the process whereby one computer is programmed to simulate another one.  “When scaled up to data center size, virtualization allows a system to operate more efficiently by sharing a set of machines.” Dr. Greg Ganger said.  “For instance, the average computer cluster around campus is in use about twenty-five percent of the time.  So combining the clusters and increasing the utilization to one hundred percent, would reduce the quantity of machines and power by a factor of four.  The DCO gives us the building blocks to do that.”


It’s good to know not only who’s watching all that data but why they’re watching it.


This article first appeared in Tom Imerito’s TEQ column, Science Fare

© Copyright 2009, Thomas P. Imerito / dba Science Communications


Bookmark and Share

 
©2009 Science Communications
thomas@science-communications.com