Data Center Efficiency- Look Before You Leap Mission Critical

Arthur Cole of IT Business Edge covers the different monitoring tools from AUDIT-BUDDY to DCIM to wireless sensors.

From the article: If you are looking for a less-invasive approach, a company called Purkay Labs offers the Audit-Buddy, a single device that gathers input from data center “white space” that can then be used to revamp energy practices. The device does not need to integrate into existing infrastructure and can be moved from place to place to gain a broad view of data center conditions. Functions include temperature and humidity tracking, 3D thermal baseline profiling, hot/cold loss measurement and containment and overall air quality measurement. Although it does not provide a deep-dive view of the data center, it should give managers some initial insight into whether a full-blown DCIM platform is warranted.

The Article:

Efficiency in the data center is a big thing now, with organizations of all sizes working to develop both the infrastructure and the practices that can help lower the energy bill. But while analysis of data flows and operating characteristics within equipment racks is fairly advanced, the ability to peek under the covers to see how energy is actually being used is still very new.

To be sure, there is a variety of tools on the market these days, from simple measurement devices to full Data Center Infrastructure Management (DCIM) platforms, but more often than not the question revolves around not only what to measure, but how.

Without adequate insight into what is going on, it is nearly impossible to execute an effective energy management plan, says UK power efficiency expert Dave Wolfenden. Many standard tests, in fact, fail in this regard because they attempt to gauge the upper capabilities of power and cooling equipment, not how to maintain maximum efficiency during normal operation. New techniques like computational fluid dynamics (CFD) can help in this regard, but they must be employed with proper baselines in order to give a realistic indication of actual vs. projected results.

When performed properly, monitoring and control processes can produce dramatic results. Yahoo recently reduced energy consumption at its Quincy, Wash., facility by 7.5 million kW/h per year after deploying Panduit’s SynapSense platform. This represents a 44 percent decrease and is nearly double the company’s original target when the project launched in March. The platform incorporates wireless sensors, advanced metrics and automation to first optimize airflow and cooling capacity and then make adjustments as data and environmental conditions change. At the same time, the facility is able to improve its utilization rates and reliability because the risk of system overload is diminished.

If you are looking for a less-invasive approach, a company called Purkay Labs offers the Audit-Buddy, a single device that gathers input from data center “white space” that can then be used to revamp energy practices. The device does not need to integrate into existing infrastructure and can be moved from place to place to gain a broad view of data center conditions. Functions include temperature and humidity tracking, 3D thermal baseline profiling, hot/cold loss measurement and containment and overall air quality measurement. Although it does not provide a deep-dive view of the data center, it should give managers some initial insight into whether a full-blown DCIM platform is warranted.

Ultimately, however, it is likely that the Internet of Things will put to rest many of the questions that currently surround data center operating conditions. As Cisco’s Gordon Feller pointed out to Intelligent Utility recently, data from a thousand devices in the data center – everything from the server rack to the coffee maker – will provide a thorough view of what is happening out there and how it can be improved. This can range from improved power management tools and practices to complete redesign of the data infrastructure and employee workspace. Once individual components are contributing their own tiny portion of the broader energy picture, facilities and data managers will have the fine-grained visibility they need to make real and lasting changes.

While it is true that a journey of a thousand miles begins with a single step, this is still hard to do if you can’t see where you are going. Finding out the truth behind energy consumption in the data center should be the priority before actual changes to infrastructure or practices are made. Without visibility, efforts at improving efficiency could be ineffectual at best and detrimental at worst.

Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata, Carpathia and NetMagic.

Previous
Previous

Six Tips on Environmental Monitoring Ft. AUDIT-BUDDY

Next
Next

EEC Uses AUDIT-BUDDY