Archive for September, 2010

Humidity control and Chillers

Wednesday, September 29th, 2010

As recently reported, Yahoo has opened their Lockport, NY data center with an air economization focus. While air economization has been talked about much and a few data centers have utilized it, for example the EDS now HP Wynyard data center in the U.K. and the Advanced Data Center in Sacramento (both Rumsey Engineering designs), not many have utilized air economization in fairly warm and humid environments. (Climate for Wynyard is humid yet cool while ADC in Sacramento is warm yet dry.) This is because Yahoo is removing data center humidity control. Many folks balk at this idea, yet the server (network and most storage equipment) specs allow for 5-95% RH, often broader, and the NEBS standard, which has been around for about 50 years for all telco equipment, has never had a humidity requirement. (So why does ASHRAE???? This is a topic for a future blog.) So while some will balk at the idea of no humidity control in a data center, there is no known loss of IT equipment from lack of humidity control. (Intel has published a paper that shows very dramatic humidity changes (10-90% within one hour) can have a potentially very minimal effect on equipment failures, and IBM a study on high humidity with unusual and very high ratios of gas pollutants.) I ran a data center in dry Reno (for a co-lo company I ran) in 2003-2005 and then again for BEA/now Oracle in 2006-2007 with very broad humidity ranges in order to save energy. I slowly expanded the range until no humidification was done at all and humidity ranged from 10-30% year around without any equipment failures, ESD issues, etc. The Yahoo data center will be another excellent test bed to show affects on hardware of no humidity control for a large scale data center.

While air econ is a great way to reduce energy, and while I led data center strategy and development for Yahoo (2007-2008) I led many of these ideas. I’ve found with recent data center design projects (by Rumsey Engineers) that water economization leads to a lower PUE in most climates. Yahoo is forecasting an excellent 1.08 PUE; I hope they’ll share actual usage data after months or years of operation as well as learned experiences.

Perhaps the most novel thing of this recent Yahoo design is the idea of not needing chillers. Yahoo choose to use water towers and water economization for those hot days (which begs the question why not water economization all of the time? Benefits would be only one cooling system not two and lower fan horsepower. Nonetheless, this is an idea I floated when I was with Yahoo and have implemented on three other data center projects over the last year, including one design we recently developed for a large financial data center at very high redundancy, and while still meeting Tier IV availability and high density without chillers, we achieved a 1.08 PUE. The really impressive part is that our total construction budget is about $3 million per MW of IT load; a total construction cost so low I haven’t seen before. In many climates chillers are not needed to maintain an ASHRAE allowable supply temperature range and I commend Yahoo for going chiller-less in a humid and warm climate. If they can do this, we should all consider why we need humidity control and such tight supply air ranges. Less equipment not only equals lower capex and opex, but also higher availability. We don’t need equipment to gain efficiency or availability, often we need less equipment to achieve those things, and that is what all of our designs consider.

On to other topics, I have some high level electrical and mechanical engineering positions I have a client in search of; let me know if you or someone you know is interested.

Pardons for my lack of recent blog posting; a wonderful one week holiday in France (a few selected photos I took on this trip here) combined with many new projects and project advancements lately has kept me working 20 hour days for several weeks now. Nonetheless, thank you for reading; I’ll try to keep you posted more frequently about many of the exiting projects I am working on. You can hear more from me at the SVLG Data Center Efficiency Summit on Oct 14th, or at my data center efficiency workshop in NYC in October. Au revoir!