Pin It

Asia – the Wild, Wild West of Opportunity

July 11th, 2013

As I led Global Data Center Strategy and Development for Yahoo!—a position that I immensely enjoyed and would love to do again—I developed, sought out and negotiated data center capacity in over 20 counties. And of all the regions, Asia was by far the most challenging to find large blocks of good quality data center capacity. I spent much time looking for data center capacity to lease in Hong Kong, Taiwan, Singapore, Japan, Korea, and India. (China was a separate entity.) I have taken countless trips to Asia since 2004, and during my tenure at Yahoo!, I traveled to Asia every month (as well as elsewhere).

Due to this limited supply of high-quality data center capacity, I was willing to take down futures on good quality data centers to be built in Asia, especially those with network neutral connectivity, which can be difficult to find in many Asian countries due to limited access to cross-country networks except by incumbent network providers. My counterparts at Google and I were almost always rushing to get into the available data centers in these regions before each other, so I was always so surprised that the US data center providers were not rushing to build strategic capacity in Asia to take advantage of these business opportunities.

Although data center capacity in Asia has been continually added faster than probably any other global region, demand still persists for good-quality, network-neutral data centers throughout Asia. This is evidenced by correlation to network growth mentioned in this statement about Equinix expanding in Osaka: “Osaka is Japan’s second largest economy after Tokyo and saw internet traffic grow a staggering 68 percent and bandwidth increase at a compound annual growth rate (CAGR) of 56 percent from 2008 to 2012 rivaling Tokyo.” http://www.datacenterknowledge.com/archives/2013/07/09/equinix-plans-data-center-in-osaka-cloudsigma-begins-deploying-across-its-footprint/

In large, mature and well-developed markets such as Osaka, intrinsic organic market demand doesn’t grow by 56% CAGR, so this indicates to me that there is likely unmeant demand finally being absorbed by new capacity. I believe that the same is true for other recent data center expansions in Asia by Equinix, Digital Realty Trust, and others.

Other locations, particularly Brazil and Mexico, have also been seeing tremendous growth that has been fairly unmeant by high-quality US data center providers. This has also been a challenge for other global locations. So even though US and non-US data center providers have been adding capacity and finally expanding into new global markets, it isn’t enough to have met and continue to meet new demand in these non-US markets.

It’s well beyond time for US data center providers to change their US-centric focus on US locations and take advantages of the many opportunities outside of the US, which can not only provide excellent revenue and margin growth, but also benefits of customer retention and new customer acquisitions, as well as network interconnectivity and market share dominance. All of these being immensely valuable in a much more competitive and global marketplace than it was even just a year ago. It’s time to smartly expand globally to gain many great advantages and cease the ultra-conservative approach that misses market share, revenue and margin growth opportunities for high-quality data center providers.

I hope that I can help guide some of these providers with these new opportunities and also the end-users find the best providers in these growing data center locations. Growth opportunities are nearly boundless in this global market. Now get out there and grow!

Apple+Reno+Solar = “Controllable Power”

July 8th, 2013

Some of you know that I have developed the Reno Technology Park along with a few others. I am the sole data center expert in the group and when I first viewed the property, I saw that it had potential as a site for data centers with the property being laced with electricity and natural gas transmission lines, main fiber routes crossing thru the property, and proximity to clean power plants. However, that infrastructure was not enough to sway me to get involved. The project needed lower cost power and tax options.

At my insistence, we created some unique tax incentives, but as a data center power guy for nearly two decades negotiating power deals and developing power plants, I saw the real potential was for clean, “controllable” power. I brought Apple to the site last spring and they too saw the same potential.

Fast forward now just over a year, and Apple has one operational data center building, a second data center building fast approaching commissioning, and now an announcement of a nearby 18-MegaWatt solar project near the Reno Technology Park. Here are some links to public articles about these announcements:
http://www.macrumors.com/2013/03/27/first-phase-of-apples-new-reno-nevada-data-center-ready-to-open/
http://www.datacenterknowledge.com/archives/2013/03/27/apple-ready-to-roll-in-reno-with-a-coop/
http://www.rgj.com/videonetwork/2264915824001?odyssey=mod%7Ctvideo2%7Carticle
http://www.datacenterknowledge.com/archives/2013/07/02/apple-planning-solar-farm-next-to-planned-reno-nevada-data-center/
http://www.computerworld.com/s/article/9240559/Apple_unveils_18_megawatt_solar_farm_to_power_cloud_data_center?source=CTWNLE_nlt_pm_2013-07-03)

Being under NDA with Apple, I cannot expand upon these articles with information from other sources. So let’s talk about what I mean by “controllable power”. The ability to take control of what I call the “Three C’s”: cost, capacity and control. Control being the deliverability, schedule and mix of that power, as well as controlling the future cost of the electricity. Cost being current and future costs, as when we plan to operate a data center, we must take into account the total electricity cost over the expected life, usually 10-20 years. And ideally, we don’t just want a low cost today, but more importantly a low average cost over that life cycle. I see too many folks run to a market with low-cost electricity today but not realize that those low costs will go up, and often within 1-3 years and to an average much higher than other location options. Predicting and seeing these future costs is one of the key advantages to using MegaWatt Consulting for your data center site selections and not another company, as I do not see any other company looking at all of the factors that will influence future data center costs like we do. Do you want to choose a site that has great costs before you start constructing yet high costs by the time you fill it and be surprised that your site is not a low cost site a few years from now, or go to a site that will continue to provide low costs for years to come?
And capacity is key, as there is a cost to bringing power capacity to a project and sometimes that is enormous. For example, a few years ago I was consulting for Equinix and the cost they were quoted by the utility to bring power capacity to a site was equal to nearly one-third of the construction cost for an entire new and large data center! That would have added nearly 50% to the total construction budget! I was able to negotiate that down to less than 10% of the total project budget, but still a very large expense and one that is often not accounted for during site selection TCO estimates. All proving the point that controllability of power over time–each it’s cost, capacity, mix and deliverability—provide significant benefits to a company and it’s costs over time.

Whether or not Apple is responding to pressure from Greenpeace, NY Times’ articles, their stockholders, consumers or other shareholders, having a data center site that can provide flexibility for the many factors over time is key to adjust to changing needs. Whether those needs are costs, the fuel mix, deliverability or reliability of that power, all provide significant benefits when they can be controlled to meet changing needs over time. And all needs change over time, and being that electricity cost drives a 10-year Net-Present Value analysis of data center ownership, “controllable power” is essential to good data center cost management.

If you’d like “to take control” of your data center’s a key driver of current and future costs, as well as combat changing pressures from shareholders, markets and other factors, let’s talk about some options.

How to save on water costs in your data center

April 15th, 2012

Two weeks ago I spoke at the Recycled Water Use and Outreach Workshop in Sacramento. I know what you’re asking, “why is a data center guy talking at a recycled water conference?” Well, funny that you asked.

First of all, most of my ultra-efficient designs use water for cooling, often indirect evaporative systems. Hence, we trade energy use for water use. Now water is far less costly than energy and often has a much lower carbon footprint and other environmental impact per unit of cooling than electricity. But it always is a bonus to use recycled water, as it has an even lower environmental impact than standard potable supply. Of course, all water IS recycled. There are only a finite number of water drops on this wonderful planet that sustains us and every one of them has been around the water cycle block at least a few times, so in essence, all water is recycled.

As we use water to help or entirely cool our data centers, water plays an even greater role in data centers to achieve the greatest efficiency. Hence, water quality, capacity, cost and reliability of service are just as important as any other valuable input into our system of operations, making these factors and the future cost of water even more important into our site selection decisions. I’ve seen water cost between $.10 to $10.00 per 1,000 gallons—wow! What a spread! And I’ve seen it increase at 40% rates per year! Wouldn’t it be nice to have a consistent price from a non-profit water system that YOU have control over and full visibility into all costs? And one that is built to meet the high-availability and quality standards for data centers, and is DEDICATED to data center use? That is what you get at the Reno Technology Park!

And it’s not just the supply but also the discharge of water. I learned much about water discharge challenges in Quincy, WA, when building the Yahoo! data center there, as the local water utility wanted Microsoft and Yahoo! to pony up $10-15 million to pay for a new water treatment plant to handle the QUANTITY of our discharge water. Our quality was fine, but the quantity was too much for the current systems. This led me to find solutions to reduce the cooling tower blow down and avoid this $10+ million unplanned cost to our project.

I’ve always been a fan of chemical-free water treatment systems, but when looking for new solutions to solve our problem, I came across WCTI, which makes a chemical-free system quite different than other systems, and could provide us a system to get the cycles of concentration up over 200!!! Yes, that is over 200 cycles of concentration, which means nearly zero blow down! Which means it lowers water consumption by 30-50% and avoidance of paying for a new water treatment plant for the city. And it’s truly chemical free (even no biocides), which means it’s safer for people and the environment, as well as much lower cost. Keep those chiller tubes and/or pipes clean!

This is one of the comprehensive solutions that we provide for our clients at MegaWatt Consulting. It’s about saving money, and water is just another critical part of our system. Reach out to us to learn more!

Coal Burning Power Plants must Finally Reduce Mercury emission

March 1st, 2012

Coal burning power plants account for the vast majority of the mercury that we contact. I’ve read statistics that 80-95% of the mercury that we contact comes from coal burning power plants. In the US, it is estimated that coal-fired power plants are responsible for half of the nation’s mercury emissions.

The mercury in the emissions literally rains down on the oceans and land falling on crops that we eat, in the rivers and oceans that we fish, and on our backyards and into our lungs. Mercury leads to many very serious mental and physical disorders.

“According to the U.S Environmental Protection Agency, mercury is responsible for thousands of premature deaths and heart attacks. It can also damage children’s nervous systems and harm their ability to think and learn. The mercury, in essence, falls back to earth where it gets into the food chain.” (energy biz, “Obama Showers Coal with Mercury Rule”, Jan 3, 2012–http://www.energybiz.com/article/12/01/obama-showers-coal-mercury-rule). I’ve read in EPA reports that there is estimated to be 50,000 pre-mature deaths every year in the US due to the emissions from coal-burning power plants. Imagine loosing an entire city of 50,000 people every year? That is a city in population not much different than Palo Alto, CA. And that figure does not count the number of lung-related issues such as asthma that develop from these emissions.

Well, the Clean Air Act provides each of us the right to clean air. As such, in December, 2011, “the EPA carried out its obligation under the 1990 Clean Air Act and demanded that coal-fired power plants implement the available technologies to reduce their emissions by 90 percent.”

These regulations are not a shock to most utilities, as they have been aware of the pending regulations for some time (since the clean air act was put into law), and most utilities actually support the law as it allows them to shut down old coal-fired power plants, which are a financial, legal and environmental liability in exchange for building new, cleaner burning and more efficient power plants. These new regulations really only affect coal plants that were constructed 30 to 50 years ago. The operators can choose to have them meet the new requirements or shut down and replace them with new, more efficient and less polluting plants– a decision compelled not just by the new regulations but also by the need to compete with lower cost shale gas. Since most utilities in the US get a return on building new infrastructure, it is good business to build new power plants. Essentially, it sets a more level playing field to the 1,400 coal-fired US power plants and ends 20 years of uncertainty about these regulations.

Will these new regulations cause electricity prices to increase? Yes, but not likely significantly, as the “EPA estimates that the cost of carrying out the new mercury rules will be about $9.6 billion annually. But it also says that payback will be as much as $90 billion by 2016 when all power plants are expected to be in compliance, or closed. The agency expects “small changes” in the average retail electricity rates, noting that the shift to abundant shale-gas will shield consumers.” I agree with that assessment, as shale-gas will keep prices down. Even though “The American Coalition for Clean Coal Electricity says that the new mercury rule, in combination with other pending coal-related regulations, will increase electricity prices by $170 billion” through 2020, a estimate not much different than the EPA’s and also one to likely have a very minimal affect on electricity prices since it is such a small percentage of total electricity spend per year.

The same group says that “Coal helps make electricity affordable for families and businesses,” says Steve Miller, chief executive of the coal group. “Unfortunately, this new rule is likely to be the most expensive rule ever imposed on coal-fueled power plants which are responsible for providing affordable electricity.” Of course, when one accounts for health-related costs, the new emissions rules are far less costly than paying for your son’s asthma medicine and your father’s lung cancer treatments. Finally, we are getting slightly cleaner air, something the clean air act provided to us by law over 40 years ago.

Call for Case Studies and Data Center Efficiency Projects

February 15th, 2012

As many of you know, I have chaired what has become known as the SVLG Data Center Efficiency Summit since the end of it’s first year’s program. That was fall of 2008. A wonderful summit held at Sun Microsystem’s Santa Clara campus. This has been a customer-focused, volunteer-driven project with case studies presented by end-users about their efficiency achievements. The goal is for all case studies to share actual results of the savings to show what works, best ways to improve efficiency and to provide ideas and support for all kinds of efficiency improvements within our data centers. We’ve highlighted software, hardware and infrastructure improvements, as well as new technologies and processes, in the effort that we all gain when we share. Through collaboration we all improve. And as an industry, if we all improve, we avoid over-regulation, we all help to preserve our precious energy supplies and keep their costs from escalating as quickly. We all help to reduce emissions generated as an industry and drive innovation. In essence, we all gain when we share ideas with each other.

As such, I have thought of this program to be immensely valuable as an industry tool to efficiency and improvement for all. Consequently, I have volunteered hundreds of hours of my time and forgiven personal financial gain to chair and help advance this program along with many other volunteers who have also given much of their time to advance this successful and valuable program. I do not have the resources to continually give of my volunteer time–I wish I did–but do hope to provide more support or time with future corporate sponsorship.

I do hope that you can participate in this valuable program and the corresponding event held in the late fall every year since 2008. Below is more information from the SVLG. You can also call me for more info.

Attention data center operators, IT managers, energy managers, engineers and vendors of green data center technologies: A call for case studies and demonstration projects is now open for the fifth annual Data Center Efficiency Summit to be held in November 2012.

The Data Center Efficiency Summit is a signature event of the Silicon Valley Leadership Group in partnership with the California Energy Commission and the Lawrence Berkeley National Laboratory, which brings together engineers and thought leaders for one full day to discuss best practices, cutting edge new technologies, and lessons learned by real end users – not marketing pitches.

We welcome case studies presented by an end user or customer. If you are the vendor of an exciting new technology, please work with your customers to submit a case study. Case studies of built projects with actual performance data are preferred.

Topics to consider:
Energy Efficiency and/or Demand Response
Efficient Cooling (Example: Liquid Immersion Cooling)
Efficient Power Distribution (Example: DC Power)
IT Impact on Energy Efficiency (Example: Energy Impact of Data Security)
Energy Efficient Data Center Operations
In the final version of your case study, you will need to include:
Quantifiable savings in terms of kWh savings, percentage reduction in energy consumption, annual dollar savings for the data center, or CO2 reduction
Costs and ROI including all implementation costs with a breakdown (hardware, software, services, etc) and time horizon for savings
Description of site environment (age, size or load, production or R&D use)
List of any technology vendors or NGO partners associated with project
Please submit a short (1 page or less) statement of interest and description of your project or concept by March 2, 2012 to asmart@svlg.org with subject heading: DCES12. Final case studies will need to be submitted in August 2012. Submissions will be reviewed and considered in the context of this event.
Interested in setting up a demonstration project at your facility? We may be able to provide technical support and independent evaluation. Please call Anne at 408-501-7871 for information.