David Chernicoff's post "Planning a datacenter? Learn from Washington’s mistakes" got me thinking about things that my former colleagues at the Uptime Institue would discuss about data center design at cross-functional team meetings. It always was amazing how many layers of technical, business, political, economic and general sociologial things elements that had to be taken into account.
If what the good Mr. Chernicoff has to say is correct, it appears that the data center designers in the state of Washington didn't take everything into account when working on the design and implementation plans for their new data center.
Here is an abbreviated list of things that must be considered to make sure that a data center matches organizational, technical and operational requirements:
- How much uptime is enough for the applications and workloads that will be housed in the data center? A Tier IV data center that must never allow a failure to stop processing must have redundant everything. This can be very, very expensive. A Tier I data center, one that can allow some things to be off line for a time. These data centers can be much less complex and costly.
- What are the environmental concerns for the data center's location. Is it hot? Then extensive plans for cooling must be in place. Is it humid? Then extensive plans for controlling moisture must be in place. Is it subject to earthquakes? Then plans must be in place to deal with the aftermath of such an event. You get the idea.
- What are the business objectives for this facility? This goes hand in hand with how much uptime is enough. A facility that is housing workloads for many different organizations, might need a great deal more flexibility in networking, cooling and power design than one that houses a single organization's IT infrastructure.
- How reliable are the local power and communications suppliers? Multiple suppliers are going to be need to make sure those precious resources are always available. It is also likely that power generation equipment, such as diesel generators, will be needed as well How many? How much capacity is required?
- What type of systems will be housed in the data center? Mainframes need liquid cooling, but most midrange and industry standard systems don't have that requirement?
- What type of networking media will be used? Local area networking
has distance limitations that must be taken into account. Some high
speed communications gear works well over very small distances.
This list goes on and on. Understanding all of the electrical, cooling, security and IT issues often requires a list of highly skilled people. If that type of expertise isn't on staff, outside consultants, such as those from the Uptime Institute, are going to be needed.
As Winston Churchill once said, "He who fails to plan is planning to fail."