Whether your organisation runs a small server closest or large server room or onsite datacentre, ambient and temperature control is critically important to prevent system downtime.
The electronics and microprocessors within a server or computer generate heat energy which must be managed in a confined space such as a server rack cabinet, server room and datacentre.
Cooling fans are built-into the server or computer casing to draw air through the front of the unit and exhaust this through rear panel grills. If the internal fan(s) fail or the ambient air drawn through the front panel is too high, a build-up of heat will occur with the potential for component failure and fire. This potential disaster is always present and increases significantly where server cabinets or racks (into which more than one server is placed) are installed.
In addition to this, whilst virtualisation technologies has reduced the number of physical servers required to perform computing tasks, the power drawn by typical servers and the energy generated as heat (measured in BTU/hr) has risen exponentially.
A fully populated server rack can draw as much as 15kW or more. Server technologies have become more efficient in their design with lower heat outputs but at 15kW and say 95% operating efficiency this still leaves 2,564 BTU/hr of heat to deal with.
Most server rooms and datacentres should be run at around 20-25˚C to provide a sufficiently cool environment for the servers and any associated UPS systems and their batteries, whilst also maintaining a comfortable environment for the people within the IT team to work in. ASHRAE (American Society of Heating, Refrigerating and Air-Conditioning Engineers) also advocates increasing the temperature within the IT room to higher levels with changes to the operational infrastructure e.g. the movement of UPS valve-regulated lead acid batteries to a separate plant room. Most server, IT peripheral and UPS electronics will operate at higher temperatures above 30˚C without detriment to their long-term performance and this reduces the need for cooling and increases energy usage. For more information see the Data Centre ANSI/ASHRAE Standard 90.4-2016 (https://www.ashrae.org/about/news/2016/data-center-standard-published-by-ashrae).
Datacentres are buildings dedicated to providing managed and controlled environments for server facilities. Most organisations will now operate some form of Cloud-based servers in addition to their on-site server rooms or server closets. This trend will increase through the adoption of Internet of Things (IoT) technologies and connectivity. The key is issue is to manage your on-site server room or closet as a controlled environment. It is in effect a mini-datacentre, without which your organisation may be incapable of operating.
The Server Room Environments team provide power, cooling and energy efficiency audits as a free of charge service. Here is our top-10 energy efficiency audit checklist for computer rooms, IT closets, server rooms and datacentres:
Cooling Temperature Settings: cooling systems are single-control loops whose operation is controlled via thermostat settings. Measure the room ambient at several points within the server room and compare these to your thermostat/temperature control settings. The larger the differential the less efficient the cooling. Ideally there should only be a 1-2˚C difference.
Air Conditioner Sizing: audit the load (server requirements) and size of air conditioning in the room and its positioning. Ideally the air conditioner system should be sufficiently sized to cool the ambient to 20-25˚C and leave a 20% safety margin.
Cooling Airflow and Containment: cool air is normally drawn into the front of a server and expelled through the rear vents. The same format holds when servers are positioned inside a server cabinet. Multiple cabinets should be arranged into a hot-aisle/cold-aisle arrangement to optimise cooling efficiency. Hot air should be expelled from the rear of the cabinets into the ‘hot-aisle’ which is then collected and drawn into the cooling system air conditioner (computer room air conditioner – CRAC) unit. The front of the server racks should face the ‘cold-aisle’ and be supplied with conditioned/cooled air.
It is often all-to-easy to overlook potential energy savings within a server room and the cooling system itself. Over the life of an air conditioning system, its operational load and demands will change and should be reviewed at least every 2-3 years with a view to a full system refresh around years 5-7. This reflects the pace at which air conditioning and cooling systems are evolving to keep pace with changes in IT server technologies.
With the UK experiencing one of the hottest summers since 1976 many server room air conditioners and datacentre cooling systems are facing extreme workloads. The issue is not the ambient temperature itself but the length of time that the higher than average temperatures have been around. For future upgrades and new system installations it may well become the norm to have to build in a ‘heatwave’ factor to prevent disruption and potential system failures. So how do you size an air conditioner for a server room or datacentre?
From 2015, it became illegal to use Hydrochlorofluorocarbons (HCFCs), including the ozone-depleting refrigerant gas R22, in refrigeration, heat pump and air conditioning (AC) systems. R22 was commonly used in cooling systems pre-dating 2004 and its ban has had a major effect on air-conditioning costs.