Price Check Guarantee

Price Check Guarantee

Nationwide Installations

Nationwide Installations

Next Day Deliveries

Next Day Deliveries

Free Expert Advice

Free Expert Advice

Free Site Survey
02/04/2018

How to Cool Your Server Room and Reduce Your Energy Costs

Whether your organisation runs a small server closest or large server room or onsite datacentre, ambient and temperature control is critically important to prevent system downtime.

The electronics and microprocessors within a server or computer generate heat energy which must be managed in a confined space such as a server rack cabinet, server room and datacentre.

Cooling fans are built-into the server or computer casing to draw air through the front of the unit and exhaust this through rear panel grills. If the internal fan(s) fail or the ambient air drawn through the front panel is too high, a build-up of heat will occur with the potential for component failure and fire. This potential disaster is always present and increases significantly where server cabinets or racks (into which more than one server is placed) are installed.

In addition to this, whilst virtualisation technologies has reduced the number of physical servers required to perform computing tasks, the power drawn by typical servers and the energy generated as heat (measured in BTU/hr) has risen exponentially.

A fully populated server rack can draw as much as 15kW or more. Server technologies have become more efficient in their design with lower heat outputs but at 15kW and say 95% operating efficiency this still leaves 2,564 BTU/hr of heat to deal with.

Most server rooms and datacentres should be run at around 20-25˚C to provide a sufficiently cool environment for the servers and any associated UPS systems and their batteries, whilst also maintaining a comfortable environment for the people within the IT team to work in. ASHRAE (American Society of Heating, Refrigerating and Air-Conditioning Engineers) also advocates increasing the temperature within the IT room to higher levels with changes to the operational infrastructure e.g. the movement of UPS valve-regulated lead acid batteries to a separate plant room. Most server, IT peripheral and UPS electronics will operate at higher temperatures above 30˚C without detriment to their long-term performance and this reduces the need for cooling and increases energy usage. For more information see the Data Centre ANSI/ASHRAE Standard 90.4-2016 (https://www.ashrae.org/about/news/2016/data-center-standard-published-by-ashrae).

Computer Room Cooling Efficiency Checklist

Datacentres are buildings dedicated to providing managed and controlled environments for server facilities. Most organisations will now operate some form of Cloud-based servers in addition to their on-site server rooms or server closets. This trend will increase through the adoption of Internet of Things (IoT) technologies and connectivity. The key is issue is to manage your on-site server room or closet as a controlled environment. It is in effect a mini-datacentre, without which your organisation may be incapable of operating.

The Server Room Environments team provide power, cooling and energy efficiency audits as a free of charge service. Here is our top-10 energy efficiency audit checklist for computer rooms, IT closets, server rooms and datacentres:

Cooling Temperature Settings: cooling systems are single-control loops whose operation is controlled via thermostat settings. Measure the room ambient at several points within the server room and compare these to your thermostat/temperature control settings. The larger the differential the less efficient the cooling. Ideally there should only be a 1-2˚C difference.

Air Conditioner Sizing: audit the load (server requirements) and size of air conditioning in the room and its positioning. Ideally the air conditioner system should be sufficiently sized to cool the ambient to 20-25˚C and leave a 20% safety margin.

Cooling Airflow and Containment: cool air is normally drawn into the front of a server and expelled through the rear vents. The same format holds when servers are positioned inside a server cabinet. Multiple cabinets should be arranged into a hot-aisle/cold-aisle arrangement to optimise cooling efficiency. Hot air should be expelled from the rear of the cabinets into the ‘hot-aisle’ which is then collected and drawn into the cooling system air conditioner (computer room air conditioner – CRAC) unit. The front of the server racks should face the ‘cold-aisle’ and be supplied with conditioned/cooled air.

  1. Server Cabinet Efficiency: it is common to find temperature variations within a server cabinet. Heat will rise, and the highest temperatures can generally be found towards the top of the cabinet. Blanking panels should be used if there are unused spaces within the server rack to ensure that the cold air cannot ‘escape’ through the front of the unit and prevent hot air from being trapped. A typical server rack cabinet can see a 20% differential in temperature between the top and bottom of a rack.
  2. Ambient Temperature Monitoring: even in a small facility it is important to have some form of remote ambient and environmental monitoring and alarm system. This can take several forms. There are dedicated environmental monitoring systems as well as additional accessories that can be connected to power distribution units (PDUs) and access control systems. The important point is to ensure that they are IP-enabled so that measurements and alarms can be generated and sent via the IT-network. Sudden or alarm-triggering changes should be investigated immediately.
  3. Additional and Redundant Cooling: air conditioners require regular routine maintenance, and this can generally mean a complete IT power down. Where this is the case and there is no cooling redundancy built-into the cooling system, either the entire IT operation has to shutdown or additional cooling in the form of portable air conditioners must be installed. Temporary and portable cooling systems are available from Server Room Environments on hire contracts.
  4. Air Conditioner Maintenance: regular maintenance is required to ensure your cooling system is fully operational and energy efficient. There are consumable items including filters within an air conditioner that will require replacement. Other components have a defined working life e.g. cooling fans. During a preventative maintenance visit, an HVAC (heating, cooling and air conditioning) engineer should carry out visual inspections of the entire cooling system including checking vents, ducts and exhausts for blockages and dust, dirt and grime build-up.
  5. Secure IT Facility: server rooms and closets need to be well-planned with an efficient use of space. This will mean ‘empty’ floor space which to the untrained eye is there to be used for general storage. Avoid this at all costs and secure the room to prevent unauthorised access. Anything stored within the room can block air flow and if it is operational (electrical/electronics) add to the cooling demand.
  6. Energy Efficiency: it is important to consider the age and efficiency of all the components within the room and not just the servers. Lighting can generate heat and should only be operational when the room is accessed and/or upgraded to energy efficient LED lighting which has a far lower heat output than traditional halogen bulbs or neon strip lights. For a typical halogen light bulb, only 10% of the energy consumed converts to light; the other 90% is emitted as heat. Air conditioners and uninterruptible power supplies (UPS systems) should also be upgraded to the latest designs to guarantee the highest levels of operational efficiency.
  7. Growth Factor: plan for the unexpected. IT technology continues to evolve at a rapid rate and it is important to build growth into the server room cooling design. This can go both ways. Additional kit may be added including more powerful servers. IT equipment may also be removed using virtualisation technologies or services outsourced to a Cloud datacentre. The cooling system needs to be able to accommodate load changes to maintain energy efficiency.

It is often all-to-easy to overlook potential energy savings within a server room and the cooling system itself. Over the life of an air conditioning system, its operational load and demands will change and should be reviewed at least every 2-3 years with a view to a full system refresh around years 5-7. This reflects the pace at which air conditioning and cooling systems are evolving to keep pace with changes in IT server technologies.

Blog main image 1848891 1616145741

Related blog posts

06/08/2018
Prev Article
How to Calculate Server Room Air Conditioner Sizes
Blog box fixed 1848896 1616145755

With the UK experiencing one of the hottest summers since 1976 many server room air conditioners and datacentre cooling systems are facing extreme workloads. The issue is not the ambient temperature itself but the length of time that the higher than average temperatures have been around. For future upgrades and new system installations it may well become the norm to have to build in a ‘heatwave’ factor to prevent disruption and potential system failures. So how do you size an air conditioner for a server room or datacentre?

Read more ...
14/02/2018
Next Article
The Replacement of Server Room Air Conditioners Refrigerants
Blog box fixed 1848886 1616145722

From 2015, it became illegal to use Hydrochlorofluorocarbons (HCFCs), including the ozone-depleting refrigerant gas R22, in refrigeration, heat pump and air conditioning (AC) systems. R22 was commonly used in cooling systems pre-dating 2004 and its ban has had a major effect on air-conditioning costs.

Read more ...