World class products and solutions. If you need a quick price, tender or specification review.
As confidence has increased in Cloud-based operations, many organisations in the private and public sectors are outsourcing some of their services and facilities. The typical providers of these Cloud-based services include colocation and enterprise datacentres some of which are in the hyperscale-size range. Those services and facilities that are not outsourced remain in-house and are run from on-site server rooms.
Critical infrastructure equipment suppliers are responding to the challenges of a shifting IT paradigm by innovating throughout their product ranges. Most rack, power and cooling systems manufacturers will offer small to medium to large scale solutions which tend to feature some common architecture, firmware and software and features. Newer developments that are also migrating into the server room facility include hot and cold aisle containment and self-contained micro data centres.
These innovations are intended to enable the provision of right-sized solutions, capable of expansion and meeting the latest specifications and standards in terms of energy efficiency, resilience, performance and size. They are also influencing the design and installation of modern on-site server rooms which are becoming more sophisticated as the power density of the IT servers operated within them increases.
With limited space for expansion, server room operators can face greater challenges than larger datacentres. Potentially more attention must be given to optimising the space (for both new or existing facilities) available for a raised access floor, server cabinets, in-row or perimeter cooling, UPS provision and network connection.
As with the design of a datacentre, the design of a computer or server room starts with a floor plan based on a standard floor tile and the selection of one or more server racks. Small computer rooms may start out as almost closet-like installations that grow over time as more equipment is added to power a network. Some of the products may be floor standing towers that tend to take up a lot of floor space and limit air flow. Security, trip hazards, tampering and disconnection can also be a problem in a poorly arranged computer room.
As in a datacentre environment rack mount equipment is the preferred choice for server rooms. The use of rack mount products overcomes many issues as they can be used within server rack cabinets and can be scaled vertically with the right-sized rack rather than horizontally over a larger floor area. Where equipment cannot be purchased in a rack mount format, most rack manufacturers offer shelves and drawers in place of bolt-on slider rails.
Sizing for server racks is relatively easy. Server racks tend to be either 600 or 1000mm deep and rack mounted equipment is made to fit one or both depths including a bend radius for rear-panel cables. The real benefit of using rack mount equipment lies in the use of standard heights and widths. All rack mount equipment is made to fit into a 19inch wide space with screw in facia slots. Where the equipment body smaller than this width, the product may be supplied with a 19inch wide facia plate.
In terms of height, rack mounted equipment is measured in U where 1U equals 1.75inches or 44.45mm. Most IT servers are 1 to 2U in height as are network switches and other IT peripherals. Rack mount UPS systems and transfer switches tend to be 2U or more in height. Rack sizing is therefore based on adding up the U-height of all the equipment and selecting a suitable rack cabinet i.e. assuming a depth of 1000mm maximum and the need to house 25U of equipment, the nearest sized server rack may be 27U, giving 2U of unused space. One point to note is that this will not leave much room for future expansion and the compactness of equipment within the rack could lead to ‘hot-spots’. It would be more practice to choose a larger cabinet to remove these issues and one at say 33U in height.
Small computer rooms with one IT server, some network switches and a NAS storage device may not require cooling. That is if there is no other equipment in the room and there is no additional thermal gain from a window (sunlight) or southern facing wall. Larger computer and server rooms with more equipment will have to consider not just precision cooling but also temperature monitoring. Heat kills equipment and wherever there is air conditioning installed within a server environment, temperature monitoring is must. Should the air conditioning cooling a high-density server rack fail, the high amount of power being drawn (10-30kW) could lead a sudden and critical rise in temperature with server CPU failure within minutes with the potential for fire to break out.
A dedicated cooling and temperature monitoring system is the preferred option rather than one that supplies the entire building. This is because building-wide systems tend to be programmed to operate when buildings are open (working days) and not 24/7 including weekends. Why is this important? IT servers are generally left powered even when not fully utilised and room ambient temperatures can rise outside working hours i.e. during a weekend or bank holiday.
Critical power is as important a consideration as cooling. Starting with the server rack it is important to install the right power distribution unit (PDU). This will generally be a vertical unit installed into the rear of the server cabinet. All the equipment within the rack is plugged into this. Hardwiring may also be an option.
The PDU or server rack PDUs (if there are multiple racks) will require some source of uninterruptible power. This could be provided by in-rack UPS systems or a centralised server room-sized UPS with its own internal or external battery cabinet. The UPS system may also be protected by an upstream standby power generator for extended power outages.
As with the cooling and temperature monitoring, it is important to connect communications capabilities of the PDU, UPS system(s) and standby power generator to the local IP network for monitoring. The power protection devices may be capable of SNMP communications and with the facility to issue alarms via text SMS and email.
Structured cable management including correct installation and labelling is a fundamental requirement within any IT environment, including server rooms. The use of patch panels helps to organise cabling runs but can take some time to install in terms of cable stripping, punching and testing for connectivity. However the effort will result in a far tidier installation and one that should be easier to manage in the future as new equipment is added to the network or decommissioned for removal. Good cable installation and management also helps to remove potential hazards including accidental disconnections and helps with ventilation.
Raised access floors can hide a range of issues but the space below the floor plenum should be viewed as another route to ensuring good cable management and distribution. Where there is no raised access floor, overhead cable trays can be used if not the ceiling void if there is suspended ceiling.
As mentioned, server rooms are housing higher power density equipment and server cabinets than ever before. Containerised Edge computing and micro data centres also increase the potential for a fire to quickly take hold if left unmonitored. At the very least smoke and fire detection should be installed as part of an environment monitoring system. The best practice approach is to include some form of fire suppression, whether this an automatic fire suppression system or local fire extinguishers and trained personnel to operate them.
The design of server rooms is being heavily influenced by the best practice approach adopted for datacentres. This is not surprising considering the commonality in equipment being used and the place of each within the IT spectrum. Server room operations are also becoming more sophisticated within an ever more connected world and one that faces a growing threat of power outages, rising temperatures and cyber security breaches. In order to guarantee the resilience and future scalability of a server room it is important to consider each critical infrastructure element and how they are to interface and operate within what is a dedicated mini-datacentre like environment.
Our projects engineers come across many types of IT closets and computer room installations and are given the task of turning these into best practice server rooms. Many of these rooms start out with the best of intentions with equipment added sporadically to support new services and growth within an organisation. So how do we approach the task of achieving the best server room layout design?
At Server Room Environments we see a growing importance for on-site server rooms, even with more businesses turning to Cloud based services. What’s driving the demand for on-site IT facilities is the need to prepare for the Internet of Things, Edge Computing and the level of critical infrastructure that organisations will need soon.