When and where was the first data centre installed? Some reports state that the first official data centre was built in the US in 1946 to house the ENIAC (Electronic Numerical Integrator and Computer). However, the UK may have been earlier if you include Alan Turing and the team at Bletchley Park who built a computer called “Christopher” to crack the Enigma code.
Data centres have of course evolved extensively since the 1940s, driven by emerging technologies that have helped to push computing power from mainframe buildings into computer and server rooms, enterprise and colocation data centres, and even microdatacentre environments.
Whichever type you use to securely house and run your IT, they share common characteristics and decisions that have to be made regarding critical power and cooling system resilience.
A computer room is quite literally a general-purpose room within a building into which a company places its IT network server and connected networking devices. The network server is more than likely going to be a floor standing tower, powered by a floor standing uninterruptible power supply. The room itself is relatively small. If it is too small, and there is a build-up of heat, the room may be air conditioned. A computer room is sometimes referred to as a network room, network closet, IT room or IT closet.
Most small businesses and organisations will have an area that they use as a computer room and their IT investment will be relatively small. The room is typically secured with a lock and key or push-button keypad.
A server room provides a more structured approach to providing a secure and managed environment for IT servers. Server rooms tend to use server racks to house 19inch rack mount servers and IT networking devices. Power is protected and provisioned by a centralised UPS system that powers the complete room or a decentralised approach where a rack mount UPS is installed within the each of the server cabinets. The room will be cooled using wall mounted air conditioners and there may be a fire suppression system and environmental monitoring (temperature and humidity). A server room can house a single rack or several which are arranged into a row or aisle arrangement.
Server rooms tend to be used by businesses and organisations with larger investments in their IT networks. The size of the organisation justifies this approach, and the server room supports the IT network across the enterprise, which could include remote locations and workers. For security, the room is connected to the building access control system and there may be additional monitoring using a local CCTV/security camera system.
A data centre is an entire building that is dedicated to running one or more server rooms or server halls. A data centre can be one of two types:
Colocation data centres can provide space in terms of a single server, rack or an entire server room or data hall. The space they provide is often referred to in terms of Public or Private cloud provision. A small website hosted cheaply by a web hosting company will be using share virtual servers within a public cloud. An organisation that requires a more sophisticated approach in terms of resilience, storage, and security such as a government department or blue light service will use a Private cloud.
A data centre will be designed to provide a specific level of resilience; normally an Uptime Institute Tier-rating from I to IV and a specific energy efficiency measure ratio such as Power Usage Effectiveness (PUE).
To achieve these, data centre consultants and architects are employed to complete a design that will meet specifications and provide the security and resilience required. Critical infrastructure systems will be deployed to support this in terms of critical power and cooling.
Over the next 5 to 10 years what will the role be for server rooms and data centres? To answer this question, one has to consider the growth of ‘hybrid cloud solutions.
For some organisation, a move to cloud computing makes perfect sense but only if they can offer a private cloud to ensure security and resilience. Such an organisation may be widely dispersed and operate software that is suited to cloud hosting. Examples include sales and marketing organisations who are reliant on their customer relationship management (CRM) software and need to use the internet to share information.
Other organisations may be decided that third-party suppliers (Colocation data centres) are not yet well developed enough to provide them with then services they need. Such an organisation may have a substantial investment in their on-site server room and IT and may only be willing to consider some aspect of their IT services be moved to off-site. A classic example is off-site backup where a company no longer relies on local backup routines but the internet and signs-up to a remote backup provider. This is referred to as a hybrid data centre solution
Innovation never stops within the IT industry and often it shapes how we work and live. The Internet of Things (IoT) or Industrial Internet of Things (IIoT) is connecting vast quantities of devices together and this means more data to process and store. By 2020, 70% of data will be created outside a data centre or Cloud and data centre technologies must find ways to move and process the data. Artificial intelligence and Machine-2-Machine (M2M) learning will increase the data mass and speed required for instant processing. Edge computing is one such approach which localises data traffic and usage near to the point of use without having to transport it back to a data centre for processing and re-transmission.
What this means is that there will always be the need for the two. Data centres and onsite server rooms will continue into the next decade and technology will evolve such as 5G, IoT, Edge computing and microdatacentres to support this.
For more information read:
A microdatacentre is a compact and complete data centre that can be installed relatively quickly and provides a secure and managed environment into which an organisation can install its IT servers and operations. A microdatacentre can be small, containerised solution or even a cabinet-based one. The point is that the modular or cabinet-based microdatacentre includes all the critical infrastructure systems of a server room or data centre environment including those in the table below.
|IT Facility||Critical Power||Cooling||Monitoring_.||Fire Suppression||Security|
|Computer Room||UPS||Window or wall mount air conditioner||–||–||Door lock or keypad|
|Server Room||LV switchboards, UPS, PDUs, possibly a generator||Wall mounted or floor standing air conditioner||Temperature / humidity monitoring||In-rack or room possible||Access control system and Cameras possible|
|Data Centre||Substations, LV switchboards, UPS, PDUs, Generators, Renewable Power||Computer room air conditioner, Computer room air handlers, liquid cooling, precision cooling and free cooling systems||Temperature, humidity and water leakage monitoring, DCIM platform possible||Room and building fire suppression||Access control and CCTV systems|
|Microdatacentre||UPS, PDUs||Racks with liquid cooling or air conditioners||Temperature monitoring via local PDU plug-in sensors||In-rack possible||Access control system controlled cabinet handles|
Microdatacentres take advantage of innovations in power densities. A single cabinet can house servers, a UPS system, network routers and switches and cooling. Once deployed to a site, the microdatacentre simply needs ‘plumbing in’ in terms of its power and cooling circuits and connecting to the local network.
Microdatacentres offer an alternative to an onsite server room and support Edge computing. They well become the preferred choice for organisations that need to push their computing nearer to the point of data collection and processing. An example would be a global car manufacturing company. This could have a central data centre with a mixture of cloud, server rooms and microdatacentres located throughout its global manufacturing facilities and supply chain.
Within any IT space the two main critical infrastructure systems are power and cooling. Decisions on how best to implement critical power and cooling systems scale, dependent upon the type of IT space deployed and the budget available. Using the Uptime Institute’s Tier-rating system for availability and resilience the table below shows a comparison between the different types of IT facility and their availability levels.
|IT Space||Tier||Delivery Paths||Redundancy||Maintenance||Fault Tolerant||Availability|
|Computer Rooms, Server Rooms, Microdatacentres||I||1||No||No||No||99.671%, no more than 28.8hours of downtime per annum|
|Server Rooms, Data Centres, Microdatacentres||II||1||N+1||No||No||99.741%, no more than 22hours of downtime per annum|
|Data Centres||III||1 Active / 1 Passive||N+1||Concurrent||No||99.982% *, no more than 1.6hours of downtime per annum|
|Data Centres||IV||Multiple||2N||Concurrent||Yes||99.995%, no more than 26.3minutes of downtime per annum|
As the Tier-levels increase, so does the financial investment required. Critical infrastructure systems become more complex and the design process more comprehensive and encompassing.
A small computer room looking for power availability, may only require a single UPS system with a short runtime battery pack to provide enough time for an orderly server shutdown in case of a prolonged mains power failure. A server room may look to use modular UPS systems to introduce N+X availability and there may be a standby power generator for long duration power failures to ensure systems keep running.
For a data centre cooling is one of the most wasteful of its energy related systems. The amount of heat generated by the servers must be tackled using the most energy efficient cooling techniques including use of free cooling if availability. Redundancy (N+X) must be built-into the cooling system to prevent a fire hazard of there is a cooling system component breakdown. For a server room, single or multiple wall mounted air conditioners may be installed, with the multiple units providing cycling and redundancy. For a computer room, a portable air conditioner may be used on hot days.
Of course when comparing whether to run an onsite server room or migrate to a Cloud-hosted solution, financial considerations also have to be taken into account. Running onsite hardware and software will require a site specific level of investment, some of which will be upfront and some of which will be leased.
For a Cloud-hosted solution, the hardware and services are generally provided by the Cloud data centre and are charged for on a monthly basis to the client. The monthly charge is based on various factors including the number of virtual servers and storage space required, resilience and backup services. Whether a Private or Public cloud is required is also a major consideration. Private clouds are more expensive due to dedicated client specific resources. Public clouds provide shared systems.
Cloud costs will invariably be higher but any organisation considering whether to use onsite server rooms or cloud data centre services or a hybrid combination of the two, should make a complete financial comparison. Running services in the cloud can, for some organisations, reduce local onsite operating costs including leasing, maintenance, staffing and energy related costs. However, once an organisation moved to a complete cloud migration, it is very hard to go back to onsite systems. This is another reasons why most organisations will run hybrid solutions for the foreseeable future.
As Edge computing, 5G and Cloud technologies develop over the next 5 to 10 years, organisations looking to refresh their IT will have to face the question of how much they push to the Cloud and how much IT hardware do they keep on premise. Most will adopt a hybrid Cloud approach. Data security will be one deciding factor but so to will be speed of data transportation and the amount of data to process. Microdatacentres will continue to develop but so will technologies support on-premises computer and server rooms. No matter how much IT an organisation moves to the Cloud, they will always run some form of local network system for speed, security, resilience and backup.
If you are considering a microdatacentre or want to review your current approach to server room or data centre design, please contact us for a free site survey and review by one of our design consultants.
The rack power density calculation is one of the most fundamental when it comes to server room and data centre designs. The calculation is based on a summation of the total kilowatts (kW) of power consumed by all the devices within each server cabinet. Multiplied by the total number of cabinets or server racks in the room, the total provides the basis for capacity planning, sizing critical power protection and cooling systems.