Data Centre Refurb

As part of a contract with a travel client, we were asked to look at the layout of one of their data centres. Provantage provided a design service that created a more configurable space.

The Brief

The data centre was a typical space, with racks installed in sort of rows, power provided from different places, and a network that had grown over time with no control. We were given free hand to redesign the data centre to free up space and make it more flexible.


The team looking after the data centre were open about the issues. It was a usual mix of uncontrolled growth, different rack types and sizes, bad networking, and poor power load balancing. We needed to find out from the business leaders what their plans were. 


We began the refurb from a business point of view. What were their intentions for the on-premises data? How much would be moved to the cloud and what is the timescale? Which of the systems are due to be shut down? These would allow us to plan how the new layout would be.

A detailed plan of the layout, air con, power, network, and fire suppression was drawn up. Inventory of the racks, what was in them, and linked it back to each systems’ roadmap, which allowed us to move all systems ready to go end of life to one place.

Power was provided through an in-line UPS, and distributed to three distribution panels, and pigtails from individual circuit breakers. To map circuit breakers to racks, we had to pull up floor panels and what we discovered beneath the panels was quite interesting. Four-way extension leads, plugged into other extension leads, provided the power distribution. Half the circuit breakers had nothing on the end of them.

Networks were patched from a panel in another room, with long cables, some wrapped in tight loops, tangled in knots, and some cut and left under the floor.

Cabling was a big problem to the air con too. It impeded airflow and was the cause of a majority of the overheating alerts.

The project began with knocking down a wall and expanding the false flooring. Power distribution needed to be refreshed and the new part of the room had a new distribution panel installed with hot-swap circuit breakers. Each breaker had a pigtail cut to adequately reach a rack position, and laid in a cable tray. New racks were installed, each with two power distribution strips. These had environmental sensors to monitor power consumption, rack temperature, and humidity. The strips we chose also had the ability to power cycle each output.

Each rack had a top of rack switch cabled over fibre to a new network core switch. This gave each rack 32 configurable copper and 16 fibre ports. It would allow a mixture of server and storage, dependent on requirements at the time.

The first set of migrations involved bringing the newer servers across. We carefully planned which servers could be brought down, and ensured the rack mounts would fit, ordering parts as necessary.

Servers were migrated, powered up, and teams would test them before they were signed off. With the new racks populated, we could move up the data centre, rip out the old racks, power, and network cables. New power distribution boards, power cables, all laid in new cable trays.

This process was repeated six more times until the whole data centre had been renovated. the client was very happy with the results and appreciative of the hard work it took.


Easier to use

With adequate power and network to each rack, any type of device can be installed quickly and frictionless.

Extended capacity

By arranging the racks in the most efficient way, we were able to get more servers per square metre than before, over and above the additional space we created.


The electrical distribution was secure and easily maintained. Racks could have one side powered down, while leaving the other up. New pigtails could be installed or existing ones swapped out without turning off power to the whole room.

Easier to maintain

Engineers now have better visibility of the equipment, how it is networked, and how racks are consuming power.


Faults can be found quickly and easily. Devices can be routed to many different networks without having to worry about physical patching. These all sum up to save time and money.

Additional capacity in networks and power can be added without taking down the entire data centre.

Environmental conditions can be monitored and reported to identify hotspots. These can be attended to before they become a problem.