Office 365’s data centres

Office 365’s low cost points and deceptive ease of use masks a multi-billion dollar global investment in fibre optics, hardware, and corporate class software tools which makes Microsoft’s Office products the dominant solution of choice for corporates and governments, and a compelling offering for small and medium business. It is not so much a matter of adulation for Microsoft than a matter of fact that the rest of Microsoft’s competition combined does not match its dominance in business email and productivity apps. So, what does Microsoft do with everyone’s data? Let’s take a look at  a Microsoft data centre.

Microsoft’s Senior Operations Program Manager, Alistair Speirs, said in June 2014 that one reason clients decided to “onboard” their services from their own premises installations was nothing simpler than their concern that adding more appliances to their server room left a real risk that the floor could collapse on the staff underneath. Why IT departments are moving to Microsoft’s cloud services are not clear cut, then, however customers are relying on Microsoft’s provisioning for critical data management.

Alistair used this anecdote to explain Microsoft’s approach to building their system from the ground up. If it is already on the ground, it is hard for anything to fall through it. So much is common sense, and Office 365’s architecture really is rooted to the ground.

The data centres and server farms are built  modularly: they are designed for ease of access and uniformity in all respects. Power and power supply back-up services are “failover”, so that redundant equipment is already operational in the event of break down, and faulty units are simply swapped out and rotated so that repairs can be handled elsewhere.

data centre power supply

Modularity plays an important role in the server arrays, too. Microsoft takes delivery of completely configured server racks from hardware vendors. If more disk space is needed, engineers do not install more drives. Instead, the facility is “stamped out” and another module, or ITPAK is installed. Here, engineers lowers an air handling unit on top of a pre-assembled rack at a facility in Quincy, eastern Washington.

ITPAK cooling unit

When fully assembled, these installations are called “ITPACs”. ITPACs are built from four components: an IT load, an evaporated cooling unit, an air handling unit, and a mixing unit. The evaporative cooling unit has a mesh screen where water can slowly drip through to keep a consistent amount of humidity. Air will naturally blow through the evaporated cooling unit. The IT load sucks that air through, and the air handling unit provides a pressure difference between the outside pressure and inside pressure of the data centre. Air gets naturally pulled through the evaporative cooling units to cool the servers without having to be powered by fans. The air handling unit pulls air out and pushes some of it back to the mixer. That is how engineers control temperature. The attraction of this model is that air conditioning is no longer a consideration and Microsoft’s server farms and data centres live outside. Below is their Quincy data center in Eastern Washington where the climate similar to Madrid. Once concrete is laid, it takes about four hours to install, and needs only three connections: the ping, the pipe, and the power.  It is remotely monitored, all operating from this unit. Microsoft’s installation incorporates some nifty tricks to mitigate energy consumption further and keep end users costs down. In this installation one building block comprises about 25,000 servers.

an assembled IT pak

Inside the data centre, the uniformity of hardware is unmistakable. Access to hardware is restricted by “rack” and on need to know authorisation. For instance, non Office 365 engineers from Microsoft’s own corporate installations would not access these kinds of facilities.

server room rack

Lastly, in addition to uniform hardware and components, seemingly peripheral issues like co-ordination of cabling colour and cabling runs follow strict protocols, all in the interest of avoiding what engineers call configuration drift:

data centre cabling

Microsoft’s data centre in Quincy, is part of a global network that comprises a core of 10 – 100 data centres at time of writing, with subsidiary “Edge Nodes” and other “Metro Solutions” to provide a mechanism for delivering content to last mile user end points within its Office 365 wide area network.

office 365;s global infrastructure

 

 

Open chat
1
Scan the code
👋Scan the QR code or click open Chat to talk to us on WhatsApp.