Case Study

8 min read

CL20-Case Study-University of Cambridge

Cambridge University Case Study

University Information Services (UIS) has just taken ownership of its new data centre on the West Cambridge Site, which will enable many of the University’s disparate machine rooms to be brought together into one industry-leading energy efficient, secure, fully managed facility. By adopting a highly efficient 'chilled water' hybrid cooling technology that is unique amongst multi-user data centres within the University sector, the data centre is expected to significantly reduce power consumption and deliver to the University a 10% reduction in carbon emissions compared against its 2013 levels.

Global Context

By early 2013, it was estimated that 90% of the world's digital data had been created during the previous 2 years. The amount of power required to store digital data accounts for 30-40% of any data centre's running costs, and McKinsey estimate that carbon emissions from data centres will quadruple by 2020, overtaking the airline industry in its impact on climate change. As a result, data centre design is rapidly evolving to place energy efficiency at the heart of the design process. The University of Cambridge is committed to reducing its environmental impact. Its Carbon Management Plan for 2010-2020 commits to reducing the University's energy-related emissions by 34% by 2020 compared to 2005/06 base levels, so when it came to commissioning a new data centre, energy efficiency and carbon reduction was a prime motivator. Rather than attempting to refurbish its existing diverse data storage infrastructures, the University and Cambridge Assessment formed a partnership to invest £20M in a bespoke world-class facility to support business operations, teaching and learning, and research communities for years to come.

Multi-user requirement

Initially, the West Cambridge Data Centre will serve the current and future needs of UIS and the institutions for whom it manages IT infrastructure, the High Performance Computing Service (HPCS) supporting the University's research activities, and the administrative needs of Cambridge Assessment, that manages the University's three examination boards. Cambridge University Press are also planning to share the Cambridge Assessment Data Hall. The different activities of these three user groups generate varying amounts of IT load, ranging from low-densities of 3.5kW per cabinet to a high IT density of 30kW per cabinet for intensive research-based data processing.

Determining the right approach

Traditional design approaches would advocate using three separate systems to support the three users' IT load types. After significant analysis, however, a 'one system' approach using the most appropriate new technology emerged as the best design solution. The early design decision to supply air at the elevated ASHRAE A2 temperature range for all three user types unlocked the potential for creating our highly efficient design. The 'chilled water' solution pushes the industry towards a more flexible, yet still highly efficient system, delivering '100% free' cooling. This would not have been possible without the University's coming to trust that the benefits of using this novel 'chilled water' system would far outweigh the risks – a decision many data centre clients would have shied away from.

Innovative hybrid cooling technology

Many approaches to cooling were explored, including all-air indirect evaporative systems. Power Utilization Effectiveness (PUE) and costing exercises were undertaken to assist with the tough decision-making. To meet the University's aspirations, however, it became obvious that the right solution for our data centre would need to go beyond the capabilities of all-air evaporative cooling. In preference, a 'chilled water' system was developed to deliver the same benefits of evaporative cooling, but without the use of chillers. This has allowed us to support both the low IT densities, which use hot-aisle containment and CRAH cooling, and the high IT density, which relies on rear-door cooling. To support the high-density cabinets for High Performance Computing, a back-of-rack cooling solution was adopted following the success of the system in a trial environment. Working with local supplier, ColdLogik, the University has been able to experiment and determine the optimum system settings to deliver the highest efficiency levels.

Designed for high availability and resilience

The centre has two independently routed Point of Presence rooms providing the incoming communications to the University. All the data cabinets, CRAHs, back-of-rack coolers, and electrical equipment in the building have dual power feeds. Main power is supplied via dual 11,000KV feeds from UK Power Networks (UKPN) via separate substations, and a single 3,150Kva transformer. The centre has a 2,200Kva initial capacity from our provider, which can rise to 3,000Kva when this becomes available from UKPN. Backup power is guaranteed by three 1,100Kva generator sets configured to N+1 (only two of which are required to supply sufficient backup power), with enough fuel to run for 72 hours. The three Hybrid Dry cooling towers are also designed for N+1 resilience, and only two of the three are required for normal operation. Three 1,000Kva modular UPSs – also configured to N+1 – each comprising five 200Kva modules with intelligent controls, deliver 98% power efficiency. Two UPS output panels deliver separate A and B feeds to each cabinet via an overhead track busbar system, chosen for its flexibility. Power distribution is controlled and metered by intelligent cabinet power strips.

Four Data Halls

The new two storey 2,200m2 steel & block facility houses four data halls, designed for the different IT density requirements of its key stakeholders. There are currently 60 racks across three halls: Hall 1 accommodates the HPCS’s high density IT load of up to 900KW; Hall 2 provides 201KW for Cambridge Assessment’s needs, and Hall 3, 240KW for UIS' servers. Hall 4 remains deliberately unallocated and has not been fully fitted out. This forward-thinking decision allows us to install another 40-50 racks as future demand increases, leaving us with the maximum flexibility to incorporate the latest technologies as they emerge. The purpose-built unit has a dedicated build room for engineering work, an operations room, security office and meeting room space. It also has a hoist for unloading deliveries, a large service elevator, designed to accommodate even the largest pieces of kit, and features an argonite gas fire suppression system.

Low power, low carbon

Power usage effectiveness (PUE) is a measure of the ratio of power used by the data centre to the power consumed by the actual IT equipment; a perfect PUE would be 1. In January 2013, the European average was still more than 2.5. Once fully operational, the West Cambridge Data Centre aims to deliver an overall PUE of 1.2, which is not far off that of large global players like Google (1.11 average for Q2 2014) and Facebook (1.06 and 1.08 averages for its two data centres, May 2013) who have made significant investments in their green data centres. As a result of the power efficiency, we expect to see University carbon emissions reduce by 10% compared to current levels from its various machine rooms.

Preview the PDF to learn more...

Challenges and Solutions

Challenges

Bulk faced the following challenges:

  • How to design a solution for a colocation customer when specific IT technology, density or rack configuration is not known.
  • How to cater for the current GPU trend requiring 40-50kW+ per rack, while also considering future tech upgrades that could scale to even higher densities including Direct Liquid Cooling (DLC).
  • Find a flexible solution that allows its customer the option to select and bring in their own racks of different sizes and manufacturers.
  • Maximise operational efficiency through optimised airflow and heat rejection.
  • Reduce time for white space fit out and customer deployment by pre-installing and testing all cooling components.

Solutions

The ColdLogik solution:

  • Energy Efficiency Ratio (EER) of over 100 at maximum capacity.
  • Average 15% reclaimed power for Compute by comparison to traditional cooling.
  • Potential Cooling PUE available of 1.035 with RDC.
  • Over 50,000 trees worth of carbon saved per 1MW ColdLogik deployment.
  • Adaptive Intelligence that controls the whole room temperature.
  • Higher water temperatures reduce the need for mechanical cooling, whilst maintaining ASHRAE A1 temperatures.

How elevated water temperatures
can dramatically reduce water temperatures

WUE-USA

What happens when you implement a ColdLogik Rear Door? In the graphs next to this you can see the marked difference between using a traditional cooling system, which is marked in yellow, and the ColdLogik cooling requirement, marked in grey. In the case of San Francisco and the Bay area its clear to see that by utilizing the traditional approach you would, on average, have a need for the adiabatic system all year round and you would also require mechanical assistance all year round in varying load. However, as most chillers have a minimum run of 25%, less free cooling could be available. By utilizing the ColdLogik door, on average, you would not need to use any additional water for 7 months of the year to provide adiabatic cooling, you would not require any mechanical assistance through the remaining 5 months either. Chillers would normally remain on site in order to provide redundancy on the rare occasions that a heat wave outside of the average occurs, however the chillers may not ever need to actually be run, causing an energy saving too.
With the current situation the world finds itself in one thing has become abundantly clear, data centers have provided people a safe haven in their own homes whilst lockdowns have been enforced across the globe, at one point half of the human race was in lockdown in one form or another.There have been both positives and negatives that have arisen from this though, from the ‘key worker’ status held by data center employees and their primary suppliers highlighting how governments across the world perceive the industry and its need, through to a sterner examination from the wider world on energy consumption and water usage.Uptime and reliability have always driven the major data center design philosophy, trade-offs have been made, understandably, in order for comfort to be achieved for the operators and owners to be safe in the knowledge that design and variability across sites maintains consistency and brings down the risk of misalignment or calculation.
Whilst data centers are more efficient than they have ever been on a whole, there is still a vast amount of improvement that can be made, in particular on both energy consumption for cooling and the consumption of water in the requirement for adiabatic cooling.



In 2014 Lawrence Berkeley National Laboratory in California issued a report that 639 billion liters of water were used in the USA alone on data center cooling, in 2020 the forecasted usage figure was predicted to be a startling 674 billion liters of water.

What is adiabatic cooling?

The reason that water is used in these data center cooling solutions is traditionally to obtain the lowest possible air temperature entering the external plant and therefore extracting as much of the heat from the data center using the natural environment before the mechanical chiller needs to be implemented.In any air content there are two temperature points, Dry Bulb (DB) and Wet Bulb (WB), the dry bulb is what you would feel if you were dry, the best way to describe wet bulb is when you walk out of the shower before you manage to get to the towel! The water on your body enables the air to reach the temperature of the relative humidity as you move through it, this is always equal to or lower than the DB temperature.

For example if the DB temperature in a room is 20°C/70°F and the WB temperature is 14°C/57°F then, if a wet object or air is pushed through a wet area or membrane the temperature would potentially reach the WB temperature, that is until the object is heated or dried.

Why is this usage so high?

The usage of water is inversely proportional to the water temperature flow to the data centre internal cooling equipment. The lower the temperature of the water flow into the data centre the higher the water usage by the external plant. Traditional plant has a normal water flow temperature of 7°C/45°F which means the highest temperature that you could potentially utilise naturally to get to the desired flow temperature is 5°C/41°F.

How can you improve this usage?

The best possible way to reduce the usage is to elevate the water temperature that the data centre requires in order to cool the equipment efficiently and effectively, the rear door cooler is a great example of this because, unlike traditional CRAC systems, instead of using colder air to mix with warm air to provide an ambient you are instead neutralising the air itself and therefore you can use a higher water temperature to obtain the same result. The graphs below show the average high temperature for DB and WB over a thirty year period.

What happens when you implement a ColdLogik rear door?

In the graphs above you can see the marked difference between using a traditional cooling system, which is marked in yellow, and the ColdLogik cooling requirement, marked in grey.In the case of San Francisco and the Bay area its clear to see that by utilising the traditional approach you would, on average, have a need for the adiabatic system all year round and you would also require mechanical assistance all year round in varying load. However, as most chillers have a minimum run of 25%, less free cooling could be available.
Whilst data centers are more efficient than they have ever been on a whole, there is still a vast amount of improvement that can be made, in particular on both energy consumption for cooling and the consumption of water in the requirement for adiabatic cooling.



In 2014 Lawrence Berkeley National Laboratory in California issued a report that 639 billion liters of water were used in the USA alone on data center cooling, in 2020 the forecasted usage figure was predicted to be a startling 674 billion liters of water.

Conclusion

In conclusion, without considering the lower water usage across the remaining 5 month which could be substantial, the ColdLogik door would likely be able to save a minimum of 58% additional water that would otherwise be consumed by the traditional cooling methods.
Translating into physical water usage over the year this could drop the current projected figure of 674 billion litters of water down to 283 billion liters of water which is a 391 billion liter drop.

This is the equivalent of filling 156,400 Olympic swimming pools which would take up an area 1.5 times that of San Francisco city.
If you are looking to improve your water usage with a product that is tried and tested and deployed into the market worldwide then get in touch with USystems today.

Conventional air cooling traditionally consumes significant energy when using mechanical chillers, one way to reduce and potentially eliminate the additional energy wastage is by utilising adiabatic cooling. Whilst significantly improving efficiencies on one hand this exponentially increases water usage in order to equip evaporative cooling. The major down side however is the growing scarcity of water in certain geographical locations. A typical large scale Data Center consumes an equivalent of 2,500 peoples water which is putting pressure on local governments in order to drop water usage.

By utilising liquid cooling you can effectively increase the water temperature to the point where adiabatic cooling is no longer needed, giving the best of both worlds, no excess water wasted and better energy efficiency with a simpler site set up and requirement. It really is a WIN-WIN-WIN.

WUE-Northen Europe

What happens when you implement a ColdLogik Rear Door? In the graphs next to this you can see the marked difference between using a traditional cooling system, which is marked in yellow, and the ColdLogik cooling requirement, marked in grey. In the case of Stockholm and the Nordics, its clear to see that by utilizing the traditional approach you would, on average, have a need for the adiabatic system most of the year and you would also require mechanical assistance in varying load. However, as most chillers have a minimum run of 25%, less free cooling could be available. By utilizing the ColdLogik door, on average, you would not need to use any additional water for 9 months of the year to provide adiabatic cooling, you would not require any mechanical assistance through the remaining 3 months either. Chillers would normally remain on site to provide redundancy on the rare occasions that a heat wave outside of the average occurs, however the chillers may not ever need to be run, causing an additional operational saving.
With the current situation the world finds itself in one thing has become abundantly clear, data centers have provided people a safe haven in their own homes whilst lockdowns have been enforced across the globe, at one point half of the human race was in lockdown in one form or another.There have been both positives and negatives that have arisen from this though, from the ‘key worker’ status held by data center employees and their primary suppliers highlighting how governments across the world perceive the industry and its need, through to a sterner examination from the wider world on energy consumption and water usage.Uptime and reliability have always driven the major data center design philosophy, trade-offs have been made, understandably, in order for comfort to be achieved for the operators and owners to be safe in the knowledge that design and variability across sites maintains consistency and brings down the risk of misalignment or calculation.
Whilst data centers are more efficient than they have ever been on a whole, there is still a vast amount of improvement that can be made, in particular on both energy consumption for cooling and the consumption of water in the requirement for adiabatic cooling.


One of the major active equipment manufacturers has openly said that a realistic figure for water use per MW can be 68,000 litres of water a day.

Whilst public information is scarce a very conservative figure for water usage is around 20 million litres of water a day in the Nordics, utilised for cooling. However importantly a large proportion of data centre owners have utilised the areas climate to reduce the mechanical power requirement, which whilst increasing water usage will provide greater overall efficiency for traditional systems.

What is adiabatic cooling?

The reason that water is used in these data center cooling solutions is traditionally to obtain the lowest possible air temperature entering the external plant and therefore extracting as much of the heat from the data center using the natural environment before the mechanical chiller needs to be implemented.In any air content there are two temperature points, Dry Bulb (DB) and Wet Bulb (WB), the dry bulb is what you would feel if you were dry, the best way to describe wet bulb is when you walk out of the shower before you manage to get to the towel! The water on your body enables the air to reach the temperature of the relative humidity as you move through it, this is always equal to or lower than the DB temperature.

For example if the DB temperature in a room is 20°C/70°F and the WB temperature is 14°C/57°F then, if a wet object or air is pushed through a wet area or membrane the temperature would potentially reach the WB temperature, that is until the object is heated or dried.

Why is this usage so high?

The usage of water is inversely proportional to the water temperature flow to the data centre internal cooling equipment. The lower the temperature of the water flow into the data centre the higher the water usage by the external plant. Traditional plant has a normal water flow temperature of 7°C/45°F which means the highest temperature that you could potentially utilise naturally to get to the desired flow temperature is 5°C/41°F.

How can you improve this usage?

The best possible way to reduce the usage is to elevate the water temperature that the data centre requires in order to cool the equipment efficiently and effectively, the rear door cooler is a great example of this because, uThe best possible way to reduce the usage is to elevate the water temperature that the data centre requires in order to cool the equipment efficiently and effectively, the rear door cooler is a great example of this because, unlike traditional CRAC systems, instead of using colder air to mix with warm air to provide an ambient you are instead neutralising the air itself and therefore you can use a higher water temperature to obtain the same result. The graphs below show the average high temperature for DB and WB over a thirty year period.nlike traditional CRAC systems, instead of using colder air to mix with warm air to provide an ambient you are instead neutralising the air itself and therefore you can use a higher water temperature to obtain the same result. The graphs below show the average high temperature for DB and WB over a thirty year period.
As you can see above the Nordic region provides a very low dry and wet bulb for a large proportion of the year, this helps with efficiency on a whole.
In 2014 Lawrence Berkeley National The important factor here is that anything above the blue line can utilise the DB and therefore not require any additional water usage. Anything between the blue line and the orange line can be cooled using an adiabatic system and this is where the water usage would come into being. Anything beneath the orange line would require additional mechanical cooling such as a traditional chiller system, this would then be using maximum water and additional power for the mechanical equipment.in California issued a report that 639 billion liters of water were used in the USA alone on data center cooling, in 2020 the forecasted usage figure was predicted to be a startling 674 billion liters of water.

What happens when you implement a ColdLogik rear door?

In the graphs above you can see the marked difference between using a traditional cooling system, which is marked in yellow, and the ColdLogik cooling requirement, marked in grey.In the case of the Nordic region its clear to see that by utilising the traditional approach you would, on average, have a need for the adiabatic system for two thirds of the year and you would also require mechanical for just under half of the year in varying load. However, as most chillers have a minimum run of 25% less free cooling could be available.
By utilizing the ColdLogik door, on average, you would not need to use any additional water for 9 months of the year to provide adiabatic cooling, you would not require any mechanical assistance through the remaining 3 months either. Chillers would normally remain on site to provide redundancy on the rare occasions that a heat wave outside of the average occurs, however the chillers may not ever need to be run, causing an additional operational saving.

Conclusion

In conclusion, without considering the lower water usage across the remaining 3 months which could be substantial, the ColdLogik door would likely be able to save a minimum of 50% additional water that would otherwise be consumed by the traditional cooling methods.

Translating into physical water usage over the year, and based on the publicly available information in the Nordic region, this could drop the current projected usage figure of 4.86 billion litres of water down to 2.43 billion litres of water which is a massive 50% drop. This is the equivalent of filling the infamous Blue Lagoon in Iceland a whopping 270 times, which really does give it perspective.
If you are looking to improve your water usage with a product that is tried and tested and deployed into the market worldwide then get in touch with USystems today.

Conventional air cooling traditionally consumes significant energy when using mechanical chillers, one way to reduce and potentially eliminate the additional energy wastage is by utilising adiabatic cooling. Whilst significantly improving efficiencies on one hand this exponentially increases water usage in order to equip evaporative cooling. The major down side however is the growing scarcity of water in certain geographical locations. A typical large scale Data Center consumes an equivalent of 2,500 peoples water which is putting pressure on local governments in order to drop water usage.

By utilising liquid cooling you can effectively increase the water temperature to the point where adiabatic cooling is no longer needed, giving the best of both worlds, no excess water wasted and better energy efficiency with a simpler site set up and requirement. It really is a WIN-WIN-WIN.

WUE-London-FLAP

What happens when you implement a ColdLogik Rear Door? In the graphs next to this you can see the marked difference between using a traditional cooling system, which is marked in yellow, and the ColdLogik cooling requirement, marked in grey. In the case of London and the FLAP markets, its clear to see that by utilizing the traditional approach you would, on average, have a need for the adiabatic system most of the year and you would also require mechanical assistance in varying load. However, as most chillers have a minimum run of 25%, less free cooling could be available. By utilizing the ColdLogik door, on average, you would not need to use any additional water for 8 months of the year to provide adiabatic cooling, you would not require any mechanical assistance through the remaining 4 months either. Chillers would normally remain on site to provide redundancy on the rare occasions that a heat wave outside of the average occurs, however the chillers may not ever need to be run, causing an additional operational saving.
With the current situation the world finds itself in one thing has become abundantly clear, data centers have provided people a safe haven in their own homes whilst lockdowns have been enforced across the globe, at one point half of the human race was in lockdown in one form or another.

There have been both positives and negatives that have arisen from this though, from the ‘key worker’ status held by data center employees and their primary suppliers highlighting how governments across the world perceive the industry and its need, through to a sterner examination from the wider world on energy consumption and water usage.

Uptime and reliability have always driven the major data center design philosophy, trade-offs have been made, understandably, in order for comfort to be achieved for the operators and owners to be safe in the knowledge that design and variability across sites maintains consistency and brings down the risk of misalignment or calculation.
Whilst data centers are more efficient than they have ever been on a whole, there is still a vast amount of improvement that can be made, in particular on both energy consumption for cooling and the consumption of water in the requirement for adiabatic cooling.
One of the major active equipment manufacturers has openly said that a realistic figure for water use per MW can be 68,000 litres of water a day.

Even if you only take the 13 largest data centre operations in the UK then this would equate to 58,412,000 litres of water that are wasted each day.

What is adiabatic cooling?

The reason that water is used in these data center cooling solutions is traditionally to obtain the lowest possible air temperature entering the external plant and therefore extracting as much of the heat from the data center using the natural environment before the mechanical chiller needs to be implemented.In any air content there are two temperature points, Dry Bulb (DB) and Wet Bulb (WB), the dry bulb is what you would feel if you were dry, the best way to describe wet bulb is when you walk out of the shower before you manage to get to the towel! The water on your body enables the air to reach the temperature of the relative humidity as you move through it, this is always equal to or lower than the DB temperature.

For example if the DB temperature in a room is 20°C/70°F and the WB temperature is 14°C/57°F then, if a wet object or air is pushed through a wet area or membrane the temperature would potentially reach the WB temperature, that is until the object is heated or dried.

Why is this usage so high?

The usage of water is inversely proportional to the water temperature flow to the data centre internal cooling equipment. The lower the temperature of the water flow into the data centre the higher the water usage by the external plant. Traditional plant has a normal water flow temperature of 7°C/45°F which means the highest temperature that you could potentially utilise naturally to get to the desired flow temperature is 5°C/41°F.

How can you improve this usage?

The best possible way to reduce the usage is to elevate the water temperature that the data centre requires in order to cool the equipment efficiently and effectively, the rear door cooler is a great example of this because, unlike traditional CRAC systems, instead of using colder air to mix with warm air to provide an ambient you are instead neutralising the air itself and therefore you can use a higher water temperature to obtain the same result. The graphs below show the average high temperature for DB and WB over a thirty year period.
As someone that lives in the UK I can safely say that our weather isn’t always the best, however this gives a wonderful opportunity for eliminating excess water use.
The important factor here is that anything above the blue line can utilise the DB and therefore not require any additional water usage. Anything between the blue line and the orange line can be cooled using an adiabatic system and this is where the water usage would come into being. Anything beneath the orange line would require additional mechanical cooling such as a traditional chiller system, this would then be using maximum water and additional power for the mechanical equipment.

What happens when you implement a ColdLogik rear door?

In the graphs above you can see the marked difference between using a traditional cooling system, which is marked in yellow, and the ColdLogik cooling requirement, marked in grey.In the case of the United Kingdom and in particular the London area its clear to see that by utilising the traditional approach you would, on average, have a need for the adiabatic system all year round and you would also require mechanical for over half of the year in varying load. However, as most chillers have a minimum run of 25% making less of the free cooling available.
By utilizing the ColdLogik door, on average, you would not need to use any additional water for 8 months of the year to provide adiabatic cooling, you would not require any mechanical assistance through the remaining 4 months either. Chillers would normally remain on site to provide redundancy on the rare occasions that a heat wave outside of the average occurs, however the chillers may not ever need to be run, causing an additional operational saving.

Conclusion

In conclusion, without considering the lower water usage across the remaining 3 monthIn conclusion, without considering the lower water usage across the remaining 4 months which could be substantial, the ColdLogik door would likely be able to save a minimum of 66% additional water that would otherwise be consumed by the traditional cooling methods.

Translating into physical water usage over the year, and based on the 13 largest publicly available data centres in the UK, this could drop the current projected usage figure of 21.32 billion litres of water down to 7.11 billion litres of water which is a 14.21 billion litre drop. This is the equivalent of filling 5550 Olympic swimming pools which would take up an area more than 130 x that which Windsor castle and it’s grounds currently occupies.
If you are looking to improve your water usage with a product that is tried and tested and deployed into the market worldwide then get in touch with USystems today.

Conventional air cooling traditionally consumes significant energy when using mechanical chillers, one way to reduce and potentially eliminate the additional energy wastage is by utilising adiabatic cooling. Whilst significantly improving efficiencies on one hand this exponentially increases water usage in order to equip evaporative cooling. The major down side however is the growing scarcity of water in certain geographical locations. A typical large scale Data Center consumes an equivalent of 2,500 peoples water which is putting pressure on local governments in order to drop water usage.

By utilising liquid cooling you can effectively increase the water temperature to the point where adiabatic cooling is no longer needed, giving the best of both worlds, no excess water wasted and better energy efficiency with a simpler site set up and requirement. It really is a WIN-WIN-WIN.

WUE-India

What happens when you implement a ColdLogik Rear Door? In the graphs next to this you can see the marked difference between using a traditional cooling system, which is marked in yellow, and the ColdLogik cooling requirement, marked in grey. In the case of Bangalore and the Indian market, its clear to see that by utilizing the traditional approach you would, on average, have a need for the adiabatic system most of the year and you would also require mechanical assistance in varying load. However, as most chillers have a minimum run of 25%, less free cooling could be available. By utilizing the ColdLogik door, on average, you would not require any additional mechanical cooling on site for standard operation. This is in the form of chillers with refrigeration circuits, whilst normally these systems would remain on site in order to maintain redundancy in case of exceptional need they would not be required on a regular basis. The water usage would be less for 6 months of the year on the ColdLogik system, this would most likely account for a drop in water usage across this period of around 20%.
With the current situation the world finds itself in one thing has become abundantly clear, data centers have provided people a safe haven in their own homes whilst lockdowns have been enforced across the globe, at one point half of the human race was in lockdown in one form or another.

There have been both positives and negatives that have arisen from this though, from the ‘key worker’ status held by data center employees and their primary suppliers highlighting how governments across the world perceive the industry and its need, through to a sterner examination from the wider world on energy consumption and water usage.

Uptime and reliability have always driven the major data center design philosophy, trade-offs have been made, understandably, in order for comfort to be achieved for the operators and owners to be safe in the knowledge that design and variability across sites maintains consistency and brings down the risk of misalignment or calculation.
Whilst data centers are more efficient than they have ever been on a whole, there is still a vast amount of improvement that can be made, in particular on both energy consumption for cooling and the consumption of water in the requirement for adiabatic cooling.


One of the major active equipment manufacturers has openly said that a realistic figure for water use per MW can be 68,000 litres of water a day.

Whilst public information is scarce a very conservative figure for water usage is around 34 million litres of water a day in the Indian market, utilised for cooling based on 500mW cooling capacity across the country.

What is adiabatic cooling?

The reason that water is used in these data center cooling solutions is traditionally to obtain the lowest possible air temperature entering the external plant and therefore extracting as much of the heat from the data center using the natural environment before the mechanical chiller needs to be implemented.In any air content there are two temperature points, Dry Bulb (DB) and Wet Bulb (WB), the dry bulb is what you would feel if you were dry, the best way to describe wet bulb is when you walk out of the shower before you manage to get to the towel! The water on your body enables the air to reach the temperature of the relative humidity as you move through it, this is always equal to or lower than the DB temperature.

For example if the DB temperature in a room is 20°C/70°F and the WB temperature is 14°C/57°F then, if a wet object or air is pushed through a wet area or membrane the temperature would potentially reach the WB temperature, that is until the object is heated or dried.

Why is this usage so high?

The usage of water is inversely proportional to the water temperature flow to the data centre internal cooling equipment. The lower the temperature of the water flow into the data centre the higher the water usage by the external plant. Traditional plant has a normal water flow temperature of 7°C/45°F which means the highest temperature that you could potentially utilise naturally to get to the desired flow temperature is 5°C/41°F.

How can you improve this usage?

The best possible way to reduce the usage is to elevate the water temperature that the data centre requires in order to cool the equipment efficiently and effectively, the rear door cooler is a great example of this because, unlike traditional CRAC systems, instead of using colder air to mix with warm air to provide an ambient you are instead neutralising the air itself and therefore you can use a higher water temperature to obtain the same result. The graphs below show the average high temperature for DB and WB over a thirty year period.
As you can see above India provides a challenging environment for any cooling requirement, with high DB temperatures and relatively high WB temperatures to suit.
The important factor here is that anything above the blue line can utilise the DB and therefore not require any additional water usage. Anything between the blue line and the orange line can be cooled using an adiabatic system and this is where the water usage would come into being. Anything beneath the orange line would require additional mechanical cooling such as a traditional chiller system, this would then be using maximum water and additional power for the mechanical equipment.

What happens when you implement a ColdLogik rear door?

In the graphs above you can see the marked difference between using a traditional cooling system, which is marked in yellow, and the ColdLogik cooling requirement, marked in grey.In India its clear to see that by utilising the traditional approach you would, on average, have a need for the adiabatic system for the whole year and you would also require mechanical for the whole year in varying load. However, as most chillers have a minimum run of 25% less free cooling may be used.
By utilizing the ColdLogik door, on average, you would not require any additional mechanical cooling on site for standard operation. This is in the form of chillers with refrigeration circuits, whilst normally these systems would remain on site in order to maintain redundancy in case of exceptional need they would not be required on a regular basis. The water usage would be less for 6 months of the year on the ColdLogik system, this would most likely account for a drop in water usage across this period of around 20%.

Conclusion

In conclusion, considering the lower water usage across the 6 months, the ColdLogik door would likely be able to save a minimum of 10% additional water that would otherwise be consumed by the traditional cooling methods.

Translating into physical water usage over the year, and based on the publicly available information for India, this could drop the current projected usage figure of 12.37 billion litres of water down to 11.13 billion litres of water which is a 10% drop. In the future, as the Ashrae guidelines are pushed more into the allowable limits, the amount of water that could be saved is limitless.
If you are looking to improve your water usage with a product that is tried and tested and deployed into the market worldwide then get in touch with USystems today.

Conventional air cooling traditionally consumes significant energy when using mechanical chillers, one way to reduce and potentially eliminate the additional energy wastage is by utilising adiabatic cooling. Whilst significantly improving efficiencies on one hand this exponentially increases water usage in order to equip evaporative cooling. The major down side however is the growing scarcity of water in certain geographical locations. A typical large scale Data Center consumes an equivalent of 2,500 peoples water which is putting pressure on local governments in order to drop water usage.

By utilising liquid cooling you can effectively increase the water temperature to the point where adiabatic cooling is no longer needed, giving the best of both worlds, no excess water wasted and better energy efficiency with a simpler site set up and requirement. It really is a WIN-WIN-WIN.

WUE-India

What happens when you implement a ColdLogik Rear Door? In the graphs next to this you can see the marked difference between using a traditional cooling system, which is marked in yellow, and the ColdLogik cooling requirement, marked in grey. In the case of Bangalore and the Indian market, its clear to see that by utilizing the traditional approach you would, on average, have a need for the adiabatic system most of the year and you would also require mechanical assistance in varying load. However, as most chillers have a minimum run of 25%, less free cooling could be available. By utilizing the ColdLogik door, on average, you would not need to use any additional water for 6 months of the year to provide adiabatic cooling, you would only require mechanical cooling assistance for around 1-2 months. Chillers would normally remain on site to provide redundancy on the rare occasions that a heat wave outside of the average occurs, however the chillers may not need to be run for 10 months of the year, causing an additional operational saving
With the current situation the world finds itself in one thing has become abundantly clear, data centers have provided people a safe haven in their own homes whilst lockdowns have been enforced across the globe, at one point half of the human race was in lockdown in one form or another.

There have been both positives and negatives that have arisen from this though, from the ‘key worker’ status held by data center employees and their primary suppliers highlighting how governments across the world perceive the industry and its need, through to a sterner examination from the wider world on energy consumption and water usage.

Uptime and reliability have always driven the major data center design philosophy, trade-offs have been made, understandably, in order for comfort to be achieved for the operators and owners to be safe in the knowledge that design and variability across sites maintains consistency and brings down the risk of misalignment or calculation.
Whilst data centers are more efficient than they have ever been on a whole, there is still a vast amount of improvement that can be made, in particular on both energy consumption for cooling and the consumption of water in the requirement for adiabatic cooling.


One of the major active equipment manufacturers has openly said that a realistic figure for water use per MW can be 68,000 litres of water a day.

Unfortunately information is scarce and so a conservative figure of 1000mW can be used across the country, this would potentially give a usage of around 68 million litres of water per day.

What is adiabatic cooling?

The reason that water is used in these data center cooling solutions is traditionally to obtain the lowest possible air temperature entering the external plant and therefore extracting as much of the heat from the data center using the natural environment before the mechanical chiller needs to be implemented.In any air content there are two temperature points, Dry Bulb (DB) and Wet Bulb (WB), the dry bulb is what you would feel if you were dry, the best way to describe wet bulb is when you walk out of the shower before you manage to get to the towel! The water on your body enables the air to reach the temperature of the relative humidity as you move through it, this is always equal to or lower than the DB temperature.

For example if the DB temperature in a room is 20°C/70°F and the WB temperature is 14°C/57°F then, if a wet object or air is pushed through a wet area or membrane the temperature would potentially reach the WB temperature, that is until the object is heated or dried.

Why is this usage so high?

The usage of water is inversely proportional to the water temperature flow to the data centre internal cooling equipment. The lower the temperature of the water flow into the data centre the higher the water usage by the external plant. Traditional plant has a normal water flow temperature of 7°C/45°F which means the highest temperature that you could potentially utilise naturally to get to the desired flow temperature is 5°C/41°F.

How can you improve this usage?

The best possible way to reduce the usage is to elevate the water temperature that the data centre requires in order to cool the equipment efficiently and effectively, the rear door cooler is a great example of this because, unlike traditional CRAC systems, instead of using colder air to mix with warm air to provide an ambient you are instead neutralising the air itself and therefore you can use a higher water temperature to obtain the same result. The graphs below show the average high temperature for DB and WB over a thirty year period.
As you can see above China provides a challenging environment for any cooling requirement, particularly in summer, with high DB temperatures and relatively high WB temperatures to suit.
The important factor here is that anything above the blue line can utilise the DB and therefore not require any additional water usage. Anything between the blue line and the orange line can be cooled using an adiabatic system and this is where the water usage would come into being. Anything beneath the orange line would require additional mechanical cooling such as a traditional chiller system, this would then be using maximum water and additional power for the mechanical equipment.

What happens when you implement a ColdLogik rear door?

In the graphs above you can see the marked difference between using a traditional cooling system, which is marked in yellow, and the ColdLogik cooling requirement, marked in grey.In China its clear to see that by utilising the traditional approach you would, on average, have a need for the adiabatic system for almost the whole year and you would also require mechanical for half the year in varying load. However, as most chillers have a minimum run of 25% less of the free cooling may be available
By utilizing the ColdLogik door, on average, you would not need to use any additional water for 6 months of the year to provide adiabatic cooling, you would only require mechanical cooling assistance for around 1-2 months. Chillers would normally remain on site to provide redundancy on the rare occasions that a heat wave outside of the average occurs, however the chillers may not need to be run for 10 months of the year, causing an additional operational saving.

Conclusion

In conclusion, without considering the lower water usage across the remaining 4 months which could be substantial, the ColdLogik door would likely be able to save a minimum of 25% additional water that would otherwise be consumed by the traditional cooling methods.

Translating into physical water usage over the year, and based on the conservative 1tW figure, this could drop the current projected usage figure of 24.82 billion litres of water down to 18.6 billion litres of water which is a 6.2 billion litre drop. This is the equivalent of filling the Birds nest stadium in Beijing with water twice over which was the pinnacle of the 2008 Olympic games.
If you are looking to improve your water usage with a product that is tried and tested and deployed into the market worldwide then get in touch with USystems today.

Conventional air cooling traditionally consumes significant energy when using mechanical chillers, one way to reduce and potentially eliminate the additional energy wastage is by utilising adiabatic cooling. Whilst significantly improving efficiencies on one hand this exponentially increases water usage in order to equip evaporative cooling. The major down side however is the growing scarcity of water in certain geographical locations. A typical large scale Data Center consumes an equivalent of 2,500 peoples water which is putting pressure on local governments in order to drop water usage.

By utilising liquid cooling you can effectively increase the water temperature to the point where adiabatic cooling is no longer needed, giving the best of both worlds, no excess water wasted and better energy efficiency with a simpler site set up and requirement. It really is a WIN-WIN-WIN.

OUR PRODUCTS

Data Center Products 

that Exceed Expectations

Discover the High Performing Data Center products, including ColdLogik Rear Door Coolers – RDHx, ColdLogik InRow Cooler, ColdLogik & EDGE LX Plant.
ColdLogik
Rear Door Heat Exchanger

CL21 Passive Rear Door Heat Exchanger

The CL21 Passive RDHx offers high-performance cooling at zero-operational cost.
EDGE
EDGE Solutions

EDGE-3 Soundproof MDC

Enables quiet computing in limited space with advanced intelligence and security features.
ColdLogik
Rear Door Heat Exchanger

CL20 Active Rear Door Heat Exchanger

Efficient cooling solution for data centers, optimizing performance and reliability.
USpace
Server Cabinet

USpace - 4210 Cabinet

Cost-effective and versatile Rack, adapts to diverse applications, offering ease and flexibility.
ColdLogik
InRow Coolers

CL80 600W InRow Cooler

Precision cooling solutions for aisle containment, ensuring optimal performance.
ColdLogik
InRow Coolers

CL80 300W InRow Cooler

The highest cooling capacity available in a footprint of a chilled water InRow.
USpace
Server Cabinet

USpace - 5210 Cabinet

High-density design with flexible configuration for evolving IT environments.

USystems

Data Center Solutions
that Exceed Expectations

Get in touch with us at USystems Ltd to join the journey towards more efficient and sustainable data centers. Our leading and innovative technologies are designed to help you use less energy and reduce your carbon footprint on a global scale. Contact us now to explore how we can work together for a greener future.
Slider with Navigation Buttons