stulz datacentre cooling

26
STULZ – THE NATURAL CHOICE 01/2010 Best Practice - Data Centre Cooling 1

Upload: nick-turunov

Post on 29-Nov-2014

800 views

Category:

Documents


8 download

DESCRIPTION

 

TRANSCRIPT

Page 1: Stulz datacentre cooling

STULZ – THE NATURAL CHOICE

01/2010

Best Practice - Data Centre Cooling 1

Page 2: Stulz datacentre cooling

STULZ – THE NATURAL CHOICE

01/2010

Best Practice - Data Centre Cooling 2

1. INTRODUCTION

2. AIR FLOW LEAKAGE

3. PERFORATED TILES: NUMBER AND OPENING FACTOR

4. PERFORATED TILES: WITH ADJUSTABLE DAMPER

5. PERFORATED TILES: PLACEMENT

6. CLOSE UNUSED UNITS IN THE RACKS WITH BLANKING PANELS TO AVOID AIRFLOW SHORT CIRCUIT INSIDE THE RACK

7. AIRFLOW PHILOSOPHIES

8. RAISED FLOOR HEIGHT

9. RETURN AIR CONDITIONS

10. CHILLED WATER SYSTEM WATER CONDITIONS

11. STANDBY UNIT OPERATION

12. USE CRAC UNITS WITH STATE-OF-THE-ART COMPRESSOR AND FAN TECHNOLOGY

Page 3: Stulz datacentre cooling

STULZ – THE NATURAL CHOICE

01/2010

Best Practice - Data Centre Cooling 3

1. Introduction

This presentation summarises today´s „Best Practise“ for data-centre cooling systems.

It will give you a rough overview about all concerned parts of the data-centre.

This list is not exhaustive, but it will impart the most important matters how to design, build and run an as energy efficient as possible data-centre.

Page 4: Stulz datacentre cooling

STULZ – THE NATURAL CHOICE

01/2010

Best Practice - Data Centre Cooling 4

2. Airflow Leakage

Airflow leakages (short air circuit) leads to dramatic inefficiencies due to air circulation back to the CRAC unit without taking heat from the computer equipment.

• Close all unwanted openings in the raised floor

• Close all unwanted openings below the racks

• Close all cable cut outs – use cable sealings

• All gaps (near walls and CRAC units) have to be sealed

Page 5: Stulz datacentre cooling

STULZ – THE NATURAL CHOICE

01/2010

Best Practice - Data Centre Cooling 5

2. Airflow Leakage

The target is to create an overpressure under the raised floor to realize an even air supply to all areas of the data-centre.

This overpressure can only be achieved with an as low as possible amount of unwanted airflow leakage.

A maximum loss of 2 – 6% of the total airflow of the data-centre is acceptable.

Page 6: Stulz datacentre cooling

STULZ – THE NATURAL CHOICE

01/2010

Best Practice - Data Centre Cooling 6

3. Perforated Tiles: Number and Opening Factor

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15Tile number

Example (Data-centre):

Design airflow: 50.000 m³/h, ESP 20Pa

Chosen tile: airflow: 500m³/h, 20Pa

⇒ 100 tiles are required

Actual aiflow: 30.000 m³/h,

⇒ Tile number to be reduced to 60

Only a raised floor with sufficient height and perforated tiles with limited openings allow an even air distribution !!!

The number of perforated tiles must be in line with:A) the designb) the actual / real airflow

Page 7: Stulz datacentre cooling

STULZ – THE NATURAL CHOICE

01/2010

Best Practice - Data Centre Cooling 7

4. Perforated Tiles: With Adjustable Damper

Perforated tiles with integral adjustable dampers can be used to avoid having to replace perforated tiles with solid tiles.

In this case the number of perforated tiles can remain unchanged but all tiles need to be adjusted according to the actual requirements in the room.

It is also possible to work with different adjustments to vary the amount of air in different areas of the data-centre.

In any case the static pressure under the raised floor has to be kept at the design level.

Page 8: Stulz datacentre cooling

STULZ – THE NATURAL CHOICE

01/2010

Best Practice - Data Centre Cooling 8

5. Perforated Tiles: Placement

Perforated tiles should only be placed at positions, where cold air is really required to cool equipment.

Do not place perforated tiles directly near the CRAC unit; a distance of minimum 2m has to be kept. Perforated tiles near CRAC units can induce warm room air into the raised floor (negative airflow).

Page 9: Stulz datacentre cooling

STULZ – THE NATURAL CHOICE

01/2010

Best Practice - Data Centre Cooling 9

6. Close Unused Units in the Racks with Blanking Panels

Recirculation of cooling air inside the rack leads to overheating of servers.

Blanking panels installed in unused areas or slots of a rack eliminate a possible internal recirculation of the hot air.

Page 10: Stulz datacentre cooling

STULZ – THE NATURAL CHOICE

01/2010

Best Practice - Data Centre Cooling 10

7. Airflow philosophies

A) The „old/traditional“ way:

• Uncontrolled placement of perforated supply air tiles anywhere in the room

• Cold air is supplied in an uncontrolled way to the low density equipment

• Uncontrolled recirculation • Supply and return air mixing

takes place• NOT RECOMMMENDED

ANYMORE• Very inefficient

Page 11: Stulz datacentre cooling

STULZ – THE NATURAL CHOICE

01/2010

Best Practice - Data Centre Cooling 11

7. Airflow philosophies

B) Hot Aisle, Cold Aisle Concept:

Hot/Cold AisleConcept

• Cold supply air distribution only in the cold aisles• Warm return air only in the hot aisles• Open tiles only in the cold aisle• Risk of mixing cold and hot air is high• This risk can be reduced by using suspended

ceilings

Page 12: Stulz datacentre cooling

STULZ – THE NATURAL CHOICE

01/2010

Best Practice - Data Centre Cooling 12

7. Airflow philosophies

C) Hot Aisle Containment:

Hot AisleContainment

• Hot Aisle between the racks will be covered on the top and the end of rack rows

• Cold supply will be delivered into the room• Open tiles only in the room• Full separation between supply and return air• The room itself will be at low temperature level

Page 13: Stulz datacentre cooling

STULZ – THE NATURAL CHOICE

01/2010

Best Practice - Data Centre Cooling 13

7. Airflow philosophies

D) Cold Aisle Containment:

Cold Aisle Containment

• Cold supply air distribution only in the cold aisles• The cold aisle is separated from the hot aisle • Warm return air only in the hot aisles• Open tiles only in the cold aisle• No risk of mixing cold and hot air• The room itself will be at high temperatures

Page 14: Stulz datacentre cooling

STULZ – THE NATURAL CHOICE

01/2010

Best Practice - Data Centre Cooling 14

7. Airflow philosophies

Debate: Hot aisle or cold aisle containment ???

Keeps the hot air in a confined area.

The room outside is cold.

Servers are stable and can be operated for an extended period in case of loss of airflow.

Hot Spots in the cold aisle disappear.

Overall stability can be increased.

Disadavantage:

CRAC´s have to be calculated carefully to deal with the high return air temperatures.

Keeps the cold air in a confined area, no need to cool the entire space.

Floor space around the cold aisle is warm.

Disadvantages:

Air balancing can be difficult if CRAC´s are sequencing and the raised floor pressure is changing.

In case of power outage the cold aisle is exposed to high threat due to loss of airflow .

Control of supply air temperature with DX-units only possible to a certain extend.

Page 15: Stulz datacentre cooling

STULZ – THE NATURAL CHOICE

01/2010

Best Practice - Data Centre Cooling 15

7. Airflow philosophies

E) Room Supply, Direct Rack-Out Return Concept:

Room supply,Ducted return air

• Cold supply air enters the rack through the room• Open tiles only in the cold aisle• Warm return air leaves the rack through a duct and suspended ceiling• Full seperation of hot and cold air• The room itself will be at low temperature level

Page 16: Stulz datacentre cooling

STULZ – THE NATURAL CHOICE

01/2010

Best Practice - Data Centre Cooling 16

7. Airflow philosophies

F) Close Coupling of CRAC units and Racks on Supply and Return Side:

Ducted Supply and Return Air Concept

• Cold supply air into ducts fitted to the supply air side of the racks• Warm return air through ducts fitted to the exhaust side of the racks• Open tiles only in the area of supply air ducts• No risk of mixing cold and hot air

Page 17: Stulz datacentre cooling

STULZ – THE NATURAL CHOICE

01/2010

Best Practice - Data Centre Cooling 17

7. Airflow philosophies

Summary:

• Target of all these various possibilities is the complete separation of warm return air and cold supply air

• This separation leads to an increased temperature difference between return and supply air temperature of the CRAC unit

• A high temperature difference increases the efficiency of the CRAC unit and therefore the efficiency of the data-centre in general

• What kind of separation can be used in a data-centre is depending on the situation and used equipment

• The highest level of efficiency is reached if each m³ of circulating air takes the design amount of heat from the IT-euipment

Page 18: Stulz datacentre cooling

STULZ – THE NATURAL CHOICE

01/2010

Best Practice - Data Centre Cooling 18

8. Raised floor height

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15Tile number

Rule of thumb:

The higher, the better !

The raised floor height has got a major influence on the efficiency of the whole data-centre !

The required free height is depending on room size, heat density and number and position of instaled CRAC units.

A certain obstacle free area is required for a proper supply of cold air to any area of the room !

Page 19: Stulz datacentre cooling

STULZ – THE NATURAL CHOICE

01/2010

Best Practice - Data Centre Cooling 19

The leading and most important temperature is the air entering the IT equipment (server).

According to the new ASHRAE environmental envelope supply air temperatures up to 27°C are possible. That would lead to return air temperatures of approx. 37°C.

Please note: A low return air set-point does not „cure“ problems with heat load. It is the other way around: The lower the return air set-point the lower the usable cooling capacity of the unit. Furthermore the operating costs are increasing.

High return air temperatures and lower return air humidity do not change the content of water in the room. The risk of ESD is as low.

8K

35°C

27°C

9. Return Air Conditions

Page 20: Stulz datacentre cooling

STULZ – THE NATURAL CHOICE

01/2010

Best Practice - Data Centre Cooling 20

Advantages of high return air temperatures:

(1) CW-units: High return air temperatures are leading towards higher water temperatures (with same sensible cooling capacity). The starting temperature of the free-cooling is much higher.

(2) CW-units: To get the same cooling capacity with high return air temperatures and high water temperatures the airflow of the unit can be reduced. Lower airflow leads to lower fan power consumption and a lower noise level.

(3) DX-units: Higher return air temperatures are leading towards higher evaporating pressure. With constant condensing pressure the delta between evaporating and condensation pressure is reduced. The compressor power consumption is decreasing. Furthermore the fan speed also can be reduced.But please be aware that too high evaporating pressure can damage the compressor.

9. Return Air Conditions

Page 21: Stulz datacentre cooling

STULZ – THE NATURAL CHOICE

01/2010

Best Practice - Data Centre Cooling 21

10. Chilled Water System Water Conditions

Heat loads in data-centres are nearly all sensible cooling load with only a small latent cooling load due to fresh air ventilation.

In accordance to the increased return air temperatures the chilled water temperatures can be increased as well.

• Chiller efficiency will be increased due to increased evaporating temperature.

• The higher the chilled water temperature the earlier the starting temperature

of the chiller free-cooling.

• The balance between cooling capacity of the CRAC unit and chiller efficiency

has to be evaluated.

Page 22: Stulz datacentre cooling

STULZ – THE NATURAL CHOICE

01/2010

Best Practice - Data Centre Cooling 22

11. Standby Unit Operation

Fan laws dictate that air volume is directly proportional to fan speed and that fan power is a cube of the fan speed.

0%10%20%30%40%50%60%70%80%90%

100%

Airflow Absorbed Power0%

10%20%30%40%50%60%70%80%90%

100%

Airflow Absorbed Power

1/2 Airflow

1/8 Absorbed

Power

Page 23: Stulz datacentre cooling

STULZ – THE NATURAL CHOICE

01/2010

Best Practice - Data Centre Cooling 23

4x ASD1900CW at 26°C/40%Water 10/15°C

Standby

4x 114,6 kW = 458,6 kW net sensible

Airflow: 3x 39.000 m³/hFan power = 3x 12,6 kW = 37,8 kWLpa,2m = 3x 71,6 dB(A) = 76,4 db(A)

3x 146,8 kW = 440,4 kW net sensible

Airflow: 4x 29.000 m³/hFan power = 4x 5,3 kW = 21,2 kWLpa,2m = 4x 62,7 dB(A) = 68,7 db(A)

=> Energy savings:16,6 kW x 8760h x 0,13 = 18.904 €/year @ 0,13 €/kWh

11. Standby Unit OperationTherefore, by running the stand-by CRAC–unit at reduced air volume the overall fan power is greatly reduced.

Example:

Page 24: Stulz datacentre cooling

STULZ – THE NATURAL CHOICE

01/2010

Best Practice - Data Centre Cooling 24

4x 110,3 kW = 441,2 kW net sensible

• Reduction of fan power consumption

• Reduction of noise level

• Increase of net sensible cooling capacity

• And: even more energy can be saved if net sensible cooling capacity is kept constant:

Advantages of Standby-Management:

Airflow: 4x 27.800 m³/hFan power = 4x 4,7 kW = 18,8 kWLpa,2m = 4x 61,5 dB(A) = 67,5 db(A)

=> Energy savings:19,0 kW x 8760h x 0,13 = 21.637 €/year @ 0,13 €/kWh

11. Standby Unit Operation

Page 25: Stulz datacentre cooling

STULZ – THE NATURAL CHOICE

01/2010

Best Practice - Data Centre Cooling 25

12. Use CRAC Units with state-of-the-art Technology

EC fan technology:

Advantages:1. Lower power consumption due to better overall degree of efficiency

2. No in-rush current, speed is ramped up, never draws more than full load amps

3. Low vibration – reduced sound level

4. Long maintanance free operation due to direct drive technology

5. Scalable air flow – easy adaptation on requirements on site

Page 26: Stulz datacentre cooling

STULZ – THE NATURAL CHOICE

01/2010

Best Practice - Data Centre Cooling 26

Example:Munich airport data centre, source:

• 600 m³ data centre, 75 server cabinets arranged in 3 rows

• 5 CRAC units (4+1), each 32 kW cooling capacity

Changes („Best practise only“):

• installation of hot aisle/cold aisle containment

• re-routing cables inside the racks

• reconfiguring of floor tile layout, sealing of raised floor

• installation of 1U blanking panels at the front of the racks

• installation of ventilated front doors

• increasing of room temperature to 28°C (supply air = 22°C)

=> Due to these changes the power consumption of the CRAC units are reduced by 35%. => Cabinet temperatures are reduced by 4°C – extended lifetime of servers and more reliability.