university of florida data center at eastside …...university of florida data center at eastside...

39
University of Florida Data Center at Eastside Campus: Key Facts • Design: KlingStubbins Architectural Engineering, Philadelphia, PA • Construction: Whiting-Turner; Baltimore, MD Commissioning: Hanson Professional Services; Springfield, IL • Budget: $15.7Million • Final Construction Cost: approximately $14Million • Gross Size: 25,390 square feet Data Hall 1: University Systems: 5,000 square feet, “Near Tier 3”; On-Line Courses (Sakai), Web-Servers, E-mail, PeopleSoft, Hyperion, & Other Administrative Systems, and Hosting Services • Data Hall 2: High-Performance Research Computing: 5,000 square feet, Tier 1 Initial Power Capacity: 675kW (300kW “Near Tier 3,” 375kW Tier 1) Ultimate Power Capacity: 2,250kW (750kW “Near Tier 3,” 1,500kW Tier 1) Storm-Rated for Hurricane Category 3 (129 mph) 2.5 Megawatt Diesel Backup Generator with 72-hour Fuel Tank (at full load)

Upload: others

Post on 10-Jun-2020

4 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: University of Florida Data Center at Eastside …...University of Florida Data Center at Eastside Campus: Purposes • Secondary site for University Systems, providing redundancy,

University of Florida Data Center at Eastside Campus: Key Facts

• Design: KlingStubbins Architectural Engineering, Philadelphia, PA

• Construction: Whiting-Turner; Baltimore, MD

• Commissioning: Hanson Professional Services; Springfield, IL

• Budget: $15.7Million

• Final Construction Cost: approximately $14Million

• Gross Size: 25,390 square feet

• Data Hall 1: University Systems: 5,000 square feet, “Near Tier 3”; On-Line Courses (Sakai), Web-Servers, E-mail, PeopleSoft, Hyperion, & Other Administrative Systems, and Hosting Services

• Data Hall 2: High-Performance Research Computing: 5,000 square feet, Tier 1

• Initial Power Capacity: 675kW (300kW “Near Tier 3,” 375kW Tier 1)

• Ultimate Power Capacity: 2,250kW (750kW “Near Tier 3,” 1,500kW Tier 1)

• Storm-Rated for Hurricane Category 3 (129 mph)

• 2.5 Megawatt Diesel Backup Generator with 72-hour Fuel Tank (at full load)

Page 2: University of Florida Data Center at Eastside …...University of Florida Data Center at Eastside Campus: Purposes • Secondary site for University Systems, providing redundancy,

University of Florida Data Center at Eastside Campus:

Purposes• Secondary site for University Systems, providing redundancy, continuity of operations & disaster recovery

• Provide significantly expanded capacity for Research Computing/HPC Center

• Facility to accommodate consolidation of college & department servers from various campus buildings into central site, to allow for better power-management of campus buildings, reduction in overall power-usage, and consequent university budget savings.

• Growth/expansion of existing services (University Systems)

• Long-range provision for growth/expansion of all services via both additional floor-space and increased usage (power) densities

Page 3: University of Florida Data Center at Eastside …...University of Florida Data Center at Eastside Campus: Purposes • Secondary site for University Systems, providing redundancy,

Estimated Annual Operating CostAs facility usage increases, so does Operations & Maintenance Cost,

the largest component of which will be electric power.

$-

$1,000,000

$2,000,000

$3,000,000

$4,000,000

$5,000,000

$6,000,000

625 1000 1500 1750 2000 2250Total Net IT kW - Tier 1 + Tier 3

Total AnnualOperatingCost

Annual PowerCost

AnnualOperating &MaintenanceCost

Tier 3 Maximum Build-Out

Excludes salariesIncludes depreciation/lifecycle replacement for infrastructure equipment

Page 4: University of Florida Data Center at Eastside …...University of Florida Data Center at Eastside Campus: Purposes • Secondary site for University Systems, providing redundancy,

Growth Projections: High-Performance Research

Computing Data Hall

0

200

400

600

800

1000

1200

1400

1600

1800

1 2 3 4 5 6 7 8 9 10 11

Kilowatts

Years

Capacity [1500kW Max]

HPC Need / Demand(updated Feb 2013)

Page 5: University of Florida Data Center at Eastside …...University of Florida Data Center at Eastside Campus: Purposes • Secondary site for University Systems, providing redundancy,

Growth Projections: University Administrative Systems & Hosting Data Hall

0

100

200

300

400

500

600

700

1 2 3 4 5 6 7 8 9 10 11

Kilowatts

Years

Capacity [750kW Max]

Total Enterprise Need /Demand

OSG & Virtual Hosting

ERP: PeopleSoft, Cognos,Hyperion, etc.

Networking

Mainframe

UFAD / Exchange

Historical Trend (8%/year)

Co-Lo Hosting

Page 6: University of Florida Data Center at Eastside …...University of Florida Data Center at Eastside Campus: Purposes • Secondary site for University Systems, providing redundancy,

UFDC

UFCore

Connectivity2 Separate, Independent, Fully-Redundant High-Speed Fiber-Optic Pathways

Back to Main UF CampusThe ‘red’ path is 96 strands (48 pair) of fiber-optic cable, and was built by UF specifically for this data center.

The ‘green’ path is 48 strands (24 pair) ‘dark fiber,’ on a 25 year lease from GRUCom.Paths shown are approximate

The redundant connection is essential to ensuring that no “backhoe accident” or other mishap can disrupt communications, and, consequently, the facility’s ability to provide mission-critical services.

Page 7: University of Florida Data Center at Eastside …...University of Florida Data Center at Eastside Campus: Purposes • Secondary site for University Systems, providing redundancy,

Data Center Capacity: How Big Is It?

The UFDC has two ‘Data Halls’ (Server Rooms) of 5,000 square feet each, for a total of 10,000 square feet. Each room can hold about 100 racks of servers.An additional 15,390 square feet is needed for the support spaces - the Electrical Rooms, Mechanical Room, and Equipment Yard, in addition to the relatively small ‘human-spaces.’

However, floor-space is only one factor of the 3 dimensions which describe a data center’s capacity.

Floor-SpacePo

wer Cooling

HPC / University Research Systems Total

Day 1 375kW 300kW 675kW

Ultimate 1500kW 750kW 2250kW

Power and Cooling are directly related: All the power that goes into the room ends up as HEAT, which must be removed by the cooling systemsData

Center Capacity

UFDC Power & Cooling Capacity

Page 8: University of Florida Data Center at Eastside …...University of Florida Data Center at Eastside Campus: Purposes • Secondary site for University Systems, providing redundancy,

Data Center Reliability: “N+1” and “Concurrently Maintainable”One of the most important characteristics of a data center is Reliability: its ability to keep its “pay-load” equipment running — supplied with adequate power and cooling.

N+1The primary infrastructure systems at UFDC are configured with “N+1 Redundancy.” This means that, if it takes “N” components to fully meet the needs, then N+1 — one extra — units are provided. This allows for continued operation if:

• One unit fails, or• One unit needs to be taken off-line for maintenance

For the University Systems Data Hall (Data Hall 1), the electrical, cooling, and networking systems are all “N+1 Redundant.”UFDC Data Hall 2, used by UFIT High-Performance Research Computing group was not designed to have N+1 Redundancy, because their research work does not have as stringent ‘up-time’ requirements as the Student and Administrative Systems. Instead, the emphasis in this room is on greater expandability of CAPACITY (kilowatts of computing payload).

Concurrently MaintainableNot all UFDC infrastructure is suitable for “N+1 Redundancy.” In some cases, it would be prohibitively expensive to do N+1 — for example, the backup power generator, or the chilled-water piping. In those cases, we have designed the systems to be Concurrently Maintainable — that is, it is possible to work on the system, or part of the system, without taking that (entire) system off-line.

Page 9: University of Florida Data Center at Eastside …...University of Florida Data Center at Eastside Campus: Purposes • Secondary site for University Systems, providing redundancy,

Energy-EfficiencyConsideration has been given at every step in the design and construction process to making UFDC as much as practical, a “Good Energy Citizen.”

• Chillers set to run 10¡F warmer than typical; preliminary estimates suggest this may be up to 2X normal efficiency

• Overhead power & cabling eliminates underfloor airflow congestion, reducing fan-power requirement

• High-efficiency, variable speed fans in CRAHs

• High-Efficiency Trane chillers – 32% more efficient than US Govt. Recommended Standard

• Hot-aisle/Cold-aisle layout using under-floor and over-ceiling plenums to maximize hot/cold-air separation; minimizing mixing and maximizing return-air temperature, for better cooling efficiency

• Room design allows for “chimney racks” ducted directly into ceiling plenum for high power-density applications

• Taps into the chilled water distribution system allow for future use of closely-coupled cooling systems for efficiently handling high power-density applications

• Incremental expansion is more efficient, due to operational characteristics of the devices; they are more efficient at near-full rated capacity

• PUE Projections: Data Center efficiency is frequently described in terms of Power Utilization Efficiency (PUE).Total Power Consumption

IT Power ConsumptionPUE =A PUE of 1.0 would be “perfect” — and impossible. Some power is required for cooling, lighting, etc., and some is lost to overall inefficiencies of the equipment.

UFDC is projected to have an annual PUE of 1.39, which is 12.3% better than the ASHRAE Baseline model.

Page 10: University of Florida Data Center at Eastside …...University of Florida Data Center at Eastside Campus: Purposes • Secondary site for University Systems, providing redundancy,

How Much Heat? How Much Cooling?

The fundamental law of Data Centers is: “All the power you put into the room is converted to heat which must be removed.” A single rack of servers or storage can require anywhere from 3.5kW (3,500 Watts) to as much as 28kW. 1 Ton of cooling = approximately 3.5kW.So a rack of servers or storage requires from 1 to 8 (or more) Tons of A/C.With floor-space for 100+ racks per room, this will add up to hundreds of Tons.UFDC has 2 air-cooled Trane water chillers, each of which can provide up to 270* Tons (945kW) of cooling. Because this is an “N+1” design, only one of the chillers is needed (initially) for cooling; the other is on stand-by/back-up.

Future Growth & ExpansionWhen the needs of the facility exceed 270 Tons, a third chiller will be added, increasing the capacity to:

270 Tons + 270 Tons = 540 Tons (1,890 kW) [with the third chiller being the “+1” back-up unit].Finally, when demand exceeds 540 Tons, the 4th chiller will be added to raise the “N+1” capacity to 810 Tons (2,835kW).

*The chillers are rated for 300 Tons, but we will only run them at a maximum of 90% rated load, to improve reliability.

Page 11: University of Florida Data Center at Eastside …...University of Florida Data Center at Eastside Campus: Purposes • Secondary site for University Systems, providing redundancy,

Future Growth & ExpansionThroughout the data center, you will see signs like this one, pointing out design/engineering decisions which provide a path for future growth.

The UF Data Center is designed and engineered to meet today’s needs today, but to be expandable/upgradeable to over 3X its original (current) capacity.

We chose to signal these features in green, not only to represent GROWTH, but because these design choices make UFDC more ENERGY-EFFICIENT.

The large infrastructure components, such as Uninterruptable Power Supplies, Chillers, Pumps, and Computer Room Air Handlers (CRAHs) perform at higher efficiency when run near their maximum rated loads.

By only having enough of these to handle the current load, and adding them as required for growth, the University not only saves initial construction costs, but also has a more efficient data center throughout its lifetime.

Page 12: University of Florida Data Center at Eastside …...University of Florida Data Center at Eastside Campus: Purposes • Secondary site for University Systems, providing redundancy,

How It Works: Heat RemovalCross-Section of a Data Hall

Cold Air Supply - Under Floor

HoTAiRReturn

“CRAH”Computer

RoomAir

Handler

“CRAH”Computer

RoomAir

Handler

Hot Air Return - Above Ceiling

IT equipment

in racks draws in

cold air in front, and blows out hot air to

rear

Hot air blown

out from rear of IT equipment racks rises to vents in

ceiling

“Cold Aisle”IT Equipment is in racks “face to

face”

“Hot Aisle”IT Equipment is in racks “back to

back”Cold air comes

up through vented tiles in

Cold Aisle floor

Solid floor in Hot Aisles

Cold air comes up through

vented tiles in Cold Aisle floor Solid floor in Hot

Aisles

è è è

èèè

èè è

è èè è

è

è èè

è è è

èè

èè

è

è

èèèèèè

è

è

“CRAHs”Computer

RoomAir

Handlers draw in air

from ceiling via ducts

èè

“CRAHs”Computer

RoomAir

Handlers draw in air

from ceiling via ducts

Maximizing the separation between cold air supply, and hot air exhaust improves efficiency by reducing the amount of power needed to cool the equipment.

HoT

AiR

Return

Page 13: University of Florida Data Center at Eastside …...University of Florida Data Center at Eastside Campus: Purposes • Secondary site for University Systems, providing redundancy,

About Data Center “Tiers”The data center industry generally recognizes four classes of facility, based on their level of

resistance to outages. UFDC is a ‘Hybrid’ Tier 3/1 Design.

Tier 4• 2 independent utility paths• 2N power and cooling systems• Able to sustain 96 hour power outage• Stringent site selection• 24/7 onsite maintenance staff

Tier 3• 2 utility paths• N+1 power and cooling systems• Able to sustain 72 hour power outage• Redundant service providers• Careful site selection planning• Allows for concurrent maintenance

Tier 2• Some redundancy in power and cooling systems• Generator backup• Able to sustain 24 hour power outage• Minimal thought to site selection

Tier 1• Numerous single points of failure• No generator• Unable to sustain more than 10 minute power outage

Tier 4• VERY EXPENSIVE TO IMPLEMENT

Tier 3• The UFDC Data Hall 1, used by UFIT-CNS to host UF administrative systems (including Student Systems, Sakai, PeopleSoft, Email, Web-servers, etc. is considered a “NEAR-Tier 3” facility.

• There are not “2 utility paths” available at this site, and site-selection was pre-determined, based on the fact that UF already owned the land.

Tier 2• The Bryant Hall/SSRB Data Center and CSE Data Center on UF Main Campus are generally considered to be “Tier 2+.”

Tier 1• UFDC Data Hall 2, used by UFIT High-Performance Research Computing was designed to be a “Tier 1” facility, because their research work does not have as stringent ‘up-time’ requirements as the Student and Administrative Systems. Instead, the emphasis in this room is on greater expandability of CAPACITY (kilowatts of computing payload).

Page 14: University of Florida Data Center at Eastside …...University of Florida Data Center at Eastside Campus: Purposes • Secondary site for University Systems, providing redundancy,

Network Entrance Room 2The ‘red’ route

• 96-strand Optical Fiber to UF Main Campus

• Wholly-owned by UF

• Installed by USI, under direction of UFIT/CNS

• On Main Campus, this terminates in the SSRB data center

UF Core

UFDC

Page 15: University of Florida Data Center at Eastside …...University of Florida Data Center at Eastside Campus: Purposes • Secondary site for University Systems, providing redundancy,

Network Entrance Room 1The ‘green’ route

• 48-strand Optical Fiber to UF Main Campus

• 25-year lease from GRUCom

• “Dark Fiber” allowing UF to use any technologies needed

• On Main Campus, this splits into two separate 24-strand runs;

• One to SSRB• One to Centrex...making it dually-connected to the UF Core, to provide additional redundancy

UF Core

UFDC

Page 16: University of Florida Data Center at Eastside …...University of Florida Data Center at Eastside Campus: Purposes • Secondary site for University Systems, providing redundancy,

Data Hall 1: University Administrative Systems

and Hosting• 5,000 square feet

• Space for approximately 100 standard “Racks” of computing equipment, in addition to the networking racks• Plus an additional row reserved for non-standard size/shape equipment (at the rear of the room)

• Initial power capacity = 300kW

• Ultimate power capacity = 750kW

• Ultimate power density = 150 Watts/square-foot (~7.5kW/Rack)

• Dual-feed, fully-redundant power to all points, coming from 2 separate UPS systems

• Power from GRU, backed by 2,500kW diesel generator with minimum 72 hours of fuel

• Dual-feed fully-redundant networking to all points connected back to main UF Campus via 2 physically disparate fiber-paths

Page 17: University of Florida Data Center at Eastside …...University of Florida Data Center at Eastside Campus: Purposes • Secondary site for University Systems, providing redundancy,

Data Hall 1 Networking

MainDistribution

Area

Horizontal(Row)

DistributionArea

[Etcetera ... more rows, more racks...]

Rack

Rack

Rack

Rack

Rack

Rack

Rack

Rack

Rack

Rack

Rack

Rack

Rack

Rack

Rack

Rack

Rack

Rack

Redundant Cross-Town Fiber Routes

UFDC

MainDistribution

Area

Horizontal(Row)

DistributionArea

Horizontal(Row)

DistributionArea

Page 18: University of Florida Data Center at Eastside …...University of Florida Data Center at Eastside Campus: Purposes • Secondary site for University Systems, providing redundancy,

Hosting ServicesVirtual Machine Hosting with VMware:

• Virtual Machine (VM) hosting provides “bare-metal” VMs so that your organization’s IT staff can run dedicated and highly customized Linux or Windows systems. This allows your IT staff to focus on your computing needs without the worry of purchasing and maintaining hardware resources.

• Your VM will live in UFIT’s secure private cloud which leverages:• Multiple enterprise-class datacenters

• Secure enterprise-class network – public or UF private IP space available

• SAN/NAS-backed failover to prevent downtime due to hardware failure or maintenance

• VMware vSphere 5 environment

CNS provides:

• Everything up to the hypervisor (virtualization layer). This includes all physical resources such as computing hardware, networking, datacenter resources, and VMware software. CNS also provides VPN access to allow you to connect to your VM for management purposes (console access, power-on/off, etc).

Customer provides:

• You provide IT staff to install, configure, and maintain all software on your VM – OS and Application software. This includes maintenance of proper licensing for any software installed on your VM. Your staff will be responsible for working with CNS Network Services to maintain network ACLs pertaining to your VM’s IP address(es). Your staff will also be responsible for responding to and working with UFIT’s office of Information Security and Compliance.

Page 19: University of Florida Data Center at Eastside …...University of Florida Data Center at Eastside Campus: Purposes • Secondary site for University Systems, providing redundancy,

Data Hall 2: High-Performance Research

Computing• 5,000 square feet

• Space for approximately 154 standard “Racks” of computing, storage, and networking equipment

• Initial power capacity = 375kW

• Ultimate power capacity = 1,500kW

• Overall average ultimate power density = 9.7kW/Rack or 300 Watts/square-foot

• Connected to Main UF Campus Research Network via 200* gigabit/second fiber-optic link, and from there via Florida LambdaRail to the Internet2 Innovation Platform

*Initially 100Gbps, increasing to 200Gbps by mid-2013

Page 20: University of Florida Data Center at Eastside …...University of Florida Data Center at Eastside Campus: Purposes • Secondary site for University Systems, providing redundancy,

PDU – Power Distribution UnitFeeds power to the computing and networking equipment, via the overhead power BUSWAY system.The units in this room range in size/capacity from 150kva (roughly 142,000 watts) to 300kva (~284,000 watts) each.As part of the Tier 3 reliability design, each equipment row is fed by 2 power Busways, each connected to separate Power Distribution Units (PDUs), helping to ensure that no row is dependent on a single Busway, or a single PDU.

Page 21: University of Florida Data Center at Eastside …...University of Florida Data Center at Eastside Campus: Purposes • Secondary site for University Systems, providing redundancy,

Busway Power Distribution Systems(large aluminum rectangular bars running overhead)

• “Track-Lighting on Steroids”

• Allows for quick connection of new equipment (using specialized connectors)

• No Electrician neededEach equipment rack contains an internal power-distribution system for the equipment in that rack, so racks are connected to the Busways, eliminating the need for many long power cables.As part of the Tier 3 reliability design, each equipment row is fed by 2 power BUSWAYs, each connected to separate Power Distribution Units (PDUs), helping to ensure that no row is dependent on a single Busway, or a single PDU.To further capitalize on this redundancy-principle, most individual servers and network devices have 2 separate power supplies, so that each device can be connected to BOTH Busways feeding its row.

Page 22: University of Florida Data Center at Eastside …...University of Florida Data Center at Eastside Campus: Purposes • Secondary site for University Systems, providing redundancy,

Mechanical RoomThis room contains:

• Pumps which move the chilled water from the chillers (in the equipment yard) to the CRAHs

• Substantial piping and control valves for the chilled water system

• Air handler for the ‘human-spaces’ of the data center

Page 23: University of Florida Data Center at Eastside …...University of Florida Data Center at Eastside Campus: Purposes • Secondary site for University Systems, providing redundancy,

Pumps 1 & 2These pumps direct chilled water from the 2 chillers (in the equipment yard) to the Computer Room Air Handlers (CRAHs) which cool the data halls.

The pumps are sized to match the capacity of the chillers; each chiller requires one pump to handle the chilled water it produces.

The piping system is designed so that the pumps are not specifically coupled to any chiller; any combination of the pumps can work in conjunction with any combination of chillers to supply the needs of the data halls.

Each pump can handle the entire ‘phase 1’ requirements of the Data Center; like the chillers, we have “N+1” redundant capacity in pumps.During normal operation, the pumps are rotated, to achieve even wear, provide routine exercise of each unit, and opportunity for each unit to undergo scheduled maintenance.For improved reliability, they will be operated at a maximum of 90% of rated capacity.

Page 24: University of Florida Data Center at Eastside …...University of Florida Data Center at Eastside Campus: Purposes • Secondary site for University Systems, providing redundancy,

Future Growth & ExpansionPads & Connections for

Future Pumps 3 & 4When the Data Center load increases to where a single chiller can no longer meet the demand for chilled water, a third — and, eventually a fourth — chiller and pump will be added, to maintain “N+1” redundancy.

Page 25: University of Florida Data Center at Eastside …...University of Florida Data Center at Eastside Campus: Purposes • Secondary site for University Systems, providing redundancy,

Concurrently Maintainable Chilled Water Loop

How it worksWe need to be able to isolate any component, including any stretch of pipe, for maintenance, without shutting down the

chilled water feed (or return)!

CRAH

CRAH CRAH

CRAH

The solution is that both the Feed and Return lines are LOOPS; water travels in both directions around each loop, crossing from the Feed loop to the Return loop through the CRAH units.

And Valves – A LOT of VALVES!

Return Feed

Return Feed

Chilled H2O Supply Feed

Chilled H2O Supply Feed

Chiller

Chiller

Pump

Pump

Page 26: University of Florida Data Center at Eastside …...University of Florida Data Center at Eastside Campus: Purposes • Secondary site for University Systems, providing redundancy,

Electrical Room 1This room contains:• Main Power Switchboards

• Uninterruptable Power Supply systems for the Data Halls

• 2 CRAH units to handle the heat produced by the electrical systems

Page 27: University of Florida Data Center at Eastside …...University of Florida Data Center at Eastside Campus: Purposes • Secondary site for University Systems, providing redundancy,

Main Electrical Switchboard 1Size/Capacity: 480 Volt; 2,500 AmpFunction: Divides the main utility (GRU) electric power current into smaller currents for further distribution and

provides switching, current protection and metering for these various currents. Switchboards distribute power to transformers, panelboards, control equipment, and ultimately to system loads.

Redundancy: The University Systems Data Hall (Data Hall 1) is fed by two separate Main Electrical Switchboards; MSB1 and MSB3. Those connect to alternate PDUs (Power Distribution Units) within the Data Hall, feeding alternate BUSWAYs, such that each row of equipment is fed by two separate Busways, fed by two separate PDUs, fed by two separate UPSs, fed by two separate MSBs.This configuration provides the maximum possible protection against failure of the electrical distribution system, since each IT equipment rack is fed by two independent paths.

Page 28: University of Florida Data Center at Eastside …...University of Florida Data Center at Eastside Campus: Purposes • Secondary site for University Systems, providing redundancy,

Main Electrical Switchboard 2Size/Capacity: 480 Volt; 2,500 AmpFunction: Divides the main utility (GRU) electric power current into smaller currents for further distribution and

provides switching, current protection and metering for these various currents. Switchboards distribute power to transformers, panelboards, control equipment, and ultimately to Data Hall Power Distribution Units (PDU) and system loads.

This MSB feeds:

• the High-Performance Research Computing Data Hall (Data Hall 2)

• Much of the “people Space

Page 29: University of Florida Data Center at Eastside …...University of Florida Data Center at Eastside Campus: Purposes • Secondary site for University Systems, providing redundancy,

Main Electrical Switchboard 3Size/Capacity: 480 Volt; 2,500 AmpFunction: Divides the main utility (GRU) electric power current into smaller currents for further distribution and

provides switching, current protection and metering for these various currents. Switchboards distribute power to transformers, panelboards, control equipment, and ultimately to system loads.

Redundancy: The University Systems Data Hall (Data Hall 1) is fed by two separate Main Electrical Switchboards; MSB1 and MSB3. Those connect to alternate PDUs (Power Distribution Units) within the Data Hall, feeding alternate BUSWAYs, such that each row of equipment is fed by two separate Busways, fed by two separate PDUs, fed by two separate UPSs, fed by two separate MSBs.This configuration provides the maximum possible protection against failure of the electrical distribution system, since each IT equipment rack is fed by two independent paths.

Page 30: University of Florida Data Center at Eastside …...University of Florida Data Center at Eastside Campus: Purposes • Secondary site for University Systems, providing redundancy,

UPS 1Size/Capacity: 500kVa (450 kilowatts): 7 minutes @ 400kW

Function:

• Filters Utility (GRU) power, to provide smooth, constant power to IT equipment in Data Hall 1 (University Administrative Systems)

• Supplies momentary backup power in the event of a brief power loss (‘flicker’)

• Supplies backup power until generator starts up (takes about 15 seconds) in the event of a longer power outage

• Connected to backup generator

Page 31: University of Florida Data Center at Eastside …...University of Florida Data Center at Eastside Campus: Purposes • Secondary site for University Systems, providing redundancy,

UPS 2Size/Capacity: 625kVa (562 kilowatts): 5 minutes @ 500kW

Function:

• Filters Utility (GRU) power, to provide smooth, constant power to IT equipment in Data Hall 2 (High-Performance Research Computing)

• Supplies momentary backup power in the event of a brief power loss (‘flicker’)

Not connected to backup generator

Page 32: University of Florida Data Center at Eastside …...University of Florida Data Center at Eastside Campus: Purposes • Secondary site for University Systems, providing redundancy,

UPS 3Size/Capacity: 500kVa (450 kilowatts): 7 minutes @ 400kW

Function:

• Filters Utility (GRU) power, to provide smooth, constant power to IT equipment in Data Hall 1 (University Administrative Systems)

• Supplies momentary backup power in the event of a brief power loss (‘flicker’)

• Supplies backup power until generator starts up (takes about 15 seconds) in the event of a longer power outage

• Connected to backup generator

Page 33: University of Florida Data Center at Eastside …...University of Florida Data Center at Eastside Campus: Purposes • Secondary site for University Systems, providing redundancy,

Future Growth & ExpansionElectrical Room 2

Provides expansion space with pre-built connections for addition of 2nd electrical feed from GRU, via:

• 3 additional pad-mounted transformers - see empty pads in Equipment Yard

• 3 additional Main Switch Boards - In this room

• 3 additional UPS systems - In this room...essentially duplicating the equipment and capacity currently in Electrical Room 1

Page 34: University of Florida Data Center at Eastside …...University of Florida Data Center at Eastside Campus: Purposes • Secondary site for University Systems, providing redundancy,

Connection Point for Temporary External Generator

In the event of a failure of the permanent diesel backup power generator (in the equipment yard), a portable, trailer-mounted generator can be parked outside and connected to the data center electrical system at this point, to take over until the main generator is back in service.

Page 35: University of Florida Data Center at Eastside …...University of Florida Data Center at Eastside Campus: Purposes • Secondary site for University Systems, providing redundancy,

Back-Up Electric Power GeneratorEngine Configuration: “V-20” (20 cylinders)

Engine Size (Displacement): 5,822 cubic inch engine

Engine Power: 3,673 brake horsepower

Generator Output: 2,500,000 Watts (2.5 Megawatts)

Fuel/Run-Time: 12,100 gallon Diesel fuel tank; 72 hours run-time at full load (168 gallons/hour; 2.8 gal/min)

Algae-X” Fuel-Polishing System processes 600 gallons/hour, filtering the entire tank volume each 24 hours.

In the event of a power outage, the large Uninterruptable Power Supply (UPS) systems in the Electrical Room can provide power for at least 5 MINUTES (assuming the data center is fully-loaded; longer at partial load). The backup generator takes approximately 15 SECONDS to start, and begin delivering power.The generator feeds power to the chiller(s), pumps, and the Data Hall 1 UPSs and CRAH (cooling) units.

Restrictions: Does not supply power to Data Hall 2 (High-Performance Research Computing)Instead, the emphasis in that room is on greater expandability of CAPACITY (kilowatts of computing payload).

Page 36: University of Florida Data Center at Eastside …...University of Florida Data Center at Eastside Campus: Purposes • Secondary site for University Systems, providing redundancy,

Load BankOften described as a “Giant Hair-Dryer,” the Load Bank is largely

just heating coils and a large fan, to be a ‘dummy load’ for testing the generator.

Capacity: 800 kW

Function: To allow regular, periodic testing of the diesel backup generator under load, without actually taking the data center off main power.

The 800kW capacity of the load bank is approximately 32% of the maximum rated output of the generator.

Page 37: University of Florida Data Center at Eastside …...University of Florida Data Center at Eastside Campus: Purposes • Secondary site for University Systems, providing redundancy,

Chillers 1 & 2Capacity: 270 Tons of cooling (each) (964 kW)

Function: Provide chilled water to Computer Room Air Handlers (CRAHs)

For initial operations, the entire facility can be supported by a single chiller. Consequently, following the “N+1 Redundancy” principle, we have 2 chillers so that if one fails or needs to be taken off-line for maintenance, the other can support the data center needs.UFDC’s Trane RTAC300 Chillers use Trane’s ‘high-efficiency’ configuration, providing 32% energy efficiency improvement over ASHRAE Standard 90.1, adopted by the US Government as its recommended efficiency goal for modern chillers.During normal operation, the chillers are rotated, to achieve even wear, provide routine exercise of each unit, and opportunity for each unit to undergo scheduled maintenance.The chillers are nominally rated at 300 Tons, but for improved reliability, they will be operated at a maximum of 90% capacity.

Page 38: University of Florida Data Center at Eastside …...University of Florida Data Center at Eastside Campus: Purposes • Secondary site for University Systems, providing redundancy,

Future Growth & ExpansionPads & Connections for

Future Chillers 3 & 4When the Data Center load increases to where a single chiller can no longer meet the demand for chilled water, a third — and, eventually a fourth — chiller and pump will be added.

Each additional chiller provides 270 Tons of cooling capacity, sufficient for an additional 964 kilowatts of IT equipment. When fully expanded, the chiller bank will provide:

3 x 270 Tons = 810 Tonsor

3 x 964 kW = 2892 kW

... of cooling, with the 4th chiller being the “+1” (of “N+1”) needed to meet the data center’s reliability requirements.

Page 39: University of Florida Data Center at Eastside …...University of Florida Data Center at Eastside Campus: Purposes • Secondary site for University Systems, providing redundancy,

Configuration/Setup Room• Provides an environment as close as possible to that of the main Data Halls: temperature, humidity, etc.

• Overhead power Busway and data cabling mimic the main rooms

• Allows racks of equipment to be built, configured, and tested, prior to moving them into place

• Minimizes need for people to do work inside the Data Halls, reducing the chance of mistakes/accidents with production IT equipment

• Helps keep the Data Halls clean and dust-free, by keeping packing materials out of the production rooms