koomeyondatacenterelectricityuse v9
TRANSCRIPT
1
Electricity use and efficiency ofservers and data centers: A review of
recent data and developments
Jonathan G. Koomey, Ph.D.
http://www.koomey.com
See podcast at http://www.smartenergyshow.com/
Lawrence Berkeley National Laboratory &
Stanford University
Presented at Google
December 4, 2007
2
What can we do about
climate?
3
Our options
• Adapt–modify human systems to makethem more flexible and resilient
• Suffer–accept what comes (but whatcomes is likely to be costly in lives,ecosystem damage, and economicdisruption)
• Mitigate–reduce emissions
4
Foster innovation in
• Individual behavior and attitudes
– Purchasing of goods and services
– Purchasing of energy-using equipment
– Operating energy using equipment
• Technologies
• Institutions
• In each case, information technologyand networks are our most potent allies
5
Innovation for Technologies• Whole systems redesign
• Think about tasks
• Ignore illusory & historical
constraints (but heed real ones)
• Apply existing organizational
techniques and technologies in
new ways
• Build things better all the way
around so people want them for
more than efficiency
6
Innovation for Institutions• Whole systems redesign (e.g. IT+productivity)
• Redefine business processes (e.g. Six Sigma)
– Assess opportunities
– Create local business cases
– Assign responsibility
– Measure results
– Reward good results (use prizes)
• Rethink underlying assumptions (e.g. extended
producer responsibility)
• Foster + reward innovation (e.g. Mutual Fun)
7
Harness the power of business
• Environmentalists and business must learn to worktogether
• Give consumers information about companies (e.g.scorecard.org and the Carbon Disclosure Project) andproducts (e.g. Energy Star)
• Create internal and external pressure for continuousimprovement and reorganization
• Use the power of the supply chain– Demand supplier responsibility
– Use purchasing power to move the market
8
Organic food is now
mainstream
Berkeley Bowl
Organic Bananas
(Dole),79¢/lb;
Safeway (!) organic
Bananas (Bonita),
79¢/lb
Trader Joe’s organic
Bananas (Dole),
~95¢/lb
9
Introduction to data centers
• Much confusion about data centerelectricity use (see Mitchell-Jackson etal., 2002 and 2003)
• Review recent data from
– Uptime facilities
– Koomey study (released 15 Feb 2007)
– Report to Congress (July 2007)
• Discuss implications for industry growth
10
Data centers in the news• Recent activity in the Southeast, Texas, and
the Northwest– Google
– Yahoo
– Microsoft
– Others
• Announcements often don’t give relevantdetails like electrically-active floor area, souse caution in interpreting them
• Can’t build them all in one place (constraints)
• Separate hosting, search, corporate, andsupercomputers (all different markets)
11
local distribution lines
to the building, 480 V
HVAC system
lights, office space, etc.
UPS PDU computer racks
backup diesel
generators
Electricity Flows in Data CentersElectricity Flows in Data Centers
computer
equipment
uninterruptible
load
UPS = Uninterruptible Power Supply
PDU = Power Distribution Unit;
12
IT Equipment
• Servers– High end enterprise servers
– Pedestal/tower servers
– Rack mount servers
– Blade servers
• Storage
• Network equipment– Routers
– LAN/WAN switches
– Hubs
Enterprise Servers
Rack Mount Server
Blade Server
13
Watch out for floor area
Source: ASHRAE
14
Uptime Institute data
• The Site Uptime Network is a technicalmembership organization for data centeroperators and designers
• Uptime first reported data for 1.6 to 1.9 Msf offacilities in 2002 (for 1999-2001)
• Uptime has tracked 19 data centers for 8years (total = 0.92 M sf in 1999)– Total IT load, floor area, and computer power
densities (W/sf)
15
Trends in 19 Site Uptime
Network facilities
W/sf measured as watts of IT power divided by electrically active floor area
16
Study on total server power
• Details
– Published 15 February 2007
– Funded by AMD
– Authored by Jonathan Koomey
• Download it athttp://enterprise.amd.com/us-en/AMD-Business/Technology-Home/Power-Management.aspx
• Reviewers from all major industry players
17
Server power study
methods and data
• Estimated power use for servers
– 2000 and 2005
– Volume, mid-range, and high end servers
– U.S. and World
• Used IDC data for total installed baseand most popular models
• Used manufacturer data/estimates fortypical power used per unit
18
Summary results for server
electricity use
19
US and Europe dominate
electricity for servers in 2005
Source: Koomey 2007
20
Asia Pacific growing fast!
21
EPA report to Congress
• Released August 3, 2007
• Download athttp://www.energystar.gov/datacenters
• Purposes of the report– Assess data center electricity use
– Identify barriers to efficiency
– Compile opportunities for industry collaborationand future research
– Propose policies and identify federal leadershipopportunities
22
Report to Congress: US
data center electricity use
23
Trends pushing total data
center power use up• Increasing demands for
– E-commerce
– VOIP
– Internet search
– software as a service
– video downloads
– resiliance in the face of disaster
– regulatory compliance (e.g. Sarbanes-Oxley)
– IT-enabled business transformation
• More transistors on a chip + more RAM +more volume servers
24
Trends pushing total data
center power use down
• Virtualization/consolidation
• Cooling and power constraints
• Recognition of constraints by the C level
• Metrics
– Servers + other IT equipment (E*)
– Site infrastructure
• Utility rebates (PG&E)
25
Misplaced incentives throughout
• Energy efficiency metrics not standardized
• 90% of site infrastructure costs are related tokW, not to floor area (Uptime) but costsalmost always charged per square foot
• Utility bills and infrastructure costs paid byfacilities department, cost of servers paid byIT department
• IT, facilities, CFO, and real estate folks don’ttalk (hierarchy and culture differences)
• Piling safety factor upon safety factor
26
Compute total costs to
understand incentives
• Simple model of total costs, including
– Cooling and other infrastructure costs
– IT capital costs
– Energy costs
– Other operating costs
• Based on current industry practice for
high performance computing facilities
27
Site infrastructure capital
costs are 2/3 of IT cap. costs
Based on a simple model that calculates annualized total costs of ownership
of an HPC data center for the financial industry: Koomey, Jonathan,Kenneth G. Brill, W. Pitt Turner, John R. Stanley, and Bruce Taylor. 2007. Asimple model for determining true total cost of ownership for data centers.Santa Fe, NM: The Uptime Institute. September.
28
Efficiency opportunities
• Think “whole system redesign” (RMI)
• Align incentives to minimize True Cost of Ownership
• Low hanging fruit (Uptime, Ecos, LBNL)– Modify current infrastructure/operations/incentives
– Kill comatose servers
– Buy efficient power supplies
• A little more work– Metrics for servers tied to purchases
– Metrics for infrastructure efficiency
– DC power or high efficiency AC power
– Virtualization & consolidation
29
Intel’s new Eco-Rack
• Idea proposed by JK toLorie Wigle of Intel in earlyDec. 2006
• Announced at IntelDeveloper Forum 9/18/07
• 16-18% savingscompared to good currentpractice
• Normalized workloads
• Eco-Rack 1.5 and 2.0 nowin development
30
Eco-Rack savings 16-18%
Data current as of September 18, 2007. Both Standard and Eco-Rack cases assume power save switch
(SpeedStep) is on. Contact: [email protected] or [email protected] with questions or comments.
31
Sources of Eco-Rack Savings
Data current as of September 18, 2007. Both Standard and Eco-Rack cases assume power save switch
(SpeedStep) is on. Contact: [email protected] or [email protected] with questions or comments.
32
Conclusions• Total power
– for servers is about 1.2% of U.S. electricity use(including cooling and auxiliaries).
– for servers plus networking, storage, andcooling/auxiliary equipment is about 1.5% of U.S.electricity use
– roughly doubled from 2000 to 2005
• If IDC installed base forecast to 2010 holds,server power use up another 40% to 76% from2005
• W/sf appears to be going up
• Volume servers driving growth
33
Conclusions (continued)
• Perverse incentives abound
• Organizational changes are needed
– driven mainly by infrastructure costs goingup as a fraction of total cost of data centers
• Consensus efficiency metrics arecoming soon for IT and infrastructure
• The industry is focused on improvingdata center efficiency, and big changesare afoot (e.g. Eco-rack)
34
The final word
• Institutional and personal change are at
least as important as technical change
for solving the climate problem
– even currently available technologies are
not now being adopted
– Information technology can enable these
changes to happen more rapidly than ever
before
35
Key web sites
• EPA on data centershttp://www.energystar.gov/datacenters
• LBNL on data centers:http://hightech.lbl.gov/datacenters.html
• The Green Grid:http://www.thegreengrid.org/
• The Uptime Institute:http://www.upsite.com/TUIpages/tuihome.html
36
Extra slides
37
Some important background:
What do we know about climate?
• “Unequivocal” that the earth’s climate
is warming
• More than 90% certainty that human
emissions of CO2 and other greenhouse
gases are the cause
(Findings from IPCC 2007 WGI, AR4)
38
Average global
temperatures*
and sea levels
are up, snow
cover is down
*and the models reproduce
these historical temperature
changes well.
39
Dramatic recent
changes in CO2
and CH4
concentrations
Source of graphs: IPCC Working Group 1 Summary for
Policy Makers, Fourth Assessment report, 2007.
“We know humans are responsible for the
CO2 spike [since pre-industrial times]
because fossil CO2 lacks carbon-14, and
the drop in atmospheric C-14 from the
fossil-CO2 additions is measurable.”
–John P. Holdren, Harvard University
40
We’re already seeing effects of
humans on the earth’s climate• Reduced Arctic and Antarctic ice cover
• Glacial melting
• Sea level rise
• Floods
• Wildfires
• Drought
• Extreme weather events
• Damage to ecosystems
41
Going, going, gone?
The difference between median minimum arctic ice coverage and the extent on Sept.
16, 2007 is equal to the area of Alaska and Texas combined (2.61 M sq. km or 1 M sq.
miles). http://nsidc.org/news/press/2007_seaiceminimum/20070810_index.html
Median 1979-2000September 21, 2005September 16, 2007
6.74 M square km5.32 M square km4.13 M square km
42
IPCC 2007 scenarios
to 2100 ---------------->
If we don’t alter course, we’ll end up where we’re headed
Global average surface
temperature is heading for
a state outside the range
experienced during the
tenure of Homo sapiens on
Earth (slide courtesy of
John P. Holdren, Harvard
University). Year 2000
concentrations