development and performance evaluation of a methodology...

31
Development and performance evaluation of a methodology, based on distributed computing, for speeding EnergyPlus simulation Vishal Garg a* , Kshitij Chandrasen a , Jyotirmay Mathur b , Surekha Tetali a , Akshey Jawa a a Centre for IT in Building Science, International Institute of Information Technology, Hyderabad, India b Centre for Energy and Environment Malaviya National Institute of Technology, Jaipur, India Vishal Garg Head & Associate Professor, Centre for IT in Building Science, IIIT Hyderabad, India. email: [email protected] Kshitij Chandrasena Student, B.Tech Final Year, IIIT Hyderabad, India. email: [email protected] Jyotirmay Mathur Co-coordinator, Centre for Energy and Environment, Associate Professor, Mechanical Engineering Department Malaviya National Institute of Technology Jaipur, India email: [email protected] Surekha Tetali Student, MS by Research, Centre for IT in Building Science IIIT Hyderabad, India. email: [email protected] Akshey Jawa Student, MS by Research, Centre for IT in Building Science IIIT Hyderabad, India. email: aksheyjawa@ research.iiit.ac.in Corresponding Author: [email protected] This paper presents an approach for speeding EnergyPlus simulations. The computing run time of an energy simulation depends on several variables and is directly proportional to the simulation RunPeriod. In the proposed approach, data parallelization is achieved by breaking an annual simulation into several segments of smaller RunPeriod, each handled by a separate computer/processor. The speed gain achieved by running 12 one-month RunPeriod segments in parallel as compared to single simulation of twelve months is between 3 to 6 times. Segmentation of simulation has resulted in minor deviations between the results obtained through segmented simulations and annual

Upload: others

Post on 20-Apr-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Development and performance evaluation of a methodology ...web2py.iiit.ac.in/publications/default/download/... · Development and performance evaluation of a methodology, based on

Development and performance evaluation of a methodology, based

on distributed computing, for speeding EnergyPlus simulation

Vishal Garga*

, Kshitij Chandrasena, Jyotirmay Mathur

b, Surekha Tetali

a,

Akshey Jawaa

a Centre for IT in Building Science, International Institute of Information Technology,

Hyderabad, India bCentre for Energy and Environment Malaviya National Institute of Technology,

Jaipur, India

Vishal Garg

Head & Associate Professor,

Centre for IT in Building Science,

IIIT Hyderabad, India.

email: [email protected]

Kshitij Chandrasena

Student, B.Tech Final Year,

IIIT Hyderabad, India.

email: [email protected]

Jyotirmay Mathur

Co-coordinator, Centre for Energy and Environment,

Associate Professor, Mechanical Engineering Department

Malaviya National Institute of Technology

Jaipur, India

email: [email protected]

Surekha Tetali

Student, MS by Research,

Centre for IT in Building Science

IIIT Hyderabad, India.

email: [email protected]

Akshey Jawa

Student, MS by Research,

Centre for IT in Building Science

IIIT Hyderabad, India.

email: aksheyjawa@ research.iiit.ac.in

Corresponding Author: [email protected]

This paper presents an approach for speeding EnergyPlus simulations. The computing

run time of an energy simulation depends on several variables and is directly

proportional to the simulation RunPeriod. In the proposed approach, data parallelization

is achieved by breaking an annual simulation into several segments of smaller

RunPeriod, each handled by a separate computer/processor. The speed gain achieved by

running 12 one-month RunPeriod segments in parallel as compared to single simulation

of twelve months is between 3 to 6 times. Segmentation of simulation has resulted in

minor deviations between the results obtained through segmented simulations and annual

Page 2: Development and performance evaluation of a methodology ...web2py.iiit.ac.in/publications/default/download/... · Development and performance evaluation of a methodology, based on

simulations. Methods for reducing these deviations on annual and monthly basis are

presented in this paper using twelve benchmark models each simulated for five cities. On

annual basis, a maximum deviation of 0.06% was observed in cooling, heating, and

lighting consumption. In a month-to-month comparison between the segments and

annual simulation, the maximum deviation was 1.7% for heating and 0.8% for cooling.

Keywords: EnergyPlus, energy simulation, simulation run time, parallel simulation, simulation

speed up

1. Introduction

There has been an increased effort among Architects, HVAC engineers and designers

to implement energy conservation features in buildings. This has resulted in an

increased use of energy simulation software in the design process. Energy simulation

programs can help in achieving energy efficient and cost effective designs.

EnergyPlus [1] is a new-generation building energy simulation program based on

DOE-2 [2] and BLAST [3], with numerous added capabilities. EnergyPlus includes

many innovative simulation capabilities such as time steps of less than an hour,

modular systems and plant integrated with heat balance-based zone simulation,

multizone air flow, thermal comfort, water use, natural ventilation, and photovoltaic

systems. Though EnergyPlus has these innovative capabilities, its major limitation is

that it runs slower than DOE-2. According to a study conducted by Tianzhen H., et al

[4], at a 15-minute time step, EnergyPlus runs much slower than DOE-2.1E by a

factor of 105 for a large office building to 196 for a hospital building. At a 60-minute

time step, EnergyPlus still runs even slower than DOE-2.1E by a factor of 25 for the

large office building to 54 for the hospital building.

According to EnergyPlus run time analysis report [5] prepared by the

Simulation Research Group, EETD, LBNL, simulation settings that can have

significant impacts on EnergyPlus run time include the length of RunPeriod,

Number_of_Timesteps_per_Hour for loads calculations, heat balance solution

algorithm, solar distribution and reflection calculation algorithm, system convergence

Page 3: Development and performance evaluation of a methodology ...web2py.iiit.ac.in/publications/default/download/... · Development and performance evaluation of a methodology, based on

limits, shadow calculation interval and the length of the warm up period. This analysis

shows that the longer the run period, longer is the EnergyPlus run time.

Some efforts to use parallel computing for reducing simulation time for a

group of simulations have been reported. Zhang Y [6] developed a Java based tool to

run EnergyPlus on parallel machines specifically for the parametric analysis where

multiple design alternatives have to be analyzed simultaneously. GenOpt [7] is an

optimization program, used to carry out the parametric analysis using multiple

computers / processors. Both these tools help in speeding up the parametric

simulations but do not address speeding individual simulation runs.

In this paper, an approach that uses data parallelization paradigm has been

proposed to increase the simulation speed of single simulation run. In this approach,

annual simulation is segmented into smaller multiple simulations, each of which can

run on a dedicated CPU in a computer cluster or on different cores on the same

computer. This segmentation reduces the simulation run time and increases the speed

of simulation. Segmentation of simulation results in minor deviations between the

results obtained through segmented simulations and annual simulations. To achieve

accurate results, effect of the number of warm up days and shadow calculation days

were analyzed to arrive at different alternatives. This method was tested on 13 models

and 5 cities covering various climatic conditions. Speed gains of 3.2x to 5.8x were

achieved by running 12 one-month RunPeriod segments in parallel as compared to

annual simulations. On an annual basis, a maximum deviation of 0.06% was observed

in cooling, heating, and lighting consumption of buildings. In a month-to-month

comparison between the segmented and annual simulation, this deviation in the worst-

case scenario was 1.7% for heating and 0.8% for cooling energy consumption.

2. Approach

Page 4: Development and performance evaluation of a methodology ...web2py.iiit.ac.in/publications/default/download/... · Development and performance evaluation of a methodology, based on

One parameter that significantly affects the runtime of a single simulation run is the

RunPeriod of that simulation. RunPeriod is the object in EnergyPlus which contains

several fields, including information on the begin date and end of the simulation. This

information is assigned by the user in the EnergyPlus Input Data File (.idf). In the

approach presented in this paper the annual simulation was divided into twelve

segments (splitting the single annual .idf into 12 monthly .idf files) with smaller

RunPeriod (monthly) that were then run in parallel. This monthly simulation took

significantly less time in comparison to the annual simulation. These monthly

simulations were independently run in parallel on several computers. Results of all the

parallel runs were then collated. This results in increase in simulation speed. This

approach is explained in Figure 1. In this study, simulations have been performed on a

cluster of computers and the results of these segmented simulations were compared

with the results of the respective months in the annual simulation. It was observed that

segmentation of simulation resulted in minor deviations between the results obtained

through segmented simulations and annual simulations. The causes of these

deviations and the alternatives to reduce them are discussed in the following sections.

Figure 1

2.1 Deviations Observed

When the annual simulation was segmented into twelve monthly simulations it was

observed that there were some deviations in the monthly cooling and heating energy

consumption of the segmented simulations as compared to the corresponding months

of the annual simulation. This was observed in all the 65 cases (13 models x 5 cities)

that were simulated (details given in Section 3- Simulation Runs). Table 1 gives the

Page 5: Development and performance evaluation of a methodology ...web2py.iiit.ac.in/publications/default/download/... · Development and performance evaluation of a methodology, based on

percentage deviations in monthly heating and cooling for one of the 65 cases

(“ElemSchool” model for Chicago climate)

Table 1

Besides the deviations observed on the monthly and annual values, large

deviations were also observed in the hourly values of heating and cooling. Figure 2

shows the percentage deviations in the hourly values for cooling and heating energy

consumption obtained in the segmented simulations with the corresponding hourly

values of annual simulation of “ElemSchool” model using New York weather file.

Figure2a

Figure2b

2.2 Causes of Deviations

The deviations discussed in the previous section were due to the following factors:

(1) Mismatch between the day of week for the dates in segmented simulation

and corresponding dates in annual simulation: The field

Day_of_Week_for_Start_DayDay, can be set in the simulation model. The

value given in this field can be used to override the day of the week indicated

in the weather file. Valid days of the week (Sunday, Monday, Tuesday,

Wednesday, Thursday, Friday, and Saturday) can be entered. When weekdays

are used, each subsequent day in the period will be incremented. If this field is

filled with a valid day of the week, all the segmented simulations will take that

day of the week as a static value for the Day_of_Week_for_Start_Day. This

will result in different day of week for the same date in the annual simulation

and the segmented simulation. This difference can further lead to a difference

Page 6: Development and performance evaluation of a methodology ...web2py.iiit.ac.in/publications/default/download/... · Development and performance evaluation of a methodology, based on

in the number of total working days in the annual simulation and the sum of

all working days in the segmented simulation, which will therefore affect the

results. For example, if 1st of July is Monday in the annual simulation and the

field Day_of_Week_for_Start_Day is set to Sunday, then the start day for the

seventh segment (July) will be Sunday and will not match with the day of

week in the annual simulation. Further, there will be four Sundays in the

month of July in the annual simulation whereas in the segmented simulation

for July there will be five Sundays. A difference of even one working day can

cause a significant difference in the energy consumption for that month.

(2) Inadequate warm up if the first day of segment is a holiday: EnergyPlus

performs a warm up on the first day of simulation period in order to set the

values of certain variables. Convergence of the simultaneous heat

balance/HVAC solution is reached when the criteria for either the loads or

temperature is satisfied. In case of an annual simulation, the simulation starts

on 1st January, and the warm up calculations are carried out for first day i.e.

1st January. However, for individual segments, the warm up takes place on the

first day of the segment i.e. the first day of each month. In case, the first day of

the month happens to be a holiday, then the warming up of the model for that

segment will be done according to non-operational conditions. This will result

in deviated results for the next few working days of that segment and hence

affect the total monthly consumption.

(3) Dates of shadow calculation in annual simulation not matching with the

dates for segmented simulations: To determine the amount of solar radiation

entering a building and to calculate the amount of cooling or heating load

required to maintain the set point temperature, shadow calculations (sun

Page 7: Development and performance evaluation of a methodology ...web2py.iiit.ac.in/publications/default/download/... · Development and performance evaluation of a methodology, based on

position, etc) are performed by EnergyPlus. By default, EnergyPlus performs

these calculations for a frequency of 20 days throughout the RunPeriod. The

shadowing Calculation_Frequency variables in the idf specify the number of

days after which the next shadowing calculation period starts . The dates for

which shadowing calculation is performed depends on this frequency. There

can be deviations in the results of simulations if the shadowing calculation

dates in the segmented simulation do not align with those in the annual

simulation. Therefore, it is important to align the shadowing calculation dates

of the segmented simulations to map with the shadowing calculation dates in

the annual simulation.

2.3 Alternatives Proposed to Decrease the Deviations

The deviations mentioned may be reduced by changing some of the simulation

settings and variables. The solutions proposed are:

(1) Selecting Day_of_Week_for_Start_Day from the weather file: If the Day of

week for Start Day is chosen as UseWeatherFile, the day of week for all the

dates in the annual simulation will be in synchronization with the day of week

for the corresponding dates in the segmented simulation, hence ensuring the

same number of working days between the segmented and annual simulations.

By default, EnergyPlus specifies a particular day (such as Monday) as the Day

of week for Start Day, due to which the start day of the each segmented

simulation will be this day specified. However, this same date in the annual

simulation would be some other day, due to which there would

synchronization problem of the day and dates in annual and segmented

simulations. UseWeatherFile option picks the calendar day for that particular

date from the weather file.

Page 8: Development and performance evaluation of a methodology ...web2py.iiit.ac.in/publications/default/download/... · Development and performance evaluation of a methodology, based on

(2) Ensuring first day of the segment is a working day for adequate warming

up: If the first day of the month happens to be a holiday then some days from

the previous month can be added to the segment to ensure adequate number of

working days between the first day of the simulation and the first the of the

month. This increases the RunPeriod of the segment and the first day of

segment would not be the 1st of the month. These extra number of days added

to each segment would be referred to as warmupPlus days in this paper.

(3) Synchronizing shadow calculation dates of the segments with the annual

shadow calculation dates: If the shadowing calculation dates of the annual

simulations and those of each segmented simulation are kept in

synchronization with each other, the effects of unaligned shadow calculations

can be minimized. This is achieved by adding some more days from the

previous month to the segment so that the start date of the shadow calculation

period of the segmented simulation align with the annual shadowing dates.

This extra number of days added before each segment will be referred to as

shadowSync days in this paper.

The following measures were taken to implement these solutions:

(1) The Day_of_Week_for_Start_Day element in the RunPeriod class was

changed to UseWeatherFile. This ensured that the total number of working

days for all the months in the segments were the same as the total number of

working days in the annual simulation.

(2) To every segment 7 warmupPlus days of simulation were added before the

simulation days of the segment. This ensured adequate warm up especially if

the 1st day of the month happened to be a holiday.

Page 9: Development and performance evaluation of a methodology ...web2py.iiit.ac.in/publications/default/download/... · Development and performance evaluation of a methodology, based on

(3) Some more days were added in the beginning of the segment to ensure that the

shadow calculation dates in the segment mapped with that of the annual

simulation.

Before arriving at this proposed solution, some more variants were analyzed. In

addition to the 7 warmupPlus days, simulations were done with 0, 14 and 21

warmupPlus days. Two more variants with synchronized shadow calculation days

were also analyzed. This resulted in six different alternatives, which were based on

the combination of warmupPlus days and shadowSync days. These six combinations

were experimented on to find out an alternative that results in the least deviation in

cooling, heating, lighting and equipment consumption when compared with the

annual simulation.

The six alternatives and their names as used in the paper are:

(1) W0: 0 warmupPlus days

(2) W7: 7 warmupPlus days

(3) W14: 14 warmupPlus days

(4) W21: 21 warmupPlus days

(5) W0Sync: shadowSync days with minimum allowed value as 0

(6) W7Sync: shadowSync days with minimum allowed value as 7

For all the six alternatives, the Day_of_Week_for_Start_Day is selected as

UseWeatherFile

3. Performance of the alternatives

As discussed in earlier sections warm up and synchronization of shadow calculation

dates have a significant impact on the simulation runtime and the accuracy of the

results. Simulations were conducted to evaluate the performance of the six

alternatives proposed in the previous section in terms of accuracy and speed up.

Page 10: Development and performance evaluation of a methodology ...web2py.iiit.ac.in/publications/default/download/... · Development and performance evaluation of a methodology, based on

“ElemSchool” model has been simulated for the five cities (Chicago, San Francisco,

Tampa, New York, and Houston) with all the six alternatives. Effect of shadow sync

and warm up days on accuracy of results and the speed gain are discussed in Sections

3.1 and 3.2.

3.1 Accuracy of the alternatives

The six alternatives were simulated to analyze the deviations in results of the

segmented simulations in comparison with corresponding results of the annual

simulation. Large deviations in monthly consumption values were observed when

“ElemSchool” model was simulated for Chicago weather data. Table 2 provides the

percentage deviations in Cooling Electricity consumption for various months between

the segmented simulations and annual simulations for all the six alternatives. The

maximum deviation in monthly cooling consumption was 0.098% for W21 alternative

in the month of July. Table 3 shows the percentage deviations in the heating energy

consumption. The maximum deviation in monthly heating consumption was 1.87%

for W0 and W0Sync alternative in month of April. In both the tables, the minimum

deviation values for each month are highlighted in bold.

Table 2

Table 3

Some of the observations from Tables 3 and 4 are:

For cooling, W0Sync and W7Sync show least deviations

For heating, W7Sync shows the least deviations

W14 shows better accuracy for both heating and cooling amongst W0, W7,

W14 and W21.

Page 11: Development and performance evaluation of a methodology ...web2py.iiit.ac.in/publications/default/download/... · Development and performance evaluation of a methodology, based on

In the annual results, W7Sync show significantly less deviations compared to

the other methods.

These observations clearly indicate that the alternatives W0Sync and W7Sync are

more accurate and are proved to be better by a significant difference when compared

to the other alternatives. As observed in the Table 2 the deviations in cooling

consumptions are higher in summer. Since the cooling values are of lower orders for a

city like Chicago, a small deviation in the cooling consumption will give a greater

percentage difference. Another interesting observation is that W14 shows better

results than W21, even when W21 has more days of simulation before the first day of

the segmented simulation. The more accurate results of W14 are due to the shadowing

frequency of 15 days in this model. Extra 14 days synchronizes the shadowing periods

better than the extra 21 days taken in W21.

Tables 3 and 4 show the comparison of the six alternatives based on their

performance on monthly and annual basis. W0Syn and W7Sync both give fairly

accurate results for monthly and annual values. Deviations in the hourly values were

then checked to compare the two alternatives. Large deviations in hourly consumption

values were observed when “ElemSchool” model was simulated for New York

weather data.

Figure 3a and 3b show percentage deviations in the hourly cooling and heating

consumption between the segmented simulations and the annual simulations for the

W0Sync alternative. The maximum deviation in hourly value of cooling was 54% and

heating was 18.1%. It can be observed from the graphs that even when W0Sync is

accurate on the monthly and the annual values, there are big deviations on the hourly

values.

Page 12: Development and performance evaluation of a methodology ...web2py.iiit.ac.in/publications/default/download/... · Development and performance evaluation of a methodology, based on

Figure 3a

Figure 3b

Figure 4a and 4b show the percentage deviation in the hourly cooling and

heating consumption between the segmented simulations and the annual simulations

when applying the alternative W7Sync.The maximum deviation in hourly value of

cooling was 0.01% and heating was 0.04%. Hence, W7Sync is not only accurate on

the monthly and the annual values; it is also very accurate on hourly values.

Figure 4a

Figure 4b

The graphs clearly indicate that the deviations observed in the hourly data for

W0Sync were reduced in the W7Sync alternative. The deviations in the graph also

strengthen the proposition that the addition of warmupPlus days is important. April,

for instance shows heavy deviations in both cooling and heating. This can be

attributed to the fact that 1st April is start date for one of the shadow calculation

periods. Hence in the W0Sync setup, there are 0 warmupPlus days leading to

deviations. While in the W7Sync setup, there are 15 warmupPlus days (since the

shadowing frequency is 15 days) which results in negligible errors. Since the

shadowing frequency for this simulation set is 15 days, other months are able to align

appropriately and result in fewer deviations. However, if the frequency is some other

number, for example 20, which causes less alignment, then these deviations will

Page 13: Development and performance evaluation of a methodology ...web2py.iiit.ac.in/publications/default/download/... · Development and performance evaluation of a methodology, based on

increase. Due to this, the W7Sync algorithm will work efficiently for any value of

shadow frequency and will provide accurate results.

3.2 Speed gain of the alternatives

To evaluate the performance of all the alternatives in terms of simulation run time, the

model “ElemSchool” with “Chicago” weather file was used and the time taken by

each simulation was observed. Table 4 shows the simulation run time (in seconds) and

speed gain of monthly segments, the time taken by the annual simulation (in seconds)

and the effective speed gain for the six alternatives. Speed gain is obtained by

dividing the time of annual simulation by the time taken by segmented simulation.

Variation can be observed in the time taken by simulation by the segments for

different months. This is due to the difference in number of shadow calculations and

the time taken for convergence of various HVAC calculations. It is observed that in

colder months such as January and December the time taken is more due to an

increased number of iterations in the HVAC. The segment that takes the maximum

time governs the effective speed gain for that alternative. Hence, the minimum

speedup value obtained for that set of segments is highlighted in bold. It is clear from

the table that the overall speed gain decreases with the number of warmupPlus days.

W0 is fastest but not very accurate and W7Sync is slower but more accurate. Due to

this accuracy, W7Sync is selected for demonstrating the proposed approach of

speeding up EnergyPlus using parallel computing.

Table 4

3.3 Performance Analysis of W7sync

After selecting W7sync alternative for demonstrating the proposed approach, this

alternative has been further analysed to observe the effect of the following on the

Page 14: Development and performance evaluation of a methodology ...web2py.iiit.ac.in/publications/default/download/... · Development and performance evaluation of a methodology, based on

speed gain: Number_of_Timesteps_per_Hour, number of processing units, and

overheads with respect to extra time taken in transferring files over the network and

collating the output.

(1) Number_of_Timesteps_per_Hour: The ElemSchool model for Chicago

climate with varying time steps has been simulated to study the impact of time

steps on speed up. It has been observed that the simulation speed gain

increased with the increase in the value of Number_of_Timesteps_per_Hour.

Speed gain with the value of Number_of_Timesteps_per_Hour as 1 is 2.73

Speed gain with the value of Number_of_Timesteps_per_Hour as 4 is 3.15

Speed gain with the value of Number_of_Timesteps_per_Hour as 20 is

4.12

(2) Number of processing units: To demonstrate the effect of number of

processing units on the speed gain we simulated ElemSchool model for

Chicago on 2, 3, 4, 6 and 12 processing units. It has been observed that the

speed gain decreases with the decrease in number of processing units. Speed

gain with the number of processing units is shown in Table 5

(3) Overheads with respect to extra time taken: This whole process has some

overheads with respect to extra time taken for pre-processing .idf file, sending

the .idf files to processors, collecting the results from processors and collating

the results to form a single file. The overhead depends largely on network

speed, the size of .idf file and the size of the output file which in turn depends

on the number of variables and their reporting frequency. In the simulations

which were performed it was observed that the overhead was close to 5% of

the time taken by the slowest segment. For example, for ElemSchool model,

Page 15: Development and performance evaluation of a methodology ...web2py.iiit.ac.in/publications/default/download/... · Development and performance evaluation of a methodology, based on

the overall overhead was about 15 seconds, and the time taken by the slowest

segment was 422 seconds, so the total overhead is 4.02 %.

4. Simulation and results

In order to demonstrate the approach over the selected alternative W7Sync,

simulations were performed using EnergyPlus V 4. Thirteen different building models

were simulated across five different cities. Details of the simulation models, weather

data and the results are provided in the following sections.

4.1 Building Prototypes

The U.S. Department of Energy (DOE) – through three of its national laboratories –

has developed a set of standard benchmark building models for new and existing

buildings [8]. These models are used in the paper to test the proposed approach. The

revised and latest models for EnergyPlus v5 are now referred as „commercial

reference building models for new construction‟ [9]. The changes between the

standard benchmark model used in this paper and the new reference building models

are listed in „summary of changes from v1.2_4.0 to v1.3_5.0‟ document [10].These

models help in providing consistent standardized models for the analysis of the

results. To analyze the performance of the proposed alternative over different models

and climates, the benchmark buildings that come along with the EnergyPlus

installation were used for the simulations. Thirteen different building models- twelve

benchmark buildings for new construction and one example model from EnergyPlus

were used. The example file of EnergyPlus installation is a 5 Zone VAV model with

daylight sensors and is used to check the concept and the accuracy of results for a

Page 16: Development and performance evaluation of a methodology ...web2py.iiit.ac.in/publications/default/download/... · Development and performance evaluation of a methodology, based on

building with daylight sensors. Some important characteristic features of these models

are listed in Table 6.

Table 6

4.2 Climate Zones

All the thirteen different models were simulated for five different cities of USA which

belong to four different climate zones as shown in Table 7.

Table 7

4.3 Performance of the proposed alternative- W7Sync

To evaluate the performance of the proposed alternative, 65 cases (13 models x 5

cities) were simulated. Results show that W7Sync results are reasonably accurate for

all the cases, as observed in the earlier section. To show a compact summary of the

results achieved for all the 65 cases, the deviations in annual cooling, heating,

equipment and lighting energy consumption and the speed gains are as listed in the

Table 7.However, analysis has been performed on all the cases and the deviations in

hourly consumptions and the monthly consumptions were noted to be very minor, in

similar limits as discussed in the previous sections.

Table 8

4.4 Results

From the simulations performed over thirteen different models using five different

cities, it was observed that the speed up achieved varied from 3.15x to 5.84x. The

minimum speed gain was in model ElemSchool for Chicago, which was 3.15x. The

maximum gain was achieved for the MidApt and Chicago weather, which was 5.84x.

The average gain observed for all the 65 cases was 4.77x. One interesting observation

Page 17: Development and performance evaluation of a methodology ...web2py.iiit.ac.in/publications/default/download/... · Development and performance evaluation of a methodology, based on

made was that the speed gain for a model depends on the climate for which the

simulation is run.

The maximum deviation in cooling consumption was noted in SmallOffice model and

in heating consumption it is noted in MidApt model. The maximum and minimum

deviations annually across all models are as shown in Table 9.

Table 9

Table 10 shows the maximum percentage deviations that occurred in the monthly

consumption of heating, cooling and lighting, out of all the 65 cases that are

simulated. The deviations in case of equipment electricity consumption were found to

be 0 for all the cases.

Table 10

5. Conclusion

The approach of dividing a simulation into segments and running them in parallel

decreases the simulation run time, and is accurate enough to be used practically for

increasing the speed of an EnergyPlus simulation. The proposed approach has been

applied over 13 different models and five different weather files, to check the

accuracy of the results achieved when compared to the annual simulation results. On

annual basis, a maximum deviation of 0.06% was observed in cooling, heating, and

lighting consumption. In a month-to-month comparison between the segments and

annual simulation, the maximum deviation was 1.7% for heating and 0.8% for

cooling. The speed gain in the simulation would range between 3x to 6x. For

performing huge number of simulations during the conceptual design stage and for the

parametric analysis, the proposed approach can be very useful. And, for the final

Page 18: Development and performance evaluation of a methodology ...web2py.iiit.ac.in/publications/default/download/... · Development and performance evaluation of a methodology, based on

analysis a single run simulation as per conventional approach can be performed to get

precise results. Further study would include the development of a tool that will use the

described algorithm on a cluster of computers or processors.

References

[1] EnergyPlus Energy Simulation Software, Available from: www.energyplus.gov

[2] DOE-2, Available from: http://gundog.lbl.gov/dirsoft/d2whatis.html

[3] BLAST-Building Load Analysis and System Thermodynamics, Available from:

http://www.cecer.army.mil/facts/sheets/cf-47.pdf

[4] Tianzhen H, Fred B, Philip H, Stephen S, Michael W. 2008. Comparing Computer

Run Time of Building Simulation Programs. Building Simulation , 1 (3), 210-213.

[5] Hong, Tianzhen. (2009). EnergyPlus Run Time Analysis [online]. Lawrence

Berkeley National Laboratory: Lawrence Berkeley National Laboratory. LBNL Paper

LBNL-1311E. Available from: http://escholarship.org/uc/item/36h4m5z0

[6] Yi Z. 2009. “Parallel” EnergyPlus and the development of a parametric analysis

tool. In: Eleventh International IBPSA Conference, 27-30 July 2009 Glasgow,

Scotland: 1382-1388.

[7] GenOpt-Generic Optimization Program, Available from:

http://gundog.lbl.gov/GO/index.html

[8] P. Torcellini, et al. 2008. DOE Commercial Building Benchmark Models. In:

ACEEE Summer Study on Energy Efficiency in Buildings, 17-22 August 2008 Pacific

Grove, California.

[9] Commercial Reference Building models for New Construction, Available from:

http://www1.eere.energy.gov/buildings/commercial_initiative/new_construction.html

[Accessed 4 October 2010]

Page 19: Development and performance evaluation of a methodology ...web2py.iiit.ac.in/publications/default/download/... · Development and performance evaluation of a methodology, based on

[10] Summary of Changes from v1.2_4.0 to v1.3_5.0, Available from:

http://apps1.eere.energy.gov/buildings/publications/pdfs/commercial_initiative/refbld

gs_changes_v40tov50.pdf [Accessed 4 October 2010]

Page 20: Development and performance evaluation of a methodology ...web2py.iiit.ac.in/publications/default/download/... · Development and performance evaluation of a methodology, based on

Table 1: Percentage deviation in monthly cooling and heating consumption

between segmented and annual run

% Deviation

Months Cooling Heating

Jan 0 -0.0004

Feb 0 -0.1908

March 0.0004 0.0019

April 0 -1.8722

May -0.0071 0.5679

June -0.0017 0.2258

Jul 0.0977 0.0925

Aug 0.0089 -0.5372

Sep -0.0076 0.0948

Oct 0.0555 0.1534

Nov -0.0013 0.0856

Dec 0 -0.0118

Annual 0.0248 -0.1128

Page 21: Development and performance evaluation of a methodology ...web2py.iiit.ac.in/publications/default/download/... · Development and performance evaluation of a methodology, based on

Table 2: Deviations in cooling electricity consumption for the six different

alternatives that were simulated for the “ElemSchool” and Chicago climate. The

maximum deviation was 0.098% for W21 alternative in the month of July

Alternatives

Months W0 W7 W14 W21 W0Sync W7Sync

Jan 0 0 0 0 0 0

Feb 0 0 0 0 0 0

March 0.0004 -0.0012 0.0004 -0.0012 0 0

April 0 0.1064 0 -0.0129 0 0

May -0.0071 -0.0794 -0.0025 -0.0362 -0.0071 -0.0025

June -0.0017 -0.0316 -0.0008 -0.0408 0 0

Jul 0.0977 0.097 0.097 0.0984 0.042 0.042

Aug 0.0089 0.0315 0.009 0.0315 -0.0008 -0.0008

Sep -0.0076 -0.0097 -0.008 -0.0087 -0.0068 -0.0068

Oct 0.0555 0.0156 0.056 0.0526 -0.0012 -0.0012

Nov -0.0013 -0.0178 -0.0178 -0.0006 0 0

Dec 0 0 0 0 0 0

Annual 0.0248 0.0188 0.0228 0.018 0.0082 0.0086

Page 22: Development and performance evaluation of a methodology ...web2py.iiit.ac.in/publications/default/download/... · Development and performance evaluation of a methodology, based on

Table 3: Deviations in heating consumption for the six different alternatives that

were simulated for the “ElemSchool” and Chicago climate. The maximum

deviation was 1.87% for W0 and W0Sync alternative in month of April.

Alternatives

Month W0 W7 W14 W21 W0Sync W7Sync

Jan -0.0004 -0.0004 -0.0004 -0.0004 -0.0004 -0.0004

Feb -0.1908 -0.0436 0.0095 -0.0387 0.0038 -0.0024

March 0.0019 -0.0579 -0.0236 -0.0395 0 0

April -1.8722 -0.0638 0 -0.0793 -1.8722 0

May 0.5679 -0.0641 0.0002 -0.0288 0.5679 0.0002

June 0.2258 -0.078 0.0181 -0.1015 -0.0145 0

Jul 0.0925 0.0641 0.0666 0.0715 0.0413 0.0376

Aug -0.5372 -0.1311 -0.0869 -0.1311 -0.103 -0.1118

Sep 0.0948 -0.1657 -0.0318 -0.1198 -0.0037 -0.0037

Oct 0.1534 0.1344 0.1456 0.1803 0.0511 0.0511

Nov 0.0856 -0.0967 -0.0967 -0.0944 0.0172 0.0172

Dec -0.0118 0.0174 -0.0193 -0.0411 0 0

Annual -0.1128 -0.0235 -0.0114 -0.0314 -0.0843 0.0021

Page 23: Development and performance evaluation of a methodology ...web2py.iiit.ac.in/publications/default/download/... · Development and performance evaluation of a methodology, based on

Table 4: Simulation run time (in seconds) and speed gain of monthly segments,

the time taken by the Annual simulation (in seconds) and the effective speed gain

for the six alternatives.

Month W0 W7 W14 W21 W0Sync W7Sync

Jan 355 3.7 360 3.6 357 3.7 358 3.6 356 3.7 358 3.7

Feb 277 4.8 354 3.7 379 3.5 426 3.1 282 4.7 385 3.5

March 195 6.8 223 5.9 267 4.9 292 4.5 262 5.0 266 5.0

April 179 7.4 192 6.8 209 6.3 222 5.9 179 7.3 212 6.3

May 178 7.4 192 6.8 204 6.5 217 6.0 178 7.4 207 6.4

June 178 7.4 194 6.7 205 6.4 218 6.0 180 7.3 211 6.3

Jul 203 6.5 217 6.0 236 5.6 242 5.4 207 6.4 240 5.6

Aug 179 7.4 195 6.7 210 6.3 222 5.9 183 7.2 215 6.2

Sep 183 7.2 198 6.6 211 6.2 225 5.8 189 7.0 218 6.1

Oct 175 7.5 189 6.9 199 6.6 214 6.1 180 7.3 211 6.3

Nov 228 5.8 242 5.4 243 5.4 264 4.9 235 5.6 261 5.1

Dec 336 3.9 388 3.4 407 3.2 423 3.1 357 3.7 422 3.2

Annual 1316 1308 1314 1302 1316 1331

Effective

speed gain 3.7 3.4 3.2 3.1 3.7 3.2

Page 24: Development and performance evaluation of a methodology ...web2py.iiit.ac.in/publications/default/download/... · Development and performance evaluation of a methodology, based on

Table 5: Speed gain achieved for ElemSchool model simulated for Chicago

climate, with varying number of processing units

Number of

processing units

Speed Gain

12 3.2

6 2.6

4 2.3

3 2.1

2 1.8

Page 25: Development and performance evaluation of a methodology ...web2py.iiit.ac.in/publications/default/download/... · Development and performance evaluation of a methodology, based on

Table 6: Characteristics of the thirteen building models used for testing the

proposed approach of speeding up EnergyPlus

Model Name

Flo

or

Area

(th

ou

san

d m

2)

Nu

mb

er o

f F

loors

Zone Definition HVAC System HVAC Plant

Inte

r z

on

e S

urfa

ces

Peo

ple

Lig

hts

Win

do

ws

Da

yli

gh

t

Zo

na

l E

qu

ipm

en

t

Cen

tral

Air

Ha

nd

lin

g

Eq

uip

men

t

Sy

stem

Eq

uip

men

t A

uto

size

Co

ils

Pu

mp

s

Boil

ers

Ch

ille

rs

To

wer

s

ElemSchool 6.871 1 x x x x x x x x

Fastfood 0.2323 1 x x x x x x x x

HighSchool 24 2 x x x x x x x x x x x

Hospital 18.697 6 x x x x x x x x x x x

LargeHotel 9.366 6 x x x x x x x x x x x

LargeOff 42.757 12 x x x x x x x x x x x

MedOff 4.952 3 x x x x x

MidApt 3.135 4 x x x x x x x x

Retail 3.882 1 x x x x

SitdownRestrau 0.511 1 x x x x

SmallHotel 1.958 2 x x x x

SmallOffice 0.511 1 x x x x

5ZoneVAV 0.8 1 x x x x x x x x x x x x

Page 26: Development and performance evaluation of a methodology ...web2py.iiit.ac.in/publications/default/download/... · Development and performance evaluation of a methodology, based on

Table 7: Climate data used for running the simulation models

City Climate zone Climate type Weather file name

Chicago 5A Cool Humid USA_IL_Chicago-

OHare.Intl.AP.725300_TMY3.epw

San

Francisco

3C Warm-

Marine

USA_CA_San.Francisco.Intl.AP.724940_TMY3

.epw

Tampa 1A Very hot-

humid

USA_FL_Tampa.Intl.AP.722110_TMY3.epw

New York 5A Cool Humid USA_NY_New.York-

J.F.Kennedy.Intl.AP.744860_TMY3.epw

Houston 2A Hot- Humid USA_TX_Houston-

D.W.Hooks.AP.722429_TMY3.epw

Page 27: Development and performance evaluation of a methodology ...web2py.iiit.ac.in/publications/default/download/... · Development and performance evaluation of a methodology, based on

Table 8: Percentage annual deviations for Cooling, Heating, Equipment and

Lighting consumption and effective speed up for 13 models and 5 cities

Model Chicago San Francisco Tampa New York Houston

ElemSchool

Cooling 0.0086 0.0062 0.0039 0.0007 0.0052

Heating 0.0021 -0.0011 -0.0053 -0.0010 -0.0025

Equipment 0.0000 0.0000 0.0000 0.0000 0.0000

Lighting 0.0000 0.0000 0.0000 0.0000 0.0000

Speed Gain 3.15x 3.68x 3.76x 3.33x 3.5x

Fastfood

Cooling 0.0088 0.0451 0.0032 0.0116 0.0047

Heating 0.0002 -0.0004 -0.0021 0.0004 0.0017

Equipment 0.0000 0.0000 0.0000 0.0000 0.0000

Lighting 0.0000 0.0000 0.0000 0.0000 0.0000

Speed Gain 4.3x 4.86x 5.05x 4.54x 4.79x

HighSchool

Cooling 0.0020 0.0083 0.0056 0.0051 0.0104

Heating 0.0015 -0.0021 0.0036 0.0020 0.0038

Equipment 0.0000 0.0000 0.0000 0.0000 0.0000

Lighting 0.0000 0.0000 0.0000 0.0000 0.0000

Speed Gain 4.92x 5.43x 5.46x 5.37x 5.4x

Hospital

Cooling 0.0002 0.0006 0.0009 0.0005 0.0006

Heating 0.0006 -0.0012 -0.0019 -0.0015 -0.0021

Equipment 0.0000 0.0000 0.0000 0.0000 0.0000

Lighting 0.0000 0.0000 0.0000 0.0000 0.0000

Speed Gain 5.61x 5.63x 5.7x 5.2x 5.44x

LargeHotel

Cooling 0.0180 0.0295 0.0145 0.0223 0.0170

Heating 0.0004 -0.0002 -0.0020 0.0006 0.0009

Equipment 0.0000 0.0000 0.0000 0.0000 0.0000

Lighting 0.0000 0.0000 0.0000 0.0000 0.0000

Speed Gain 4.84x 5.22x 5.77x 5.23x 5.58x

LargeOff

Cooling 0.0245 0.0351 0.0119 0.0176 0.0029

Heating 0.0009 -0.0001 0.0108 0.0049 -0.0013

Equipment 0.0000 0.0000 0.0000 0.0000 0.0000

Lighting 0.0000 0.0000 0.0000 0.0000 0.0000

Speed Gain 5.23x 5.47x 5.18x 5.31x 5.04x

MedOff

Cooling 0.0233 0.0211 0.0112 0.0263 0.0172

Heating 0.0005 -0.0044 0.0021 0.0001 0.0018

Equipment 0.0000 0.0000 0.0000 0.0000 0.0000

Lighting 0.0000 0.0000 0.0000 0.0000 0.0000

Speed Gain 4.16x 4.4x 4.3x 4.31x 4.29x

Page 28: Development and performance evaluation of a methodology ...web2py.iiit.ac.in/publications/default/download/... · Development and performance evaluation of a methodology, based on

MidApt

Cooling 0.0205 0.0302 0.0164 0.0231 0.0164

Heating 0.0256 0.0165 0.0185 0.0046 0.0051

Equipment 0.0000 0.0000 0.0000 0.0000 0.0000

Lighting 0.0000 0.0000 0.0000 0.0000 0.0000

Speed Gain 5.84x 5.42x 5.8x 4.92x 5.62x

Retail

Cooling 0.0078 0.0418 0.0059 0.0100 0.0057

Heating 0.0034 0.0007 0.0015 0.0011 -0.0012

Equipment 0.0000 0.0000 0.0000 0.0000 0.0000

Lighting 0.0000 0.0000 0.0000 0.0000 0.0000

Speed Gain 4.36x 4.5x 4.44x 4.53x 4.34x

SitdownRestrau

Cooling 0.0043 0.0348 0.0020 0.0062 0.0031

Heating 0.0001 -0.0001 -0.0016 0.0001 0.0006

Equipment 0.0000 0.0000 0.0000 0.0000 0.0000

Lighting 0.0000 0.0000 0.0000 0.0000 0.0000

Speed Gain 4.33x 4.84x 5.07x 4.46x 4.69x

SmallHotel

Cooling 0.0123 0.0153 0.0106 0.0141 0.0116

Heating 0.0002 -0.0014 -0.0016 -0.0001 0.0012

Equipment 0.0000 0.0000 0.0000 0.0000 0.0000

Lighting 0.0000 0.0000 0.0000 0.0000 0.0000

Speed Gain 3.4x 3.51x 3.51x 3.42x 3.5x

SmallOffice

Cooling 0.0491 0.1599 0.0277 0.0607 0.0334

Heating 0.0001 0.0003 -0.0007 0.0007 0.0011

Equipment 0.0000 0.0000 0.0000 0.0000 0.0000

Lighting 0.0000 0.0000 0.0000 0.0000 0.0000

Speed Gain 4.18x 4.58x 4.58x 4.53x 4.55x

5ZoneVAV

Cooling 0.0207 0.0437 0.0113 0.0207 -0.0337

Heating -0.0009 -0.0103 0.0025 0.0021 -0.0108

Equipment 0.0000 0.0000 0.0000 0.0000 0.0000

Lighting 0.0054 -0.0042 -0.0220 -0.0399 -0.0168

Speed Gain 5.45x 5.47x 5.44x 5.61x 5.43x

Page 29: Development and performance evaluation of a methodology ...web2py.iiit.ac.in/publications/default/download/... · Development and performance evaluation of a methodology, based on

Table 9: Maximum and Minimum percentage deviations in annual heating,

cooling, lighting and equipment consumption

Heating Cooling Lighting Equipment

Minimum Deviation 0.0001 0.0002 0 0

Maximum Deviation 0.0256 0.0607 0 0

Page 30: Development and performance evaluation of a methodology ...web2py.iiit.ac.in/publications/default/download/... · Development and performance evaluation of a methodology, based on

Table 10: Maximum percentage deviations in monthly heating, cooling and

lighting consumption

Cooling Heating Lighting

Maximum Value -0.8015 1.7407 -0.4307

Model Medium Office Small Office 5ZoneVAV

City New York New York Tampa

Page 31: Development and performance evaluation of a methodology ...web2py.iiit.ac.in/publications/default/download/... · Development and performance evaluation of a methodology, based on

Figure1: Flow diagram of the entire process

Figure2a: Percentage deviation in hourly cooling electricity between segmented

simulation and the annual simulation for ElemSchool model with Chicago city.

The maximum hourly deviation is 100%

Figure2b: Percentage deviation in hourly heating consumption between

segmented simulation and the annual simulation for ElemSchool model with

Chicago city. The maximum hourly deviation observed in an hour is as high as

1340 %.( Instead of the peak a general range in which the percentage deviations

exist has been considered to scale the plot)

Figure 3a: Percentage deviation in hourly cooling electricity between segmented

simulation and the annual simulation for ElemSchool model with New York city

weather. The maximum hourly deviation is 54%.

Figure 3b: Percentage deviation in hourly heating consumption between

segmented simulation and the annual simulation for ElemSchool model with

New York city weather. The maximum hourly deviation is 18.1%.

Figure 4a: Percentage deviation in hourly cooling electricity between segmented

simulation and the annual simulation for ElemSchool model with New York city

weather. The maximum hourly deviation is 0.01%.

Figure 4b: Percentage deviation in hourly heating consumption between

segmented simulation and the annual simulation for ElemSchool model with

New York city weather. The maximum hourly deviation is 0.04%.