You are on page 1of 22

MULTI-OBJECTIVE PARAMETRIC EXPERIMENTS DESIGN

AUTHORS
Victor Okhoya, Perkins+Will
Marcelo Bernal, Perkins+Will
Tyrone Marshall, Perkins+Will
John Haymaker, Perkins+Will

Abstract. Although the importance of multi-objective parametric


analysis is recognized in research literature, in architectural practice it
is difficult to implement. Design teams struggle because architectural
decisions involve vast design spaces and multiple, sometimes
conflicting, criteria. They also need appropriate methods for
interpreting data and visualizing the geometry of the design. This
requires a systematic and multi-disciplinary approach to analysis and
optimization that enables teams to narrow the space of exploration
through statistical sampling, objective weights, and interactive
geometric and data visualizations so that designers can see the physical
impact of the optimization. This paper explores a Multi-Objective
Parametric Experiments Design (MOPED) workflow that seeks to
address the above challenges. The workflow combines parametric
analysis with stakeholder preferences, design of experiment methods,
and interactive visualizations. The paper describes, through a case study
of a relocatable classroom, how the method enables design teams to
make higher performing decisions with more confidence in less time.

1. Introduction
With respect to building performance analysis typical architectural practice
simulates a small number of design options and then uses human judgement
to infer the best cause of action. Parametric Analysis (PA), on the other hand,
varies all relevant input factors through significant ranges and thereby
develops large design spaces for exploration. Research shows that this can
lead to dramatically better building performance results (Clevenger &
Haymaker, 2012). However, PA can also be time and resource expensive, so
researchers have explored optimization methods that can search through large
parametric spaces in a fraction of the time.

1
For example, Naboni et al. (2013) compared traditional simulation to PA
and optimization using a genetic algorithm. They designed and built a project
prototype of a 14m2 test lab made from cross laminated timber located at the
School of Architecture in Copenhagen. They simulated ten design options in
EnergyPlus 1 during design, building the most energy efficient option. After
construction they analyzed a calibrated model. It revealed a total energy
consumption of 98.6 kWh/m2. A PA was performed in EnergyPlus with 11
factors and 139,968 individual runs. It was executed on a 256 core cluster and
took 30 hours to run. Using the results, modifications were identified that
would bring the energy consumption down to 8.5 kWh/m2. The improvement
is dramatic but the cost in computation resources is high.

It should be noted that in discussing multi-objective optimization it is


important to distinguish between multi-disciplinary optimization and single
discipline multi-objective optimization. Many authors referring to multi-
objective optimization are referring to the single discipline optimization of
thermal performance responses like annual heating load, annual cooling load
and HVAC performance (Chlela et al., 2009; Magnier & Haghighat, 2010). A
multi-disciplinary optimization gives a more holistic understanding of the
building performance factors in relation to overall human experience,
economic, and environmental requirements and is as such more desirable. It
requires a careful selection of input factors, a weighting of responses used in
the optimization value function and coordination of simulation results from
different software platforms. In this paper the multi-disciplinary study
optimizes thermal performance, daylight factor, direct line of sight, view
quality, life cycle cost and carbon dioxide emission.

The paper describes the use of design of experiments (DoE) in conjunction


with objective weighting and data and geometry visualization, to understand
how these methods can improve on the PA process. The DoE is seen as a
filtering mechanism that allows for drastic reductions in the design space to
be explored by estimating optimal values of the input factors. By combining
the DoE method with a full-factorial analysis of a reduced design space the
authors explore the extent to which it is possible to maintain the rigor of
analytical search with the accuracy of parametric analysis using a practical
amount of computational resources. In order to distinguish this effort from
other similar research, five aspects of PA have been identified as important
for meeting the requirements of rigor, accuracy and practicality:

Multi-disciplinary - Building performance analysis problems are

1
https://energyplus.net/

2
complex and multi-faceted and good solutions need to recognize the
multi-dimensionality of the problem.

Prioritized Not all responses are equally important in a multi-


objective circumstance. The PA must prioritize what are considered
more important design goals for a project over other less important
criteria.

Visual Data - The output, particularly to an architectural audience,


cannot simply be numbers and statistical metrics. Architects require a
visual exploratory interface that permits them to interactively
examine ranges of alternatives in the proposed solution.

Visual Geometry The outputs must also retain a link to parametric


design geometry. Architects must be able to see the impact of building
performance decisions on the physical geometry of their building
design.

Efficient Finally, the process needs to be computationally feasible


which means, by todays computational standards, it does not extend
much beyond a few thousand simulation runs.

Table 1 summarizes how the literature we review below addresses each of


these five aspects, and compares to the Multi-Objective Parametric
Experiments Design (MOPED) workflow.

2. Literature Review

Dhariwal & Banerjee (2017) observe that multi-objective optimization


methods rely on exhaustive search which can be computationally intensive.
They propose the use of surrogate models, specifically response surface
models (a form of DoE), that use a limited number of simulation runs. On a
case study of a three story office building in New Delhi they found that the
surrogate model approach reduced simulation time substantially at the cost of
incurring up to 10% in prediction error. Ritter et al. (2015) also use response
surface methods for optimizing design space to support decision making.
They note that current methods lack acceptable interfaces for designers to
interact with. They propose generating a parametric geometric model in
Autodesk Dynamo, defining input parameters to describe the design space,
creating a DoE in Matlab 2 to rapidly calculate the response values based on

2
https://www.mathworks.com/products/matlab.html

3
the input parameters, and then output the results into a parallel coordinates
plot for interactive visualization.

Table 1. Comparison of literature review sources.

Small number
Visualization
Disciplinary

Parametric
Interactive
Responses

Geometry
Weighted

of runs
Multi-
Dhariwal & Banerjee
Ritter et al.
Sadeghifam et al.
Qian & Lee
Pratt & Bosworth
Sadeghifam et al.
Jabi
Chlela et al.
Magnier & Haghighat
Flager et al.
Iwaro et al.
Lin & Gerber
Khalafallah & El-Rayes
Shi & Yang
Granadeiro et al.
MOPED

.
Sadeghifam et al. (2015) observe that altering a combination of factors
combined for a 36% reduction in annual energy consumption compared to
altering a single factor. This points to the benefit of a PA approach. They
developed a Revit Architecture 3 model, ran a baseline energy analysis based
on the Revit model, identified significant factors and their ranges and then
used a DoE to select building envelope materials in order to optimize
thermal performance. Jabi (2014) sought to better harmonize the outputs of
parametric geometric modeling with the input requirements of building
performance analysis by developing DSOS, a software framework. DSOS

3 https://www.autodesk.com/products/revit-family/architecture

4
used Autodesk Designscript 4, Open Studio 5 and EnergyPlus scripts and files
to output results as color overlays on parametric geometry.

Qian and Lee (2014) sought to determine energy consumption in small


commercial buildings using mixed level factorial design (another form of
DoE). They used Trace 6 7000 for simulation and Minitab 7 1.7 for statistical
analysis. They analyzed a small commercial building at Morgan State
University and found a potential saving of 16.6% total energy consumption.
Pratt and Bosworth (2011) proposed combining parametric methods with
high throughput energy analysis methods. They developed
sustainParametrics and exportZones as Ruby plugins for Sketchup 8 to create
parametric models. They simulated models in EnergyPlus to produce
building energy use metrics. Large simulations with 34,398 runs were
simulated and an interactive visualization interface, including parallel
coordinate plots, used to visualize the results.

Chlela et al. (2009) acknowledged that parametric studies can help designers
choose optimal solutions but noted that such studies can be complicated and
time consuming due to a large number of runs. They proposed that DoE can
simplify parametric studies by reducing significantly the required number of
experiments or simulations. In a case study a DoE reduced a 177,147 run
three factorial design to between 200 and 377 runs. Magnier & Haghighat
(2010) argued for the use of multi-objective optimization with DoE and
artificial intelligence. They acknowledged that a shortcoming with genetic
algorithms is the need for thousands of evaluations to reach optimal
solutions. They proposed using response surface methods with genetic
algorithms to reduce the computational time while maintaining good
accuracy.

Flager et al. (2009) recognized that multi-disciplinary analysis has not been
fully realized in practice because current tools and processes do not support
the generation and evaluation of a large number of alternatives. They
observed that researchers in aerospace and automotive industries have
developed methods for multi-disciplinary design optimization. They
proposed to apply these methods to the parametric modeling of a single
classroom building case study. They used a multi-disciplinary process with
parallel coordinates plot as an interactive visualization, and genetic
algorithms and design of experiments to reduce the size of the design spaces.

4 https://www.autodeskresearch.com/publications/designscript
5
https://www.openstudio.net/
6
http://www.trane.com
7
http://www.minitab.com/en-us/
8
http://www.sketchup.com/

5
Iwaro et al. (2014) described the importance of weighting and selection
criteria for the sustainable performance assessment of building envelopes.
They developed an integrated criteria weighting framework and used it to
evaluate the most sustainable performance design alternative for building
envelopes in the Caribbean.

Lin & Gerber (2014) argued that the use of multi-objective optimization
methods are an effective means to overcome the limitations of current
performance based design processes. They proposed a multi-objective
design framework (EEPFD) that uses a genetic algorithm to optimize spatial
compliance, construction cost and energy performance. The use of a genetic
algorithm drastically reduces simulation cycle time. Khalafallah & El-Rayes
(2011) also addressed the question of multi-objective optimization with
specific reference to airport layouts. They used genetic algorithms to
optimize layouts for construction safety, construction related aviation safety
and airport security, and overall site layout costs. They also recognized the
need for an interactive visualization interface.

Shi & Yang (2013) understood that performance driven design takes a
holistic view of buildings ensuring ecological and environmental
performance without overlooking design and aesthetics. However
conventional architectural design methodology faces the problems that
analytical models are difficult to obtain, models need both geometric as well
as simulative inputs and the representation needs of design documentation
are often at odds with the needs of performance analysis. They proposed
Rhinoceros 9/Grasshopper 10 integrated with Ecotect 11, Radiance 12 and
EnergyPlus as a platform for performance driven design that addresses these
problems. Granadeiro et al. (2012) cautioned that environmental aspects in
architectural design can lead to neglect of other qualities such as aesthetics.
Their research recognized two challenges: how to improve design while
respecting compositional principles, and second the time consuming task of
modeling design alternatives for energy simulation. They used shape
grammars as a generative design tool that respects encoded design intent,
which they integrate with EnergyPlus to perform energy simulation of each
design iteration.

It is seen that while many authors acknowledge the importance of PA and


multi-disciplinary analysis they are aware that PA is computationally
demanding. Many seek to use DoE, in isolation or in combination with other

9
https://www.rhino3d.com/
10
http://www.grasshopper3d.com/
11
http://ecotect.com/
12
https://www.radiance-online.org/

6
methods, to reduce the computational burden. This paper has similar
motivations. Recognizing that the simulation design space must be reduced
as a practical matter, it describes a framework for performing a weighted,
multi-objective optimization which uses a DoE to reduce the design space.
The framework incorporates parametric geometric modeling and interactive
visualization interfaces. The research compares the quality of reduced
samples for simulations specified from DoE optimization to sample
simulations specified by a designers intuition and samples of randomly
specified simulations. It is shown that the DoE based method significantly
improves the mean and range of the value function of the reduced sample.

3. Background to the Design of Experiments


According to the US National Institute of Science and Technology (NIST)
design of experiments is a systematic, rigorous approach to engineering
problem-solving that uses statistical methods to derive valid engineering
conclusions under the constraint of minimal expenditure of engineering
resources (Croarkin & Tobias, 2017). There are four general problem areas
in which DoE is applied: comparative assessment of experimental outputs,
screening for important factors, modeling solutions and optimization of the
problem space. DoE is widely applied to optimization problems in several
fields including building performance analysis.

DoE uses statistical methods to discover the optimum value from a large
problem space of values by methodically sampling the space and then
interpolating between sampled values to obtain estimates of non-sampled
values. This estimated problem space can then be optimized, to a high
degree of accuracy, much faster than trying to experiment over the full
problem space. The concept can be illustrated by a simple example. In this
example we use a least squares model for a two factor experiment (Dunn,
2017). Suppose an experiment wishes to optimize an outcome O. Suppose
that there are two factors affecting this outcome A and B. A ranges from A1
to A2 while B ranges from B1 to B2. This is a two factor experiment with two
levels and an example table of outcomes (the numerical values of the
outcome are purely for illustration) is shown (Table 2).

From this sample of 4 experiments we can estimate any value of the


outcome using the equation = 67 + 10 + 4 . The equation is obtained
by noting that the average of all outcomes is (52 + 74 + 62 + 80)/4 = 67, the
average effect on factor A, going from 0 to +1, is (18 + 22)/4 = 10 and the
average effect on factor B, going from 0 to +1, is (6 + 10)/4 = 4. With this
estimate we can use software to quickly calculate an optimum value of the
outcome, and the corresponding factor levels, without performing any
further experiments.

7
Table 2. Example of DoE outcome table.

Standard Order Run order A B Outcome (O)


1 2 - - 52
2 4 + - 74
3 1 - + 62
4 3 + + 80

The general equation for this experiment, including interaction terms, can be
written:

= 0 + + +

where xA is the coded value for the factor A, xB is the coded value for factor
B, xAxB is the interaction term and the bi are coefficients to be calculated.
We can describe the experiment as follows:

1 = 0 + + +
2 = 0 + + + + +
3 = 0 + + + + +
4 = 0 + + + + + + +

this reduces to the matrix equation:

1 1 1 1 +1 0
2 1 +1 1 1
=
3 1 1 +1 1
4 1 +1 +1 +1

or = . The solution is = ( ) , where is the inverse of


the matrix and is the transpose. This can be solved and optimized using
statistical software. The example can be extended to experiments with
several factors and several levels, the statistical calculations getting
progressively more complex. Fortunately there are several commercial and
open source tools capable of performing DoE calculations.

8
4. Description of the MOPED Process

The MOPED process involves the following steps (Figure 1):

1. Develop design objectives


2. Establish input factors and ideal ranges
3. Use a DoE to reduce the design space
4. Run a full factorial PA on the reduced design space
5. Use a value function to optimize the responses
6. Use a parallel coordinates plot to visualize the results

The toolkit for this process includes a custom Grasshopper definition for
receiving inputs, coordinating experimental runs with the simulation
platforms and pushing output results to Flux.io 13 (Figure 2). Grasshopper
performs parametric geometric modeling with Rhinoceros used for
geometric visualization; thermal and daylight analysis are in EnergyPlus and
Radiance using the HoneyBee and LadyBug plugins; Life Cycle Cost
estimation uses RS Means 14 building cost data; and Carbon Dioxide output
emission is based on the total thermal energy values. JMP 15 13.0 (a version
of the well-known SAS 16 statistical software) is used for DoE analysis and a
custom parallel coordinates plot interface, developed at Perkins+Will
Architects, used for visualizing the high dimensional results data of the
experiment.

5. Case Study - Sprout Space

Sprout Space (Figure 3) is a research initiative focused around developing a


high performance modular and portable classroom (Perkins+Will, 2017).
The prototype is a 1000 square foot pre-engineered and pre-built design
aimed, in part, at solving the problems of poor daylighting, views, and
energy efficiency experienced in portable classrooms. It was used for
exploring the MOPED workflow.

13
https://flux.io/
14
https://www.rsmeans.com/
15
https://www.jmp.com/en_us/home.html
16
https://www.sas.com/en_ca/home.html

9
Figure 1. The MOPED process.

Figure 2. The toolkit for the MOPED process.

10
In addition to exploring MOPED the paper presents a series of Sprout Space
experiments to compare the quality of the design spaces generated by a
number of exploration methodologies, including: random sampling (used as
a baseline), a designers intuition, as well as by using DoE based methods.
The metrics for comparison were the means and ranges of the respective
samples. The best performing approach would have a high mean and a range
of values clustered around high values of the value function compared to
other approaches. The discussion also focuses on the time and effort to
implement and interpret these design spaces.

This means that the Sprout Space experiments described here included a few
more steps than the MOPED process shown in Figure 1. In particular Sprout
Space included a step for using random selection to reduce the design space,
a step for using the designers intuition to reduce the design space and a final
step to compare the outcomes of the three sampling approaches.

Figure 3. The Sprout Space case study.

1. Develop Design Objectives

Design objectives were developed by a team of stakeholders, designers and


decision makers. They included goals, indicators or responses, metrics and
preferences or weights. For this paper the goals were identified as
minimizing thermal performance, maximizing daylight factor, maximizing
direct line of sight, maximizing view quality, minimizing life cycle costs and
minimizing carbon dioxide emissions. It was recognized that in reality not
every goal is as important as every other. Therefore weights were given to
the goals as indicated in Table 3.

11
Table 3. Responses with weights.

Response Weight
Total Thermal Energy 25%

Daylight Factor 15%

Direct Line of Sight 10%

View Quality 10%

Life Cycle Cost 25%

Carbon Dioxide Emission 15%

2. Establish factors and ideal ranges

Based on the defined objectives, design variables relevant to the evaluation


of these objectives were first identified intuitively by the research team.
Ranges for each of the variables were first chosen together with the step-
values needed for a rigorous exploration of the design space. Some of these
variables were geometric like the size of the overhang while others were
material properties, like construction assembly. Some assumptions and
constraints were involved in choosing these factors and their ranges. For
example, assumptions included holding some factors constant, such as roof
construction, while constraints included limiting building geometry, dictated
by site restrictions. For this paper there were 64,800 combinations of the
factors which is computationally impractical on easily accessible resources
(Table 4).

3. Perform a reduced parametric analysis based on randomly selected


input values

In order to reduce the number of simulations required a reduced range of


input values was randomly selected from the full design space. It was not
expected that the random sampling approach would give the best
performance, but it was seen as a useful base line for comparison. In order to
generate the random sample space, we generated the full space of 64,800
runs in JMP, and then used the table subset feature to define a random
sampling of 1296 runs.

12
Table 4. Ideal ranges for input factors. 17

Input Range Number of


Options
Orientation 0, 30, 60, 90, 120, 150, 180, 210, 12
240, 270, 300, 330

Window 10 (3.0), 15 (4.6), 20 (6.0), 25 (7.6), 5


Width 30 (9.1)

Overhang 0 (0), 1 (0.3), 2 (0.6), 3 (0.9), 4 (1.2), 6


Depth 5 (1.5)

Roof Angle 1, 2, 3, 4, 5, 6, 7, 8, 9 9

Offset 4 (1.2), 8 (2.4), 12 (3.7), 16(4.9), 20 5


(6.0)

Construction GFRC [1], SIP [2], CLT [3], PCP [4] 4


Assemblies 18

Total 64,800

4. Perform a reduced parametric analysis based on intuitively selected input values

Next, we asked a designer to examine the full sample space in Table 4 and
then intuitively select reduced ranges for each of the inputs that they thought
would yield high values of the value function when taken together (Table 5).
This is not an easy task for a human designer because of the multi-objective
nature of the problem with some of the objectives conflicting and with
potential interactions between inputs that are not easy to predict.

17
Unless otherwise stated numbers in parentheses are unit conversions to meters.
18
Numbers in square brackets are statistical codes.

13
Table 5. Target ranges based on intuitive selection of inputs by a human designer

Input Target Range Number of Options


Orientation 0, 30, 60, 270, 300, 330 6

Window Width 10 (3.0), 20 (6.0), 30 (9.1) 3

Overhang Depth 0 (0), 5 (1.5) 2

Roof Angle 1, 5, 9 3

Offset 4 (1.2), 12(3.7), 20 (6.0) 3

Construction GFRC [1], SIP [2], CLT [3], 4


PCP [4]

Total 1296

5. Perform a reduced parametric analysis based on using a DoE to filter


the design space

Next, a DoE was set up in JMP software to help reduce the size of the design
space. The DoE was set up with reasonable defaults to report estimated
optimum values for each of the factors. A realistic, reduced design space was
constructed centered around these estimates. The full factorial PA was run
on this reduced space (Table 6). The DoE process involved the five steps
described below.

First, input factors and step values were entered into a DoE Custom Design
in JMP, and a response called Value Function was defined which held the
sum of weighted response factors. Second, reasonable interaction and second
order terms were defined, and the DoE was set to a reasonable number of
runs, 32 in this case. Third, JMP was used to design the experiment. Note
that the Value Function is not filled out yet at this point. It will receive the
results from the 32 simulation runs performed in Grasshopper. Fourth, the 32
run DoE was simulated in Grasshopper and a value function created from the
results to plug into JMP (See Step 7 for Value Function discussion). This
simulation took 20 minutes for the Sprout Space case study on a Lenovo
Yoga with Intel Core i7-6600U, 2.6 GHz, 2.81GHz and 16GB of memory.
Finally, the DoE was run and prediction profiler plots obtained with the
estimates of the optimum values of the factors (Figure 4). These estimates
are used in constructing the full factorial design space of Table 6.

14
Table 6. Target ranges of input factors for DoE based full factorial experiment.

Input Target Range Number of Options


Orientation 0, 30, 60, 270, 300, 330 6

Window Width 15 (4.5), 20(6.1), 25(7.6) 3

Overhang Depth 3(0.9), 4(1.2), 5(1.5) 3


Roof Angle 1, 2, 8, 9 4

Offset 12(3.6), 16(4.8), 20(6.1) 3


Construction SIP, CLT 2

Total 1296

Figure 4. Prediction profiler plots from JMP.

6. Run a full factorial PA on the reduced design spaces

We next used the random sample, intuitive based selection and DoE based
results to create realistic ranges for a full factorial PA in the reduced space
of promising designs with a computationally feasible number of runs, 1296
in this case. Each of these full factorial runs took 12 hrs on the Lenovo
laptop. It is estimated that 64,800 runs would have taken 640 hrs, or almost a
month.

7. Use a value function to optimize the responses.

After generating the PA results, we created a value function by normalizing


the responses, weighting them according to the preferences indicated when

15
developing the objectives and inverting any antagonistic objectives. The
responses Total Thermal Energy, Life Cycle Costs and Carbon Dioxide
Emissions all needed to be inverted. The value function is a summation of
the normalized, inverted (where applicable), and weighted response factors.
The optimization seeks to maximize the value function. The normalized
value function is computed as follows:

RV = Response Value
DF = Daylight Factor
DLS = Direct Line of Sight
VQ = View Quality
LCC = Life Cycle Costs
CDE = Carbon Dioxide Emissions
VF = Value Function

To normalize the response value for each response:

RV normalized = RVn = (RV RVmin)/(RVmax RVmin)


RV inverted = RVi = 1 RVn
RV weighted = RVw = weight RVi (or RVn), where weight is based
on Table 3.

RVmin = Minimum response value for the sample


RVmax = Maximum response value for the sample

And then:

VF = TTEw + DFw + DLSw + VQw + LCCw + CDEw

8. Use a parallel coordinates plot to visualize data and Rhinoceros to


visualize geometry

Finally, we visualized the responses and value function in a parallel


coordinates plot of the full factorial PA (Figure 5). This will enable the
architect to explore ranges of options and understand their impact on
individual responses as well as on the value function. The optimized outputs
can also be visualized in the context of the geometric model in Rhinoceros
(Figure 6).

16
Figure 5. Parallel coordinates plot for interactively exploring analysis results.

6. Results

Three 1296 run PA simulations were performed in the Sprout Space


experiment. First, a 1296 run PA simulation based on random selection was
performed. Second, a 1296 run PA simulation based on the intuitive
selection of inputs by a designer was performed (Table 5). A 32 run
simulation based on a DoE designed in JMP was run from the ideal ranges in
Table 4 then, based on this, a third full factorial 1296 run PA simulation was
run based on these DoE optimized values (Table 6).

Table 7. Comparison of Randomly Based runs to Intuitively Based runs and DoE Based runs
(in value function units).

Randomly Based 1296 Intuitively Based 1296 DoE Based 1296


run PA run PA run PA
Min 31.24 26.64 46.45
Max 80.57 81.52 79.55
Mean 56.62 55.55 62.74

A comparison of the range and the mean values of the value function from
the three simulations is shown in Table 7. A box plot comparison of the
value function range from the three samples is shown in Figure 7. It is seen
that the DoE reduces the range of the design spaces by eliminating low
quality options. It is also seen that the DoE improves the mean value of the
reduced design space. A three way single factor Anova

17
Figure 6. Visualization of design geometry and performance analysis in Rhinoceros

18
shows that the differences in the means between the three samples is
statistically significant (p<0.000). Paired sample t-tests between the DoE
Based method and the Random Based Method as well as the Intuitively
Based methods respectively show that the differences in means are
statistically significant (p<0.000).

To perform a comparison of value functions obtained from the randomly


selected, intuitively selected and DoE based run simulations a slightly
different method of calculating the value function was required. This is
because the method in Section 6 above provides for in sample normalization
comparing different responses in the same sample. To compare responses
between samples, and to ultimately compare the value function between
samples, a between samples normalization is required. The modification to
the definitions in Section 5 is as follows:

RVmin = Minimum response value for all samples


RVmax = Maximum response value for all samples

Figure 7. Comparison of Random Sample PA, Intuitive PA and DoE Based PA (in value
function units)

7. Conclusion

This paper describes the MOPED workflow for using DoE with PA. The
benefit of PA for building performance simulations was indicated and the
limitation of a high computational processing requirement was highlighted.
A comparative literature review of sources that recognize the merits of both
DoE and PA was presented. However, limitations with many of the

19
approaches was observed. The use of the DoE for PA workflow on the
Sprout Space case study was also described. The key benefit of the method
is that it drastically reduces the time and resources needed for analysis. The
method also allows for multi-disciplinary, multi-objective optimization
taking into account the weighting of responses in constructing a value
function. A parallel coordinates plot tool, for visualizing high dimensional
data was introduced. The workflow remains integrated with the parametric
geometric model which can be visualized in Rhinoceros software. The
results of comparing a PA simulation based on random selection, intuitive
selection and DoE optimization shows the range and mean values of
responses to be significantly superior for the latter.

20
8. References

Chlela, F., Husaunndee, A., Inard, C., & Reiderer, P. (2009). A New
Methodology for the Design of Low Energy Buildings. Energy and
Buildings, 41(9), 982-990.
Clevenger, C., & Haymaker, J. (2012). The Value of Design Strategies
Applied to Energy Efficiency. Smart and Sustainable Built
Environment, 1(3), 222-240.
Croarkin, C., & Tobias, P. (2017, March 7). Engineering Statistics
Handbook. Retrieved from NIST/SEMATECH e-Handbook of
Statistical Methods:
http://www.itl.nist.gov/div898/handbook/index.htm
Dhariwal, J., & Banerjee, R. (2017). An approach for building design
optimization using design of experiments. Building Simulation,
10(3), 323-336.
Dunn, K. (2017, March 7). Process Improvement Using Data. Retrieved
from https://learnche.org/pid/
Flager, F., Welle, B., Bansal, P., Soremekun, G., & Haymaker, J. (2009).
Multidisciplinary process integration and design optimization of a
classroom building. Journal of Information Technology in
Construction, 14(38), 595-612.
Granadeiro, V., Duarte, J., Correia, J., & Leal, V. (2012). Building envelope
shape design in early stages of the design process: Integrating
architectural design systems and energy simulation. Automation in
Construction, 196-209.
Iwaro, J., Abrahams, M., Rupert, W., & Ricardo, Z. (2014). An integrated
Criteria Weighting Framework for the sustainable performance
assessment and design of building envelope. Renewable and
Sustainable Energy Reviews, 417 - 434.
Jabi, W. (2014). Parametric spatial models for energy analysis in the early
design stages. Proceedings of the Symposium on Simulation for
Architecture & Urban Design (p. 16). Society for Computer
Simulation International.

21
Khalafallah, A., & El-Rayes, K. (2011). Automated multi-objective
optimization system for airport site layouts. Automation in
Construction, 313-320.
Lin, S.-H., & Gerber, J. (2014). Designing-in performance: A framework for
evolutionary energy performance feedback in early stage design.
Automation in Construction, 59-73.
Magnier, L., & Haghighat, F. (2010). Multiobjective optimization of
building design using TRNSYS simulations, genetic algorithms and
Artificial Neural Network. Building and Environment, 45(3), 739-
746.
Naboni, E., Maccarini, A., Korolija, I., & Zhang, Y. (2013). Comparison of
conventional, parametric and evolutionary optimization approaches
for the architectural design of nearly zero energy buildings. Building
Simulation.
Perkins+Will. (2017, May 25). Sprout Space. Retrieved from
http://www.sproutspace.com/
Pratt, K., & Bosworth, D. (2011). A Method for the Design and Analysis of
Parametric Building Energy Models. 12th Conference of the
International Building Performance Simulation Association, (pp.
2499 - 2506). Sidney.
Qian, X., & Seong, L. (2014). The Design and Analysis of Energy Efficient
Building Envelopes for the Commercial Buildings by Mixed-level
Factorial Design and Statistical Methods. Middle Atlantic Section
Proceedings. American Society for Engineering Education.
Ritter, F., Geyer, P., & Bormann, A. (2015). Simulation-based decision-
making in Early Design Stages. 32nd CIB W78 Conference, (pp. 27-
29). Eindhoven.
Sadeghifam, A., Zahraee, S., Meynagh, M., & Kiani, I. (2015). Combined
use of design of experiments and dynamic building simulation in
assessment of energy efficiency in tropical residential buildings.
Energy and Buildings, 86, 525-533.
Shi, X., & Yang, W. (2013). Performance-driven architectural design and
optimization technique from a perspective of architects. Automation
in Construction, 125-135.

22

You might also like