Research Papers

Maximizing Design Confidence in Sequential Simulation-Based Optimization

[+] Author and Article Information
Jing Li

e-mail: li2@oakland.edu

Zissimos P. Mourelatos

e-mail: mourelat@oakland.edu
Mechanical Engineering Department,
Oakland University,
Rochester, MI 48309

Michael Kokkolaras

Department of Mechanical Engineering
McGill University
Montreal, Quebec H3A oC3
e-mail: michael.kokkolaras@mcgill.ca

Panos Y. Papalambros

e-mail: pyp@umich.edu
Department of Mechanical Engineering,
University of Michigan,
Ann Arbor, MI 48109

David J. Gorsich

U.S. Army, TARDEC,
Warren, MI 48397
e-mail: david.j.gorsich.civ@mail.mil

The design space is defined by the bounds of the design optimization variables, while the feasible space is defined by the intersection of all design constraints.

1Corresponding author.

Contributed by the Design Automation Committee of ASME for publication in the JOURNAL OF MECHANICAL DESIGN. Manuscript received June 24, 2012; final manuscript received March 12, 2013; published online June 10, 2013. Assoc. Editor: Timothy W. Simpson.

J. Mech. Des 135(8), 081004 (Jun 10, 2013) (8 pages) Paper No: MD-11-1285; doi: 10.1115/1.4024470 History: Received June 24, 2012; Revised March 12, 2013

Computational simulation models support a rapid design process. Given model approximation and operating conditions uncertainty, designers must have confidence that the designs obtained using simulations will perform as expected. The traditional approach to address this need consists of model validation efforts conducted predominantly prior to the optimization process. We argue that model validation is too daunting of a task to be conducted with meaningful success for design optimization problems associated with high-dimensional space and parameter spaces. In contrast, we propose a methodology for maximizing confidence in designs generated during the simulation-based optimization process. Specifically, we adopt a trust-region-like sequential optimization process and utilize a Bayesian hypothesis testing technique to quantify model confidence, which we maximize by calibrating the simulation model within local domains if and when necessary. This ensures that the design iterates generated during the sequential optimization process are associated with maximized confidence in the utilized simulation model. The proposed methodology is illustrated using a cantilever beam design subject to vibration.

Copyright © 2013 by ASME
Your Session has timed out. Please sign back in to continue.


Bayarri, M. J., Berger, J. O., Paulo, R., Sacks, J., Cafeo, J. A., Cavendish, J., Lin, C. H., and Tu, J., 2007, “A Framework for Validation of Computer Models,” Technometrics, 49(2), pp. 138–154. [CrossRef]
Kennedy, M. C., and O'Hagan, A., 2001, “Bayesian Calibration of Computer Models,” J. R. Stat. Soc., Ser. B, 63, pp. 425–450. [CrossRef]
Booker, A. J., Dennis, Jr., J. E., Frank, P. D., Serafini, D. B., Torczon, V., and Trosset, M. W., 1999, “A Rigorous Framework by Surrogates for Optimization of Expensive Functions,” Struct. Optim., 17, pp. 1–13. [CrossRef]
DoD Directive No. 5000.61, “Modeling and Simulation (M&S), Verification, validation, and Accreditation (VV&A),” Defense Modeling and Simulation Office, www.dmso.mil/docslib.
Oberkampf, W. L., and Barone, M. F., 2006, “Measures of Agreement between Computation and Experiment: Validation Metrics,” Journal of Computational Physics, 217(1), pp. 5–36.
Gu, L., and Yang, R. J., 2003, “Recent Applications on Reliability-Based Optimization of Automotive Structures,” SAE World Congress, Detroit, MI, Paper No. 2003-01-0152.
Oden, J. T., 2006, Chair, “Revolutionizing Engineering Science Through Simulation: The NSF Blue Ribbon Panel on Simulation-Based Engineering Science,” National Science Foundation.
Oberkampf, W. L., Trucano, T. G., and Hirsch, C., 2004, “Verification, Validation, and Predictive Capability in Computational Engineering and Physics,” Appl. Mech. Rev., 57(5), pp. 345–384. [CrossRef]
Easterling, R. G. and Berger, J. O., 2002, “Statistical Foundations for the Validation of Computer Models,” Presentation at Computer Model Verification and Validation in the 21st Century Workshop, Johns Hopkins University, MD.
Zhang, R. and Mahadevan, S., 2003, “Bayesian Methodology for Reliability Model Acceptance,” Reliab. Eng. Syst. Saf., 80, pp. 95–103. [CrossRef]
Mahadevan, S., and Rebba, R., 2005, “Validation of Reliability Computational Models using Bayes Networks,” Reliab. Eng. Syst. Saf., 87(2), pp. 223–232. [CrossRef]
Rebba, R., and Mahadevan, S., 2006, “Model Predictive Capability Assessment Under Uncertainty,” AIAA J., 44(10), pp. 2376–2384. [CrossRef]
Jiang, X., and Mahadevan, S., 2008a, “Bayesian Validation Assessment of Multivariate Computational Models,” J. Appl. Stat., 35(1), pp. 49–65. [CrossRef]
Jiang, X., and Mahadevan, S., 2008b, “Bayesian Wavelet Method for Multivariate Model Assessment of Dynamical Systems,” J. Sound Vib., 312(4–5), pp. 694–712. [CrossRef]
Jiang, X., and Mahadevan, S., 2009, “Bayesian Inference Method for Model Validation and Confidence Extrapolation,” J. Appl. Stat., 36(6), pp. 659–677. [CrossRef]
Gunawan, S. and Papalambros, P. Y., 2006, “A Bayesian Approach to Reliability-Based Optimization With Incomplete Information,” ASME J. Mech. Des., 128(4), pp. 909–918. [CrossRef]
Chen, W., Xiong, Y., Tsui, K.-L., and Wang, S., 2008, “A Design-Driven Validation Approach Using Bayesian Prediction Models,” ASME J. Mech. Des., 130(2), p. 021101. [CrossRef]
Easterling, R. G., 2003, “Statistical Foundations for Model Validation,” Sandia National Laboratories, Albuquerque, NM. Technical Report No. SAND2003-0287.
Oberkampf, W. L., Trucano, T. G., and Hirsch, C., 2003, “Verification, Validation, and Predictive Capabilities in Computational Engineering and Physics,” Sandia National Laboratories, Technical Report Sand. No 2003-3769, Albuquerque, NM.
Rebba, R., 2005, Model Validation and Design under Uncertainty, Ph.D. thesis, Vanderbilt University, Nashville, TN.
Hemez, F. M., and Doebling, S. W., 2000, “Validation of Structural Dynamics Models at Los Alamos National Laboratory,” Proceedings, 41st AIAA/ASME/ASCE/AHS/ASC Structures, Structural Dynamics and Materials Conference, Atlanta, GA.
Jiang, X., and Mahadevan, S., 2007, “Bayesian Risk-based Decision Method for Model Validation Under Uncertainty,” Reliab. Eng. Syst. Saf., 92(6), pp. 707–718. [CrossRef]
Chen, W., Baghdasaryan, L., Buranathiti, T., and Cao, J., 2004, “Model Validation via Uncertainty Propagation and Data Transformations,” AIAA J., 42(7), pp. 1406–1415. [CrossRef]
Jiang, X., Yang, R.-J., Barbat, S., and Weerappuli, P., 2009, “Bayesian Probabilistic PCA Approach for Model Validation of Dynamic Systems,” SAE Int. J. Mate. Manuf., 2(1), pp. 555–563. [CrossRef]
Tipping, M. E., and Bishop, C. M., 1999, “Probabilistic Principal Component Analysis,” J. R. Stat. Soc. Ser. B (Stat. Methodol.), 61(3), pp. 611–622. [CrossRef]
Joliffe, 2002, I. T., Principal Component Analysis, Springer, New York.
Pai, Y., 2009, Investigation of Bayesian Model Validation Framework for Dynamic Systems, M.S. thesis, Department of Mechanical Engineering, University of Michigan, Ann Arbor.
Gelman, A., Carlin, J. B., Stern, H. S., and Rubin, D. B., 2003, Bayesian Data Analysis, Taylor & Francis, Inc., pp. 85–86.
Rao, S. S., 2004, Mechanical Vibrations, 4th ed., Pearson Prentice–Hall, Upper Saddle River, NJ.
Drignei, D., Mourelatos, Z. P., and Rebba, R., 2010, “Parameter Screening in Dynamic Computer Model Calibration Using Global Sensitivities,” Proceedings, 2010 ASME IDETC/CIE, Montreal, Canada, Paper No. DETC2010-28343.


Grahic Jump Location
Fig. 1

Notation and definition of local domains

Grahic Jump Location
Fig. 2

Cantilever beam of rectangular cross-section with tip point load f(t)

Grahic Jump Location
Fig. 3

Cantilever beam of rectangular cross-section for test and model (CAE) cases; the test uses a reduced cross-section close to the fixed end and the model assumes a pinned left end with a rotational spring constant kt

Grahic Jump Location
Fig. 4

Confidence value as a function of design variables d1 and d2

Grahic Jump Location
Fig. 5

Comparison of tip displacement between test and model at initial design before calibration

Grahic Jump Location
Fig. 6

Comparison of tip displacement between test and model at initial design after calibration

Grahic Jump Location
Fig. 7

Optimization process in the design space for noncalibrated and calibrated models

Grahic Jump Location
Fig. 8

Design optimization history for noncalibrated and calibrated models



Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging and repositioning the boxes below.

Related Journal Articles
Related eBook Content
Topic Collections

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In