0
Research Papers: Design Automation

Statistical Surrogate Formulations for Simulation-Based Design Optimization

[+] Author and Article Information
Bastien Talgorn

GERAD,
Montréal, Québec H3T 2A7, Canada
Department of Mechanical Engineering,
McGill University,
Montréal, Québec H3A 0C3, Canada

Sébastien Le Digabel

GERAD,
Montréal, Quebec H3T 2A7, Canada
Département de Mathématiques
et Génie Industriel,
École Polytechnique de Montréal,
Montréal, Québec H3C 3A7, Canada

Michael Kokkolaras

GERAD,
Montréal, Québec H3T 2A7,Canada
Department of Mechanical Engineering,
McGill University,
Montréal, Québec H3A 0C3, Canada

There may be situations where the properties of the objective function or some of the constraints do not require the construction and use of surrogate models, e.g., if one of these functions is smooth and inexpensive and has an analytical expression.

Contributed by the Design Automation Committee of ASME for publication in the JOURNAL OF MECHANICAL DESIGN. Manuscript received February 18, 2014; final manuscript received September 9, 2014; published online December 15, 2014. Assoc. Editor: Gary Wang.

J. Mech. Des 137(2), 021405 (Feb 01, 2015) (18 pages) Paper No: MD-14-1128; doi: 10.1115/1.4028756 History: Received February 18, 2014; Revised September 09, 2014; Online December 15, 2014

Typical challenges of simulation-based design optimization include unavailable gradients and unreliable approximations thereof, expensive function evaluations, numerical noise, multiple local optima, and the failure of the analysis to return a value to the optimizer. One possible remedy to alleviate these issues is to use surrogate models in lieu of the computational models or simulations and derivative-free optimization algorithms. In this work, we use the R dynaTree package to build statistical surrogates of the blackboxes and the direct search method for derivative-free optimization. We present different formulations for the surrogate problem (SP) considered at each search step of the mesh adaptive direct search (MADS) algorithm using a surrogate management framework. The proposed formulations are tested on 20 analytical benchmark problems and two simulation-based multidisciplinary design optimization (MDO) problems. Numerical results confirm that the use of statistical surrogates in MADS improves the efficiency of the optimization algorithm.

FIGURES IN THIS ARTICLE
<>
Copyright © 2015 by ASME
Your Session has timed out. Please sign back in to continue.

References

Conn, A. R., Scheinberg, K., and Vicente, L. N., 2009, “Introduction to Derivative-Free Optimization,” MOS/SIAM Series on Optimization, SIAM, Philadelphia, PA.
Audet, C., and Dennis, J. E., Jr., 2003, “Analysis of Generalized Pattern Searches,” SIAM J. Optim., 13(3), pp. 889–903. [CrossRef]
Torczon, V., 1997, “On the Convergence of Pattern Search Algorithms,” SIAM J. Optim., 7(1), pp. 1–25. [CrossRef]
Audet, C., and Dennis, J. E., Jr., 2006, “Mesh Adaptive Direct Search Algorithms for Constrained Optimization,” SIAM J. Optim., 17(1), pp. 188–217. [CrossRef]
Cohn, D. A., 1996, “Neural Network Exploration Using Optimal Experimental Design,” Adv. Neural Inf. Process. Syst., 6(9), pp. 679–686.
Gramacy, R. B., and Taddy, M. A., 2010, “dynaTree: An R Package Implementing Dynamic Trees for Learning and Design,” http://CRAN.R-project.org/package=dynaTree
Jones, D. R., 2001, “A Taxonomy of Global Optimization Methods Based on Response Surfaces,” J. Global Optim., 21(4), pp. 345–383. [CrossRef]
Jones, D. R., Schonlau, M., and Welch, W. J., 1998, “Efficient Global Optimization of Expensive Black Box Functions,” J. Global Optim., 13(4), pp. 455–492. [CrossRef]
Schonlau, M., Jones, D. R., and Welch, W. J., 1998, “Global Versus Local Search in Constrained Optimization of Computer Models,” New Developments and Applications in Experimental Design, Number 34 in IMS Lecture Notes—Monograph Series, Institute of Mathematical Statistics, Beachwood, OH, pp. 11–25.
Williams, B. J., Santner, T. J., and Notz, W. I., 2000, “Sequential Design of Computer Experiments to Minimize Integrated Response Functions,” Stat. Sinica, 10(4), pp. 1133–1152.
Queipo, N., Haftka, R., Shyy, W., Goel, T., Vaidyanathan, R., and Kevintucker, P., 2005, “Surrogate-Based Analysis and Optimization,” Prog. Aerosp. Sci., 41(1), pp. 1–28. [CrossRef]
Simpson, T. W., Korte, J. J., Mauery, T. M., and Mistree, F., 2001, “Kriging Models for Global Approximation in Simulation-Based Multidisciplinary Design Optimization,” AIAA J., 39(12), pp. 2233–2241. [CrossRef]
Bandler, J. W., Cheng, Q. S., Dakroury, S. A., Mohamed, A. S., Bakr, M. H., Madsen, K., and Sondergaard, J., 2004, “Space Mapping: the State of the Art,” IEEE Trans. Microwave Theory Tech., 52(1), pp. 337–361. [CrossRef]
Forrester, A. I. J., and Keane, A. J., 2009, “Recent Advances in Surrogate-Based Optimization,” Prog. Aerosp. Sci., 45(1–3), pp. 50–79. [CrossRef]
Booker, A. J., Dennis, J. E., Jr., Frank, P. D., Serafini, D. B., Torczon, V., and Trosset, M. W., 1999, “A Rigorous Framework for Optimization of Expensive Functions by Surrogates,” Struct. Multidiscip. Optim., 17(1), pp. 1–13. [CrossRef]
Serafini, D. B., 1998, “A Framework for Managing Models in Nonlinear Optimization of Computationally Expensive Functions,” Ph.D. thesis, Department of Computational and Applied Mathematics, Rice University, Houston, TX.
Conn, A. R., and Le Digabel, S., 2013, “Use of Quadratic Models With Mesh-Adaptive Direct Search for Constrained Black Box Optimization,” Optim. Methods Software, 28(1), pp. 139–158. [CrossRef]
Gramacy, R. B., and Le Digabel, S., 2011, “The Mesh Adaptive Direct Search Algorithm With Treed Gaussian Process Surrogates,” Les cahiers du GERAD, Technical Report No. G-2011-37. Available at: [CrossRef]. To appear in the Pacific Journal of Optimization.
Taddy, M. A., Gramacy, R. B., and Polson, N. G., 2011, “Dynamic Trees for Learning and Design,” J. Am. Stat. Assoc., 106(493), pp. 109–123. [CrossRef]
Clarke, F. H., 1983, Optimization and Nonsmooth Analysis, Wiley, New York (reissued in 1990 by SIAM Publications, Philadelphia, PA, as Vol. 5 in the series Classics in Applied Mathematics).
Le Digabel, S., 2011, “Algorithm 909: NOMAD: Nonlinear Optimization With the MADS Algorithm,” ACM Trans. Math. Software, 37(4), pp. 1–44. [CrossRef]
Goldberg, D. E., 1989, Genetic Algorithms in Search, Optimization and Machine Learning, Addison-Wesley Longman, Boston, MA.
Audet, C., Béchard, V., and Le Digabel, S., 2008, “Nonsmooth Optimization Through Mesh Adaptive Direct Search and Variable Neighborhood Search,” J. Global Optim., 41(2), pp. 299–318. [CrossRef]
Vaz, A. I. F., and Vicente, L. N., 2007, “A Particle Swarm Pattern Search Method for Bound Constrained Global Optimization,” J. Global Optim., 39(2), pp. 197–219. [CrossRef]
McKay, M. D., Beckman, R. J., and Conover, W. J., 1979, “A Comparison of Three Methods for Selecting Values of Input Variables in the Analysis of Output From a Computer Code,” Technometrics, 21(2), pp. 239–245.
Custódio, A. L., Rocha, H., and Vicente, L. N., 2010, “Incorporating Minimum Frobenius Norm Models in Direct Search,” Comput. Optim. Appl., 46(2), pp. 265–278. [CrossRef]
Fletcher, R., and Leyffer, S., 2002, “Nonlinear Programming Without a Penalty Function,” Math. Program. Ser. A, 91(2), pp. 239–269. [CrossRef]
Gramacy, R. B., Taddy, M. A., and Wild, S. M., 2013, “Variable Selection and Sensitivity Analysis Using Dynamic Trees, With an Application to Computer Code Performance Tuning,” Ann. Appl. Stat., 7(1), pp. 51–80. [CrossRef]
Chipman, H. A., George, E. I., and McCulloch, R. E., 1998, “Bayesian CART Model Search (With Discussion),” J. Am. Stat. Assoc., 93(443), pp. 935–960. [CrossRef]
Chipman, H. A., George, E. I., and McCulloch, R. E., 2002 “Bayesian Treed Models,” Mach. Learn., 48(1–3), pp. 299–320. [CrossRef]
Carvalho, C. M., Johannes, M., Lopes, H. F., and Polson, N. G., 2010, “Particle Learning and Smoothing,” Stat. Sci., 25(1), pp. 88–106. [CrossRef]
Carvalho, C. M., Lopes, H. F., Polson, N. G., and Taddy, M. A., 2010, “Particle Learning for General Mixtures,” Bayesian Anal., 5(4), pp. 709–740. [CrossRef]
Cortes, C., and Vapnik, V., 1995, “Support-Vector Networks,” Mach. Learn., 20(3), pp. 273–297.
Liem, R. P., 2007, “Surrogate Modeling for Large-Scale Black-Box Systems,” Massachusetts Institute of Technology, Computation for Design and Optimization Program. Available at: http://hdl.handle.net/1721.1/41559
Bai, Z., 2002, “Krylov Subspace Techniques for Reduced-Order Modeling of Large-Scale Dynamical Systems,” Appl. Numer. Math., 43(1–2), pp. 9–44. [CrossRef]
Willcox, K., and Peraire, J., 2002, “Balanced Model Reduction Via the Proper Orthogonal Decomposition,” AIAA J., 40(11), pp. 2323–2330. [CrossRef]
Abramson, M. A., Audet, C., Couture, G., Dennis, J. E., Jr., Le Digabel, S., and Tribes, C., “The NOMAD Project,” http://www.gerad.ca/nomad
Audet, C., Dennis, J. E., Jr., and Le Digabel, S., 2008, “Parallel Space Decomposition of the Mesh Adaptive Direct Search Algorithm,” SIAM J. Optim., 19(3), pp. 1150–1170. [CrossRef]
Lukšan, L., and Vlček, J., 2000, “Test Problems for Nonsmooth Unconstrained and Linearly Constrained Optimization,” Technical Report No. V-798.
Audet, C., and Dennis, J. E., Jr., 2009, “A Progressive Barrier for Derivative-Free Nonlinear Programming,” SIAM J. Optim., 20(1), pp. 445–472. [CrossRef]
Hock, W., and Schittkowski, K., 1981, “Test Examples for Nonlinear Programming Codes,” Lecture Notes in Economics and Mathematical Systems, Vol. 187, Springer Verlag, Berlin, Germany.
Tribes, C., Dubé, J.-F., and Trépanier, J.-Y., 2005, “Decomposition of Multidisciplinary Optimization Problems: Formulations and Application to a Simplified Wing Design,” Eng. Optim., 37(8), pp. 775–796. [CrossRef]
Kodiyalam, S., 2001, “Multidisciplinary Aerospace Systems Optimization,” Lockheed Martin Space Systems Company, Computational AeroSciences Project, Sunnyvale, CA, Technical Report No. NASA/CR-2001-211053.
AIAA/UTC/Pratt & Whitney, 1995/1996, Undergraduate Individual Aircraft Design Competition.
Moré, J. J., and Wild, S. M., 2009, “Benchmarking Derivative-Free Optimization Algorithms,” SIAM J. Optim., 20(1), pp. 172–191. [CrossRef]
Krige, D. G., 1951, “A Statistical Approach to Some Mine Valuations and Allied Problems at the Witwatersrand,” Master’s thesis, University of Witwatersrand, Johannesburg, South Africa.
Rasmussen, C. E., and Williams, C. K. I., 2006, Gaussian Processes for Machine Learning, MIT Press, Cambridge, MA.
Gramacy, R. B., and Lee, H. K. H., 2008, “Bayesian Treed Gaussian Process Models With an Application to Computer Modeling,” J. Am. Stat. Assoc., 103(483), pp. 1119–1130. [CrossRef]

Figures

Grahic Jump Location
Fig. 1

Optimization algorithm

Grahic Jump Location
Fig. 2

dynaTree regression on 24 data points in R

Grahic Jump Location
Fig. 3

Probability of feasibility and uncertainty in feasibility. μ(x) is maximal for x = 4, where c∧(x) = 0. In the neighborhood of x = 7, despite the sharp variation in c, the feasibility is predictable, so μ(x) is small.

Grahic Jump Location
Fig. 4

Data and performance profiles for the set of analytical problems. MADS and Quad are used as a reference; ((SP1-Fσ), λ = 1.0) and ((SP3-EIσ), λ = 0.01) are the formulations with the best and worst mean deviation, respectively. (a): data profile, τ = 10−1; (b): data profile, τ = 10−3; (c): data profile, τ = 10−7; (d): performance profile after 1000np evaluations.

Grahic Jump Location
Fig. 5

Data and performance profiles for the simplified wing MDO problem. MADS and Quad are used as a reference; ((SP3-EIσ), λ = 0.01) and ((SP3-EIσ), λ = 1.0) are the formulations with the best and worst mean deviation, respectively. (a): data profile, τ = 10−1; (b): data profile, τ = 10−3; (c): data profile, τ = 10−7; (d): performance profile after 7,000 evaluations.

Grahic Jump Location
Fig. 6

Data and performance profiles for the aircraft-range MDO problem. MADS and Quad are used as a reference; ((SP5-EFσ), λ = 0.1) and ((SP5-EFIσ), λ = 1.0) are the formulations with the best and worst mean deviation, respectively. (a): Data profile, τ = 10−1; (b): data profile, τ = 10−3; (c): data profile, τ = 10−7; (d): performance profile after 10,000 evaluations.

Grahic Jump Location
Fig. 7

Comparison of different surrogate modeling methods using four test functions (left column: models, right column: absolute errors)

Tables

Errata

Discussions

Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging and repositioning the boxes below.

Related Journal Articles
Related eBook Content
Topic Collections

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In