Conn, A. R., Scheinberg, K., and Vicente, L. N., 2009, “Introduction to Derivative-Free Optimization,” *MOS/SIAM Series on Optimization*, SIAM, Philadelphia, PA.

Audet, C., and Dennis, J. E., Jr., 2003, “Analysis of Generalized Pattern Searches,” SIAM J. Optim., 13(3), pp. 889–903.

[CrossRef]Torczon, V., 1997, “On the Convergence of Pattern Search Algorithms,” SIAM J. Optim., 7(1), pp. 1–25.

[CrossRef]Audet, C., and Dennis, J. E., Jr., 2006, “Mesh Adaptive Direct Search Algorithms for Constrained Optimization,” SIAM J. Optim., 17(1), pp. 188–217.

[CrossRef]Cohn, D. A., 1996, “Neural Network Exploration Using Optimal Experimental Design,” Adv. Neural Inf. Process. Syst., 6(9), pp. 679–686.

Jones, D. R., 2001, “A Taxonomy of Global Optimization Methods Based on Response Surfaces,” J. Global Optim., 21(4), pp. 345–383.

[CrossRef]Jones, D. R., Schonlau, M., and Welch, W. J., 1998, “Efficient Global Optimization of Expensive Black Box Functions,” J. Global Optim., 13(4), pp. 455–492.

[CrossRef]Schonlau, M., Jones, D. R., and Welch, W. J., 1998, “Global Versus Local Search in Constrained Optimization of Computer Models,” New Developments and Applications in Experimental Design, Number 34 in IMS Lecture Notes—Monograph Series, Institute of Mathematical Statistics, Beachwood, OH, pp. 11–25.

Williams, B. J., Santner, T. J., and Notz, W. I., 2000, “Sequential Design of Computer Experiments to Minimize Integrated Response Functions,” Stat. Sinica, 10(4), pp. 1133–1152.

Queipo, N., Haftka, R., Shyy, W., Goel, T., Vaidyanathan, R., and Kevintucker, P., 2005, “Surrogate-Based Analysis and Optimization,” Prog. Aerosp. Sci., 41(1), pp. 1–28.

[CrossRef]Simpson, T. W., Korte, J. J., Mauery, T. M., and Mistree, F., 2001, “Kriging Models for Global Approximation in Simulation-Based Multidisciplinary Design Optimization,” AIAA J., 39(12), pp. 2233–2241.

[CrossRef]Bandler, J. W., Cheng, Q. S., Dakroury, S. A., Mohamed, A. S., Bakr, M. H., Madsen, K., and Sondergaard, J., 2004, “Space Mapping: the State of the Art,” IEEE Trans. Microwave Theory Tech., 52(1), pp. 337–361.

[CrossRef]Forrester, A. I. J., and Keane, A. J., 2009, “Recent Advances in Surrogate-Based Optimization,” Prog. Aerosp. Sci., 45(1–3), pp. 50–79.

[CrossRef]Booker, A. J., Dennis, J. E., Jr., Frank, P. D., Serafini, D. B., Torczon, V., and Trosset, M. W., 1999, “A Rigorous Framework for Optimization of Expensive Functions by Surrogates,” Struct. Multidiscip. Optim., 17(1), pp. 1–13.

[CrossRef]Serafini, D. B., 1998, “A Framework for Managing Models in Nonlinear Optimization of Computationally Expensive Functions,” Ph.D. thesis, Department of Computational and Applied Mathematics, Rice University, Houston, TX.

Conn, A. R., and Le Digabel, S., 2013, “Use of Quadratic Models With Mesh-Adaptive Direct Search for Constrained Black Box Optimization,” Optim. Methods Software, 28(1), pp. 139–158.

[CrossRef]Gramacy, R. B., and Le Digabel, S., 2011, “The Mesh Adaptive Direct Search Algorithm With Treed Gaussian Process Surrogates,” Les cahiers du GERAD, Technical Report No. G-2011-37. Available at:

[CrossRef]. To appear in the Pacific Journal of Optimization.

Taddy, M. A., Gramacy, R. B., and Polson, N. G., 2011, “Dynamic Trees for Learning and Design,” J. Am. Stat. Assoc., 106(493), pp. 109–123.

[CrossRef]Clarke, F. H., 1983, *Optimization and Nonsmooth Analysis*, Wiley, New York (reissued in 1990 by SIAM Publications, Philadelphia, PA, as Vol. 5 in the series Classics in Applied Mathematics).

Le Digabel, S., 2011, “Algorithm 909: NOMAD: Nonlinear Optimization With the MADS Algorithm,” ACM Trans. Math. Software, 37(4), pp. 1–44.

[CrossRef]Goldberg, D. E., 1989, *Genetic Algorithms in Search, Optimization and Machine Learning*, Addison-Wesley Longman, Boston, MA.

Audet, C., Béchard, V., and Le Digabel, S., 2008, “Nonsmooth Optimization Through Mesh Adaptive Direct Search and Variable Neighborhood Search,” J. Global Optim., 41(2), pp. 299–318.

[CrossRef]Vaz, A. I. F., and Vicente, L. N., 2007, “A Particle Swarm Pattern Search Method for Bound Constrained Global Optimization,” J. Global Optim., 39(2), pp. 197–219.

[CrossRef]McKay, M. D., Beckman, R. J., and Conover, W. J., 1979, “A Comparison of Three Methods for Selecting Values of Input Variables in the Analysis of Output From a Computer Code,” Technometrics, 21(2), pp. 239–245.

Custódio, A. L., Rocha, H., and Vicente, L. N., 2010, “Incorporating Minimum Frobenius Norm Models in Direct Search,” Comput. Optim. Appl., 46(2), pp. 265–278.

[CrossRef]Fletcher, R., and Leyffer, S., 2002, “Nonlinear Programming Without a Penalty Function,” Math. Program. Ser. A, 91(2), pp. 239–269.

[CrossRef]Gramacy, R. B., Taddy, M. A., and Wild, S. M., 2013, “Variable Selection and Sensitivity Analysis Using Dynamic Trees, With an Application to Computer Code Performance Tuning,” Ann. Appl. Stat., 7(1), pp. 51–80.

[CrossRef]Chipman, H. A., George, E. I., and McCulloch, R. E., 1998, “Bayesian CART Model Search (With Discussion),” J. Am. Stat. Assoc., 93(443), pp. 935–960.

[CrossRef]Chipman, H. A., George, E. I., and McCulloch, R. E., 2002 “Bayesian Treed Models,” Mach. Learn., 48(1–3), pp. 299–320.

[CrossRef]Carvalho, C. M., Johannes, M., Lopes, H. F., and Polson, N. G., 2010, “Particle Learning and Smoothing,” Stat. Sci., 25(1), pp. 88–106.

[CrossRef]Carvalho, C. M., Lopes, H. F., Polson, N. G., and Taddy, M. A., 2010, “Particle Learning for General Mixtures,” Bayesian Anal., 5(4), pp. 709–740.

[CrossRef]Cortes, C., and Vapnik, V., 1995, “Support-Vector Networks,” Mach. Learn., 20(3), pp. 273–297.

Liem, R. P., 2007, “Surrogate Modeling for Large-Scale Black-Box Systems,” Massachusetts Institute of Technology, Computation for Design and Optimization Program. Available at:

http://hdl.handle.net/1721.1/41559Bai, Z., 2002, “Krylov Subspace Techniques for Reduced-Order Modeling of Large-Scale Dynamical Systems,” Appl. Numer. Math., 43(1–2), pp. 9–44.

[CrossRef]Willcox, K., and Peraire, J., 2002, “Balanced Model Reduction Via the Proper Orthogonal Decomposition,” AIAA J., 40(11), pp. 2323–2330.

[CrossRef]Abramson, M. A., Audet, C., Couture, G., Dennis, J. E., Jr., Le Digabel, S., and Tribes, C., “The NOMAD Project,”

http://www.gerad.ca/nomadAudet, C., Dennis, J. E., Jr., and Le Digabel, S., 2008, “Parallel Space Decomposition of the Mesh Adaptive Direct Search Algorithm,” SIAM J. Optim., 19(3), pp. 1150–1170.

[CrossRef]Lukšan, L., and Vlček, J., 2000, “Test Problems for Nonsmooth Unconstrained and Linearly Constrained Optimization,” Technical Report No. V-798.

Audet, C., and Dennis, J. E., Jr., 2009, “A Progressive Barrier for Derivative-Free Nonlinear Programming,” SIAM J. Optim., 20(1), pp. 445–472.

[CrossRef]Hock, W., and Schittkowski, K., 1981, “Test Examples for Nonlinear Programming Codes,” *Lecture Notes in Economics and Mathematical Systems*, Vol. 187, Springer Verlag, Berlin, Germany.

Tribes, C., Dubé, J.-F., and Trépanier, J.-Y., 2005, “Decomposition of Multidisciplinary Optimization Problems: Formulations and Application to a Simplified Wing Design,” Eng. Optim., 37(8), pp. 775–796.

[CrossRef]Kodiyalam, S., 2001, “Multidisciplinary Aerospace Systems Optimization,” Lockheed Martin Space Systems Company, Computational AeroSciences Project, Sunnyvale, CA, Technical Report No. NASA/CR-2001-211053.

AIAA/UTC/Pratt & Whitney, 1995/1996, Undergraduate Individual Aircraft Design Competition.

Moré, J. J., and Wild, S. M., 2009, “Benchmarking Derivative-Free Optimization Algorithms,” SIAM J. Optim., 20(1), pp. 172–191.

[CrossRef]Krige, D. G., 1951, “A Statistical Approach to Some Mine Valuations and Allied Problems at the Witwatersrand,” Master’s thesis, University of Witwatersrand, Johannesburg, South Africa.

Rasmussen, C. E., and Williams, C. K. I., 2006, *Gaussian Processes for Machine Learning*, MIT Press, Cambridge, MA.

Gramacy, R. B., and Lee, H. K. H., 2008, “Bayesian Treed Gaussian Process Models With an Application to Computer Modeling,” J. Am. Stat. Assoc., 103(483), pp. 1119–1130.

[CrossRef]