0
Research Papers: Design Automation

Bayesian Optimal Design of Experiments for Inferring the Statistical Expectation of Expensive Black-Box Functions

[+] Author and Article Information
Piyush Pandita

School of Mechanical Engineering,
Purdue University,
West Lafayette, IN 47907
e-mail: ppandit@purdue.edu

Ilias Bilionis

School of Mechanical Engineering,
Purdue University,
West Lafayette, IN 47907
e-mail: ibilion@purdue.edu

Jitesh Panchal

School of Mechanical Engineering,
Purdue University,
West Lafayette, IN 47907
e-mail: panchal@purdue.edu

1Corresponding author.

Contributed by the Design Automation Committee of ASME for publication in the Journal of Mechanical Design. Manuscript received July 26, 2018; final manuscript received May 28, 2019; published online July 10, 2019. Assoc. Editor: James T. Allison.

J. Mech. Des 141(10), 101404 (Jul 10, 2019) (11 pages) Paper No: MD-18-1592; doi: 10.1115/1.4043930 History: Received July 26, 2018; Accepted May 29, 2019

Bayesian optimal design of experiments (BODEs) have been successful in acquiring information about a quantity of interest (QoI) which depends on a black-box function. BODE is characterized by sequentially querying the function at specific designs selected by an infill-sampling criterion. However, most current BODE methods operate in specific contexts like optimization, or learning a universal representation of the black-box function. The objective of this paper is to design a BODE for estimating the statistical expectation of a physical response surface. This QoI is omnipresent in uncertainty propagation and design under uncertainty problems. Our hypothesis is that an optimal BODE should be maximizing the expected information gain in the QoI. We represent the information gain from a hypothetical experiment as the Kullback–Liebler (KL) divergence between the prior and the posterior probability distributions of the QoI. The prior distribution of the QoI is conditioned on the observed data, and the posterior distribution of the QoI is conditioned on the observed data and a hypothetical experiment. The main contribution of this paper is the derivation of a semi-analytic mathematical formula for the expected information gain about the statistical expectation of a physical response. The developed BODE is validated on synthetic functions with varying number of input-dimensions. We demonstrate the performance of the methodology on a steel wire manufacturing problem.

FIGURES IN THIS ARTICLE
<>
Copyright © 2019 by ASME
Your Session has timed out. Please sign back in to continue.

References

Sacks, J., Welch, W. J., Mitchell, T. J., and Wynn, H. P., 1989, “Design and Analysis of Computer Experiments,” Stat. Sci., 4(4), pp. 409–423. [CrossRef]
Flournoy, N., 1993, Bayesian Statistics in Science and Technology: Case Studies, C. Gatsonis, J. Hodges, R. E. Kass, and N. Singpurwalla, eds., Springer, New York, pp. 324–336.
Eriksson, L., Johansson, E., Kettaneh-Wold, N., Wikström, C., and Wold, S., 2000, Principles and Applications, Learn Ways AB, Stockholm.
Anderson, M. J., and Whitcomb, P. J., 2000, Design of Experiments, Wiley Online Library, New York.
Alexanderian, A., Petra, N., Stadler, G., and Ghattas, O., 2014, “A-Optimal Design of Experiments for Infinite-Dimensional Bayesian Linear Inverse Problems With Regularized L0 Sparsification,” SIAM J. Sci. Comput., 36(5), pp. A2122–A2148. [CrossRef]
Montgomery, D. C., 2017, Design and Analysis of Experiments, John Wiley & Sons, New York.
Chaloner, K., and Verdinelli, I., 1995, “Bayesian Experimental Design: A Review,” Stat. Sci., 10(3), pp. 273–304. [CrossRef]
Chernoff, H., 1959, “Sequential Design of Experiments,” Ann. Math. Stat., 30(3), pp. 755–770. [CrossRef]
Robbins, H., 1952, “Some Aspects of the Sequential Design of Experiments,” Bulletin of the American Mathematical Society, 58(5), pp. 527–535. [CrossRef]
Havinga, J., Klaseboer, G., and Van den Boogaard, A., 2013, Key Engineering Materials:The Current State-of-the-Art on Material Forming, R. Alves de Sousa and R. Valente, eds., softcover, Vol. 554, Trans Tech Publications, Aveiro, Portugal, pp. 911–918.
Havinga, J., van den Boogaard, A. H., and Klaseboer, G., 2017, “Sequential Improvement for Robust Optimization Using an Uncertainty Measure for Radial Basis Functions,” Struct. Multidiscipl. Optim., 55(4), pp. 1345–1363. [CrossRef]
Alrefae, M. A., 2018, “Process Characterization and Optimization of Roll-to-Roll Plasma Chemical Vapor Deposition for Graphene Growth,” Ph.D. thesis, Purdue University, West Lafayette, IN.
Saviers, K. R., 2017, “Scaled-Up Production and Transport Applications of Graphitic Carbon Nanomaterials,” Ph.D. thesis, Purdue University, West Lafayette, IN.
Schonlau, M., 1997, “Computer Experiments and Global Optimization,” Ph.D. thesis, University of Waterloo, Ontario, Canada.
Simpson, T. W., Lin, D. K., and Chen, W., 2001, “Sampling Strategies for Computer Experiments: Design and Analysis,” Int. J. Reliab. Appl., 2(3), pp. 209–240.
Gramacy, R. B., and Ludkovski, M., 2015, “Sequential Design for Optimal Stopping Problems,” SIAM J. Financ. Math., 6(1), pp. 748–775. [CrossRef]
Huan, X., 2010, “Accelerated Bayesian Experimental Design for Chemical Kinetic Models,” Ph.D. thesis, Massachusetts Institute of Technology, Cambridge, MA.
Locatelli, M., 1997, “Bayesian Algorithms for One-Dimensional Global Optimization,” J. Global Optim., 10(1), pp. 57–76. [CrossRef]
Jones, D. R., Schonlau, M., and Welch, W. J., 1998, “Efficient Global Optimization of Expensive Black-Box Functions,” J. Global Optim., 13(4), pp. 455–492. [CrossRef]
Gaul, N. J., 2014, Modified Bayesian Kriging for Noisy Response Problems and Bayesian Confidence-Based Reliability-Based Design Optimization, The University of Iowa, Iowa City, IA.
Huang, D., Allen, T. T., Notz, W. I., and Zeng, N., 2006, “Global Optimization of Stochastic Black-Box Systems Via Sequential Kriging Meta-Models,” J. Global Optim., 34(3), pp. 441–466. [CrossRef]
Lizotte, D.,2008, Ph.D. thesis, University of Alberta, Edmonton, Alberta, Canada.
Frazier, P. I., Powell, W. B., and Dayanik, S., 2008, “A Knowledge-Gradient Policy for Sequential Information Collection,” SIAM J. Control Optim., 47(5), pp. 2410–2439. [CrossRef]
Mockus, J., 2012, Bayesian Approach to Global Optimization: Theory and Applications, Vol. 37, Springer Science & Business Media, New York.
Arendt, P. D., Apley, D. W., and Chen, W., 2013, “Objective-Oriented Sequential Sampling for Simulation Based Robust Design Considering Multiple Sources of Uncertainty,” ASME J. Mech. Des., 135(5), p. 051005. [CrossRef]
Huan, X., and Marzouk, Y., 2014, “Gradient-Based Stochastic Optimization Methods in Bayesian Experimental Design,” Int. J. Uncertain. Quantif., 4(6), pp. 479–510. [CrossRef]
Lam, R., Willcox, K., and Wolpert, D. H., 2016, Advances in Neural Information Processing Systems, Curran Associates Inc., pp. 883–891.
Marco, A., Hennig, P., Bohg, J., Schaal, S., and Trimpe, S., 2016, “Automatic lqr Tuning Based on Gaussian Process Global Optimization,” 2016 IEEE International Conference on Robotics and Automation (ICRA), Stockholm, Sweden, IEEE, pp. 270–277.
Kristensen, J., Bilionis, I., and Zabaras, N., 2017, “Adaptive Simulation Selection for the Discovery of the Ground State Line of Binary Alloys With a Limited Computational Budget,” Recent Progress and Modern Challenges in Applied Mathematics, Modeling and Computational Science, Springer, New York, pp. 185–211.
Christen, J. A., and Sansó, B., 2011, “Advances in the Sequential Design of Computer Experiments Based on Active Learning,” Commun. Stat. Theory Methods, 40(24), pp. 4467–4483. [CrossRef]
MacKay, D. J., 1992, “Information-Based Objective Functions for Active Data Selection,” Neural Comput., 4(4), pp. 590–604. [CrossRef]
Krause, A., Singh, A., and Guestrin, C., 2008, “Near-Optimal Sensor Placements in Gaussian Processes: Theory, Efficient Algorithms and Empirical Studies,” J. Mach. Learn. Res., 9(Feb), pp. 235–284.
Stroh, R., Demeyer, S., Fischer, N., Bect, J., and Vazquez, E., 2017, “Sequential Design of Experiments to Estimate a Probability of Exceeding a Threshold in a Multi-Fidelity Stochastic Simulator,” 61th World Statistics Congress of the International Statistical Institute (ISI 2017), Marrakech, Morocco, July 16–21.
Beck, J., and Guillas, S., 2016, “Sequential Design With Mutual Information for Computer Experiments (Mice): Emulation of a Tsunami Model,” SIAM/ASA J. Uncertain. Quantif., 4(1), pp. 739–766. [CrossRef]
Gramacy, R. B., and Lee, H. K., 2009, “Adaptive Design and Analysis of Supercomputer Experiments,” Technometrics, 51(2), pp. 130–145. [CrossRef]
Terejanu, G., Upadhyay, R. R., and Miki, K., 2012, “Bayesian Experimental Design for the Active Nitridation of Graphite by Atomic Nitrogen,” Exp. Therm. Fluid. Sci., 36, pp. 178–193. [CrossRef]
Mohamad, M. A., 2017, “Direct and Adaptive Quantification Schemes for Extreme Event Statistics in Complex Dynamical Systems,” Ph.D. thesis, Massachusetts Institute of Technology, Cambridge, MA.
Mohamad, M. A., and Sapsis, T. P., 2018, “A Sequential Sampling Strategy for Extreme Event Statistics in Nonlinear Dynamical Systems,” preprint arXiv:1804.07240.
Kullback, S., and Leibler, R. A., 1951, “On Information and Sufficiency,” Ann. Math. Stat., 22(1), pp. 79–86. [CrossRef]
Mckay, M. D., Beckman, R. J., and Conover, W. J., 2000, “A Comparison of Three Methods for Selecting Values of Input Variables in the Analysis of Output From a Computer Code,” Technometrics, 42(1), pp. 55–61. [CrossRef]
Tsilifis, P., Ghanem, R. G., and Hajali, P., 2017, “Efficient Bayesian Experimentation Using an Expected Information Gain Lower Bound,” SIAM/ASA J. Uncertain. Quantif., 5(1), pp. 30–62. [CrossRef]
Nath, P., Hu, Z., and Mahadevan, S., 2017, “Sensor Placement for Calibration of Spatially Varying Model Parameters,” J. Comput. Phys., 343, pp. 150–169. [CrossRef]
Yan, L., Duan, X., Liu, B., and Xu, J., 2018, “Gaussian Processes and Polynomial Chaos Expansion for Regression Problem: Linkage Via the Rkhs and Comparison Via the KL Divergence,” Entropy, 20(3), p. 191. [CrossRef]
Choi, S.-K., Grandhi, R. V., Canfield, R. A., and Pettit, C. L., 2004, “Polynomial Chaos Expansion With Latin Hypercube Sampling for Estimating Response Variability,” AIAA J., 42(6), pp. 1191–1198. [CrossRef]
Hadigol, M., and Doostan, A., 2017, “Least Squares Polynomial Chaos Expansion: A Review of Sampling Strategies,” Comput. Methods Appl. Mech. Eng., 332, pp. 382–407. [CrossRef]
Terejanu, G., Bryant, C., and Miki, K., 2013, “Bayesian Optimal Experimental Design for the Shock-Tube Experiment,” J. Phys.: Conf. Ser., 410(1), pp. 012040. [CrossRef]
Hennig, P., and Schuler, C. J., 2012, “Entropy Search for Information-Efficient Global Optimization,” J. Mach. Learn. Res., 13(Jun.), pp. 1809–1837.
Guestrin, C., Krause, A., and Singh, A. P., 2005, “Near-Optimal Sensor Placements in Gaussian Processes,” Proceedings of the 22nd International Conference on Machine Learning, Bonn, Germany, ACM, pp. 265–272.
Huan, X., and Marzouk, Y. M., 2013, “Simulation-Based Optimal Bayesian Experimental Design for Nonlinear Systems,” J. Comput. Phys., 232(1), pp. 288–317. [CrossRef]
Picheny, V., Ginsbourger, D., Roustant, O., Haftka, R. T., and Kim, N.-H., 2010, “Adaptive Designs of Experiments for Accurate Approximation of a Target Region,” ASME J. Mech. Des., 132(7), p. 071008. [CrossRef]
Xiao, N.-C., Zuo, M. J., and Zhou, C., 2018, “A New Adaptive Sequential Sampling Method to Construct Surrogate Models for Efficient Reliability Analysis,” Reliab. Eng. Syst. Saf., 169(C), pp. 330–338. [CrossRef]
Liu, H., Xu, S., Ma, Y., Chen, X., and Wang, X., 2016, “An Adaptive Bayesian Sequential Sampling Approach for Global Metamodeling,” ASME J. Mech. Des., 138(1), p. 011404. [CrossRef]
Liu, H., Chen, W., and Sudjianto, A., 2006, “Relative Entropy Based Method for Probabilistic Sensitivity Analysis in Engineering Design,” ASME J. Mech. Des., 128(2), pp. 326–336. [CrossRef]
Gonzalvez, J., Lezmi, E., Roncalli, T., and Xu, J., 2019, “Financial Applications of Gaussian Processes and Bayesian Optimization,” preprint arXiv:1903.04841.
Wu, J., Toscano-Palmerin, S., Frazier, P. I., and Wilson, A. G., 2019, “Practical Multi-Fidelity Bayesian Optimization for Hyperparameter Tuning,” preprint arXiv:1903.04703.
OHagan, A., 2006, “Bayesian Analysis of Computer Code Outputs: A Tutorial,” Reliab. Eng. Syst. Saf., 91(10–11), pp. 1290–1300. [CrossRef]
Briol, F.-X., Oates, C., Girolami, M., and Osborne, M. A., 2015, “Frank–Wolfe Bayesian Quadrature: Probabilistic Integration With Theoretical Guarantees,” Advances in Neural Information Processing Systems, pp. 1162–1170.
Oates, C. J., Girolami, M., and Chopin, N., 2017, “Control Functionals for Monte Carlo Integration,” J. R. Stat. Soc.: Ser. B (Stat. Methodol.), 79(3), pp. 695–718. [CrossRef]
Rasmussen, C. E., and Williams, C. K. I., 2006, Gaussian Processes for Machine Learning. Adaptive Computation and Machine Learning, MIT Press, Cambridge, MA.
Gelman, A., Carlin, J. B., Stern, H. S., Dunson, D. B., Vehtari, A., and Rubin, D. B., 2014, Bayesian Data Analysis, Vol. 2, CRC Press, Boca Raton, FL.
Goodman, J., and Weare, J., 2010, “Ensemble Samplers With Affine Invariance,” Commun. Appl. Math. Comput. Sci., 5(1), pp. 65–80. [CrossRef]
Duchi, J., 2007, Derivations for Linear Algebra and Optimization, Stanford University, Berkeley, CA.
Foreman-Mackey, D., Hogg, D. W., Lang, D., and Goodman, J., 2013, “EMCEE: The MCMC Hammer,” Publ. Astron. Soc. Pacific, 125(925), p. 306. [CrossRef]
Sóbester, A., Leary, S. J., and Keane, A. J., 2005, “On the Design of Optimization Strategies Based on Global Response Surface Approximation Models,” J. Global Optim., 33(1), pp. 31–59. [CrossRef]
Dette, H., and Pepelyshev, A., 2010, “Generalized Latin Hypercube Design for Computer Experiments,” Technometrics, 52(4), pp. 421–429. [CrossRef]
Knowles, J., 2006, “Parego: A Hybrid Algorithm With On-Line Landscape Approximation for Expensive Multiobjective Optimization Problems,” IEEE Trans. Evol. Comput., 10(1), pp. 50–66. [CrossRef]
Bui-Thanh, T., Willcox, K., and Ghattas, O., 2008, “Model Reduction for Large-Scale Systems With High-Dimensional Parametric Input Space,” SIAM J. Sci. Comput., 30(6), pp. 3270–3288. [CrossRef]

Figures

Grahic Jump Location
Fig. 1

One-dimensional synthetic problem (ni = 3). (a) and (b) The state of the function (1st iteration) at the start and the end (15th iteration) of the algorithm. (c) The convergence to the true expectation of the function and the reduction in uncertainty about the QoI after the end of the algorithm.

Grahic Jump Location
Fig. 2

One-dimensional synthetic example (ni = 3). (a) and (b) The state of the function at the start (1st iteration) and the end (25th iteration) of the algorithm. (c) The convergence to the true expectation of the function and the reduction in uncertainty about the QoI after the end of the algorithm.

Grahic Jump Location
Fig. 3

One-dimensional synthetic examples. (a) and (b) The predictive mean of the EKLD for synthetic problem no. 1 (ni = 3) and synthetic problem no. 2 (ni = 4), respectively.

Grahic Jump Location
Fig. 4

Three-dimensional synthetic example (ni = 2). (a) The decay of the EKLD from the 1st iteration to the end of the 30th iteration of the algorithm. (b) The convergence to the true value of the QoI.

Grahic Jump Location
Fig. 5

Five-dimensional synthetic example (ni = 20). (a) The decay of the EKLD from the 1st iteration to the end of the 45th iteration of the algorithm. (b) Convergence to the true value of the QoI.

Grahic Jump Location
Fig. 6

Wire drawing problem (ni = 20) after 75 iterations

Grahic Jump Location
Fig. 7

(a)–(c) The comparison of the EKLD to uncertainty sampling for synthetic problem nos. 1, 2, and 3, respectively

Grahic Jump Location
Fig. 8

(a) and (b) The comparison of the EKLD to uncertainty sampling for synthetic problem no. 4 and the wire-drawing problem, respectively

Tables

Errata

Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging and repositioning the boxes below.

Related Journal Articles
Related eBook Content
Topic Collections

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In