0


Guest Editorial

J. Mech. Des. 2016;138(11):110301-110301-3. doi:10.1115/1.4034536.
FREE TO VIEW

Uncertainty quantification and propagation using probabilistic and nonprobabilistic methods are essential in many engineering and nonengineering disciplines. In mechanical design, there is an ever-increasing need to design systems considering uncertainty and variability using simulation models. The past decade has seen a significant growth in uncertainty quantification, propagation, and design. The “Simulation-Based Design Under Uncertainty” special session of the ASME Design Automation Conference (DAC) has been attracting many papers every year for more than twelve years. Design under uncertainty has implications in decision-making as well as reliability, quality, safety, and risk tolerance of many products. This special issue covers various related topics under the general umbrella of simulation-based design under uncertainty, including methods, models, and case studies.

Commentary by Dr. Valentin Fuster

Research Papers: Design Automation

J. Mech. Des. 2016;138(11):111401-111401-11. doi:10.1115/1.4034106.

The design of complex systems often requires reliability assessments involving a large number of uncertainties and low probability of failure estimations (in the order of 10−4). Estimating such rare event probabilities with crude Monte Carlo (CMC) is computationally intractable. Specific numerical methods to reduce the computational cost and the variance estimate have been developed such as importance sampling or subset simulation. However, these methods assume that the uncertainties are defined within the probability formalism. Regarding epistemic uncertainties, the interval formalism is particularly adapted when only their definition domain is known. In this paper, a method is derived to assess the reliability of a system with uncertainties described by both probability and interval frameworks. It allows one to determine the bounds of the failure probability and involves a sequential approach using subset simulation, kriging, and an optimization process. To reduce the simulation cost, a refinement strategy of the surrogate model is proposed taking into account the presence of both aleatory and epistemic uncertainties. The method is compared to existing approaches on an analytical example as well as on a launch vehicle fallout zone estimation problem.

Commentary by Dr. Valentin Fuster
J. Mech. Des. 2016;138(11):111402-111402-12. doi:10.1115/1.4034089.

Validating dynamic engineering models is critically important in practical applications by assessing the agreement between simulation results and experimental observations. Though significant progresses have been made, the existing metrics lack the capability of managing uncertainty in both simulations and experiments. In addition, it is challenging to validate a dynamic model aggregately over both the time domain and a model input space with data at multiple validation sites. To overcome these difficulties, this paper presents an area-based metric to systemically handle uncertainty and validate computational models for dynamic systems over an input space by simultaneously integrating the information from multiple validation sites. To manage the complexity associated with a high-dimensional data space, eigenanalysis is performed for the time series data from simulations at each validation site to extract the important features. A truncated Karhunen–Loève (KL) expansion is then constructed to represent the responses of dynamic systems, resulting in a set of uncorrelated random coefficients with unit variance. With the development of a hierarchical data-fusion strategy, probability integral transform (PIT) is then employed to pool all the resulting random coefficients from multiple validation sites across the input space into a single aggregated metric. The dynamic model is thus validated by calculating the cumulative area difference of the cumulative density functions. The proposed model validation metric for dynamic systems is illustrated with a mathematical example, a supported beam problem with stochastic loads, and real data from the vehicle occupant-restraint system.

Commentary by Dr. Valentin Fuster
J. Mech. Des. 2016;138(11):111403-111403-10. doi:10.1115/1.4034108.

Binary-state and component independent assumptions will lead to doubtful and misleading redundancy allocation schemes which may not satisfy the reliability requirements for real engineering applications. Most published works proposed methods to remove the first assumption by studying the degradation cases where multiple states of a component are from the best state to the degradation states then to the completely failed state. Fewer works focused on removing the second assumption and they only discussed dependent failures which are only a special case of component dependency. This work uses the Semi-Markov process to describe a two-component system for redundancy allocation. In this work, multiple states of a component are represented by multiple output levels, which are beyond the scope of degradation, and the component dependency is not limited to failure dependency only. The load sharing is also taken care of in the proposed work. The optimal redundancy allocation scheme is obtained by solving the corresponding redundancy allocation optimization problem with the reliability measure, the system availability, obtained through the Semi-Markov process model being constraint. Two case studies are presented, demonstrating the applicability of the propose method.

Commentary by Dr. Valentin Fuster
J. Mech. Des. 2016;138(11):111404-111404-3. doi:10.1115/1.4034109.

Significant efforts have been recently devoted to the qualitative and quantitative evaluation of resilience in engineering systems. Current resilience evaluation methods, however, have mainly focused on business supply chains and civil infrastructure, and need to be extended for application in engineering design. A new resilience metric is proposed in this paper for the design of mechanical systems to bridge this gap, by investigating the effects of recovery activity and system failure paths on system resilience. The defined resilience metric is connected to design through time-dependent system reliability analysis. This connection enables us to design a system for a specific resilience target in the design stage. Since computationally expensive computer simulations are usually used in design, a surrogate modeling method is developed to efficiently perform time-dependent system reliability analysis. Based on the time-dependent system reliability analysis, dominant system failure paths are enumerated and then the system resilience is estimated. The connection between the proposed resilience assessment method and design is explored through sensitivity analysis and component importance measure (CIM). Two numerical examples are used to illustrate the effectiveness of the proposed resilience assessment method.

Commentary by Dr. Valentin Fuster
J. Mech. Des. 2016;138(11):111405-111405-11. doi:10.1115/1.4034222.

Uncertainty is unavoidable in engineering design, which may result in variations in the objective functions and/or constraints. The former may degrade the designed performance while the latter can even change the feasibility of the obtained optimal solutions. Taking uncertainty into consideration, robust optimization (RO) algorithms aim to find optimal solutions that are also insensitive to uncertainty. Uncertainty may include variation in parameters and/or design variables, inaccuracy in simulation models used in design problems, and other possible errors. Most existing RO algorithms only consider uncertainty in parameters, but overlook that in simulation models by assuming that the simulation model used can always provide identical outputs to those of the real physical systems. In this paper, we propose a new RO framework using Gaussian processes, considering not only parameter uncertainty but also uncertainty in simulation models. The consideration of model uncertainty in RO could reduce the risk for the obtained robust optimal designs becoming infeasible even if the parameter uncertainty has been considered. Two test examples with different degrees of complexity are utilized to demonstrate the applicability and effectiveness of our proposed algorithm.

Commentary by Dr. Valentin Fuster
J. Mech. Des. 2016;138(11):111406-111406-10. doi:10.1115/1.4034113.

One real challenge for multidisciplinary design optimization (MDO) problems to gain a robust solution is the propagation of uncertainty from one discipline to another. Most existing methods only consider an MDO problem in the deterministic manner or find a solution which is robust for a single-disciplinary optimization problem. These rare methods for solving MDO problems under uncertainty are usually computationally expensive. This paper proposes a robust sequential MDO (RS-MDO) approach based on a sequential MDO (S-MDO) framework. First, a robust solution is obtained by giving each discipline full autonomy to perform optimization without considering other disciplines. A tolerance range is specified for each coupling variable to take care of uncertainty propagation in the coupled system. Then the obtained robust extreme points of global variables and coupling variables are dispatched into subsystems to perform robust optimization (RO) sequentially. Additional constraints are added in each subsystem to keep the consistency and to guarantee a robust solution. To find a solution with such strict constraints, genetic algorithm (GA) is used as a solver in each optimization stage. The proposed RS-MDO can save significant amount of computational efforts by using the sequential optimization procedure. Since all iterations in the sequential optimization stage can be processed in parallel, this robust MDO approach can be more time-saving. Numerical and engineering examples are provided to demonstrate the availability and effectiveness of the proposed approach.

Commentary by Dr. Valentin Fuster
J. Mech. Des. 2016;138(11):111407-111407-12. doi:10.1115/1.4034112.

The robustness of a design has a major influence on how much the product's performance will vary and is of great concern to design, quality, and production engineers. While variability is always central to the definition of robustness, the concept does contain ambiguity, and although subtle, this ambiguity can have significant influence on the strategies used to combat variability, the way it is quantified and ultimately, the quality of the final design. In this contribution, the literature for robustness metrics was systematically reviewed. From the 108 relevant publications found, 38 metrics were determined to be conceptually different from one another. The metrics were classified by their meaning and interpretation based on the types of the information necessary to calculate the metrics. Four different classes were identified: (1) sensitivity robustness metrics; (2) size of feasible design space robustness metrics; (3) functional expectancy and dispersion robustness metrics; and (4) probability of compliance robustness metrics. The goal was to give a comprehensive overview of robustness metrics and guidance to scholars and practitioners to understand the different types of robustness metrics and to remove the ambiguities of the term robustness. By applying an exemplar metric from each class to a case study, the differences between the classes were further highlighted. These classes form the basis for the definition of four specific subdefinitions of robustness, namely the “robust concept,” “robust design,” “robust function,” and “robust product.”

Topics: Robustness , Design
Commentary by Dr. Valentin Fuster
J. Mech. Des. 2016;138(11):111408-111408-13. doi:10.1115/1.4034223.
OPEN ACCESS

A resilient system is a system that possesses the ability to survive and recover from the likelihood of damage due to disruptive events or mishaps. The concept that incorporates resiliency into engineering practices is known as engineering resilience. To date, engineering resilience is still predominantly application-oriented. Despite an increase in the usage of engineering resilience concept, the diversity of its applications in various engineering sectors complicates a universal agreement on its quantification and associated measurement techniques. There is a pressing need to develop a generally applicable engineering resilience analysis framework, which standardizes the modeling, assessment, and improvement of engineering resilience for a broader engineering discipline. This paper provides a literature survey of engineering resilience from the design perspective, with a focus on engineering resilience metrics and their design implications. The currently available engineering resilience quantification metrics are reviewed and summarized, the design implications toward the development of resilient-engineered systems are discussed, and further, the challenges of incorporating resilience into engineering design processes are evaluated. The presented study expects to serve as a building block toward developing a generally applicable engineering resilience analysis framework that can be readily used for system design.

Topics: Design , Resilience
Commentary by Dr. Valentin Fuster
J. Mech. Des. 2016;138(11):111409-111409-13. doi:10.1115/1.4034347.

Early in the design process, there is often mixed epistemic model uncertainty and aleatory parameter uncertainty. Later in the design process, the results of high-fidelity simulations or experiments will reduce epistemic model uncertainty and may trigger a redesign process. Redesign is undesirable because it is associated with costs and delays; however, it is also an opportunity to correct a dangerous design or possibly improve design performance. In this study, we propose a margin-based design/redesign method where the design is optimized deterministically, but the margins are selected probabilistically. The final design is an epistemic random variable (i.e., it is unknown at the initial design stage) and the margins are optimized to control the epistemic uncertainty in the final design, design performance, and probability of failure. The method allows for the tradeoff between expected final design performance and probability of redesign while ensuring reliability with respect to mixed uncertainties. The method is demonstrated on a simple bar problem and then on an engine design problem. The examples are used to investigate the dilemma of whether to start with a higher margin and redesign if the test later in the design process reveals the design to be too conservative, or to start with a lower margin and redesign if the test reveals the design to be unsafe. In the examples in this study, it is found that this decision is related to the variance of the uncertainty in the high-fidelity model relative to the variance of the uncertainty in the low-fidelity model.

Commentary by Dr. Valentin Fuster
J. Mech. Des. 2016;138(11):111410-111410-12. doi:10.1115/1.4034224.
OPEN ACCESS

Sensitivity analysis plays a critical role in quantifying uncertainty in the design of engineering systems. A variance-based global sensitivity analysis is often used to rank the importance of input factors, based on their contribution to the variance of the output quantity of interest. However, this analysis assumes that all input variability can be reduced to zero, which is typically not the case in a design setting. Distributional sensitivity analysis (DSA) instead treats the uncertainty reduction in the inputs as a random variable, and defines a variance-based sensitivity index function that characterizes the relative contribution to the output variance as a function of the amount of uncertainty reduction. This paper develops a computationally efficient implementation for the DSA formulation and extends it to include distributions commonly used in engineering design under uncertainty. Application of the DSA method to the conceptual design of a commercial jetliner demonstrates how the sensitivity analysis provides valuable information to designers and decision-makers on where and how to target uncertainty reduction efforts.

Commentary by Dr. Valentin Fuster
J. Mech. Des. 2016;138(11):111411-111411-11. doi:10.1115/1.4034110.

Multidisciplinary analysis (MDA) is nowadays a powerful tool for analysis and optimization of complex systems. The present study is interested in the case where MDA involves feedback loops between disciplines (i.e., the output of a discipline is the input of another and vice versa). When the models for each discipline involve non-negligible modeling uncertainties, it is important to be able to efficiently propagate these uncertainties to the outputs of the MDA. The present study introduces a polynomial chaos expansion (PCE)-based approach to propagate modeling uncertainties in MDA. It is assumed that the response of each disciplinary solver is affected by an uncertainty modeled by a random field over the design and coupling variables space. A semi-intrusive PCE formulation of the problem is proposed to solve the corresponding nonlinear stochastic system. Application of the proposed method emphasizes an important particular case in which each disciplinary solver is replaced by a surrogate model (e.g., kriging). Three application problems are treated, which show that the proposed approach can approximate arbitrary (non-Gaussian) distributions very well at significantly reduced computational cost.

Commentary by Dr. Valentin Fuster
J. Mech. Des. 2016;138(11):111412-111412-8. doi:10.1115/1.4034104.

Design optimization under uncertainty is notoriously difficult when the objective function is expensive to evaluate. State-of-the-art techniques, e.g., stochastic optimization or sampling average approximation, fail to learn exploitable patterns from collected data and require a lot of objective function evaluations. There is a need for techniques that alleviate the high cost of information acquisition and select sequential simulations optimally. In the field of deterministic single-objective unconstrained global optimization, the Bayesian global optimization (BGO) approach has been relatively successful in addressing the information acquisition problem. BGO builds a probabilistic surrogate of the expensive objective function and uses it to define an information acquisition function (IAF) that quantifies the merit of making new objective evaluations. In this work, we reformulate the expected improvement (EI) IAF to filter out parametric and measurement uncertainties. We bypass the curse of dimensionality, since the method does not require learning the response surface as a function of the stochastic parameters, and we employ a fully Bayesian interpretation of Gaussian processes (GPs) by constructing a particle approximation of the posterior of its hyperparameters using adaptive Markov chain Monte Carlo (MCMC) to increase the methods robustness. Also, our approach quantifies the epistemic uncertainty on the location of the optimum and the optimal value as induced by the limited number of objective evaluations used in obtaining it. We verify and validate our approach by solving two synthetic optimization problems under uncertainty and demonstrate it by solving the oil-well placement problem (OWPP) with uncertainties in the permeability field and the oil price time series.

Commentary by Dr. Valentin Fuster

Technical Brief

J. Mech. Des. 2016;138(11):114501-114501-5. doi:10.1115/1.4034346.

Uncertainties, inevitable in nature, can be classified as probability based and interval based uncertainties in terms of its representations. Corresponding optimization strategies have been proposed to deal with these two types of uncertainties. It is more likely that both types of uncertainty can occur in one single problem, and thus, it is trivial to treat all uncertainties the same. A novel formulation for reliability-based design optimization (RBDO) under mixed probability and interval uncertainties is proposed in this paper, in which the objective variation is concerned. Furthermore, it is proposed to efficiently solve the worst-case parameter resulted from the interval uncertainty by utilizing the Utopian solution presented in a single-looped robust optimization (RO) approach where the inner optimization can be solved by matrix operations. The remaining problem can be solved utilizing any existing RBDO method. This work applies the performance measure approach to search for the most probable failure point (MPFP) and sequential quadratic programing (SQP) to solve the entire problem. One engineering example is given to demonstrate the applicability of the proposed approach and to illustrate the necessity to consider the objective robustness under certain circumstances.

Commentary by Dr. Valentin Fuster
J. Mech. Des. 2016;138(11):114502-114502-4. doi:10.1115/1.4034103.

As additive manufacturing (AM) matures, models are beginning to take a more prominent stage in design and process planning. A limitation frequently encountered in AM models is a lack of indication about their precision and accuracy. Often overlooked, model uncertainty is required for validation of AM models, qualification of AM-produced parts, and uncertainty management. This paper presents a discussion on the origin and propagation of uncertainty in laser powder bed fusion (L-PBF) models. Four sources of uncertainty are identified: modeling assumptions, unknown simulation parameters, numerical approximations, and measurement error in calibration data. Techniques to quantify uncertainty in each source are presented briefly, along with estimation algorithms to diminish prediction uncertainty with the incorporation of online measurements. The methods are illustrated with a case study based on a thermal model designed for melt pool width predictions. Model uncertainty is quantified for single track experiments, and the effect of online estimation in overhanging structures is studied via simulation.

Commentary by Dr. Valentin Fuster

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In