J. Mech. Des. 2012;134(10):100201-100201-1. doi:10.1115/1.4007615.

I am delighted to present to our JMD community this special issue on design under uncertainty. JMD has been a leading venue for publishing research in this area over many years. Following some earlier pioneering work, the area has blossomed while we recognize and try to address the frequent critique from our industry colleagues than our “nominal” solutions to design problems can never be realized in practice.

Commentary by Dr. Valentin Fuster

Guest Editorial

J. Mech. Des. 2012;134(10):100301-100301-2. doi:10.1115/1.4007508.

Uncertainty is ubiquitous in engineering design. The past decade has seen a significant growth of research developments in design under uncertainty, and a wide range of applications from designing simple product components to designing complex and emerging engineered systems. While methods like “robust design” and “reliability-based design optimization” have become mature and widely adopted in computational design software, it has become evident that these techniques are mostly limited to handling parametric uncertainty. New methods and strategies for uncertainty characterization, problem formulation, preference elicitation, and risk mitigation are needed for managing many other sources of uncertainty in design such as those associated with modeling and prediction, the design process itself, the product use environment, emergent system behavior, and the changing market.

Commentary by Dr. Valentin Fuster

Special section: New Problem Formulations for Design Under Uncertainty

J. Mech. Des. 2012;134(10):100901-100901-12. doi:10.1115/1.4007533.

In new product design, risk averse firms must consider downside risk in addition to expected profitability, since some designs are associated with greater market uncertainty than others. We propose an approach to robust optimal product design for profit maximization by introducing an α-profit metric to manage expected profitability vs. downside risk due to uncertainty in market share predictions. Our goal is to maximize profit at a firm-specified level of risk tolerance. Specifically, we find the design that maximizes the α-profit: the value that the firm has a (1 − α) chance of exceeding, given the distribution of possible outcomes. The parameter α ∈ (0,1) is set by the firm to reflect sensitivity to downside risk (or upside gain), and parametric study of α reveals the sensitivity of optimal design choices to firm risk preference. We account here only for uncertainty of choice model parameter estimates due to finite data sampling when the choice model is assumed to be correctly specified (no misspecification error). We apply the delta method to estimate the mapping from uncertainty in discrete choice model parameters to uncertainty of profit outcomes and identify the estimated α-profit as a closed-form function of decision variables for the multinomial logit model. An example demonstrates implementation of the method to find the optimal design characteristics of a dial-readout scale using conjoint data.

Commentary by Dr. Valentin Fuster
J. Mech. Des. 2012;134(10):100902-100902-6. doi:10.1115/1.4007394.

The management of end-of-life electronic waste (e-waste) attracts significant attention due to environmental concerns, legislative requirements, consumer interest in green products, and the market image of manufacturers. However, managing e-waste is complicated by several factors, including the high degree of uncertainty of quantity, timing of arrival, and quality of the returned products. This variability in the stream of returned end-of-life (EOL) products makes it difficult to plan for remanufacturing facility materials, equipment, and human resource requirements. The aim of this research is to tackle the uncertainty associated with the quantity of received used products. A stochastic programming model for waste stream acquisition systems (as opposed to market-driven systems) is introduced. The model considers the quantity of returned product as an uncertain parameter and determines to what extent the product should be disassembled and what is the best EOL option for each subassembly. The stochastic model is defined in a form of chance constrained programming and is then converted to a mixed integer linear programming. An example is provided to illustrate the application of the model for an uncertain stream of PCs (minus monitor and keyboard) received in a PC refurbishing company. The remanufacturer must then decide which proportion of disassembled modules should be processed given specific remanufacturing options.

Commentary by Dr. Valentin Fuster
J. Mech. Des. 2012;134(10):100903-100903-15. doi:10.1115/1.4007574.

This article presents an integrated multistate method for the early-phase design of inherently robust systems; namely, those capable, as a prima facie quality, of maintaining adequate performance in the face of probabilistic system events or failures. The methodology merges integrated multidisciplinary analysis techniques for system design with behavioral-Markov analysis methods used to define probabilistic metrics such as reliability and availability. The result is a multistate approach that concurrently manipulates design variables and component failure rates to better identify key features for an inherently robust system. This methodology is demonstrated on the design of a long-endurance unmanned aerial vehicle for a three-month ice surveillance mission over Antarctica. The vehicle is designed using the multistate methodology and then compared to a baseline design created for the best performance under nominal conditions. Results demonstrated an improvement of 10–11% in system availability over this period with minimal impacts on cost or performance.

Commentary by Dr. Valentin Fuster
J. Mech. Des. 2012;134(10):100904-100904-11. doi:10.1115/1.4007396.

The potential for engineering technology to evolve over time can be a critical consideration in design decisions that involve long-term commitments. Investments in manufacturing equipment, contractual relationships, and other factors can make it difficult for engineering firms to backtrack once they have chosen one technology over others. Although engineering technologies tend to improve in performance over time, competing technologies can evolve at different rates and details about how a technology might evolve are generally uncertain. In this article we present a general framework for modeling and making decisions about evolving technologies under uncertainty. In this research, the evolution of technology performance is modeled as an S-curve; the performance evolves slowly at first, quickly during heavy research and development effort, and slowly again as the performance approaches its limits. We extend the existing single-attribute S-curve model to the case of technologies with multiple performance attributes. By combining an S-curve evolutionary model for each attribute with a Pareto frontier representation of the optimal implementations of a technology at a particular point in time, we can project how the Pareto frontier will move over time as a technology evolves. Designer uncertainty about the precise shape of the S-curve model is considered through a Monte Carlo simulation of the evolutionary process. To demonstrate how designers can apply the framework, we consider the scenario of a green power generation company deciding between competing wind turbine technologies. Wind turbines, like many other technologies, are currently evolving as research and development efforts improve performance. The engineering example demonstrates how the multi-attribute technology evolution modeling technique provides designers with greater insight into critical uncertainties present in long-term decision problems.

Commentary by Dr. Valentin Fuster

Special section: Strategies for Design Under Uncertainty

J. Mech. Des. 2012;134(10):100905-100905-14. doi:10.1115/1.4007397.

Engineering change (EC) is a source of uncertainty. While the number of changes to a design can be optimized, their existence cannot be eliminated. Each change is accompanied by intended and unintended impacts both of which might propagate and cause further knock-on changes. Such change propagation causes uncertainty in design time, cost, and quality and thus needs to be predicted and controlled. Current engineering change propagation models map the product connectivity into a single-domain network and model change propagation as spread within this network. Those models miss out most dependencies from other domains and suffer from “hidden dependencies”. This paper proposes the function-behavior-structure (FBS) linkage model, a multidomain model which combines concepts of both the function-behavior-structure model from Gero and colleagues with the change prediction method (CPM) from Clarkson and colleagues. The FBS linkage model is represented in a network and a corresponding multidomain matrix of structural, behavioral, and functional elements and their links. Change propagation is described as spread in that network using principles of graph theory. The model is applied to a diesel engine. The results show that the FBS linkage model is promising and improves current methods in several ways: The model (1) accounts explicitly for all possible dependencies between product elements, (2) allows capturing and modeling of all relevant change requests, (3) improves the understanding of why and how changes propagate, (4) is scalable to different levels of decomposition, and (5) is flexible to present the results on different levels of abstraction. All these features of the FBS linkage model can help control and counteract change propagation and reduce uncertainty and risk in design.

Commentary by Dr. Valentin Fuster
J. Mech. Des. 2012;134(10):100906-100906-10. doi:10.1115/1.4007587.

System complexity is considered a key driver of the inability of current system design practices to at times not recognize performance, cost, and schedule risks as they emerge. We present here a definition of system complexity and a quantitative metric for measuring that complexity based on information theory. We also derive sensitivity indices that indicate the fraction of complexity that can be reduced if more about certain factors of a system can become known. This information can be used as part of a resource allocation procedure aimed at reducing system complexity. Our methods incorporate Gaussian process emulators of expensive computer simulation models and account for both model inadequacy and code uncertainty. We demonstrate our methodology on a candidate design of an infantry fighting vehicle.

Commentary by Dr. Valentin Fuster
J. Mech. Des. 2012;134(10):100907-100907-14. doi:10.1115/1.4007448.

The dynamic nature of today’s technology market requires new value-characteristic modeling methods; mainstream methods have limitations due to unrealistic assumptions, such as static customer preferences and no multicollinearity among product attributes. In particular, products with longer cycle times can suffer because the static model ignores changes in the market during the concept-to-customer lead time. This study proposes a dynamic, partial least squares path model for customer driven product design and development in order to reduce model uncertainty by formulating preference models to reflect market dynamics. The proposed dynamic model adopted partial least squares regression to handle the limited observations plagued by multicollinearity among product attributes. The main advantage of the proposed model is its ability to evaluate design alternatives during the front-end concept screening phase, using the overall product-value metric, customer-revealed value. A case study analyzing the US car market data for sedans from 1990 to 2010 showed the potential for the proposed method to be effective, with a 3.40 mean absolute percentage error.

Commentary by Dr. Valentin Fuster

Special section: Methods for Uncertainty Characterizations in Existing Models Through Uncertainly Quantification or Calibration

J. Mech. Des. 2012;134(10):100908-100908-12. doi:10.1115/1.4007390.

To use predictive models in engineering design of physical systems, one should first quantify the model uncertainty via model updating techniques employing both simulation and experimental data. While calibration is often used to tune unknown calibration parameters of a computer model, the addition of a discrepancy function has been used to capture model discrepancy due to underlying missing physics, numerical approximations, and other inaccuracies of the computer model that would exist even if all calibration parameters are known. One of the main challenges in model updating is the difficulty in distinguishing between the effects of calibration parameters versus model discrepancy. We illustrate this identifiability problem with several examples, explain the mechanisms behind it, and attempt to shed light on when a system may or may not be identifiable. In some instances, identifiability is achievable under mild assumptions, whereas in other instances, it is virtually impossible. In a companion paper, we demonstrate that using multiple responses, each of which depends on a common set of calibration parameters, can substantially enhance identifiability.

Commentary by Dr. Valentin Fuster
J. Mech. Des. 2012;134(10):100909-100909-9. doi:10.1115/1.4007573.

In physics-based engineering modeling, the two primary sources of model uncertainty, which account for the differences between computer models and physical experiments, are parameter uncertainty and model discrepancy. Distinguishing the effects of the two sources of uncertainty can be challenging. For situations in which identifiability cannot be achieved using only a single response, we propose to improve identifiability by using multiple responses that share a mutual dependence on a common set of calibration parameters. To that end, we extend the single response modular Bayesian approach for calculating posterior distributions of the calibration parameters and the discrepancy function to multiple responses. Using an engineering example, we demonstrate that including multiple responses can improve identifiability (as measured by posterior standard deviations) by an amount that ranges from minimal to substantial, depending on the characteristics of the specific responses that are combined.

Commentary by Dr. Valentin Fuster
J. Mech. Des. 2012;134(10):100910-100910-8. doi:10.1115/1.4007572.

The design optimization process relies often on computational models for analysis or simulation. These models must be validated to quantify the expected accuracy of the obtained design solutions. It can be argued that validation of computational models in the entire design space is neither affordable nor required. In previous work, motivated by the fact that most numerical optimization algorithms generate a sequence of candidate designs, we proposed a new paradigm where design optimization and calibration-based model validation are performed concurrently in a sequence of variable-size local domains that are relatively small compared to the entire design space. A key element of this approach is how to account for variability in test data and model predictions in order to determine the size of the local domains at each stage of the sequential design optimization process. In this article, we discuss two alternative techniques for accomplishing this: parametric and nonparametric bootstrapping. The parametric bootstrapping assumes a Gaussian distribution for the error between test and model data and uses maximum likelihood estimation to calibrate the prediction model. The nonparametric bootstrapping does not rely on the Gaussian assumption providing; therefore, a more general way to size the local domains for applications where distributional assumptions are difficult to verify, or not met at all. If distribution assumptions are met, parametric methods are preferable over nonparametric methods. We use a validation literature benchmark problem to demonstrate the application of the two techniques. Which technique to use depends on whether the Gaussian distribution assumption is appropriate based on available information.

Commentary by Dr. Valentin Fuster

Special section: Methods for Uncertainty Computing Either Uncertainty Propagation or Optimization Under Uncertainty

J. Mech. Des. 2012;134(10):100911-100911-9. doi:10.1115/1.4007389.

System models help designers predict actual system output. Generally, variation in system inputs creates variation in system outputs. Designers often propagate variance through a system model by taking a derivative-based weighted sum of each input’s variance. This method is based on a Taylor-series expansion. Having an output mean and variance, designers typically assume the outputs are Gaussian. This paper demonstrates that outputs are rarely Gaussian for nonlinear functions, even with Gaussian inputs. This paper also presents a solution for system designers to more meaningfully describe the system output distribution. This solution consists of using equations derived from a second-order Taylor series that propagate skewness and kurtosis through a system model. If a second-order Taylor series is used to propagate variance, these higher-order statistics can also be propagated with minimal additional computational cost. These higher-order statistics allow the system designer to more accurately describe the distribution of possible outputs. The benefits of including higher-order statistics in error propagation are clearly illustrated in the example of a flat-rolling metalworking process used to manufacture metal plates.

Topics: Errors , Functions
Commentary by Dr. Valentin Fuster
J. Mech. Des. 2012;134(10):100912-100912-10. doi:10.1115/1.4007391.

This paper proposes a novel second-order reliability method (SORM) using noncentral or general chi-squared distribution to improve the accuracy of reliability analysis in existing SORM. Conventional SORM contains three types of errors: (1) error due to approximating a general nonlinear limit state function by a quadratic function at most probable point in standard normal U-space, (2) error due to approximating the quadratic function in U-space by a parabolic surface, and (3) error due to calculation of the probability of failure after making the previous two approximations. The proposed method contains the first type of error only, which is essential to SORM and thus cannot be improved. However, the proposed method avoids the other two types of errors by describing the quadratic failure surface with the linear combination of noncentral chi-square variables and using the linear combination for the probability of failure estimation. Two approaches for the proposed SORM are suggested in the paper. The first approach directly calculates the probability of failure using numerical integration of the joint probability density function over the linear failure surface, and the second approach uses the cumulative distribution function of the linear failure surface for the calculation of the probability of failure. The proposed method is compared with first-order reliability method, conventional SORM, and Monte Carlo simulation results in terms of accuracy. Since it contains fewer approximations, the proposed method shows more accurate reliability analysis results than existing SORM without sacrificing efficiency.

Commentary by Dr. Valentin Fuster
J. Mech. Des. 2012;134(10):100913-100913-13. doi:10.1115/1.4007392.

Uncertainty plays a critical role in engineering design as even a small amount of uncertainty could make an optimal design solution infeasible. The goal of robust optimization is to find a solution that is both optimal and insensitive to uncertainty that may exist in parameters and design variables. In this paper, a novel approach, sequential quadratic programming for robust optimization (SQP-RO), is proposed to solve single-objective continuous nonlinear optimization problems with interval uncertainty in parameters and design variables. This new SQP-RO is developed based on a classic SQP procedure with additional calculations for constraints on objective robustness, feasibility robustness, or both. The obtained solution is locally optimal and robust. Eight numerical and engineering examples with different levels of complexity are utilized to demonstrate the applicability and efficiency of the proposed SQP-RO with the comparison to its deterministic SQP counterpart and RO approaches using genetic algorithms. The objective and/or feasibility robustness are verified via Monte Carlo simulations.

Commentary by Dr. Valentin Fuster
J. Mech. Des. 2012;134(10):100914-100914-8. doi:10.1115/1.4007393.

Evidence theory is one of the approaches designed specifically for dealing with epistemic uncertainty. This type of uncertainty modeling is often useful at preliminary design stages where the uncertainty related to lack of knowledge is the highest. While multiple approaches for propagating epistemic uncertainty through one-dimensional functions have been proposed, propagation through functions having a multidimensional output that need to be considered at once received less attention. Such propagation is particularly important when the multiple function outputs are not independent, which frequently occurs in real world problems. The present paper proposes an approach for calculating belief and plausibility measures by uncertainty propagation through functions with multidimensional, nonindependent output by formulating the problem as one-dimensional optimization problems in spite of the multidimensionality of the output. A general formulation is first presented followed by two special cases where the multidimensional function is convex and where it is linear over each focal element. An analytical example first illustrates the importance of considering all the function outputs at once when these are not independent. Then, an application example to preliminary design of a propeller aircraft then illustrates the proposed algorithm for a convex function. An approximate solution found to be almost identical to the exact solution is also obtained for this problem by linearizing the previous convex function over each focal element.

Commentary by Dr. Valentin Fuster

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In