0
Research Papers: Design Theory and Methodology

Applied Tests of Design Skills—Part III: Abstract Reasoning

[+] Author and Article Information
Maryam Khorshidi, Jami J. Shah

Mechanical & Aerospace Engineering,
Arizona State University,
Tempe, AZ 85287

Jay Woodward

Department of Educational Psychology,
Texas A&M University,
College Station, TX 77843

In most applications of deductive or inductive reasoning in Logics or Mathematics, numeric data is required; in early design however, availability of numeric data is usually limited to very few variables. Therefore, due to their particular application in early design, i.e., taking a rather qualitative form, in this paper qualitative deductive reasoning and qualitative inductive reasoning are used instead of the conventional terms of deductive and inductive reasoning.

The first domain is generally known as the source and the second one as the target.

Test problems were designed such that no irrelevant domain knowledge is required. Then by the alpha tests and the protocol study problems were re-examined for having any un-predicted correlations with irrelevant domain knowledge.

This criterion serves as a measure for the quality of an analogy; here is an example to further explain it: Ravens are very advanced creatures in incorporating tools, in that sense, an analogy between ravens' skills to use tools to do tasks (e.g., using stones to break nuts) and early humans' is more valid than the analogy between humans' skills in using tools and apes' even though apes and human beings are very similar in appearance, body structure, etc. (underlying structure vs. superficial form and appearance).

Semantic distance measures how close/far source and target are in an analogy. Various methods are proposed for quantifying it; an example could be found in Ref. [36].

Semantic distance could be considered as a measure of novelty of an analogy: the farther the two mediums are, the harder it is to come up with the analogy. For instance, in the analogy proposed above, apes and humans -but not ravens- belong to the same class of animals (mammals); even though it is easier to think of apes in an analogy with humans than ravens, the first one does not transfer the point about advanced behavior in incorporating tools and is therefore not valid.

Defined by the National Association of the Directors of Educational Research (T. B. Rogers, 1995, p. 25), test validity is how well a test measures what it purports to measure [60].

Reliability of a test on the other hand, suggests trustworthiness of its results and is based on the consistency of the test and the precision of the measurement process done by its problems.

Part I of this paper was published in J. Mech. Des.134(2), 021005 (Feb. 03, 2012).

Part II of this paper was published in J. Mech. Des.135(7), 071004 (May 24, 2013).

Contributed by the Design Theory and Methodology Committee of ASME for publication in the JOURNAL OF MECHANICAL DESIGN. Manuscript received October 23, 2013; final manuscript received June 24, 2014; published online July 31, 2014. Assoc. Editor: Janis Terpenny.

J. Mech. Des 136(10), 101101 (Jul 31, 2014) (11 pages) Paper No: MD-13-1479; doi: 10.1115/1.4027986 History: Received October 23, 2013; Revised June 24, 2014

Past studies have identified the following cognitive skills relevant to conceptual design: divergent thinking, spatial reasoning, visual thinking, abstract reasoning, and problem formulation (PF). Standardized tests are being developed to assess these skills. The tests on divergent thinking and visual thinking are fully developed and validated; this paper focuses on the development of a test of abstract reasoning in the context of engineering design. Similar to the two previous papers, this paper reports on the theoretical and empirical basis for skill identification and test development. Cognitive studies of human problem solving and design thinking revealed four indicators of abstract reasoning: qualitative deductive reasoning (DR), qualitative inductive reasoning (IR), analogical reasoning (AnR), and abductive reasoning (AbR). Each of these is characterized in terms of measurable indicators. The paper presents test construction procedures, trial runs, data collection, norming studies, and test refinement. Initial versions of the test were given to approximately 250 subjects to determine the clarity of the test problems, time allocation and to gauge the difficulty level. A protocol study was also conducted to assess test content validity. The beta version was given to approximately 100 students and the data collected was used for norming studies and test validation. Analysis of test results suggested high internal consistency; factor analysis revealed four eigenvalues above 1.0, indicating assessment of four different subskills by the test (as initially proposed by four indicators). The composite Cronbach’s alpha for all of the factors together was found to be 0.579. Future research will be conducted on criterion validity.

FIGURES IN THIS ARTICLE
<>
Copyright © 2014 by ASME
Your Session has timed out. Please sign back in to continue.

References

Wilde, G., 1983, “The Skills and Practices of Engineering Designers Now and in the Future,” Des. Stud., 4(1), pp. 21–34. [CrossRef]
Newstetter, W., and Khan, S., 1997, “A Developmental Approach to Assessing Design Skills and Knowledge,” 27th FIE Conference, Pittsburgh, PA, Nov. 5–8, Vol. 2.
Lewis, W., and Bonollo, E., 2002, “An Analysis of Professional Skills in Design: Implications for Education and Research,” Des. Stud.23(4), pp. 385–406. [CrossRef]
Dym, C., Agogino, A., Eris, O., Frey, D., and Leifer, L., 2005, “Engineering Design Thinking, Teaching and Learning,” J. Eng. Educ., 94(1), pp. 103–120. [CrossRef]
Shah, J., 2005, “Identification, Measurement, and Development of Design Skills in Engineering Education,” International Conference on Engineering Design (ICED 05), Melbourne, Australia. Aug 15–18, Melbourne, Australia.
Shah, J., Smith, S., and Woodward, J., 2009, “Development of Standardized Tests for Design Skills,” International Conference on Engineering Design (ICED 09), Stanford, CA, Aug. 24–27.
Shah, J., Millsap, R., Woodward, J., and Smith, S., 2012, “Applied Tests of Design Skills—Part 1: Divergent Thinking,” ASME J. Mech. Des., 134(2), pp. p. 021005. [CrossRef]
Shah, J., Woodward, J., and Smith, S., 2013, “Applied Tests of Design Skills—Part II: Visual Thinking,” ASME J. Mech. Des., 135(7), p. 071004. [CrossRef]
Shah, J., Millsap, R., Woodward, J., and Smith, S., 2010, “Applied Tests of Design Skills: Divergent Thinking Data Analysis and Reliability Studies,” 2010 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference, Quebec, Canada Aug. 15–18.
Shah, J., Woodward, J., and Smith, S., 2011, “Applied Tests of Engineering Design Skills: Visual Thinking Characterization, Test Development and Validation,” Proceedings of the 18th International Conference on Engineering Design (ICED 11), Copenhagen, Denmark, Aug. 15–19.
Langer, S., 1953, Feeling and Form: A Theory of Art Developed From Philosophy in a New Key, Charles Scribner's Sons, New York, p. 90.
Khorshidi, M., Woodward, J., and Shah, J., 2012, “Towards a Comprehensive Test of Qualitative Reasoning Skill in Design,” 2012 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference, Chicago, IL Aug. 12–15.
Goel, A. K., 1997, “Design, Analogy, and Creativity,” IEEE Expert, 12(3), pp. 62–70. [CrossRef]
Schunn, C., and Klahr, D., 1995, “A 4-Space Model of Scientific Discovery,” Proceedings of the 17th Annual Conference of the Cognitive Science Society, Erlbaum, Hillsdale, NJ.
Chiu, M., 2003, “Design Moves in Situated Design With Case-Based Reasoning,” Des. Stud., 24(1), pp. 1–25. [CrossRef]
Jonassen, D., 2000, “Toward a Design Theory of Problem Solving,” Educ. Technol. Res. Dev., 48(4), pp. 63–85. [CrossRef]
Campbell, J., and Wolstencroft, J., 1990, “Structure and Significance of Analogical Reasoning,” Artif. Intell. Med., 2(2), pp. 103–118. [CrossRef]
Roozenburg, N., and Eekels, N., 1996, Product Design: Fundamentals and Methods, Wiley, Chichester, UK.
Torrens, D., and Thompson, V. A., 1999, “Individual Differences and the Belief Bias Effect: Mental Models, Logical Necessity, and Abstract Reasoning,” Thinking Reasoning, 5, pp. 1–28. [CrossRef]
Markovits, H., Doyon, C., and Simoneau, M., 2002, “Individual Differences in Working Memory and Conditional Reasoning With Concrete and Abstract Content,” Thinking Reasoning, 8(1), pp. 97–107. [CrossRef]
Staub, F., and Stern, E., 1997, “Abstract Reasoning With Mathematical Constructs,” Int. J. Educ. Res., 27(1), pp. 63–75. [CrossRef]
Johnson-Laird, P., 1999, “Deductive Reasoning,” Annu. Rev. Psychol., 50, pp. 109–135. [CrossRef] [PubMed]
Oaksford, M., and Hahn, U., 2007, “Induction, Deduction, and Argument Strength in Human Reasoning and Argumentation,” Inductive Reasoning: Experimental, Developmental, and Computational Approaches, A.Feeney and E.Heit, eds., Cambridge University, Cambridge, pp. 269–327.
Holland, J., Holyoak, K., Nisbett, R., and Thagard, P., 1986, Induction: Process of Inference, Learning, and Discovery, MIT, Cambridge, MA.
Klauer, K., 1989, “Teaching for Analogical Transfer as a Means of Improving Problem-Solving, Thinking and Learning,” Instructional Sci., 18, pp. 179–192. [CrossRef]
de Koning, E., Sijtsma, K., and Hamers, J., 2003, “Construction and Validation of a Test for Inductive Reasoning,” Eur. J. Psychol. Assess., 19(1), pp. 24–39. [CrossRef]
Throne, S., 2000, “Data Analysis in Qualitative Research,” Evidence-Based Nurs., 3(3), pp. 68–70. [CrossRef]
Brian, W., 1994, “Inductive Reasoning and Bounded Rationality,” Am. Econ. Rev., 84(2), pp. 406–411. [CrossRef]
Nisbett, R., Krantz, D., Jepson, C., and Kunda, Z., 1983, “The Use of Statistical Heuristics in Everyday Inductive Reasoning,” Psychol. Rev., 90(4), pp. 339–363. [CrossRef]
Klix, F., 2001, “Problem Solving: Deduction, Induction, and Analogical Reasoning,” Int. Encycl. Social Behav. Sci., pp. 12123–12130. [CrossRef]
Kedar-Cabelli, S. T., 1985, “Analogy-from a Unified Perspective,” Department of Computer Science Laboratory for Computer Science Research, New Brunswick, NJ, Technical Report No. ML-TR-3.
Findler, N., 1981, “Analogical Reasoning in Design Process,” Des. Stud., 2(1), pp. 45–51. [CrossRef]
Fu, K., Chan, J., Cagan, J., Kotovsky, K., Shunn, C., and Wood, K., 2012, “The Meaning of Near and Far: The Impact of Structuring Design Database and the Effect of Distance of Analogy on Design Output,” ASME J. Mech. Des., 135(2), p. 021007. [CrossRef]
Casakin, H., and Goldschmidt, G., 1999, “Expertise and the Use of Visual Analogy: Implications for Design Education,” Des. Stud., 20(2), pp. 153–175. [CrossRef]
Visser, W., 1996, “Two Functions of Analogical Reasoning in Design: A Cognitive-Psychology Approach,” Des. Stud., 17(4), pp. 417–434. [CrossRef]
Peirce, C., 1931–1958, Collected Papers of Charles Sanders Peirce, C.Hartshorne and P.Weiss, eds., Harvard University Press, Cambridge, MA.
Charniak, E., and McDermott, P., 1985, Introduction to Artificial Intelligence, Addison Wesley, Menlo Park, CA.
Lawson, A., 2000, “Reasoning: the Generality of Hypothetico-Deductive Reasoning: Making Scientific Thinking Explicit,” Am. Biol. Teach., 62(7), pp. 482–495. [CrossRef]
Aliseda, A., 2003, “Mathematical Reasoning vs. Abductive Reasoning: A Structural Approach,” Synthese, 134(1–2), pp. 25–44. [CrossRef]
Ng, H., and Mooney, R., 1990, “On the Role of Coherence in Abductive Explanation,” Proceedings of the 8th National Conference on Artificial Intelligence, Bostin, MA, Jul. 29–Aug. 3, pp. 337–342.
Stern, W., 1912, Psychologische Methoden der Intelligenz-Prüfung, T.Barth, ed., Barth, Leipzig, Germany.
Raven, J., Raven, J., and Court, J., 2003, Manual for Raven's Progressive Matrices and Vocabulary Scales, Harcourt Assessment, San Antonio, TX.
Watson, G., and Glaser, E. M., 1980, Critical Thinking Appraisal, Psych Corp, Cleveland, OH.
Hicks, R., and Southey, G., 1990, “The Watson-Glaser Critical Thinking Appraisal and Performance of Business Management Students,” Psychcol. Test Bull., 3, pp. 74–81.
Torrance, E., 1966, Torrance Tests of Creativity, Personnel, Princeton, NJ.
Kendal, I., and Izard, J., 1988, “ACER mechanical reasoning test (revised),” Psych. Bull., 1.
Arasteh, A., and Arasteh, J., 1976, Creativity in Human Development: An Interpretive and Annotated Bibliography, Schenkman Pub. Co., Cambridge, MA and New York.
Murray, R. E., 1979, “Variability in MAT Results Within the Field of Education,” Psychol. Rep., 45(2), pp. 665–666. [CrossRef]
Lawson, A., 1978, “Development and Validation of the Classroom Test of Formal Reasoning,” J. Res. Sci. Teach., 15(1), pp. 11–24. [CrossRef]
Klauer, K., and Phye, G., 2008, “Inductive Reasoning: A Training Approach,” Rev. Educ. Res., 78(1), pp. 85–123. [CrossRef]
Holyoak, K., and Koh, K., 1987, “Surface and Structural Similarity in Analogical Transfer,” Memory Cognit., 15(4), pp. 332–340. [CrossRef]
Keane, M., 1987, “On Retrieving Analogues When Solving Problems,” Q. J. Exp. Psychol. A, 39(1), pp. 29–41. [CrossRef]
Gick, M., and Holyoak, K., 1980, “Analogical Problem Solving,” Cognit. Psychol., 12(3), pp. 306–355. [CrossRef]
Bearman, C., Ball, L., and Ormerod, T., 2002, “An Exploration of Real-World Analogical Problem Solving in Novices,” Proceeding of the 24th Conference of the Cognitive Science Society, Fairfax, VA, Aug. 7–10.
Urbina, S., 2004, Essentials of Psychological Testing, Wiley, Hoboken, NJ.
Kaplan, R., and Saccuzzo, D., 2012, Psychological Testing, Principles, Applications, and Issues, 8th ed., Jon-David Hague Publishing, Independence, KY.
Anastasi, A., and Urbina, S., 1997, Psychological Testing, 7th ed., Prentice Hall, Upper Saddler River, NJ, p. 113.
Khorshidi, M., Shah, J., and Woodward, J., 2013, “Rethinking the Comprehensive Test on Qualitative Reasoning for Designers,” ASME Paper No. DETC2013-12403. [CrossRef]
Horn, J. L., and Cattell, R. B., 1966, “Refinement and Test of the Theory of Fluid and Crystallized General Intelligences,” J. Educ. Psychol., 57(5), pp. 253–270. [CrossRef] [PubMed]
Wilde, D. J., 2008, Teamology: The Construction and Organization of Effective Teams, Springer, Berlin, Heidelberg.

Figures

Tables

Errata

Discussions

Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging and repositioning the boxes below.

Related Journal Articles
Related eBook Content
Topic Collections

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In