0
Research Papers

Applied Tests of Design Skills—Part II: Visual Thinking

[+] Author and Article Information
Jami J. Shah

Mechanical and Aerospace Engineering,
Arizona State University,
Tempe, AZ 85258

Jay Woodward

Department of Educational Psychology,
Texas A&M University,
College Station, TX 77843

Steven M. Smith

Department of Psychology,
Texas A&M University,
College Station, TX 77843

Contributed by the Design Education Committee of ASME for publication in the Journal of Mechanical Design. Manuscript received May 9, 2012; final manuscript received April 10, 2013; published online xx xx, xxxx. Assoc. Editor: Janis Terpenny.

J. Mech. Des 135(7), 071004 (May 24, 2013) (11 pages) Paper No: MD-12-1246; doi: 10.1115/1.4024228 History: Received May 09, 2012; Revised April 10, 2013

A number of cognitive skills relevant to conceptual design have been previously identified: divergent thinking, visual thinking, spatial reasoning, qualitative reasoning, and problem formulation. A battery of standardized test has been developed for each of these skills. This is the second paper in a series of papers on testing individual skill level differences in engineers and engineering students. In the first paper, we reported on the theoretical and empirical basis for divergent thinking test, as well as, on test formulation, data collection, norming studies, and statistical validation of that test. This paper focuses similarly on the efforts related to the visual thinking and spatial reasoning in engineering context. We have decomposed visual thinking into six categories: visual comprehension including perceptual speed, visual memory (that is, the visual memory system), visual synthesis mental image manipulation/transformation, spatial reasoning, and graphical expression/elaboration. We discuss the theoretical basis of a comprehensive test for engineers, test composition, trial runs, and computation of reliability measures. The alpha version was given to a small set of subjects to determine clarity of the questions and gauge difficulty level. The beta version was used for norming and test validation from over 500 samples that included engineering students and a smaller number of practicing engineers. Construct validation was achieved through basing the construction of our instrument off other well-known measures of visual thinking, while content validity was assured through thoroughly sampling the domain of visual thinking and including a variety of items both pertinent and specific to the engineering design process. The factor analysis reveals that there are possibly two eigenvalues above 1.0, an indication that it is a stable and accurate instrument. We emphasize that these tests are not just dependent on native abilities, but on education and experience; design skills are teachable and learnable.

FIGURES IN THIS ARTICLE
<>
Copyright © 2013 by ASME
Your Session has timed out. Please sign back in to continue.

References

Figures

Grahic Jump Location
Fig. 1

Illustrative examples for M4 and M5

Grahic Jump Location
Fig. 2

Example scoring basis and weights (Module M2)

Grahic Jump Location
Fig. 3

Example responses and VC scoring for Module 9

Grahic Jump Location
Fig. 4

Score ranges and norms

Tables

Errata

Discussions

Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging and repositioning the boxes below.

Related Journal Articles
Related eBook Content
Topic Collections

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In