Assessing Student Team Performance in Industry Sponsored Design Projects

[+] Author and Article Information
M. Keefe

Department of Mechanical Engineering, University of Delaware, Newark, DE 19716keefe@udel.edu

J. Glancey

Department of Mechanical Engineering, University of Delaware, Newark, DE 19716jglancey@udel.edu

N. Cloud

Department of Mechanical Engineering, University of Delaware, Newark, DE 19716cloudn@udel.edu

J. Mech. Des 129(7), 692-700 (Feb 05, 2007) (9 pages) doi:10.1115/1.2722791 History: Received November 14, 2006; Revised February 05, 2007

Although cooperative learning in a team setting is a common approach for integrating problem-based learning into undergraduate science and engineering, standard assessment tools do not exists to evaluate learning outcomes. As a result, novel techniques need to be developed to assess learning in team-based design projects. This paper describes the experiences and lessons learned in assessing student performance in team-based, project courses culminating in a senior capstone experience that integrates industry-sponsored design projects. A set of rubrics linked to the instructional objectives was developed that define and communicate expectations during each of three project phases. Rubrics for each phase incorporate three fundamental areas of team performance assessment: (i) synthesis of a valid concept; (ii) management of resources; and (iii) interpersonal interaction and communication. At the end of each phase, both the faculty and industry sponsor use the same rubric to assess student team performance. An analysis of variance (ANOVA) of the assessment data collected over the last 5 years indicated that student performance, measured by faculty grades and industry sponsor evaluations, was not significantly affected by the faculty advisor, project type, or sponsoring company size. These results are attributed primarily to the faculty focusing more on assessing student performance in executing the design process and less on the actual project results. The analysis also revealed that faculty assessments of student performance did not correlate very well with industry sponsor assessments. To address this, a revised set of evaluation rubrics were developed and are currently being used to better articulate expectations from both faculty and industrial sponsor perspectives.

Copyright © 2007 by American Society of Mechanical Engineers
Topics: Design , Teams , Students
Your Session has timed out. Please sign back in to continue.



Grahic Jump Location
Figure 6

Mapping of phase requirements to instructional objectives

Grahic Jump Location
Figure 7

Team grades versus sponsor rating for the 49 projects used in this study. The correlation coefficient is 0.21.

Grahic Jump Location
Figure 8

Revised rubric initiated in 2005 for evaluating team performance at the end of Phase 3. The text in italics attempts to articulate and clarify the industrial sponsors’ (i.e., customer) expectations.

Grahic Jump Location
Figure 9

Mapping of instructional objectives to ABET outcomes

Grahic Jump Location
Figure 1

Senior design process and course structure

Grahic Jump Location
Figure 2

End of project sponsor feedback form

Grahic Jump Location
Figure 3

Phase 1 assessment rubric used by faculty and the industrial sponsor

Grahic Jump Location
Figure 4

Phase 2 assessment rubric used by faculty and the industrial sponsor

Grahic Jump Location
Figure 5

Phase 3 assessment rubric used by faculty and the industrial sponsor




Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging and repositioning the boxes below.

Related Journal Articles
Related eBook Content
Topic Collections

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In