0
Research Papers

Multi-Objective Ease-Off Optimization of Hypoid Gears for Their Efficiency, Noise, and Durability Performances

[+] Author and Article Information
Alessio Artoni1

Dipartimento di Ingegneria Meccanica, Nucleare e della Produzione,  University of Pisa, Largo Lucio Lazzarino 2, 56122 Pisa, Italyalessio.artoni@ing.unipi.it

Marco Gabiccini

Gear and Power Transmission Research Laboratory,  The Ohio State University, 201 West 19th Avenue, Columbus, OH 43210m.gabiccini@ing.unipi.it

Massimo Guiggiani

Gear and Power Transmission Research Laboratory,  The Ohio State University, 201 West 19th Avenue, Columbus, OH 43210guiggiani@ing.unipi.it

Ahmet Kahraman

Gear and Power Transmission Research Laboratory,  The Ohio State University, 201 West 19th Avenue, Columbus, OH 43210kahraman.1@osu.edu

1

Corresponding author.

J. Mech. Des 133(12), 121007 (Dec 09, 2011) (9 pages) doi:10.1115/1.4005234 History: Received June 04, 2011; Revised September 29, 2011; Published December 09, 2011; Online December 09, 2011

Microgeometry optimization has become an important phase of gear design that can remarkably enhance gear performance. For spiral bevel and hypoid gears, microgeometry is typically represented by ease-off topography. The optimal ease-off shape can be defined as the outcome of a process where generally conflicting objective functions are simultaneously minimized (or maximized), in the presence of constraints. This matter naturally lends itself to be framed as a multi-objective optimization problem. This paper proposes a general algorithmic framework for ease-off multi-objective optimization, with special attention given to computational efficiency. Its implementation is fully detailed. A simulation model for loaded tooth contact analysis is assumed to be available. The proposed method is demonstrated on a face-hobbed hypoid gear set. Three objectives are defined: maximization of gear mesh mechanical efficiency, minimization of loaded transmission error, minimization of maximum contact pressure. Bound constraints on the design variables are imposed, as well as a nonlinear constraint aimed at keeping the loaded contact pattern inside a predefined allowable contact region. The results show that the proposed method can obtain optimal ease-off topographies that significantly improve the basic design performances. It is also evident that the method is general enough to handle geometry optimization of any gear type.

FIGURES IN THIS ARTICLE
<>
Copyright © 2011 by American Society of Mechanical Engineers
Your Session has timed out. Please sign back in to continue.

References

Figures

Grahic Jump Location
Figure 1

Example of a multi-objective optimization problem and its Pareto-optimal set

Grahic Jump Location
Figure 2

Achievement function approach on a nonconvex problem with (a) an infeasible reference point and (b) a feasible one

Grahic Jump Location
Figure 3

Graphical representation of three iterations of DIRECT, adapted from Ref. [23]

Grahic Jump Location
Figure 4

Two iterations of the reference point method on a problem with two objectives

Grahic Jump Location
Figure 5

Example of (a) a standard ease-off representation and (b) the corresponding contour plot of it (drawn on the gear-based PCA)

Grahic Jump Location
Figure 6

Pseudocode of the proposed algorithm for ease-off MOO

Grahic Jump Location
Figure 7

Basic design: (a) loaded contact pattern and (b) ease-off surface

Grahic Jump Location
Figure 8

Results of T est 1 (five design variables)

Grahic Jump Location
Figure 9

Results of T est 2 (nine design variables)

Grahic Jump Location
Figure 10

Results of T est 3 (fourteen design variables)

Tables

Errata

Discussions

Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging and repositioning the boxes below.

Related Journal Articles
Related eBook Content
Topic Collections

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In