The Microplastic Strain Energy Criterion Applied to Fatigue

[+] Author and Article Information
A. Esin

Department of Mechanical Engineering, University College, London, England

J. Basic Eng 90(1), 28-36 (Mar 01, 1968) (9 pages) doi:10.1115/1.3605061 History: Received August 14, 1967; Online November 03, 2011


This paper describes the research work done to develop an analytical approach to fatigue failure in metals for a life in excess of 105 cycles which will enable the prediction of fatigue data to be carried out by computer. In the region of interest where the metal is nominally elastic, a localized form of plastic flow was found and termed “microplasticity.” As microplastic flow is a random and microstructure sensitive property, it was studied by means of a statistical approach. The necessary parameters to define the statistical functions were obtained from tensile test and by measuring the changes in the a-c resistance under strain, which made it possible to differentiate between elastic and plastic strain and hence detect microplasticity. Using the experimentally defined parameters, a mathematical model for generating microplastic hysteresis loops was constructed. Fatigue damage was related to the plastic hysteresis energy dissipated per cycle and the final fracture was assumed to occur when the accumulated plastic strain energy reached the value under the true stress-true strain diagram. By applying the microplastic strain energy criterion, the fatigue lives of six different types of steel were successfully determined using a computer.

Copyright © 1968 by ASME
Your Session has timed out. Please sign back in to continue.






Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging and repositioning the boxes below.

Related Journal Articles
Related eBook Content
Topic Collections

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In