0
RESEARCH PAPERS

An Analysis of the Time and Temperature Dependence of the Upper Yield Point in Iron

[+] Author and Article Information
P. E. Bennett, G. M. Sinclair

Department of Theoretical and Applied Mechanics, University of Illinois, Urbana, Ill.

J. Basic Eng 83(4), 557-564 (Dec 01, 1961) (8 pages) doi:10.1115/1.3662267 History: Received December 13, 1960; Online November 04, 2011

Abstract

The influence of temperature and strain rate on the upper yield point of ingot iron was studied. Torsion tests were conducted using strain rates of 12.5/sec, 0.25/sec, and 0.0001/sec over the temperature range 77 to 525 deg K. The upper yield point showed a rapid increase as the temperature was lowered. An increase in the strain rate also caused an increase in the yield point. An apparent activation energy can be associated with the strain rate and temperature dependence of the yield point. This energy is influenced by stress level, and it appears from the present study that the relationship can be described by an equation of the form

ΔH = ΔHτ − ττb.
If this relationship is substituted for ΔH in a modification of the Boltzmann relation, the following result is obtained:
log γ̇γ̇1
   = M ΔHRT1τ − τ1τb
   1 − T1T τ − ττ − τ1b.
This equation describes the experimental data within ± 3000 psi. The results of this investigation compared with tensile test data from other investigators confirm that state of stress is an important factor in determining whether a material will behave in a ductile or brittle fashion.

Copyright © 1961 by ASME
Your Session has timed out. Please sign back in to continue.

References

Figures

Tables

Errata

Discussions

Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging and repositioning the boxes below.

Related Journal Articles
Related eBook Content
Topic Collections

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In