Experimental Determination of Precision Requirements for Back-Propagation Training of Artificial Neural Networks

TitleExperimental Determination of Precision Requirements for Back-Propagation Training of Artificial Neural Networks
Publication TypeTechnical Report
Year of Publication1991
AuthorsAsanović, K., & Morgan N.
Other Numbers666
Abstract

The impact of reduced weight and output precision on the back-propagation training algorithm is experimentally determined for a feed-forward multi-layer perceptron. In contrast with previous such studies, the network is large with over 20,000 weights, and is trained with a large, real-world data set of over 130,000 patterns to perform a difficult task, that of phoneme classification for a continuous speech recognition system.The results indicate that 16b weight values are sufficient to achieve training and classification results comparable to 32b floating point, provided that weight and bias values are scaled separately, and that rounding rather than truncation is employed to reduce the precision of intermediary values. Output precision can be reduced to 8 bits without significant effects on performance.

URLhttp://www.icsi.berkeley.edu/ftp/global/pub/techreports/1991/tr-91-036.pdf
Bibliographic Notes

ICSI Technical Report TR-91-036

Abbreviated Authors

K. Asanovic and N. Morgan

ICSI Publication Type

Technical Report