Publication Details
Title: Experimental Determination of Precision Requirements for Back-Propagation Training of Artificial Neural Networks
Author: K. Asanovic and N. Morgan
Group: ICSI Technical Reports
Date: June 1991
PDF: ftp://ftp.icsi.berkeley.edu/pub/techreports/1991/tr-91-036.pdf
Overview:
The impact of reduced weight and output precision on the back-propagation training algorithm is experimentally determined for a feed-forward multi-layer perceptron. In contrast with previous such studies, the network is large with over 20,000 weights, and is trained with a large, real-world data set of over 130,000 patterns to perform a difficult task, that of phoneme classification for a continuous speech recognition system. The results indicate that 16b weight values are sufficient to achieve training and classification results comparable to 32b floating point, provided that weight and bias values are scaled separately, and that rounding rather than truncation is employed to reduce the precision of intermediary values. Output precision can be reduced to 8 bits without significant effects on performance.
Bibliographic Information:
ICSI Technical Report TR-91-036
Bibliographic Reference:
K. Asanovic and N. Morgan. Experimental Determination of Precision Requirements for Back-Propagation Training of Artificial Neural Networks. ICSI Technical Report TR-91-036, June 1991
Author: K. Asanovic and N. Morgan
Group: ICSI Technical Reports
Date: June 1991
PDF: ftp://ftp.icsi.berkeley.edu/pub/techreports/1991/tr-91-036.pdf
Overview:
The impact of reduced weight and output precision on the back-propagation training algorithm is experimentally determined for a feed-forward multi-layer perceptron. In contrast with previous such studies, the network is large with over 20,000 weights, and is trained with a large, real-world data set of over 130,000 patterns to perform a difficult task, that of phoneme classification for a continuous speech recognition system. The results indicate that 16b weight values are sufficient to achieve training and classification results comparable to 32b floating point, provided that weight and bias values are scaled separately, and that rounding rather than truncation is employed to reduce the precision of intermediary values. Output precision can be reduced to 8 bits without significant effects on performance.
Bibliographic Information:
ICSI Technical Report TR-91-036
Bibliographic Reference:
K. Asanovic and N. Morgan. Experimental Determination of Precision Requirements for Back-Propagation Training of Artificial Neural Networks. ICSI Technical Report TR-91-036, June 1991
