Second Order Backpropagation - Efficient Computation of the Hessian Matrix for Neural Networks

TitleSecond Order Backpropagation - Efficient Computation of the Hessian Matrix for Neural Networks
Publication TypeTechnical Report
Year of Publication1993
AuthorsRojas, R.
Other Numbers845
Abstract

Traditional learning methods for neural networks use some kind of gradient descent in order to determine the network's weights for a given task. Some second order learning algorithms deal with a quadratic approximation of the error function determined from the calculation of the Hessian matrix, and achieve improved convergence rates in many cases. We introduce in this paper second order backpropagation, a method to calculate efficiently the Hessian of a linear network of one-dimensional functions. This technique can be used to get explicit symbolic expressions or numerical approximations of the Hessian and could be used in parallel computers to improve second order learning algorithms for neural networks. It can be of interest also for computer algebra systems.[A newer version of this method is found in the book: Raul Rojas, Neural Networks, 1996, Springer-Verlag, available in English and German.]

URLhttp://www.icsi.berkeley.edu/ftp/global/pub/techreports/1993/tr-93-057.pdf
Bibliographic Notes

ICSI Technical Report TR-93-057

Abbreviated Authors

R. Rojas

ICSI Publication Type

Technical Report