Overtraining, Regularization and Searching for Minimum in Neural Networks
1992 (English)In: 4th IFAC Symposium on Adaptive Systems in Control and Signal Processing, 1992, 669-674 p.Conference paper (Refereed)
Neural network models for dynamical systems have been subject of considerable interest lately. They are often characterized by the fact that they use a fairly large amount of parameters. Here we address the problem why this can be done without the usual penalty in terms of a large variance error. We show that reguralization is a key explanation, and that terminating a gradient search ("backpropagation") before the true criterion minimum is found is a way of achieving regularization. This, among other things, also explains the concept of "overtraining" in neural nets.
Place, publisher, year, edition, pages
1992. 669-674 p.
Neural network models, Dynamical systems, Parameters, Variance error, Regularization
IdentifiersURN: urn:nbn:se:liu:diva-91677ISBN: 978-0080425962OAI: oai:DiVA.org:liu-91677DiVA: diva2:618458
4th IFAC Symposium on Adaptive Systems in Control and Signal Proccesing, Grenoble, France, July, 1992