liu.seSearch for publications in DiVA
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Overtraining, Regularization, And Searching For Minimum With Application To Neural Networks
Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
1994 (English)Report (Other academic)
Abstract [en]

In this paper we discuss the role of criterion minimization as a means for parameter estimation. Most traditional methods, such as maximum likelihood and prediction error identification are based on these principles. However, somewhat surprisingly, it turns out that it is not always "optimal" to try to find the absolute minimum point of the criterion. The reason is that "stopped minimization" (where the iterations have been terminated before the absolute minimum has been reached) has more or less identical properties as using regularization (adding a parametric penalty term). Regularization is known to have beneficial effects on the variance of the parameter estimates and it reduces the "variance contribution" of the misfit. This also explains the concept of "overtraining" in neural nets. How does one know when to terminate the iterations then? A useful criterion would be to stop iterations when the criterion function applied to a validation data set no longer decreases. However, we show in this paper, that applying this technique exensively may lead to the fact that the resulting estimate is an unregularized estimate for the total data set: Estimation + validation data.

Place, publisher, year, edition, pages
Linköping: Linköping University , 1994. , 18 p.
Series
LiTH-ISY-R, ISSN 1400-3902 ; 1567
Keyword [en]
Parameter estimation, Neural nets
Keyword [sv]
Cybernetik Informationsteori, Cybernetik Informationsteori
National Category
Control Engineering
Identifiers
URN: urn:nbn:se:liu:diva-55162ISRN: LiTH-ISY-R, 1567OAI: oai:DiVA.org:liu-55162DiVA: diva2:315818
Available from: 2010-04-29 Created: 2010-04-29 Last updated: 2014-10-07Bibliographically approved

Open Access in DiVA

fulltext(208 kB)112 downloads
File information
File name FULLTEXT02.pdfFile size 208 kBChecksum SHA-512
9601926fbff041aef519a3535f99575bbef42bf0621ce1cc9850b7272631e605774619a026251296beff624f64f6e2699f953c641d1d159190e70b60314da948
Type fulltextMimetype application/pdf
fulltext(280 kB)107 downloads
File information
File name FULLTEXT01.psFile size 280 kBChecksum SHA-512
ad1baacc8213c64ffaa8c076d894345e93faa490714ef80a5d261537609789e05dbcde35d5cb62c2c3ff3f34fc0be07469f1f45cd3ffd68b4c6c01e1ac2a7012
Type fulltextMimetype application/postscript

Authority records BETA

Ljung, Lennart

Search in DiVA

By author/editor
Ljung, Lennart
By organisation
Automatic ControlThe Institute of Technology
Control Engineering

Search outside of DiVA

GoogleGoogle Scholar
Total: 219 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

urn-nbn

Altmetric score

urn-nbn
Total: 72 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf