liu.seSearch for publications in DiVA
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
A smoothed monotonic regression via L2 regularization
Linköping University, Department of Computer and Information Science, The Division of Statistics and Machine Learning. Linköping University, Faculty of Arts and Sciences.ORCID iD: 0000-0002-3092-4162
Linköping University, Department of Mathematics, Optimization . Linköping University, Faculty of Science & Engineering.ORCID iD: 0000-0003-1836-4200
2019 (English)In: Knowledge and Information Systems, ISSN 0219-1377, E-ISSN 0219-3116, Vol. 59, no 1, p. 197-218Article in journal (Refereed) Published
Abstract [en]

Monotonic regression is a standard method for extracting a monotone function from non-monotonic data, and it is used in many applications. However, a known drawback of this method is that its fitted response is a piecewise constant function, while practical response functions are often required to be continuous. The method proposed in this paper achieves monotonicity and smoothness of the regression by introducing an L2 regularization term. In order to achieve a low computational complexity and at the same time to provide a high predictive power of the method, we introduce a probabilistically motivated approach for selecting the regularization parameters. In addition, we present a technique for correcting inconsistencies on the boundary. We show that the complexity of the proposed method is O(n2). Our simulations demonstrate that when the data are large and the expected response is a complicated function (which is typical in machine learning applications) or when there is a change point in the response, the proposed method has a higher predictive power than many of the existing methods.

Place, publisher, year, edition, pages
Springer, 2019. Vol. 59, no 1, p. 197-218
Keywords [en]
Monotonic regression, Kernel smoothing, Penalized regression, Probabilistic learning, Constrained optimization
National Category
Probability Theory and Statistics Computational Mathematics
Identifiers
URN: urn:nbn:se:liu:diva-147628DOI: 10.1007/s10115-018-1201-2ISI: 000461390300008OAI: oai:DiVA.org:liu-147628DiVA, id: diva2:1202548
Available from: 2018-04-27 Created: 2018-04-27 Last updated: 2019-04-03Bibliographically approved

Open Access in DiVA

fulltext(1211 kB)100 downloads
File information
File name FULLTEXT01.pdfFile size 1211 kBChecksum SHA-512
7ca847689e540996fab77d5a65a33e42535881dc1d2a97ad35bf8e2bb35750abd7a78b5c88e35eb9bf3a84078ace017ecdeec5c0d4625c887486c23fae2f155e
Type fulltextMimetype application/pdf

Other links

Publisher's full text

Authority records BETA

Sysoev, OlegBurdakov, Oleg

Search in DiVA

By author/editor
Sysoev, OlegBurdakov, Oleg
By organisation
The Division of Statistics and Machine LearningFaculty of Arts and SciencesOptimization Faculty of Science & Engineering
In the same journal
Knowledge and Information Systems
Probability Theory and StatisticsComputational Mathematics

Search outside of DiVA

GoogleGoogle Scholar
Total: 100 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 456 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf