liu.seSearch for publications in DiVA
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
New optimization algorithms for large-scale isotonic regression in L2-norm
Linköping University, The Institute of Technology. Linköping University, Department of Mathematics, Optimization .ORCID iD: 0000-0003-1836-4200
Linköping University, Faculty of Arts and Sciences. Linköping University, Department of Computer and Information Science, Statistics.
Linköping University, Faculty of Arts and Sciences. Linköping University, Department of Computer and Information Science, Statistics.
2007 (English)In: EUROPT-OMS Conference on Optimization,2007, University of Hradec Kralove, Czech Republic: Guadeamus , 2007, p. 44-44Conference paper, Published paper (Other academic)
Abstract [en]

Isotonic regression problem (IR) has numerous important applications in statistics, operations research, biology, image and signal processing and other areas. IR in L2-norm is a minimization problem in which the objective function is the squared Euclidean distance from a given point to a convex set defined by monotonicity constraints of the form: i-th component of the decision vector is less or equal to its j-th component. Unfortunately, the conventional optimization methods are unable to solve IR problems originating from large data sets. The existing IR algorithms, such as the minimum lower sets algorithm by Brunk, the min-max algorithm by Lee, the network flow algorithm by Maxwell & Muchstadt and the IBCR algorithm by Block et al. are able to find exact solution to IR problem for at most a few thousands of variables. The IBCR algorithm, which proved to be the most efficient of them, is not robust enough. An alternative approach is related to solving IR problem approximately. Following this approach, Burdakov et al. developed an algorithm, called GPAV, whose block refinement extension, GPAVR, is able to solve IR problems with a very high accuracy in a far shorter time than the exact algorithms. Apart from this, GPAVR is a very robust algorithm, and it allows us to solve IR problems with over hundred thousands of variables. In this talk, we introduce new exact IR algorithms, which can be viewed as active set methods. They use the approximate solution produced by the GPAVR algorithm as a starting point. We present results of our numerical experiments demonstrating the high efficiency of the new algorithms, especially for very large-scale problems, and their robustness. They are able to solve the problems which all existing exact IR algorithms fail to solve.

Place, publisher, year, edition, pages
University of Hradec Kralove, Czech Republic: Guadeamus , 2007. p. 44-44
Keywords [en]
quadratic programming, isotonic regression, large-scale optimization, active-set methods
National Category
Mathematics
Identifiers
URN: urn:nbn:se:liu:diva-40342Local ID: 53033OAI: oai:DiVA.org:liu-40342DiVA, id: diva2:261191
Available from: 2009-10-10 Created: 2009-10-10 Last updated: 2015-06-02

Open Access in DiVA

No full text in DiVA

Authority records

Burdakov, OlegGrimvall, AndersSysoev, Oleg

Search in DiVA

By author/editor
Burdakov, OlegGrimvall, AndersSysoev, Oleg
By organisation
The Institute of TechnologyOptimization Faculty of Arts and SciencesStatistics
Mathematics

Search outside of DiVA

GoogleGoogle Scholar

urn-nbn

Altmetric score

urn-nbn
Total: 517 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf