liu.seSearch for publications in DiVA
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
New optimization methods for isotonic regression in L1 norm
Linköping University, Faculty of Arts and Sciences. Linköping University, Department of Computer and Information Science, Statistics.
Linköping University, The Institute of Technology. Linköping University, Department of Mathematics, Optimization .ORCID iD: 0000-0003-1836-4200
Linköping University, Faculty of Arts and Sciences. Linköping University, Department of Computer and Information Science, Statistics.
2007 (English)In: EUROPT-OMS Conference on Optimization,2007, University of Hradec Kralove, Czech Republic: Guadeamus , 2007, 133-133 p.Conference paper, Published paper (Other academic)
Abstract [en]

Isotonic regression problem (IR) has numerous important applications in statistics, operations research, biology, image and signal processing and other areas. IR is a minimization problem with the objective function defined by the distance from a given point to a convex set defined by monotonicity constraints of the form: i-th component of the decision vector is less or equal to its j-th component. The distance in IR is usually associated with the Lp norm, whereas the norms L1 and L2 are of the highest practical interest. The conventional optimization methods are unable to solve large-scale IR problems originating from large data sets. Historically, the major efforts were focused on IR problem in the L2 norm. Exact algorithms such as the minimum lower sets algorithm by Brunk, the min-max algorithm by Lee, the network flow algorithm by Maxwell & Muchstadt and the IBCR algorithm by Block et al. were developed. Among them the IBCR algorithm has been proved to be the most numerically efficient, but it is not robust enough. An alternative approach is related to solving IR problem approximately. Following this approach, Burdakov et al. developed GPAV algorithm whose block refinement extension, GPAVR, is able to solve IR problem with high accuracy in a far shorter time than the exact algorithms. Apart from this, GPAVR is a very robust algorithm. Unfortunately, for the norm L1 there are no algorithms which are as efficient as those in L2 norm. In our talk, we introduce new algorithms, GPAVR1 and IBCR1. They are extensions of the algorithms GPAV and IBCR to L1 norm. We present also results of numerical experiments, which demonstrate the high efficiency of the new algorithms, especially for very large-scale problems.

Place, publisher, year, edition, pages
University of Hradec Kralove, Czech Republic: Guadeamus , 2007. 133-133 p.
Keyword [en]
quadratic programming, linear programming, isotonic regression, large-scale optimization
National Category
Computer Science
Identifiers
URN: urn:nbn:se:liu:diva-40343Local ID: 53034OAI: oai:DiVA.org:liu-40343DiVA: diva2:261192
Available from: 2009-10-10 Created: 2009-10-10 Last updated: 2015-06-02

Open Access in DiVA

No full text

Authority records BETA

Sysoev, OlegBurdakov, OlegGrimvall, Anders

Search in DiVA

By author/editor
Sysoev, OlegBurdakov, OlegGrimvall, Anders
By organisation
Faculty of Arts and SciencesStatisticsThe Institute of TechnologyOptimization
Computer Science

Search outside of DiVA

GoogleGoogle Scholar

urn-nbn

Altmetric score

urn-nbn
Total: 338 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf