liu.seSearch for publications in DiVA
Endre søk
RefereraExporteraLink to record
Permanent link

Direct link
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Annet format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annet språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf
New optimization algorithms for large-scale isotonic regression in L2-norm
Linköpings universitet, Tekniska högskolan. Linköpings universitet, Matematiska institutionen, Optimeringslära.ORCID-id: 0000-0003-1836-4200
Linköpings universitet, Filosofiska fakulteten. Linköpings universitet, Institutionen för datavetenskap, Statistik.
Linköpings universitet, Filosofiska fakulteten. Linköpings universitet, Institutionen för datavetenskap, Statistik.
2007 (engelsk)Inngår i: EUROPT-OMS Conference on Optimization,2007, University of Hradec Kralove, Czech Republic: Guadeamus , 2007, s. 44-44Konferansepaper, Publicerat paper (Annet vitenskapelig)
Abstract [en]

Isotonic regression problem (IR) has numerous important applications in statistics, operations research, biology, image and signal processing and other areas. IR in L2-norm is a minimization problem in which the objective function is the squared Euclidean distance from a given point to a convex set defined by monotonicity constraints of the form: i-th component of the decision vector is less or equal to its j-th component. Unfortunately, the conventional optimization methods are unable to solve IR problems originating from large data sets. The existing IR algorithms, such as the minimum lower sets algorithm by Brunk, the min-max algorithm by Lee, the network flow algorithm by Maxwell & Muchstadt and the IBCR algorithm by Block et al. are able to find exact solution to IR problem for at most a few thousands of variables. The IBCR algorithm, which proved to be the most efficient of them, is not robust enough. An alternative approach is related to solving IR problem approximately. Following this approach, Burdakov et al. developed an algorithm, called GPAV, whose block refinement extension, GPAVR, is able to solve IR problems with a very high accuracy in a far shorter time than the exact algorithms. Apart from this, GPAVR is a very robust algorithm, and it allows us to solve IR problems with over hundred thousands of variables. In this talk, we introduce new exact IR algorithms, which can be viewed as active set methods. They use the approximate solution produced by the GPAVR algorithm as a starting point. We present results of our numerical experiments demonstrating the high efficiency of the new algorithms, especially for very large-scale problems, and their robustness. They are able to solve the problems which all existing exact IR algorithms fail to solve.

sted, utgiver, år, opplag, sider
University of Hradec Kralove, Czech Republic: Guadeamus , 2007. s. 44-44
Emneord [en]
quadratic programming, isotonic regression, large-scale optimization, active-set methods
HSV kategori
Identifikatorer
URN: urn:nbn:se:liu:diva-40342Lokal ID: 53033OAI: oai:DiVA.org:liu-40342DiVA, id: diva2:261191
Tilgjengelig fra: 2009-10-10 Laget: 2009-10-10 Sist oppdatert: 2015-06-02

Open Access i DiVA

Fulltekst mangler i DiVA

Personposter BETA

Burdakov, OlegGrimvall, AndersSysoev, Oleg

Søk i DiVA

Av forfatter/redaktør
Burdakov, OlegGrimvall, AndersSysoev, Oleg
Av organisasjonen

Søk utenfor DiVA

GoogleGoogle Scholar

urn-nbn

Altmetric

urn-nbn
Totalt: 431 treff
RefereraExporteraLink to record
Permanent link

Direct link
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Annet format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annet språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf