liu.seSök publikationer i DiVA
Ändra sökning
RefereraExporteraLänk till posten
Permanent länk

Direktlänk
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf
New optimization algorithms for large-scale isotonic regression in L2-norm
Linköpings universitet, Tekniska högskolan. Linköpings universitet, Matematiska institutionen, Optimeringslära.ORCID-id: 0000-0003-1836-4200
Linköpings universitet, Filosofiska fakulteten. Linköpings universitet, Institutionen för datavetenskap, Statistik.
Linköpings universitet, Filosofiska fakulteten. Linköpings universitet, Institutionen för datavetenskap, Statistik.
2007 (Engelska)Ingår i: EUROPT-OMS Conference on Optimization,2007, University of Hradec Kralove, Czech Republic: Guadeamus , 2007, s. 44-44Konferensbidrag, Publicerat paper (Övrigt vetenskapligt)
Abstract [en]

Isotonic regression problem (IR) has numerous important applications in statistics, operations research, biology, image and signal processing and other areas. IR in L2-norm is a minimization problem in which the objective function is the squared Euclidean distance from a given point to a convex set defined by monotonicity constraints of the form: i-th component of the decision vector is less or equal to its j-th component. Unfortunately, the conventional optimization methods are unable to solve IR problems originating from large data sets. The existing IR algorithms, such as the minimum lower sets algorithm by Brunk, the min-max algorithm by Lee, the network flow algorithm by Maxwell & Muchstadt and the IBCR algorithm by Block et al. are able to find exact solution to IR problem for at most a few thousands of variables. The IBCR algorithm, which proved to be the most efficient of them, is not robust enough. An alternative approach is related to solving IR problem approximately. Following this approach, Burdakov et al. developed an algorithm, called GPAV, whose block refinement extension, GPAVR, is able to solve IR problems with a very high accuracy in a far shorter time than the exact algorithms. Apart from this, GPAVR is a very robust algorithm, and it allows us to solve IR problems with over hundred thousands of variables. In this talk, we introduce new exact IR algorithms, which can be viewed as active set methods. They use the approximate solution produced by the GPAVR algorithm as a starting point. We present results of our numerical experiments demonstrating the high efficiency of the new algorithms, especially for very large-scale problems, and their robustness. They are able to solve the problems which all existing exact IR algorithms fail to solve.

Ort, förlag, år, upplaga, sidor
University of Hradec Kralove, Czech Republic: Guadeamus , 2007. s. 44-44
Nyckelord [en]
quadratic programming, isotonic regression, large-scale optimization, active-set methods
Nationell ämneskategori
Matematik
Identifikatorer
URN: urn:nbn:se:liu:diva-40342Lokalt ID: 53033OAI: oai:DiVA.org:liu-40342DiVA, id: diva2:261191
Tillgänglig från: 2009-10-10 Skapad: 2009-10-10 Senast uppdaterad: 2015-06-02

Open Access i DiVA

Fulltext saknas i DiVA

Personposter BETA

Burdakov, OlegGrimvall, AndersSysoev, Oleg

Sök vidare i DiVA

Av författaren/redaktören
Burdakov, OlegGrimvall, AndersSysoev, Oleg
Av organisationen
Tekniska högskolanOptimeringsläraFilosofiska fakultetenStatistik
Matematik

Sök vidare utanför DiVA

GoogleGoogle Scholar

urn-nbn

Altmetricpoäng

urn-nbn
Totalt: 431 träffar
RefereraExporteraLänk till posten
Permanent länk

Direktlänk
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf