liu.seSök publikationer i DiVA
Ändra sökning
Avgränsa sökresultatet
1 - 24 av 24
RefereraExporteraLänk till träfflistan
Permanent länk
Referera
Referensformat
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf
Träffar per sida
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sortering
  • Standard (Relevans)
  • Författare A-Ö
  • Författare Ö-A
  • Titel A-Ö
  • Titel Ö-A
  • Publikationstyp A-Ö
  • Publikationstyp Ö-A
  • Äldst först
  • Nyast först
  • Skapad (Äldst först)
  • Skapad (Nyast först)
  • Senast uppdaterad (Äldst först)
  • Senast uppdaterad (Nyast först)
  • Disputationsdatum (tidigaste först)
  • Disputationsdatum (senaste först)
  • Standard (Relevans)
  • Författare A-Ö
  • Författare Ö-A
  • Titel A-Ö
  • Titel Ö-A
  • Publikationstyp A-Ö
  • Publikationstyp Ö-A
  • Äldst först
  • Nyast först
  • Skapad (Äldst först)
  • Skapad (Nyast först)
  • Senast uppdaterad (Äldst först)
  • Senast uppdaterad (Nyast först)
  • Disputationsdatum (tidigaste först)
  • Disputationsdatum (senaste först)
Markera
Maxantalet träffar du kan exportera från sökgränssnittet är 250. Vid större uttag använd dig av utsökningar.
  • 1.
    Sysoev, Oleg
    et al.
    Linköpings universitet, Institutionen för datavetenskap, Statistik och maskininlärning. Linköpings universitet, Filosofiska fakulteten.
    Burdakov, Oleg
    Linköpings universitet, Matematiska institutionen, Optimeringslära. Linköpings universitet, Tekniska fakulteten.
    A smoothed monotonic regression via L2 regularization2019Ingår i: Knowledge and Information Systems, ISSN 0219-1377, E-ISSN 0219-3116, Vol. 59, nr 1, s. 197-218Artikel i tidskrift (Refereegranskat)
    Abstract [en]

    Monotonic regression is a standard method for extracting a monotone function from non-monotonic data, and it is used in many applications. However, a known drawback of this method is that its fitted response is a piecewise constant function, while practical response functions are often required to be continuous. The method proposed in this paper achieves monotonicity and smoothness of the regression by introducing an L2 regularization term. In order to achieve a low computational complexity and at the same time to provide a high predictive power of the method, we introduce a probabilistically motivated approach for selecting the regularization parameters. In addition, we present a technique for correcting inconsistencies on the boundary. We show that the complexity of the proposed method is O(n2). Our simulations demonstrate that when the data are large and the expected response is a complicated function (which is typical in machine learning applications) or when there is a change point in the response, the proposed method has a higher predictive power than many of the existing methods.

  • 2.
    Svahn, Caroline
    et al.
    Linköpings universitet, Institutionen för datavetenskap, Statistik och maskininlärning. Linköpings universitet, Filosofiska fakulteten. Ericsson AB, Sweden.
    Sysoev, Oleg
    Linköpings universitet, Institutionen för datavetenskap, Statistik och maskininlärning. Linköpings universitet, Filosofiska fakulteten.
    Cirkic, Mirsad
    Ericsson AB, Sweden.
    Gustafsson, Fredrik
    Ericsson AB, Sweden.
    Berglund, Joel
    Ericsson AB, Sweden.
    Inter-Frequency Radio Signal Quality Prediction for Handover, Evaluated in 3GPP LTE2019Konferensbidrag (Refereegranskat)
    Abstract [en]

    Radio resource management in cellular networks is typically based on device measurements reported to the serving base station. Frequent measuring of signal quality on available frequencies would allow for highly reliable networks and optimal connection at

  • 3.
    Sysoev, Oleg
    et al.
    Linköpings universitet, Institutionen för datavetenskap, Statistik och maskininlärning. Linköpings universitet, Filosofiska fakulteten.
    Bartoszek, Krzysztof
    Linköpings universitet, Institutionen för datavetenskap, Statistik och maskininlärning. Linköpings universitet, Filosofiska fakulteten.
    Ekström, Eva-Charlotte
    Uppsala University, Akademiska Sjukhuset, Uppsala, Sweden.
    Ekström Selling, Katarina
    Uppsala University, Akademiska Sjukhuset, Uppsala, Sweden.
    PSICA: Decision trees for probabilistic subgroup identification with categorical treatments2019Ingår i: Statistics in Medicine, ISSN 0277-6715, E-ISSN 1097-0258, Vol. 38, nr 22, s. 4436-4452Artikel i tidskrift (Refereegranskat)
    Abstract [en]

    Personalized medicine aims at identifying best treatments for a patient with given characteristics. It has been shown in the literature that these methods can lead to great improvements in medicine compared to traditional methods prescribing the same treatment to all patients. Subgroup identification is a branch of personalized medicine, which aims at finding subgroups of the patients with similar characteristics for which some of the investigated treatments have a better effect than the other treatments. A number of approaches based on decision trees have been proposed to identify such subgroups, but most of them focus on two‐arm trials (control/treatment) while a few methods consider quantitative treatments (defined by the dose). However, no subgroup identification method exists that can predict the best treatments in a scenario with a categorical set of treatments. We propose a novel method for subgroup identification in categorical treatment scenarios. This method outputs a decision tree showing the probabilities of a given treatment being the best for a given group of patients as well as labels showing the possible best treatments. The method is implemented in an R package psica available on CRAN. In addition to a simulation study, we present an analysis of a community‐based nutrition intervention trial that justifies the validity of our method.

  • 4.
    Svefors, Pernilla
    et al.
    Uppsala Universitet, Uppsala, Sweden; Center for Epidemiology and Community Medicine, Stockholm, Sweden.
    Sysoev, Oleg
    Linköpings universitet, Institutionen för datavetenskap, Statistik och maskininlärning. Linköpings universitet, Filosofiska fakulteten.
    Ekstrom, Eva-Charlotte
    Uppsala Universitet, Uppsala, Sweden.
    Persson, Lars Ake
    London School of Hygiene and Tropical Medicine, London, UK.
    Arifeen, Shams E
    International Centre for Diarrhoeal Disease Research, Dhaka, Bangladesh.
    Naved, Ruchira T
    International Centre for Diarrhoeal Disease Research, Dhaka, Bangladesh.
    Rahman, Anisur
    International Centre for Diarrhoeal Disease Research, Dhaka, Bangladesh.
    Khan, Ashraful Islam
    International Centre for Diarrhoeal Disease Research, Dhaka, Bangladesh.
    Selling, Katarina
    Uppsala Universitet, Uppsala, Sweden.
    Relative importance of prenatal and postnatal determinants of stunting: data mining approaches to the MINIMat cohort, Bangladesh2019Ingår i: BMJ Open, ISSN 2044-6055, E-ISSN 2044-6055, Vol. 9, nr 8Artikel i tidskrift (Refereegranskat)
    Abstract [en]

    Introduction WHO has set a goal to reduce the prevalence of stunted child growth by 40% by the year 2025. To reach this goal, it is imperative to establish the relative importance of risk factors for stunting to deliver appropriate interventions. Currently, most interventions take place in late infancy and early childhood. This study aimed to identify the most critical prenatal and postnatal determinants of linear growth 0–24 months and the risk factors for stunting at 2 years, and to identify subgroups with different growth trajectories and levels of stunting at 2 years.

    Methods Conditional inference tree-based methods were applied to the extensive Maternal and Infant Nutrition Interventions in Matlab trial database with 309 variables of 2723 children, their parents and living conditions, including socioeconomic, nutritional and other biological characteristics of the parents; maternal exposure to violence; household food security; breast and complementary feeding; and measurements of morbidity of the mothers during pregnancy and repeatedly of their children up to 24 months of age. Child anthropometry was measured monthly from birth to 12 months, thereafter quarterly to 24 months.

    Results Birth length and weight were the most critical factors for linear growth 0–24 months and stunting at 2 years, followed by maternal anthropometry and parental education. Conditions after birth, such as feeding practices and morbidity, were less strongly associated with linear growth trajectories and stunting at 2 years.

    Conclusion The results of this study emphasise the benefit of interventions before conception and during pregnancy to reach a substantial reduction in stunting.

  • 5.
    Burdakov, Oleg
    et al.
    Linköpings universitet, Matematiska institutionen, Optimeringslära. Linköpings universitet, Tekniska fakulteten.
    Sysoev, Oleg
    Linköpings universitet, Institutionen för datavetenskap, Statistik. Linköpings universitet, Filosofiska fakulteten.
    A Dual Active-Set Algorithm for Regularized Monotonic Regression2017Ingår i: Journal of Optimization Theory and Applications, ISSN 0022-3239, E-ISSN 1573-2878, Vol. 172, nr 3, s. 929-949Artikel i tidskrift (Refereegranskat)
    Abstract [en]

    Monotonic (isotonic) regression is a powerful tool used for solving a wide range of important applied problems. One of its features, which poses a limitation on its use in some areas, is that it produces a piecewise constant fitted response. For smoothing the fitted response, we introduce a regularization term in the monotonic regression, formulated as a least distance problem with monotonicity constraints. The resulting smoothed monotonic regression is a convex quadratic optimization problem. We focus on the case, where the set of observations is completely (linearly) ordered. Our smoothed pool-adjacent-violators algorithm is designed for solving the regularized problem. It belongs to the class of dual active-set algorithms. We prove that it converges to the optimal solution in a finite number of iterations that does not exceed the problem size. One of its advantages is that the active set is progressively enlarging by including one or, typically, more constraints per iteration. This resulted in solving large-scale test problems in a few iterations, whereas the size of that problems was prohibitively too large for the conventional quadratic optimization solvers. Although the complexity of our algorithm grows quadratically with the problem size, we found its running time to grow almost linearly in our computational experiments.

  • 6.
    Burdakov, Oleg
    et al.
    Linköpings universitet, Matematiska institutionen, Optimeringslära. Linköpings universitet, Tekniska fakulteten.
    Sysoev, Oleg
    Linköpings universitet, Institutionen för datavetenskap, Statistik och maskininlärning. Linköpings universitet, Filosofiska fakulteten.
    A Dual Active-Set Algorithm for Regularized Slope-Constrained Monotonic Regression2017Ingår i: Iranian Journal of Operations Research, ISSN 2008-1189, Vol. 8, nr 2, s. 40-47Artikel i tidskrift (Refereegranskat)
    Abstract [en]

    In many problems, it is necessary to take into account monotonic relations. Monotonic (isotonic) Regression (MR) is often involved in solving such problems. The MR solutions are of a step-shaped form with a typical sharp change of values between adjacent steps. This, in some applications, is regarded as a disadvantage. We recently introduced a Smoothed MR (SMR) problem which is obtained from the MR by adding a regularization penalty term. The SMR is aimed at smoothing the aforementioned sharp change. Moreover, its solution has a far less pronounced step-structure, if at all available. The purpose of this paper is to further improve the SMR solution by getting rid of such a structure. This is achieved by introducing a lowed bound on the slope in the SMR. We call it Smoothed Slope-Constrained MR (SSCMR) problem. It is shown here how to reduce it to the SMR which is a convex quadratic optimization problem. The Smoothed Pool Adjacent Violators (SPAV) algorithm developed in our recent publications for solving the SMR problem is adapted here to solving the SSCMR problem. This algorithm belongs to the class of dual active-set algorithms. Although the complexity of the SPAV algorithm is o(n2) its running time is growing in our computational experiments almost linearly with n. We present numerical results which illustrate the predictive performance quality of our approach. They also show that the SSCMR solution is free of the undesirable features of the MR and SMR solutions.

  • 7.
    Sysoev, Oleg
    et al.
    Linköpings universitet, Institutionen för datavetenskap, Statistik. Linköpings universitet, Filosofiska fakulteten.
    Burdakov, Oleg
    Linköpings universitet, Matematiska institutionen, Optimeringslära. Linköpings universitet, Tekniska fakulteten.
    A Smoothed Monotonic Regression via L2 Regularization2016Rapport (Övrigt vetenskapligt)
    Abstract [en]

    Monotonic Regression (MR) is a standard method for extracting a monotone function from non-monotonic data, and it is used in many applications. However, a known drawback of this method is that its fitted response is a piecewise constant function, while practical response functions are often required to be continuous. The method proposed in this paper achieves monotonicity and smoothness of the regression by introducing an L2 regularization term, and it is shown that the complexity of this method is O(n2). In addition, our simulations demonstrate that the proposed method normally has higher predictive power than some commonly used alternative methods, such as monotonic kernel smoothers. In contrast to these methods, our approach is probabilistically motivated and has connections to Bayesian modeling.

  • 8.
    Kalish, Michael L.
    et al.
    Syracuse University, USA.
    Dunn, John C.
    University of Adelaide, Australia.
    Burdakov, Oleg P.
    Linköpings universitet, Matematiska institutionen, Optimeringslära. Linköpings universitet, Tekniska fakulteten.
    Sysoev, Oleg
    Linköpings universitet, Institutionen för datavetenskap, Statistik. Linköpings universitet, Filosofiska fakulteten.
    A statistical test of the equality of latent orders2016Ingår i: Journal of mathematical psychology (Print), ISSN 0022-2496, E-ISSN 1096-0880, Vol. 70, s. 1-11, artikel-id YJMPS2051Artikel i tidskrift (Refereegranskat)
    Abstract [en]

    It is sometimes the case that a theory proposes that the population means on two variables should have the same rank order across a set of experimental conditions. This paper presents a test of this hypothesis. The test statistic is based on the coupled monotonic regression algorithm developed by the authors. The significance of the test statistic is determined by comparison to an empirical distribution specific to each case, obtained via non-parametric or semi-parametric bootstrap. We present an analysis of the power and Type I error control of the test based on numerical simulation. Partial order constraints placed on the variables may sometimes be theoretically justified. These constraints are easily incorporated into the computation of the test statistic and are shown to have substantial effects on power. The test can be applied to any form of data, as long as an appropriate statistical model can be specified.

  • 9.
    Sysoev, Oleg
    et al.
    Linköpings universitet, Institutionen för datavetenskap, Statistik. Linköpings universitet, Tekniska fakulteten.
    Grimvall, Anders
    Linköpings universitet, Institutionen för datavetenskap, Statistik. Linköpings universitet, Tekniska fakulteten.
    Burdakov, Oleg
    Linköpings universitet, Matematiska institutionen, Optimeringslära. Linköpings universitet, Tekniska fakulteten.
    Bootstrap confidence intervals for large-scale multivariate monotonic regression problems2016Ingår i: Communications in statistics. Simulation and computation, ISSN 0361-0918, E-ISSN 1532-4141, Vol. 45, nr 3, s. 1025-1040Artikel i tidskrift (Refereegranskat)
    Abstract [en]

    Recently, the methods used to estimate monotonic regression (MR) models have been substantially improved, and some algorithms can now produce high-accuracy monotonic fits to multivariate datasets containing over a million observations. Nevertheless, the computational burden can be prohibitively large for resampling techniques in which numerous datasets are processed independently of each other. Here, we present efficient algorithms for estimation of confidence limits in large-scale settings that take into account the similarity of the bootstrap or jackknifed datasets to which MR models are fitted. In addition, we introduce modifications that substantially improve the accuracy of MR solutions for binary response variables. The performance of our algorithms isillustrated using data on death in coronary heart disease for a large population. This example also illustrates that MR can be a valuable complement to logistic regression.

  • 10.
    Burdakov, Oleg
    et al.
    Linköpings universitet, Matematiska institutionen, Optimeringslära. Linköpings universitet, Tekniska fakulteten.
    Sysoev, Oleg
    Linköpings universitet, Institutionen för datavetenskap, Statistik. Linköpings universitet, Filosofiska fakulteten.
    Regularized monotonic regression2016Rapport (Övrigt vetenskapligt)
    Abstract [en]

    Monotonic (isotonic) Regression (MR) is a powerful tool used for solving a wide range of important applied problems. One of its features, which poses a limitation on its use in some areas, is that it produces a piecewise constant fitted response. For smoothing the fitted response, we introduce a regularization term in the MR formulated as a least distance problem with monotonicity constraints. The resulting Smoothed Monotonic Regrassion (SMR) is a convex quadratic optimization problem. We focus on the SMR, where the set of observations is completely (linearly) ordered. Our Smoothed Pool-Adjacent-Violators (SPAV) algorithm is designed for solving the SMR. It belongs to the class of dual activeset algorithms. We proved its finite convergence to the optimal solution in, at most, n iterations, where n is the problem size. One of its advantages is that the active set is progressively enlarging by including one or, typically, more constraints per iteration. This resulted in solving large-scale SMR test problems in a few iterations, whereas the size of that problems was prohibitively too large for the conventional quadratic optimization solvers. Although the complexity of the SPAV algorithm is O(n2), its running time was growing in our computational experiments in proportion to n1:16.

  • 11.
    Sysoev, Oleg
    et al.
    Linköpings universitet, Tekniska högskolan. Linköpings universitet, Institutionen för datavetenskap, Statistik.
    Grimvall, Anders
    Linköpings universitet, Tekniska högskolan. Linköpings universitet, Institutionen för datavetenskap, Statistik.
    Burdakov, Oleg
    Linköpings universitet, Tekniska högskolan. Linköpings universitet, Matematiska institutionen, Optimeringslära.
    Bootstrap estimation of the variance of the error term in monotonic regression models2013Ingår i: Journal of Statistical Computation and Simulation, ISSN 0094-9655, E-ISSN 1563-5163, Vol. 83, nr 4, s. 625-638Artikel i tidskrift (Refereegranskat)
    Abstract [en]

    The variance of the error term in ordinary regression models and linear smoothers is usually estimated by adjusting the average squared residual for the trace of the smoothing matrix (the degrees of freedom of the predicted response). However, other types of variance estimators are needed when using monotonic regression (MR) models, which are particularly suitable for estimating response functions with pronounced thresholds. Here, we propose a simple bootstrap estimator to compensate for the over-fitting that occurs when MR models are estimated from empirical data. Furthermore, we show that, in the case of one or two predictors, the performance of this estimator can be enhanced by introducing adjustment factors that take into account the slope of the response function and characteristics of the distribution of the explanatory variables. Extensive simulations show that our estimators perform satisfactorily for a great variety of monotonic functions and error distributions.

  • 12.
    Sysoev, Oleg
    et al.
    Linköpings universitet, Institutionen för datavetenskap. Linköpings universitet, Tekniska högskolan.
    Burdakov, Oleg
    Linköpings universitet, Tekniska högskolan. Linköpings universitet, Matematiska institutionen, Optimeringslära.
    Grimvall, Anders
    Linköpings universitet, Institutionen för datavetenskap. Linköpings universitet, Tekniska högskolan.
    A segmentation-based algorithm for large-scale partially ordered monotonic regression2011Ingår i: Computational Statistics & Data Analysis, ISSN 0167-9473, E-ISSN 1872-7352, Vol. 55, nr 8, s. 2463-2476Artikel i tidskrift (Refereegranskat)
    Abstract [en]

    Monotonic regression (MR) is an efficient tool for estimating functions that are monotonic with respect to input variables. A fast and highly accurate approximate algorithm called the GPAV was recently developed for efficient solving large-scale multivariate MR problems. When such problems are too large, the GPAV becomes too demanding in terms of computational time and memory. An approach, that extends the application area of the GPAV to encompass much larger MR problems, is presented. It is based on segmentation of a large-scale MR problem into a set of moderate-scale MR problems, each solved by the GPAV. The major contribution is the development of a computationally efficient strategy that produces a monotonic response using the local solutions. A theoretically motivated trend-following technique is introduced to ensure higher accuracy of the solution. The presented results of extensive simulations on very large data sets demonstrate the high efficiency of the new algorithm.

  • 13.
    Sysoev, Oleg
    Linköpings universitet, Institutionen för datavetenskap. Linköpings universitet, Tekniska högskolan.
    Monotonic regression for large multivariate datasets2010Doktorsavhandling, sammanläggning (Övrigt vetenskapligt)
    Abstract [sv]

    Monoton regression är en icke-parametrisk statistisk metod som är utvecklad speciellt för tillämpningar i vilka det förväntade värdet aven responsvariabel ökar eller minskar med en eller flera förklaringsvariabler. Sådana tillämpningar finns inom företagsekonomi, fysik, biologi, medicin, signalbehandling och andra områden. Eftersom många insamlade datamaterial kan innehålla ett mycket stort antal multivariata observationer finns ett starkt behov av effektiva numeriska algoritmer. Här presenterar vi nya metoder som gör det möjligt att anpassa monotona funktioner till mer än 100000 datapunkter. Genom simulering visar vi. att våra algoritmer har hög noggrannhet och innebär betydande förbättringar med avseende på beräkningstid och krav på minnesutrymme. Speciellt visar vi hur segmentering av ett storskaligt problem starkt kan förbättra existerande algoritmer. Dessutom visar vi hur osäkerheten aven monoton regressions modell kan uppskattas. En av de metoder vi utvecklat kan användas för att uppskatta variansen för de slumpkomponenter som kan finnas i den observerade responsvariabeln. Andra metoder, baserade på s.k. återsampling, kan ge konfidensintervall för den förväntade responsen för givna värden på ett antal prediktorer.

    Delarbeten
    1. An O(n2) algorithm for isotonic regression problems
    Öppna denna publikation i ny flik eller fönster >>An O(n2) algorithm for isotonic regression problems
    2006 (Engelska)Ingår i: Large-Scale Nonlinear Optimization / [ed] G. Di Pillo and M. Roma, Springer-Verlag , 2006, s. 25-33Kapitel i bok, del av antologi (Refereegranskat)
    Abstract [en]

    Large-Scale Nonlinear Optimization reviews and discusses recent advances in the development of methods and algorithms for nonlinear optimization and its applications, focusing on the large-dimensional case, the current forefront of much research.

    The chapters of the book, authored by some of the most active and well-known researchers in nonlinear optimization, give an updated overview of the field from different and complementary standpoints, including theoretical analysis, algorithmic development, implementation issues and applications

    Ort, förlag, år, upplaga, sidor
    Springer-Verlag, 2006
    Serie
    Nonconvex Optimization and Its Applications ; 83
    Nyckelord
    Quadratic programming, large scale optimization, least distance problem, isotonic regression, pool-adjacent-violators algorithm
    Nationell ämneskategori
    Beräkningsmatematik
    Identifikatorer
    urn:nbn:se:liu:diva-60581 (URN)978-0-387-30063-4 (ISBN)0-387-3-0065-1 (ISBN)
    Tillgänglig från: 2010-10-20 Skapad: 2010-10-20 Senast uppdaterad: 2015-06-02Bibliografiskt granskad
    2. Data preordering in generalized PAV algorithm for monotonic regression
    Öppna denna publikation i ny flik eller fönster >>Data preordering in generalized PAV algorithm for monotonic regression
    2006 (Engelska)Ingår i: Journal of Computational Mathematics, ISSN 0254-9409, E-ISSN 1991-7139, Vol. 24, nr 6, s. 771-790Artikel i tidskrift (Refereegranskat) Published
    Abstract [en]

    Monotonic regression (MR) is a least distance problem with monotonicity constraints induced by a partially ordered data set of observations. In our recent publication [In Ser. {\sl Nonconvex Optimization and Its Applications}, Springer-Verlag, (2006) {\bf 83}, pp. 25-33], the Pool-Adjacent-Violators algorithm (PAV) was generalized from completely to partially ordered data sets (posets). The new algorithm, called GPAV, is characterized by the very low computational complexity, which is of second order in the number of observations. It treats the observations in a consecutive order, and it can follow any arbitrarily chosen topological order of the poset of observations. The GPAV algorithm produces a sufficiently accurate solution to the MR problem, but the accuracy depends on the chosen topological order. Here we prove that there exists a topological order for which the resulted GPAV solution is optimal. Furthermore, we present results of extensive numerical experiments, from which we draw conclusions about the most and the least preferable topological orders.

    Nationell ämneskategori
    Matematik
    Identifikatorer
    urn:nbn:se:liu:diva-36278 (URN)30826 (Lokalt ID)30826 (Arkivnummer)30826 (OAI)
    Tillgänglig från: 2009-10-10 Skapad: 2009-10-10 Senast uppdaterad: 2017-12-13
    3. Generalized PAV algorithm with block refinement for partially ordered monotonic regression
    Öppna denna publikation i ny flik eller fönster >>Generalized PAV algorithm with block refinement for partially ordered monotonic regression
    2009 (Engelska)Ingår i: Proceedings of the Workshop on Learning Monotone Models from Data / [ed] A. Feelders and R. Potharst, 2009, s. 23-37Konferensbidrag, Publicerat paper (Övrigt vetenskapligt)
    Abstract [en]

    In this paper, the monotonic regression problem (MR) is considered. We have recentlygeneralized for MR the well-known Pool-Adjacent-Voilators algorithm(PAV) from the case of completely to partially ordered data sets. Thenew algorithm, called GPAV, combines both high accuracy and lowcomputational complexity which grows quadratically with the problemsize. The actual growth observed in practice is typically far lowerthan quadratic. The fitted values of the exact MR solution composeblocks of equal values. Its GPAV approximation has also a blockstructure. We present here a technique for refining blocks produced bythe GPAV algorithm to make the new blocks more close to those in theexact solution. This substantially improves the accuracy of the GPAVsolution and does not deteriorate its computational complexity. Thecomputational time for the new technique is approximately triple thetime of running the GPAV algorithm. Its efficiency is demonstrated byresults of our numerical experiments.

    Nyckelord
    Monotonic regression, Partially ordered data set, Pool-adjacent-violators algorithm, Quadratic programming, Large scale optimization, Least distance problem.
    Nationell ämneskategori
    Beräkningsmatematik
    Identifikatorer
    urn:nbn:se:liu:diva-52535 (URN)
    Konferens
    the European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, Bled, Slovenia, September 7-11, 2009
    Tillgänglig från: 2010-01-02 Skapad: 2010-01-02 Senast uppdaterad: 2017-12-06
    4. A segmentation-based algorithm for large-scale partially ordered monotonic regression
    Öppna denna publikation i ny flik eller fönster >>A segmentation-based algorithm for large-scale partially ordered monotonic regression
    2011 (Engelska)Ingår i: Computational Statistics & Data Analysis, ISSN 0167-9473, E-ISSN 1872-7352, Vol. 55, nr 8, s. 2463-2476Artikel i tidskrift (Refereegranskat) Published
    Abstract [en]

    Monotonic regression (MR) is an efficient tool for estimating functions that are monotonic with respect to input variables. A fast and highly accurate approximate algorithm called the GPAV was recently developed for efficient solving large-scale multivariate MR problems. When such problems are too large, the GPAV becomes too demanding in terms of computational time and memory. An approach, that extends the application area of the GPAV to encompass much larger MR problems, is presented. It is based on segmentation of a large-scale MR problem into a set of moderate-scale MR problems, each solved by the GPAV. The major contribution is the development of a computationally efficient strategy that produces a monotonic response using the local solutions. A theoretically motivated trend-following technique is introduced to ensure higher accuracy of the solution. The presented results of extensive simulations on very large data sets demonstrate the high efficiency of the new algorithm.

    Ort, förlag, år, upplaga, sidor
    Elsevier Science B.V., Amsterdam., 2011
    Nyckelord
    Quadratic programming, Large-scale optimization, Least distance problem, Monotonic regression, Partially ordered data set, Pool-adjacent-violators algorithm
    Nationell ämneskategori
    Samhällsvetenskap
    Identifikatorer
    urn:nbn:se:liu:diva-69182 (URN)10.1016/j.csda.2011.03.001 (DOI)000291181000002 ()
    Tillgänglig från: 2011-06-17 Skapad: 2011-06-17 Senast uppdaterad: 2017-12-11
    5. Bootstrap estimation of the variance of the error term in monotonic regression models
    Öppna denna publikation i ny flik eller fönster >>Bootstrap estimation of the variance of the error term in monotonic regression models
    2013 (Engelska)Ingår i: Journal of Statistical Computation and Simulation, ISSN 0094-9655, E-ISSN 1563-5163, Vol. 83, nr 4, s. 625-638Artikel i tidskrift (Refereegranskat) Published
    Abstract [en]

    The variance of the error term in ordinary regression models and linear smoothers is usually estimated by adjusting the average squared residual for the trace of the smoothing matrix (the degrees of freedom of the predicted response). However, other types of variance estimators are needed when using monotonic regression (MR) models, which are particularly suitable for estimating response functions with pronounced thresholds. Here, we propose a simple bootstrap estimator to compensate for the over-fitting that occurs when MR models are estimated from empirical data. Furthermore, we show that, in the case of one or two predictors, the performance of this estimator can be enhanced by introducing adjustment factors that take into account the slope of the response function and characteristics of the distribution of the explanatory variables. Extensive simulations show that our estimators perform satisfactorily for a great variety of monotonic functions and error distributions.

    Ort, förlag, år, upplaga, sidor
    Taylor & Francis Group, 2013
    Nyckelord
    uncertainty estimation; bootstrap; monotonic regression; pool-adjacent-violators algorithm
    Nationell ämneskategori
    Sannolikhetsteori och statistik
    Identifikatorer
    urn:nbn:se:liu:diva-78858 (URN)10.1080/00949655.2011.631138 (DOI)000317276900003 ()
    Tillgänglig från: 2012-06-21 Skapad: 2012-06-21 Senast uppdaterad: 2017-12-07
    6. Bootstrap confidence intervals for large-scale multivariate monotonic regression problems
    Öppna denna publikation i ny flik eller fönster >>Bootstrap confidence intervals for large-scale multivariate monotonic regression problems
    2016 (Engelska)Ingår i: Communications in statistics. Simulation and computation, ISSN 0361-0918, E-ISSN 1532-4141, Vol. 45, nr 3, s. 1025-1040Artikel i tidskrift (Refereegranskat) Published
    Abstract [en]

    Recently, the methods used to estimate monotonic regression (MR) models have been substantially improved, and some algorithms can now produce high-accuracy monotonic fits to multivariate datasets containing over a million observations. Nevertheless, the computational burden can be prohibitively large for resampling techniques in which numerous datasets are processed independently of each other. Here, we present efficient algorithms for estimation of confidence limits in large-scale settings that take into account the similarity of the bootstrap or jackknifed datasets to which MR models are fitted. In addition, we introduce modifications that substantially improve the accuracy of MR solutions for binary response variables. The performance of our algorithms isillustrated using data on death in coronary heart disease for a large population. This example also illustrates that MR can be a valuable complement to logistic regression.

    Ort, förlag, år, upplaga, sidor
    Taylor & Francis, 2016
    Nyckelord
    Big data, Bootstrap, Confidence intervals, Monotonic regression, Pool- adjacent-violators algorithm
    Nationell ämneskategori
    Sannolikhetsteori och statistik Beräkningsmatematik
    Identifikatorer
    urn:nbn:se:liu:diva-85169 (URN)10.1080/03610918.2014.911899 (DOI)000372527900014 ()
    Anmärkning

    Vid tiden för disputation förelåg publikationen som manuskript

    Tillgänglig från: 2012-11-08 Skapad: 2012-11-08 Senast uppdaterad: 2017-12-13
  • 14.
    Burdakov, Oleg
    et al.
    Linköpings universitet, Matematiska institutionen, Optimeringslära. Linköpings universitet, Tekniska högskolan.
    Grimvall, Anders
    Linköpings universitet, Institutionen för datavetenskap, Statistik. Linköpings universitet, Filosofiska fakulteten.
    Sysoev, Oleg
    Linköpings universitet, Institutionen för datavetenskap, Statistik. Linköpings universitet, Filosofiska fakulteten.
    Generalized PAV algorithm with block refinement for partially ordered monotonic regression2009Ingår i: Proceedings of the Workshop on Learning Monotone Models from Data / [ed] A. Feelders and R. Potharst, 2009, s. 23-37Konferensbidrag (Övrigt vetenskapligt)
    Abstract [en]

    In this paper, the monotonic regression problem (MR) is considered. We have recentlygeneralized for MR the well-known Pool-Adjacent-Voilators algorithm(PAV) from the case of completely to partially ordered data sets. Thenew algorithm, called GPAV, combines both high accuracy and lowcomputational complexity which grows quadratically with the problemsize. The actual growth observed in practice is typically far lowerthan quadratic. The fitted values of the exact MR solution composeblocks of equal values. Its GPAV approximation has also a blockstructure. We present here a technique for refining blocks produced bythe GPAV algorithm to make the new blocks more close to those in theexact solution. This substantially improves the accuracy of the GPAVsolution and does not deteriorate its computational complexity. Thecomputational time for the new technique is approximately triple thetime of running the GPAV algorithm. Its efficiency is demonstrated byresults of our numerical experiments.

  • 15.
    Burdakov, Oleg
    et al.
    Linköpings universitet, Tekniska högskolan. Linköpings universitet, Matematiska institutionen, Optimeringslära.
    Grimvall, Anders
    Linköpings universitet, Filosofiska fakulteten. Linköpings universitet, Institutionen för datavetenskap, Statistik.
    Sysoev, Oleg
    Linköpings universitet, Filosofiska fakulteten. Linköpings universitet, Institutionen för datavetenskap, Statistik.
    Kapyrin, Ivan
    Institute of Numerical Mathematics Russian Academy of Sciences, Moscow, Russia.
    Vassilevski, Yuri
    Institute of Numerical Mathematics Russian Academy of Sciences, Moscow, Russia.
    Monotonic data fitting and interpolation with application to postprocessing of FE solutions2007Ingår i: CERFACS 20th Anniversary Conference on High-performance Computing,2007, 2007, s. 11-12Konferensbidrag (Övrigt vetenskapligt)
    Abstract [en]

    In this talk we consider the isotonic regression (IR) problem which can be formulated as follows. Given a vector $\bar{x} \in R^n$, find $x_* \in R^n$ which solves the problem: \begin{equation}\label{ir2} \begin{array}{cl} \mbox{min} & \|x-\bar{x}\|^2 \\ \mbox{s.t.} & Mx \ge 0. \end{array} \end{equation} The set of constraints $Mx \ge 0$ represents here the monotonicity relations of the form $x_i \le x_j$ for a given set of pairs of the components of $x$. The corresponding row of the matrix $M$ is composed mainly of zeros, but its $i$th and $j$th elements, which are equal to $-1$ and $+1$, respectively. The most challenging applications of (\ref{ir2}) are characterized by very large values of $n$. We introduce new IR algorithms. Our numerical experiments demonstrate the high efficiency of our algorithms, especially for very large-scale problems, and their robustness. They are able to solve some problems which all existing IR algorithms fail to solve. We outline also our new algorithms for monotonicity-preserving interpolation of scattered multivariate data. In this talk we focus on application of our IR algorithms in postprocessing of FE solutions. Non-monotonicity of the numerical solution is a typical drawback of the conventional methods of approximation, such as finite elements (FE), finite volumes, and mixed finite elements. The problem of monotonicity is particularly important in cases of highly anisotropic diffusion tensors or distorted unstructured meshes. For instance, in the nuclear waste transport simulation, the non-monotonicity results in the presence of negative concentrations which may lead to unacceptable concentration and chemistry calculations failure. Another drawback of the conventional methods is a possible violation of the discrete maximum principle, which establishes lower and upper bounds for the solution. We suggest here a least-change correction to the available FE solution $\bar{x} \in R^n$. This postprocessing procedure is aimed on recovering the monotonicity and some other important properties that may not be exhibited by $\bar{x}$. The mathematical formulation of the postprocessing problem is reduced to the following convex quadratic programming problem \begin{equation}\label{ls2} \begin{array}{cl} \mbox{min} & \|x-\bar{x}\|^2 \\ \mbox{s.t.} & Mx \ge 0, \quad l \le x \le u, \quad e^Tx = m, \end{array} \end{equation} where$e=(1,1, \ldots ,1)^T \in R^n$. The set of constraints $Mx \ge 0$ represents here the monotonicity relations between some of the adjacent mesh cells. The constraints $l \le x \le u$ originate from the discrete maximum principle. The last constraint formulates the conservativity requirement. The postprocessing based on (\ref{ls2}) is typically a large scale problem. We introduce here algorithms for solving this problem. They are based on the observation that, in the presence of the monotonicity constraints only, problem (\ref{ls2}) is the classical monotonic regression problem, which can be solved efficiently by some of the available monotonic regression algorithms. This solution is used then for producing the optimal solution to problem (\ref{ls2}) in the presence of all the constraints. We present results of numerical experiments to illustrate the efficiency of our algorithms.

  • 16.
    Burdakov, Oleg
    et al.
    Linköpings universitet, Tekniska högskolan. Linköpings universitet, Matematiska institutionen, Optimeringslära.
    Grimvall, Anders
    Linköpings universitet, Filosofiska fakulteten. Linköpings universitet, Institutionen för datavetenskap, Statistik.
    Sysoev, Oleg
    Linköpings universitet, Filosofiska fakulteten. Linköpings universitet, Institutionen för datavetenskap, Statistik.
    New optimization algorithms for large-scale isotonic regression in L2-norm2007Ingår i: EUROPT-OMS Conference on Optimization,2007, University of Hradec Kralove, Czech Republic: Guadeamus , 2007, s. 44-44Konferensbidrag (Övrigt vetenskapligt)
    Abstract [en]

    Isotonic regression problem (IR) has numerous important applications in statistics, operations research, biology, image and signal processing and other areas. IR in L2-norm is a minimization problem in which the objective function is the squared Euclidean distance from a given point to a convex set defined by monotonicity constraints of the form: i-th component of the decision vector is less or equal to its j-th component. Unfortunately, the conventional optimization methods are unable to solve IR problems originating from large data sets. The existing IR algorithms, such as the minimum lower sets algorithm by Brunk, the min-max algorithm by Lee, the network flow algorithm by Maxwell & Muchstadt and the IBCR algorithm by Block et al. are able to find exact solution to IR problem for at most a few thousands of variables. The IBCR algorithm, which proved to be the most efficient of them, is not robust enough. An alternative approach is related to solving IR problem approximately. Following this approach, Burdakov et al. developed an algorithm, called GPAV, whose block refinement extension, GPAVR, is able to solve IR problems with a very high accuracy in a far shorter time than the exact algorithms. Apart from this, GPAVR is a very robust algorithm, and it allows us to solve IR problems with over hundred thousands of variables. In this talk, we introduce new exact IR algorithms, which can be viewed as active set methods. They use the approximate solution produced by the GPAVR algorithm as a starting point. We present results of our numerical experiments demonstrating the high efficiency of the new algorithms, especially for very large-scale problems, and their robustness. They are able to solve the problems which all existing exact IR algorithms fail to solve.

  • 17.
    Sysoev, Oleg
    et al.
    Linköpings universitet, Filosofiska fakulteten. Linköpings universitet, Institutionen för datavetenskap, Statistik.
    Burdakov, Oleg
    Linköpings universitet, Tekniska högskolan. Linköpings universitet, Matematiska institutionen, Optimeringslära.
    Grimvall, Anders
    Linköpings universitet, Filosofiska fakulteten. Linköpings universitet, Institutionen för datavetenskap, Statistik.
    New optimization methods for isotonic regression in L1 norm2007Ingår i: EUROPT-OMS Conference on Optimization,2007, University of Hradec Kralove, Czech Republic: Guadeamus , 2007, s. 133-133Konferensbidrag (Övrigt vetenskapligt)
    Abstract [en]

    Isotonic regression problem (IR) has numerous important applications in statistics, operations research, biology, image and signal processing and other areas. IR is a minimization problem with the objective function defined by the distance from a given point to a convex set defined by monotonicity constraints of the form: i-th component of the decision vector is less or equal to its j-th component. The distance in IR is usually associated with the Lp norm, whereas the norms L1 and L2 are of the highest practical interest. The conventional optimization methods are unable to solve large-scale IR problems originating from large data sets. Historically, the major efforts were focused on IR problem in the L2 norm. Exact algorithms such as the minimum lower sets algorithm by Brunk, the min-max algorithm by Lee, the network flow algorithm by Maxwell & Muchstadt and the IBCR algorithm by Block et al. were developed. Among them the IBCR algorithm has been proved to be the most numerically efficient, but it is not robust enough. An alternative approach is related to solving IR problem approximately. Following this approach, Burdakov et al. developed GPAV algorithm whose block refinement extension, GPAVR, is able to solve IR problem with high accuracy in a far shorter time than the exact algorithms. Apart from this, GPAVR is a very robust algorithm. Unfortunately, for the norm L1 there are no algorithms which are as efficient as those in L2 norm. In our talk, we introduce new algorithms, GPAVR1 and IBCR1. They are extensions of the algorithms GPAV and IBCR to L1 norm. We present also results of numerical experiments, which demonstrate the high efficiency of the new algorithms, especially for very large-scale problems.

  • 18.
    Burdakov, Oleg
    et al.
    Linköpings universitet, Tekniska högskolan. Linköpings universitet, Matematiska institutionen, Optimeringslära.
    Sysoev, Oleg
    Linköpings universitet, Filosofiska fakulteten. Linköpings universitet, Matematiska institutionen, Statistik.
    Grimvall, Anders
    Linköpings universitet, Filosofiska fakulteten. Linköpings universitet, Matematiska institutionen, Statistik.
    Hussian, Mohamed
    Linköpings universitet, Filosofiska fakulteten. Linköpings universitet, Matematiska institutionen, Statistik.
    An O(n2) algorithm for isotonic regression2006Ingår i: Large-Scale Nonlinear Optimization / [ed] Pillo, Gianni; Roma, Massimo, New York: Springer Science+Business Media B.V., 2006, s. 25-33Konferensbidrag (Övrigt vetenskapligt)
    Abstract [en]

    We consider the problem of minimizing the distance from a given n-dimensional vector to a set defined by constraints of the form xixj. Such constraints induce a partial order of the components xi, which can be illustrated by an acyclic directed graph. This problem is also known as the isotonic regression (IR) problem. IR has important applications in statistics, operations research and signal processing, with most of them characterized by a very large value of n. For such large-scale problems, it is of great practical importance to develop algorithms whose complexity does not rise with n too rapidly. The existing optimization-based algorithms and statistical IR algorithms have either too high computational complexity or too low accuracy of the approximation to the optimal solution they generate. We introduce a new IR algorithm, which can be viewed as a generalization of the Pool-Adjacent-Violator (PAV) algorithm from completely to partially ordered data. Our algorithm combines both low computational complexity O(n2) and high accuracy. This allows us to obtain sufficiently accurate solutions to IR problems with thousands of observations.

  • 19.
    Burdakov, Oleg
    et al.
    Linköpings universitet, Matematiska institutionen, Optimeringslära. Linköpings universitet, Tekniska högskolan.
    Sysoev, Oleg
    Linköpings universitet, Institutionen för datavetenskap, Statistik. Linköpings universitet, Filosofiska fakulteten.
    Grimvall, Anders
    Linköpings universitet, Institutionen för datavetenskap, Statistik. Linköpings universitet, Filosofiska fakulteten.
    Hussian, Mohammed
    Linköpings universitet, Matematiska institutionen, Statistik. Linköpings universitet, Filosofiska fakulteten.
    An O(n2) algorithm for isotonic regression problems2006Ingår i: Large-Scale Nonlinear Optimization / [ed] G. Di Pillo and M. Roma, Springer-Verlag , 2006, s. 25-33Kapitel i bok, del av antologi (Refereegranskat)
    Abstract [en]

    Large-Scale Nonlinear Optimization reviews and discusses recent advances in the development of methods and algorithms for nonlinear optimization and its applications, focusing on the large-dimensional case, the current forefront of much research.

    The chapters of the book, authored by some of the most active and well-known researchers in nonlinear optimization, give an updated overview of the field from different and complementary standpoints, including theoretical analysis, algorithmic development, implementation issues and applications

  • 20.
    Burdakov, Oleg
    et al.
    Linköpings universitet, Tekniska högskolan. Linköpings universitet, Matematiska institutionen, Optimeringslära.
    Grimvall, Anders
    Linköpings universitet, Filosofiska fakulteten. Linköpings universitet, Matematiska institutionen, Statistik.
    Sysoev, Oleg
    Linköpings universitet, Matematiska institutionen.
    Data preordering in generalized pav algorithm for monotonic regression2006Rapport (Övrigt vetenskapligt)
  • 21.
    Burdakov, Oleg
    et al.
    Linköpings universitet, Matematiska institutionen. Linköpings universitet, Tekniska högskolan.
    Grimvall, Anders
    Linköpings universitet, Tekniska högskolan. Linköpings universitet, Matematiska institutionen, Optimeringslära.
    Sysoev, Oleg
    Linköpings universitet, Matematiska institutionen. Linköpings universitet, Tekniska högskolan.
    Data preordering in generalized PAV algorithm for monotonic regression2006Ingår i: Journal of Computational Mathematics, ISSN 0254-9409, E-ISSN 1991-7139, Vol. 24, nr 6, s. 771-790Artikel i tidskrift (Refereegranskat)
    Abstract [en]

    Monotonic regression (MR) is a least distance problem with monotonicity constraints induced by a partially ordered data set of observations. In our recent publication [In Ser. {\sl Nonconvex Optimization and Its Applications}, Springer-Verlag, (2006) {\bf 83}, pp. 25-33], the Pool-Adjacent-Violators algorithm (PAV) was generalized from completely to partially ordered data sets (posets). The new algorithm, called GPAV, is characterized by the very low computational complexity, which is of second order in the number of observations. It treats the observations in a consecutive order, and it can follow any arbitrarily chosen topological order of the poset of observations. The GPAV algorithm produces a sufficiently accurate solution to the MR problem, but the accuracy depends on the chosen topological order. Here we prove that there exists a topological order for which the resulted GPAV solution is optimal. Furthermore, we present results of extensive numerical experiments, from which we draw conclusions about the most and the least preferable topological orders.

  • 22.
    Burdakov, Oleg
    et al.
    Linköpings universitet, Matematiska institutionen, Optimeringslära. Linköpings universitet, Tekniska högskolan.
    Grimvall, Anders
    Linköpings universitet, Institutionen för datavetenskap, Statistik. Linköpings universitet, Filosofiska fakulteten.
    Hussian, Mohamed
    Linköpings universitet, Institutionen för datavetenskap, Statistik. Linköpings universitet, Filosofiska fakulteten.
    Sysoev, Oleg
    Linköpings universitet, Institutionen för datavetenskap, Statistik. Linköpings universitet, Filosofiska fakulteten.
    Hasse diagrams and the generalized PAV-algorithm for monotonic regression in several explanatory variables2005Ingår i: Computational Statistics and Data Analysis, ISSN 0167-9473Artikel i tidskrift (Refereegranskat)
    Abstract [en]

    Monotonic regression is a nonparametric method for estimation ofmodels in which the expected value of a response variable y increases ordecreases in all coordinates of a vector of explanatory variables x = (x1, …, xp).Here, we examine statistical and computational aspects of our recentlyproposed generalization of the pool-adjacent-violators (PAV) algorithm fromone to several explanatory variables. In particular, we show how the goodnessof-fit and accuracy of obtained solutions can be enhanced by presortingobserved data with respect to their level in a Hasse diagram of the partial orderof the observed x-vectors, and we also demonstrate how these calculations canbe carried out to save computer memory and computational time. Monte Carlosimulations illustrate how rapidly the mean square difference between fittedand expected response values tends to zero, and how quickly the mean squareresidual approaches the true variance of the random error, as the number of observations increases up to 104.

  • 23.
    Hussian, Mohamed
    et al.
    Linköpings universitet, Institutionen för datavetenskap, Statistik. Linköpings universitet, Filosofiska fakulteten.
    Grimvall, Anders
    Linköpings universitet, Institutionen för datavetenskap, Statistik. Linköpings universitet, Filosofiska fakulteten.
    Burdakov, Oleg
    Linköpings universitet, Matematiska institutionen, Optimeringslära. Linköpings universitet, Tekniska högskolan.
    Sysoev, Oleg
    Linköpings universitet, Institutionen för datavetenskap, Statistik. Linköpings universitet, Filosofiska fakulteten.
    Monotonic regression for the detection of temporal trends in environmental quality data2005Ingår i: Match, ISSN 0340-6253, Vol. 54, nr 3, s. 535-550Artikel i tidskrift (Refereegranskat)
  • 24.
    Burdakov, Oleg
    et al.
    Linköpings universitet, Tekniska högskolan. Linköpings universitet, Matematiska institutionen, Optimeringslära.
    Sysoev, Oleg
    Linköpings universitet, Institutionen för datavetenskap, Statistik.
    Grimvall, Anders
    Linköpings universitet, Filosofiska fakulteten. Linköpings universitet, Matematiska institutionen, Statistik.
    Hussian, Mohamed
    Linköpings universitet, Filosofiska fakulteten. Linköpings universitet, Matematiska institutionen, Statistik.
    An algorithm for isotonic regression problems2004Ingår i: European Congress on Computational Methods in Applied Sciences and Engineering ECCOMAS / [ed] P. Neittaanmäki, T. Rossi, K. Majava and O. Pironneau, Jyväskylä: University of Jyväskylä , 2004, s. 1-9Konferensbidrag (Refereegranskat)
    Abstract [en]

    We consider the problem of minimizing the distance from a given n-dimensional vector to a set defined by constraintsof the form   xi  xj Such constraints induce a partial order of the components xi, which can be illustrated by an acyclic directed graph.This problem is known as the isotonic regression (IR) problem. It has important applications in statistics, operations research and signal processing. The most of the applied IR problems are characterized by a very large value of n. For such large-scale problems, it is of great practical importance to develop algorithms whose complexity does not rise with n too rapidly.The existing optimization-based algorithms and statistical IR algorithms have either too high computational complexity or too low accuracy of the approximation to the optimal solution they generate. We introduce a new IR algorithm, which can be viewed as a generalization of the Pool-Adjacent-Violator (PAV) algorithm from completely to partially ordered data. Our algorithm combines both low computational complexity O(n2) and high accuracy. This allows us to obtain sufficiently accurate solutions to the IR problems with thousands of observations.

1 - 24 av 24
RefereraExporteraLänk till träfflistan
Permanent länk
Referera
Referensformat
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf