Subspace Selection Techniques for Classification Problems
2002 (English)Licentiate thesis, monograph (Other academic)
The main topic of this thesis is linear subspaces for regression - how to find the subspaces and how to evaluate them. The motivation to do regression in a subspace is numerical as well as computational - numerical in the sense that the subspace can filter out the relevant components or features of the problem, computationally in the sense that this filtering can be done quickly and then can nonlinear predictionby artificial neural networks, for instance, be conducted in lower dimensionality.
The theory is developed in a versatile regression framework into which both discrete (classification) and continuous (quantification) regression can be put. The target is to find good future predictors from past observations. The foundation is the assumption that observations (or measurements) are drawn from a probability distribution, and from there on the theory is developed towards practical results and algorithms. The emphasis is however put on classification problems.
The applications are many and ranges from identification of dynamical systems to data mining and compression. Particular interest is given processing of sensor data - how to learn something from calibration measurements that in turn can be used to learn something about future unknown samples. In focus are the electronic nose (smell sensor) and the electronic tongue (taste sensor).
Three new algorithms are introduced and described. The Asymmetric Class Projection is a computationally efficient method to find subspaces for classification between two classes with small mean and large covariance difference. The Optimal Discriminative Projection (ODP) is an algorithm that uses a particular composition of Givens rotations to parameterize all subspaces. The subspaces are optimized for classification. The Clustered Regression Analysis uses the ODP subspace for conditional expectation prediction.
Place, publisher, year, edition, pages
Linköping: Linköping University , 2002. , 92 p.
Linköping Studies in Science and Technology. Thesis, ISSN 0280-7971 ; 995
Linear subspaces for regression, Asymetric classification, Subspace parameterization, Clustered regression analysis
IdentifiersURN: urn:nbn:se:liu:diva-98167Local ID: LiU-TEK-LIC-2002:68ISBN: 91-7373-575-2OAI: oai:DiVA.org:liu-98167DiVA: diva2:652348
Ljung, Lennart, Professor
FunderSwedish Research Council