liu.seSearch for publications in DiVA
Change search
Refine search result
1 - 7 of 7
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Rows per page
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sort
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
Select
The maximal number of hits you can export is 250. When you want to export more records please use the Create feeds function.
  • 1.
    Andersson, Mats
    et al.
    Linköping University, Department of Biomedical Engineering, Medical Informatics. Linköping University, The Institute of Technology.
    Burdakov, Oleg
    Linköping University, Department of Mathematics, Optimization . Linköping University, The Institute of Technology.
    Knutsson, Hans
    Linköping University, Department of Biomedical Engineering, Medical Informatics. Linköping University, The Institute of Technology.
    Zikrin, Spartak
    Linköping University, Department of Mathematics, Mathematics and Applied Mathematics. Linköping University, The Institute of Technology.
    Global search strategies for solving multilinear least-squares problems2012In: Sultan Qaboos University Journal for Science, ISSN 1027-524X, Vol. 17, no 1, p. 12-21Article in journal (Refereed)
    Abstract [en]

    The multilinear least-squares (MLLS) problem is an extension of the linear leastsquares problem. The difference is that a multilinear operator is used in place of a matrix-vector product. The MLLS is typically a large-scale problem characterized by a large number of local minimizers. It originates, for instance, from the design of filter networks. We present a global search strategy that allows for moving from one local minimizer to a better one. The efficiency of this strategy is illustrated by results of numerical experiments performed for some problems related to the design of filter networks.

  • 2.
    Andersson, Mats
    et al.
    Linköping University, Department of Biomedical Engineering, Medical Informatics. Linköping University, The Institute of Technology.
    Burdakov, Oleg
    Linköping University, Department of Mathematics, Optimization . Linköping University, The Institute of Technology.
    Knutsson, Hans
    Linköping University, Department of Biomedical Engineering, Medical Informatics. Linköping University, The Institute of Technology.
    Zikrin, Spartak
    Linköping University, Department of Mathematics. Linköping University, The Institute of Technology.
    Global Search Strategies for Solving Multilinear Least-squares Problems2011Report (Other academic)
    Abstract [en]

    The multilinear least-squares (MLLS) problem is an extension of the linear least-squares problem. The difference is that a multilinearoperator is used in place of a matrix-vector product. The MLLS istypically a large-scale problem characterized by a large number of local minimizers. It originates, for instance, from the design of filter networks. We present a global search strategy that allows formoving from one local minimizer to a better one. The efficiencyof this strategy isillustrated by results of numerical experiments performed forsome problems related to the design of filter networks.

  • 3.
    Andersson, Mats
    et al.
    Linköping University, Department of Biomedical Engineering. Linköping University, The Institute of Technology.
    Burdakov, Oleg
    Linköping University, Department of Mathematics, Optimization . Linköping University, The Institute of Technology.
    Knutsson, Hans
    Linköping University, Department of Biomedical Engineering, Medical Informatics. Linköping University, The Institute of Technology.
    Zikrin, Spartak
    Linköping University, Department of Mathematics. Linköping University, The Institute of Technology.
    Sparsity Optimization in Design of Multidimensional Filter Networks2013Report (Other academic)
    Abstract [en]

    Filter networks is a powerful tool used for reducing the image processing time, while maintaining its reasonably high quality.They are composed of sparse sub-filters whose low sparsity ensures fast image processing.The filter network design is related to solvinga sparse optimization problem where a cardinality constraint bounds above the sparsity level.In the case of sequentially connected sub-filters, which is the simplest network structure of those considered in this paper, a cardinality-constrained multilinear least-squares (MLLS) problem is to be solved. If to disregard the cardinality constraint, the MLLS is typically a large-scale problem characterized by a large number of local minimizers. Each of the local minimizers is singular and non-isolated.The cardinality constraint makes the problem even more difficult to solve.An approach for approximately solving the cardinality-constrained MLLS problem is presented.It is then applied to solving a bi-criteria optimization problem in which both thetime and quality of image processing are optimized. The developed approach is extended to designing filter networks of a more general structure. Its efficiency is demonstrated by designing certain 2D and 3D filter networks. It is also compared with the existing approaches.

  • 4.
    Andersson, Mats
    et al.
    Linköping University, Department of Biomedical Engineering, Medical Informatics. Linköping University, The Institute of Technology. Linköping University, Center for Medical Image Science and Visualization (CMIV).
    Burdakov, Oleg
    Linköping University, Department of Mathematics, Optimization . Linköping University, The Institute of Technology.
    Knutsson, Hans
    Linköping University, Department of Biomedical Engineering, Medical Informatics. Linköping University, The Institute of Technology.
    Zikrin, Spartak
    Linköping University, Department of Mathematics, Optimization . Linköping University, The Institute of Technology.
    Sparsity Optimization in Design of Multidimensional Filter Networks2015In: Optimization and Engineering, ISSN 1389-4420, E-ISSN 1573-2924, Vol. 16, no 2, p. 259-277Article in journal (Refereed)
    Abstract [en]

    Filter networks are used as a powerful tool used for reducing the image processing time and maintaining high image quality.They are composed of sparse sub-filters whose high sparsity ensures fast image processing.The filter network design is related to solvinga sparse optimization problem where a cardinality constraint bounds above the sparsity level.In the case of sequentially connected sub-filters, which is the simplest network structure of those considered in this paper, a cardinality-constrained multilinear least-squares (MLLS) problem is to be solved. Even when disregarding the cardinality constraint, the MLLS is typically a large-scale problem characterized by a large number of local minimizers, each of which is singular and non-isolated.The cardinality constraint makes the problem even more difficult to solve.

    An approach for approximately solving the cardinality-constrained MLLS problem is presented.It is then applied to solving a bi-criteria optimization problem in which both thetime and quality of image processing are optimized. The developed approach is extended to designing filter networks of a more general structure. Its efficiency is demonstrated by designing certain 2D and 3D filter networks. It is also compared with the existing approaches.

  • 5.
    Burdakov, Oleg
    et al.
    Linköping University, Department of Mathematics, Optimization . Linköping University, The Institute of Technology.
    Gong, Lujin
    Samsung Advanced Institute of Technology, China Lab, Beijing, China.
    Yuan, Ya-Xiang
    State Key Laboratory of Scientic and Engineering Computing, Institute of Computational.
    Zikrin, Spartak
    Linköping University, Department of Mathematics, Optimization . Linköping University, The Institute of Technology.
    On Efficiently Combining Limited Memory and Trust-Region Techniques2013Report (Other academic)
    Abstract [en]

    Limited memory quasi-Newton methods and trust-region methods represent two efficient approaches used for solving unconstrained optimization problems. A straightforward combination of them deteriorates the efficiency of the former approach, especially in the case of large-scale problems. For this reason, the limited memory methods are usually combined with a line search. We show how to efficiently combine limited memory and trust-region techniques. One of our approaches is based on the eigenvalue decomposition of the limited memory quasi-Newton approximation of the Hessian matrix. The decomposition allows for finding a nearly-exact solution to the trust-region subproblem defined by the Euclidean norm with an insignificant computational overhead compared with the cost of computing the quasi-Newton direction in line-search limited memory methods. The other approach is based on two new eigenvalue-based norms. The advantage of the new norms is that the trust-region subproblem is separable and each of the smaller subproblems is easy to solve. We show that our eigenvalue-based limited-memory trust-region methods are globally convergent. Moreover, we propose improved versions of the existing limited-memory trust-region algorithms. The presented results of numerical experiments demonstrate the efficiency of our approach which is competitive with line-search versions of the L-BFGS method.

  • 6.
    Burdakov, Oleg
    et al.
    Linköping University, Department of Mathematics, Optimization . Linköping University, Faculty of Science & Engineering.
    Gong, Lujin
    Tencent, Beijing, China.
    Zikrin, Spartak
    Linköping University, Department of Mathematics, Optimization . Linköping University, Faculty of Science & Engineering.
    Yuan, Ya-xiang
    State Key Laboratory of Scientific and Engineering Computing, Institute of Computational Mathematics and Scientific/Engineering Computing, AMSS, CAS, Beijing, China.
    On Efficiently Combining Limited-Memory and Trust-Region Techniques2017In: Mathematical Programming Computation, ISSN 1867-2949, E-ISSN 1867-2957, Vol. 9, no 1, p. 101-134Article in journal (Refereed)
    Abstract [en]

    Limited-memory quasi-Newton methods and trust-region methods represent two efficient approaches used for solving unconstrained optimization problems. A straightforward combination of them deteriorates the efficiency of the former approach, especially in the case of large-scale problems. For this reason, the limited-memory methods are usually combined with a line search. We show how to efficiently combine limited-memory and trust-region techniques. One of our approaches is based on the eigenvalue decomposition of the limited-memory quasi-Newton approximation of the Hessian matrix. The decomposition allows for finding a nearly-exact solution to the trust-region subproblem defined by the Euclidean norm with an insignificant computational overhead as compared with the cost of computing the quasi-Newton direction in line-search limited-memory methods. The other approach is based on two new eigenvalue-based norms. The advantage of the new norms is that the trust-region subproblem is separable and each of the smaller subproblems is easy to solve. We show that our eigenvalue-based limited-memory trust-region methods are globally convergent. Moreover, we propose improved versions of the existing limited-memory trust-region algorithms. The presented results of numerical experiments demonstrate the efficiency of our approach which is competitive with line-search versions of the L-BFGS method.

  • 7.
    Zikrin, Spartak
    Linköping University, Department of Mathematics, Optimization . Linköping University, The Institute of Technology.
    Large-Scale Optimization Methods with Application to Design of Filter Networks2014Doctoral thesis, comprehensive summary (Other academic)
    Abstract [en]

    Nowadays, large-scale optimization problems are among those most challenging. Any progress in developing methods for large-scale optimization results in solving important applied problems more effectively. Limited memory methods and trust-region methods represent two ecient approaches used for solving unconstrained optimization problems. A straightforward combination of them deteriorates the efficiency of the former approach, especially in the case of large-scale problems. For this reason, the limited memory methods are usually combined with a line search. We develop new limited memory trust-region algorithms for large-scale unconstrained optimization. They are competitive with the traditional limited memory line-search algorithms.

    In this thesis, we consider applied optimization problems originating from the design of lter networks. Filter networks represent an ecient tool in medical image processing. It is based on replacing a set of dense multidimensional lters by a network of smaller sparse lters called sub-filters. This allows for improving image processing time, while maintaining image quality and the robustness of image processing.

    Design of lter networks is a nontrivial procedure that involves three steps: 1) choosing the network structure, 2) choosing the sparsity pattern of each sub-filter and 3) optimizing the nonzero coecient values. So far, steps 1 and 2 were mainly based on the individual expertise of network designers and their intuition. Given a sparsity pattern, the choice of the coecients at stage 3 is related to solving a weighted nonlinear least-squares problem. Even in the case of sequentially connected lters, the resulting problem is of a multilinear least-squares (MLLS) type, which is a non-convex large-scale optimization problem. This is a very dicult global optimization problem that may have a large number of local minima, and each of them is singular and non-isolated. It is characterized by a large number of decision variables, especially for 3D and 4D lters.

    We develop an effective global optimization approach to solving the MLLS problem that reduces signicantly the computational time. Furthermore, we  develop efficient methods for optimizing sparsity of individual sub-filters  in lter networks of a more general structure. This approach offers practitioners a means of nding a proper trade-o between the image processing quality and time. It allows also for improving the network structure, which makes automated some stages of designing lter networks.

    List of papers
    1. On Efficiently Combining Limited Memory and Trust-Region Techniques
    Open this publication in new window or tab >>On Efficiently Combining Limited Memory and Trust-Region Techniques
    2013 (English)Report (Other academic)
    Abstract [en]

    Limited memory quasi-Newton methods and trust-region methods represent two efficient approaches used for solving unconstrained optimization problems. A straightforward combination of them deteriorates the efficiency of the former approach, especially in the case of large-scale problems. For this reason, the limited memory methods are usually combined with a line search. We show how to efficiently combine limited memory and trust-region techniques. One of our approaches is based on the eigenvalue decomposition of the limited memory quasi-Newton approximation of the Hessian matrix. The decomposition allows for finding a nearly-exact solution to the trust-region subproblem defined by the Euclidean norm with an insignificant computational overhead compared with the cost of computing the quasi-Newton direction in line-search limited memory methods. The other approach is based on two new eigenvalue-based norms. The advantage of the new norms is that the trust-region subproblem is separable and each of the smaller subproblems is easy to solve. We show that our eigenvalue-based limited-memory trust-region methods are globally convergent. Moreover, we propose improved versions of the existing limited-memory trust-region algorithms. The presented results of numerical experiments demonstrate the efficiency of our approach which is competitive with line-search versions of the L-BFGS method.

    Place, publisher, year, edition, pages
    Linköping: Linköping University Electronic Press, 2013. p. 33
    Series
    LiTH-MAT-R, ISSN 0348-2960 ; 2013:13
    Keywords
    Unconstrained Optimization; Large-scale Problems; Limited Memory Methods;
    National Category
    Computational Mathematics
    Identifiers
    urn:nbn:se:liu:diva-102005 (URN)LiTH-MAT-R--2013/13--SE (ISRN)
    Available from: 2013-11-26 Created: 2013-11-26 Last updated: 2016-01-12Bibliographically approved
    2. Global search strategies for solving multilinear least-squares problems
    Open this publication in new window or tab >>Global search strategies for solving multilinear least-squares problems
    2012 (English)In: Sultan Qaboos University Journal for Science, ISSN 1027-524X, Vol. 17, no 1, p. 12-21Article in journal (Refereed) Published
    Abstract [en]

    The multilinear least-squares (MLLS) problem is an extension of the linear leastsquares problem. The difference is that a multilinear operator is used in place of a matrix-vector product. The MLLS is typically a large-scale problem characterized by a large number of local minimizers. It originates, for instance, from the design of filter networks. We present a global search strategy that allows for moving from one local minimizer to a better one. The efficiency of this strategy is illustrated by results of numerical experiments performed for some problems related to the design of filter networks.

    Place, publisher, year, edition, pages
    Sultan Qaboos University, 2012
    Keywords
    Global optimization; Global search strategies; Multilinear least-squares; Filter
    National Category
    Computational Mathematics Medical Image Processing
    Identifiers
    urn:nbn:se:liu:diva-78918 (URN)
    Available from: 2012-08-28 Created: 2012-06-25 Last updated: 2015-09-03Bibliographically approved
    3. Sparsity Optimization in Design of Multidimensional Filter Networks
    Open this publication in new window or tab >>Sparsity Optimization in Design of Multidimensional Filter Networks
    2013 (English)Report (Other academic)
    Abstract [en]

    Filter networks is a powerful tool used for reducing the image processing time, while maintaining its reasonably high quality.They are composed of sparse sub-filters whose low sparsity ensures fast image processing.The filter network design is related to solvinga sparse optimization problem where a cardinality constraint bounds above the sparsity level.In the case of sequentially connected sub-filters, which is the simplest network structure of those considered in this paper, a cardinality-constrained multilinear least-squares (MLLS) problem is to be solved. If to disregard the cardinality constraint, the MLLS is typically a large-scale problem characterized by a large number of local minimizers. Each of the local minimizers is singular and non-isolated.The cardinality constraint makes the problem even more difficult to solve.An approach for approximately solving the cardinality-constrained MLLS problem is presented.It is then applied to solving a bi-criteria optimization problem in which both thetime and quality of image processing are optimized. The developed approach is extended to designing filter networks of a more general structure. Its efficiency is demonstrated by designing certain 2D and 3D filter networks. It is also compared with the existing approaches.

    Place, publisher, year, edition, pages
    Linköping: Linköping University Electronic Press, 2013. p. 21
    Series
    LiTH-MAT-R, ISSN 0348-2960 ; 2013:16
    Keywords
    Sparse optimization; Cardinality Constraint; Multicriteria Optimization; Multilinear Least-Squares Problem; Filter networks; Medical imaging
    National Category
    Computational Mathematics Medical Image Processing Signal Processing
    Identifiers
    urn:nbn:se:liu:diva-103915 (URN)LiTH-MAT-R-2013/16-SE (ISRN)
    Available from: 2014-02-03 Created: 2014-02-03 Last updated: 2016-11-24Bibliographically approved
1 - 7 of 7
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf