liu.seSearch for publications in DiVA
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Algorithms for Costly Global Optimization
School of Education, Culture and Communication, Mälardalen University, Västeråas, Sweden.ORCID iD: 0000-0002-9881-4170
2009 (English)Licentiate thesis, comprehensive summary (Other academic)
Abstract [en]

There exists many applications with so-called costly problems, which means that the objective function you want to maximize or minimize cannot be described using standard functions and expressions. Instead one considers these objective functions as \black box" where the parameter values are sent in and a function value is returned. This implies in particular that no derivative information is available.

The reason for describing these problems as expensive is that it may take a long time to calculate a single function value. The black box could, for example, solve a large system of dierential equations or carrying out a heavy simulation, which can take anywhere from several minutes to several hours!

These very special conditions therefore requires customized algorithms. Common optimization algorithms are based on calculating function values every now and then, which usually can be done instantly. But with an expensive problem, it may take several hours to compute a single function value. Our main objective is therefore to create algorithms that exploit all available information to the limit before a new function value is calculated. Or in other words, we want to nd the optimal solution using as few function evaluations as possible.

A good example of real life applications comes from the automotive industry, where the development of new engines utilize advanced models that are governed by a dozen key parameters. The goal is to optimize the model by changing the parameters in such a way that the engine becomes as energy ecient as possible, but still meets all sorts of demands on strength and external constraints.

Place, publisher, year, edition, pages
Västerås: Mälardalen University Press , 2009. , 107 p.
Series
Mälardalen University Press Licentiate Thesis, ISSN 1651-9256 ; 105
National Category
Computational Mathematics
Identifiers
URN: urn:nbn:se:liu:diva-114511ISBN: 978-91-86135-29-4 (print)OAI: oai:DiVA.org:liu-114511DiVA: diva2:790556
Presentation
2009-09-03, Gamma, Hus U, Högskoleplan 1, Mälardalens högskola, Västerås, 13:15 (English)
Opponent
Supervisors
Available from: 2015-02-26 Created: 2015-02-25 Last updated: 2015-02-26Bibliographically approved
List of papers
1. An adaptive radial basis algorithm (ARBF) for expensive black-box mixed-integer constrained global optimization
Open this publication in new window or tab >>An adaptive radial basis algorithm (ARBF) for expensive black-box mixed-integer constrained global optimization
2008 (English)In: Optimization and Engineering, ISSN 1389-4420, E-ISSN 1573-2924, Vol. 9, no 3, 311-339 p.Article in journal (Refereed) Published
Abstract [en]

Response surface methods based on kriging and radial basis function (RBF) interpolationhave been successfully applied to solve expensive, i.e. computationally costly,global black-box nonconvex optimization problems.In this paper we describe extensions of these methods to handlelinear, nonlinear, and integer constraints.In particular, algorithms for standard RBF and the new adaptive RBF (ARBF) aredescribed.Note, however, while the objective function may be expensive, we assumethat any nonlinear constraints are either inexpensive or are incorporatedinto the objective function via penalty terms.Test results are presented on standard test problems, both nonconvexproblems with linear and nonlinear constraints, and mixed-integernonlinear problems (MINLP). Solvers in the TOMLAB OptimizationEnvironment (http://tomopt.com/tomlab/) have been compared,specifically the three deterministic derivative-free solversrbfSolve, ARBFMIP and EGO with three derivative-based mixed-integernonlinear solvers, OQNLP, MINLPBB and MISQP, as well as the GENOsolver implementing a stochastic genetic algorithm. Results showthat the deterministic derivative-free methods compare well with thederivative-based ones, but the stochastic genetic algorithm solver isseveral orders of magnitude too slow for practical use.When the objective function for the test problems is costly to evaluate,the performance of the ARBF algorithm proves to be superior.

Place, publisher, year, edition, pages
Springer, 2008
Keyword
Global optimization, radial basis functions, response surface model, surrogate model, expensive function, CPU-intensive, optimization software, splines, mixed-integer nonlinear programming, nonconvex, derivative-free, black-box, linear constraints, nonlinear constraints
National Category
Computational Mathematics
Identifiers
urn:nbn:se:liu:diva-77076 (URN)10.1007/s11081-008-9037-3 (DOI)000259577400002 ()
Available from: 2012-05-04 Created: 2012-05-04 Last updated: 2017-12-07Bibliographically approved
2. The Influence of Experimental Designs on the performance of surrogate model based costly global optimization solvers
Open this publication in new window or tab >>The Influence of Experimental Designs on the performance of surrogate model based costly global optimization solvers
2009 (English)In: Studies in Informatics and Control, ISSN 1220-1766, E-ISSN 1841-429X, Vol. 18, no 1, 87-95 p.Article in journal (Refereed) Published
Abstract [en]

When dealing with costly objective functions in optimization, one good alternative is to use a surrogate model approach. A common feature for all such methods is the need of an initial set of points, or "experimental design", in order to start the algorithm. Since the behavior of the algorithms often depends heavily on this set, the question is how to choose a good experimental design. We investigate this by solving a number of problems using different designs, and compare the outcome with respect to function evaluations and a root mean square error test of the true function versus the surrogate model produced. Each combination of problem and design is solved by 3 different solvers available in the TOMLAB optimization environment. Results indicate two designs as superior.

Place, publisher, year, edition, pages
National Institute for R&D in Informatics (ICI), 2009
Keyword
Black-box, Surrogate model, Costly functions, Latin Hypercube Designs, Experimental Design
National Category
Computational Mathematics
Identifiers
urn:nbn:se:liu:diva-77077 (URN)000269029600010 ()
Available from: 2012-05-04 Created: 2012-05-04 Last updated: 2017-12-07Bibliographically approved
3. Implementation of a One-Stage Efficient Global Optimization (EGO) Algorithm
Open this publication in new window or tab >>Implementation of a One-Stage Efficient Global Optimization (EGO) Algorithm
2009 (English)Report (Other academic)
Abstract [en]

Almost every Costly Global Optimization (CGO) solver utilizes a surrogate model, or response surface, to approximate the true (costly) function. The EGO algorithm introduced by Jones et al. utilizes the DACE framework to build an approximating surrogate model. By optimizing a less costly utility function, the algorithm determines a new point where the original objective function is evaluated. This is repeated until some convergence criteria is fulfilled.The original EGO algorithm finds the new point to sample in a two-stage process. In its first stage, the estimates of the interpolation parameters are optimized with respect to already sampled points. In the second stage, these estimated values are considered true in order to optimize the location of the new point. The use of estimate values as correct introduces a source of error.Instead, in the one-stage EGO algorithm, both the parameters and the location of a new point are optimized at the same time, removing the source of error. This new subproblem becomes more difficult, but eliminates the need of solving two subproblems.Difficulties in implementing a fast and robust One-Stage EGO algorithm in TOMLAB are discussed, especially the solution of the new subproblem.

Publisher
26 p.
Series
Research Report 2009, School of Education, Culture and Communication, Division of Applied Mathematics, Mälardalen University, ISSN 1404-4978 ; 2009:2
Keyword
Global Optimization, Costly, EGO
National Category
Computational Mathematics
Identifiers
urn:nbn:se:liu:diva-77075 (URN)
Available from: 2012-05-04 Created: 2012-05-04 Last updated: 2015-02-26Bibliographically approved

Open Access in DiVA

No full text

Other links

Link to full text

Authority records BETA

Quttineh, Nils-Hassan

Search in DiVA

By author/editor
Quttineh, Nils-Hassan
Computational Mathematics

Search outside of DiVA

GoogleGoogle Scholar

isbn
urn-nbn

Altmetric score

isbn
urn-nbn
Total: 77 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf