liu.seSearch for publications in DiVA
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Speeding Up Logic-Based Benders Decomposition by Strengthening Cuts with Graph Neural Networks
Institute of Logic and Computation, TU Wien.
Linköping University, Department of Mathematics, Applied Mathematics. Linköping University, Faculty of Science & Engineering. Saab AB, Linköping, Sweden.ORCID iD: 0000-0002-9498-1924
Institute of Logic and Computation, TU Wien.
Linköping University, Department of Mathematics, Applied Mathematics. Linköping University, Faculty of Science & Engineering.ORCID iD: 0000-0002-2081-2888
Show others and affiliations
2024 (English)In: Machine Learning, Optimization, and Data Science / [ed] Giuseppe Nicosia, Varun Ojha, Emanuele La Malfa, Gabriele La Malfa, Panos M. Pardalos, Renato Umeton, Cham: Springer, 2024, p. 24-38Conference paper, Published paper (Refereed)
Abstract [en]

Logic-based Benders decomposition is a technique to solve optimization problems to optimality. It works by splitting the problem into a master problem, which neglects some aspects of the problem, and a subproblem, which is used to iteratively produce cuts for the master problem to account for those aspects. It is critical for the computational performance that these cuts are strengthened, but the strengthening of cuts comes at the cost of solving additional subproblems. In this work we apply a graph neural network in an autoregressive fashion to approximate the compilation of an irreducible cut, which then only requires few postprocessing steps to ensure its validity. We test the approach on a job scheduling problem with a single machine and multiple time windows per job and compare to approaches from the literature. Results show that our approach is capable of considerably reducing the number of subproblems that need to be solved and hence the total computational effort.

Place, publisher, year, edition, pages
Cham: Springer, 2024. p. 24-38
Series
Lecture Notes in Computer Science, ISSN 0302-9743, E-ISSN 1611-3349 ; 14505
Keywords [en]
Logic-based Benders Decomposition; Cut Strengthening; Graph Neural Networks; Job Scheduling
National Category
Computational Mathematics
Identifiers
URN: urn:nbn:se:liu:diva-200891DOI: 10.1007/978-3-031-53969-5_3ISI: 001217088300003ISBN: 9783031539688 (print)ISBN: 9783031539695 (electronic)OAI: oai:DiVA.org:liu-200891DiVA, id: diva2:1838192
Conference
9th International Conference, LOD 2023, Grasmere, UK, September 22–26, 2023
Note

Funding Agencies|Honda Research Institute Europe

Available from: 2024-02-15 Created: 2024-02-15 Last updated: 2024-12-20Bibliographically approved

Open Access in DiVA

The full text will be freely available from 2025-02-16 14:08
Available from 2025-02-16 14:08

Other links

Publisher's full text

Authority records

Karlsson, EmilRönnberg, ElinaLindsten, Fredrik

Search in DiVA

By author/editor
Karlsson, EmilRönnberg, ElinaLindsten, Fredrik
By organisation
Applied MathematicsFaculty of Science & EngineeringThe Division of Statistics and Machine Learning
Computational Mathematics

Search outside of DiVA

GoogleGoogle Scholar

doi
isbn
urn-nbn

Altmetric score

doi
isbn
urn-nbn
Total: 115 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf