liu.seSearch for publications in DiVA
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Adaptive Off-Line Tuning for Optimized Composition of Components for Heterogeneous Many-Core Systems
Linköping University, Department of Computer and Information Science, Software and Systems. Linköping University, The Institute of Technology. (PELAB)
Linköping University, Department of Computer and Information Science, Software and Systems. Linköping University, The Institute of Technology. (PELAB)
Linköping University, Department of Computer and Information Science, Software and Systems. Linköping University, The Institute of Technology. (PELAB)ORCID iD: 0000-0001-5241-0026
2013 (English)In: High Performance Computing for Computational Science - VECPAR 2012, Springer, 2013, 329-345 p.Conference paper, Published paper (Refereed)
Abstract [en]

In recent years heterogeneous multi-core systems have been given much attention. However, performance optimization on these platforms remains a big challenge. Optimizations performed by compilers are often limited due to lack of dynamic information and run time environment, which makes applications often not performance portable. One current approach is to provide multiple implementations for the same interface that could be used interchangeably depending on the call context, and expose the composition choices to a compiler, deployment-time composition tool and/or run-time system. Using off-line machine-learning techniques allows to improve the precision and reduce the run-time overhead of run-time composition and leads to an improvement of performance portability. In this work we extend the run-time composition mechanism in the PEPPHER composition tool by off-line composition and present an adaptive machine learning algorithm for generating compact and efficient dispatch data structures with low training time. As dispatch data structure we propose an adaptive decision tree structure, which implies an adaptive training algorithm that allows to control the trade-off between training time, dispatch precision and run-time dispatch overhead.

We have evaluated our optimization strategy with simple kernels (matrix-multiplication and sorting) as well as applications from RODINIA benchmark on two GPU-based heterogeneous systems. On average, the precision for composition choices reaches 83.6 percent with approximately 34 minutes off-line training time.

Place, publisher, year, edition, pages
Springer, 2013. 329-345 p.
Series
Lecture Notes in Computer Science, ISSN 0302-9743 (print), 1611-3349 (online) ; 7851
Keyword [en]
parallel programming, parallel computing, automated performance tuning, machine learning, adaptive sampling, GPU, multicore processor, software composition, program optimization, autotuning
National Category
Computer Science
Identifiers
URN: urn:nbn:se:liu:diva-93471DOI: 10.1007/978-3-642-38718-0_32ISI: 000342997100032ISBN: 978-3-642-38717-3 (print)ISBN: 978-3-642-38718-0 (print)OAI: oai:DiVA.org:liu-93471DiVA: diva2:625081
Conference
10th International Conference on High Performance Computing for Computational Science, VECPAR 2012; Kobe; Japan
Projects
EU FP7 PEPPHER (2010-2012), #248481, www.peppher.euSeRC - OpCoReS
Funder
EU, FP7, Seventh Framework Programme, 248481Swedish e‐Science Research Center, OpCoReS
Available from: 2013-06-04 Created: 2013-06-04 Last updated: 2017-02-27

Open Access in DiVA

No full text

Other links

Publisher's full textSpringerLink

Authority records BETA

Li, LuDastgeer, UsmanKessler, Christoph

Search in DiVA

By author/editor
Li, LuDastgeer, UsmanKessler, Christoph
By organisation
Software and SystemsThe Institute of Technology
Computer Science

Search outside of DiVA

GoogleGoogle Scholar

doi
isbn
urn-nbn

Altmetric score

doi
isbn
urn-nbn
Total: 83 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf