Open this publication in new window or tab >>2024 (English)In: Proceedings of the 41 st International Conference on Machine Learning, Vienna, Austria. PMLR 235, 2024 / [ed] Neil Lawrence, PMLR , 2024, Vol. 235, p. 52903-52914Conference paper, Published paper (Refereed)
Abstract [en]
This paper addresses the training of Neural Ordinary Differential Equations (neural ODEs), and in particular explores the interplay between numerical integration techniques, stability regions, step size, and initialization techniques. It is shown how the choice of integration technique implicitly regularizes the learned model, and how the solver’s corresponding stability region affects training and prediction performance. From this analysis, a stability-informed parameter initialization technique is introduced. The effectiveness of the initialization method is displayed across several learning benchmarks and industrial applications.
Place, publisher, year, edition, pages
PMLR, 2024
Series
Proceedings of Machine Learning Research, ISSN 2640-3498
National Category
Computational Mathematics
Identifiers
urn:nbn:se:liu:diva-210226 (URN)
Conference
International Conference on Machine Learning, 21-27 July 2024, Vienna, Austria
Note
Funding: This research was supported by the Strategic Research Area at Linköping-Lund in Information Technology (ELLIIT) and the Wallenberg AI, Autonomous Systems and Software Program (WASP) funded by the Knut and Alice Wallenberg Foundation. Computations were enabled by the Berzelius resource provided by the Knut and Alice Wallenberg Foundation at the National Supercomputer Centre. The authors would like to thank the reviewers for their insightful comments and suggestions, which have significantly improved the manuscript.
2024-12-032024-12-032024-12-03Bibliographically approved