Open this publication in new window or tab >>2023 (English)In: Thirty-Ninth Conference on Uncertainty in Artificial Intelligence: PMLR 216, JMLR-JOURNAL MACHINE LEARNING RESEARCH , 2023, Vol. 216, p. 691-700Conference paper, Published paper (Refereed)
Abstract [en]
We introduce the Kernel Calibration Conditional Stein Discrepancy test (KCCSD test), a non-parametric, kernel-based test for assessing the calibration of probabilistic models with well-defined scores. In contrast to previous methods, our test avoids the need for possibly expensive expectation approximations while providing control over its type-I error. We achieve these improvements by using a new family of kernels for score-based probabilities that can be estimated without probability density samples, and by using a conditional goodness-of-fit criterion for the KCCSD test’s U-statistic. The tractability of the KCCSD test widens the surface area of calibration measures to new promising use-cases, such as regularization during model training. We demonstrate the properties of our test on various synthetic settings.
Place, publisher, year, edition, pages
JMLR-JOURNAL MACHINE LEARNING RESEARCH, 2023
National Category
Probability Theory and Statistics Computer Sciences
Identifiers
urn:nbn:se:liu:diva-204029 (URN)001222701100065 ()
Conference
39th Conference on Uncertainty in Artificial Intelligence (UAI), Pittsburgh, PA, JUL 31-AUG 04, 2023.
Note
Funding Agencies|Centre for Interdisciplinary Mathematics (CIM) at Uppsala University, Sweden; Swedish Research Council [621-2016-06079]; Kjell och Marta Beijer Foundation; Gatsby Charitable Foundation
2024-06-012024-06-012024-09-06Bibliographically approved