This paper studies the asymptotic properties of the hyperparameter estimators including the leave-k-out cross validation (LKOCV) and r-fold cross validation (RFCV), and discloses their relation with the Steins unbiased risk estimators (SURE) as well as the mean squared error (MSE). It is shown that as the number of data goes to infinity, the LKOCV shares the same asymptotic best hyperparameter minimizing the MSE estimator as the SURE does if the input is bounded and the ratio between the training data and the whole data tends to zero. We illustrate the efficacy of the theoretical result by Monte Carlo simulations.
Funding Agencies|National Natural Science Foundation of China [61603379, 61773329]; National Key Basic Research Program of China (973 Program) [2014CB845301]; President Fund of Academy of Mathematics and Systems Science, CAS [2015-hwyxqnrc-mbq]; Shenzhen Science and Technology Innovation Council [Ji-20170189, Ji-20160207]; Chinese University of Hong Kong, Shenzhen [2014.0003.23]; Swedish Research Council [2014-5894]; Thousand Youth Talents Plan - central government of China; [PF. 01.000249]