ubes_a_2116442_sm0142.pdf (235.62 kB)

A Scalable Frequentist Model Averaging Method

Download (235.62 kB)
journal contribution
posted on 2022-08-23, 13:20 authored by Rong Zhu, Haiying Wang, Xinyu Zhang, Hua Liang

Frequentist model averaging is an effective technique to handle model uncertainty. However, calculation of the weights for averaging is extremely difficult, if not impossible, even when the dimension of the predictor vector, p, is moderate, because we may have2pcandidate models. The exponential size of the candidate model set makes it difficult to estimate all candidate models, and brings additional numeric errors when calculating the weights. This article proposes a scalable frequentist model averaging method, which is statistically and computationally efficient, to overcome this problem by transforming the original model using the singular value decomposition. The method enables us to find the optimal weights by considering at most p candidate models. We prove that the minimum loss of the scalable model averaging estimator is asymptotically equal to that of the traditional model averaging estimator. We apply the Mallows and Jackknife criteria to the scalable model averaging estimator and prove that they are asymptotically optimal estimators. We further extend the method to the high-dimensional case (i.e.,pn). Numerical studies illustrate the superiority of the proposed method in terms of both statistical efficiency and computational cost.


Zhu’s work was partially supported by NNSF of China grants 11301514 and 71532013, and by Shanghai Municipal Science and Technology Major Project (No.2018SHZDZX01) and 111 Project (No.B18015). Zhang’s research was supported by NNSF grant (71925007, 72091212 and 12288201) and the CAS Project for Young Scientists in Basic Research (YSBR-008) Wang was partially supported by NSF grant 2105571.