figshare
Browse

An Outer-Product-of-Gradient Approach to Dimension Reduction and its Application to Classification in High Dimensional Space

Download (422.97 kB)
Version 2 2022-01-13, 19:20
Version 1 2021-11-09, 19:00
journal contribution
posted on 2022-01-13, 19:20 authored by Zhibo Cai, Yingcun Xia, Weiqiang Hang

Sufficient dimension reduction (SDR) has progressed steadily. However, its ability to improve general function estimation or classification has not been well received, especially for high-dimensional data. In this article, we first devise a local linear smoother for high dimensional nonparametric regression and then utilise it in the outer-product-of-gradient (OPG) approach of SDR. We call the method high-dimensional OPG (HOPG). To apply SDR to classification in high-dimensional data, we propose an ensemble classifier by aggregating results of classifiers that are built on subspaces reduced by the random projection and HOPG consecutively from the data. Asymptotic results for both HOPG and the classifier are established. Superior performance over the existing methods is demonstrated in simulations and real data analyses. Supplementary materials for this article are available online.

Funding

We are grateful to the AE and 3 reviewers for their insightful comments and suggestions, which have helped us greatly in improving an earlier manuscript. ZC was supported by AcRF grant R-155-000-220-114 of the National University of Singapore. YX was partially supported by the National Natural Science Foundation of China (Nos. 72033002 and 11931014), and AcRF grant R-155-000-220-114 of the National University of Singapore.

History