figshare
Browse
uasa_a_1407771_sm2789.pdf (262.19 kB)

A Computational Framework for Multivariate Convex Regression and Its Variants

Download (262.19 kB)
Version 2 2019-04-02, 07:55
Version 1 2018-01-15, 13:19
journal contribution
posted on 2019-04-02, 07:55 authored by Rahul Mazumder, Arkopal Choudhury, Garud Iyengar, Bodhisattva Sen

We study the nonparametric least squares estimator (LSE) of a multivariate convex regression function. The LSE, given as the solution to a quadratic program with O(n2) linear constraints (n being the sample size), is difficult to compute for large problems. Exploiting problem specific structure, we propose a scalable algorithmic framework based on the augmented Lagrangian method to compute the LSE. We develop a novel approach to obtain smooth convex approximations to the fitted (piecewise affine) convex LSE and provide formal bounds on the quality of approximation. When the number of samples is not too large compared to the dimension of the predictor, we propose a regularization scheme—Lipschitz convex regression—where we constrain the norm of the subgradients, and study the rates of convergence of the obtained LSE. Our algorithmic framework is simple and flexible and can be easily adapted to handle variants: estimation of a nondecreasing/nonincreasing convex/concave (with or without a Lipschitz bound) function. We perform numerical studies illustrating the scalability of the proposed algorithm—on some instances our proposal leads to more than a 10,000-fold improvement in runtime when compared to off-the-shelf interior point solvers for problems with n = 500.

Funding

Rahul Mazumder thank ONR for the grant N00014-15-1-2342 and an interface grant from the Betty-Moore Sloan Foundation. Garud Iyengar thank NSF: DMS-1016571, CMMI-1235023, ONR: N000140310514. Bodhisattva Sen thank NSF CAREER Grant DMS-1150435.

History