An Ordered Lasso and Sparse Time-lagged Regression

2015-08-07T00:00:00Z (GMT) by Robert Tibshirani Xiaotong Suo
<div><p>We consider regression scenarios where it is natural to impose an order constraint on the coefficients. We propose an order-constrained version of ℓ<sub>1</sub>-regularized regression (Lasso) for this problem, and show how to solve it efficiently using the well-known Pool Adjacent Violators Algorithm as its proximal operator. The main application of this idea is to time-lagged regression, where we predict an outcome at time <i>t</i> from features at the previous <i>K</i> time points. In this setting it is natural to assume that the coefficients decay as we move farther away from <i>t</i>, and hence the order constraint is reasonable. Potential application areas include financial time series and prediction of dynamic patient outcomes based on clinical measurements. We illustrate this idea on real and simulated data.</p></div>