Browse by author
Lookup NU author(s): Dr Wanqing ZhaoORCiD
Full text for this publication is not currently held within this repository. Alternative links are provided below where available.
© 2017 The Author(s) Published by the Royal Society. All rights reserved. Least angle regression, as a promising model selection method, differentiates itself from conventional stepwise and stagewise methods, in that it is neither too greedy nor too slow. It is closely related to L1 norm optimization, which has the advantage of low prediction variance through sacrificing part of model bias property in order to enhance model generalization capability. In this paper, we propose an efficient least angle regression algorithm for model selection for a large class of linear-in-the-parameters models with the purpose of accelerating the model selection process. The entire algorithm works completely in a recursive manner, where the correlations between model terms and residuals, the evolving directions and other pertinent variables are derived explicitly and updated successively at every subset selection step. The model coefficients are only computed when the algorithm finishes. The direct involvement of matrix inversions is thereby relieved. A detailed computational complexity analysis indicates that the proposed algorithm possesses significant computational efficiency, compared with the original approach where the wellknown efficient Cholesky decomposition is involved in solving least angle regression. Three artificial and real-world examples are employed to demonstrate the effectiveness, efficiency and numerical stability of the proposed algorithm.
Author(s): Zhao W, Beach TH, Rezgui Y
Publication type: Article
Publication status: Published
Journal: Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences
Year: 2017
Volume: 473
Issue: 2198
Print publication date: 28/02/2017
Online publication date: 01/02/2017
Acceptance date: 03/01/2017
ISSN (print): 1364-5021
ISSN (electronic): 1471-2946
Publisher: Royal Society Publishing
URL: https://doi.org/10.1098/rspa.2016.0775
DOI: 10.1098/rspa.2016.0775
Altmetrics provided by Altmetric