Type
Article
Keywords
regression, active set, support vector
Abstract
We present ASVR, a new active set strategy to solve a straightforward reformulation of the standard support vector regression problem. This new algorithm is based on the successful ASVM algorithm for classification problems, and consists of solving a finite number of linear equations with a typically large dimensionality equal to the number of points to be approximated. However, by making use of the Sherman-Morrison-Woodbury formula, a much smaller matrix of the order of the original input space is inverted at each step. The algorithm requires no specialized quadratic or linear programming code, but merely a linear equation solver which is publicly available. ASVR is extremely fast, produces comparable generalization error to other popular algorithms, and is available on the web for download.
Language
English
Department(s)
Computer Science
Journal or Book Title
IEEE Transactions on Neural Networks
Publication Year
2004
Issue Month/Season
March
DOI
10.1109/TNN.2004.824259
Publisher
IEEE
Rights Management
Carleton College does not own the copyright to this work and the work is available through the Carleton College Library following the original publisher policies regarding self-archiving. For more information on the copyright status of this work, refer to the current copyright holder.
RoMEO Color
Green
Preprint Archiving
Yes
Postprint Archiving
Yes
Publisher PDF Archiving
Yes
Contributing Organization
Carleton College
Format
application/pdf
Recommended Citation
D. R. Musicant and A. Feinberg, "Active Set Support Vector Regression," IEEE Transactions on Neural Networks, vol. 15, no. 2, pp. 268-275. Available at: https://doi.org/10.1109/TNN.2004.824259. , IEEE, Jan 2004. Accessed via Faculty Work. Computer Science. Carleton Digital Commons. https://digitalcommons.carleton.edu/cs_faculty/3
The definitive version is available at https://doi.org/10.1109/TNN.2004.824259
