IEEE Transactions on Neural Networks
regression, active set, support vector
We present ASVR, a new active set strategy to solve a straightforward reformulation of the standard support vector regression problem. This new algorithm is based on the successful ASVM algorithm for classification problems, and consists of solving a finite number of linear equations with a typically large dimensionality equal to the number of points to be approximated. However, by making use of the Sherman-Morrison-Woodbury formula, a much smaller matrix of the order of the original input space is inverted at each step. The algorithm requires no specialized quadratic or linear programming code, but merely a linear equation solver which is publicly available. ASVR is extremely fast, produces comparable generalization error to other popular algorithms, and is available on the web for download.
Carleton College does not own the copyright to this work and the work is available through the Carleton College Library following the original publisher's policies regarding self-archiving. For more information on the copyright status of this work, refer to the current copyright holder.
Publisher PDF Archiving
D. R. Musicant and A. Feinberg, "Active Set Support Vector Regression," IEEE Transactions on Neural Networks, vol. 15, no. 2, pp. 268-275, IEEE, Jan 2004. Accessed via Faculty Work. Computer Science. Carleton Digital Commons.