The support vector machine (SVM) is one of the most popular classification models that aims to separate two classes with maximum distance between them. Previous studies have demonstrated the superiority of the SVM in dealing with the high dimensional, low sample size (HDLSS) data analysis problems. However, with the sample size increasing, the numerical difficulties in computations of the SVM become severe. Despite the fact that there exist a large number of solvers in the literature for the SVM, few solvers are designed by exploiting the special structure of the model. In this paper, we propose a highly efficient semismooth Newton based augmented Lagrangian method for solving a large-scale convex quadratic programming problem generated by the dual of the SVM with constraints consisting of one linear equality constraint and a simple box constraint. By leveraging on
available error bound results to realize the asymptotic superlinear convergence property of the augmented Lagrangian method, and by exploiting the second-order sparsity of the problem through the semismooth Newton method, the algorithm we propose can efficiently solve the aforementioned difficult problems. Numerical comparisons between our approach and a number of state-of-the-art solvers on real data sets are presented to demonstrate the high efficiency and robustness of our algorithm.