The sparse group Lasso is a widely used statistical model which encourages the spar-
sity both on a group and within the group level. In this paper, we develop an efficient
augmented Lagrangian method for large-scale non-overlapping sparse group Lasso
problems with each subproblem being solved by a superlinearly convergent inexact
semismooth Newton method. Theoretically, we prove that, if the penalty parameter
is chosen sufficiently large, the augmented Lagrangian method converges globally at
an arbitrarily fast linear rate for the primal iterative sequence, the dual infeasibility,
and the duality gap of the primal and dual objective functions. Computationally, we
derive explicitly the generalized Jacobian of the proximal mapping associated with
the sparse group Lasso regularizer and exploit fully the underlying second order spar-
sity through the semismooth Newton method. The efficiency and robustness of our
proposed algorithm are demonstrated by numerical experiments on both the synthetic
and real data sets