Sparse logistic regression has been developed tremendously in recent two decades, from its origination the $l_1$-regularized version by Tibshirani (1996) to the sparsity constrained models by Bahmani, Raj, and Boufounos (2013); Plan and Vershynin (2013). This paper is carried out on the sparsity constrained logistic regression through the classical Newton method. We begin with analysing its first optimality condition to acquire a strong τ- stationary point for some $ \tau> 0$. This point enables us to equivalently derive a stationary equation system which is able to be efficiently solved by Newton method. The proposed method NSLR an abbreviation for Newton method for sparse logistic regression, enjoys a very low computational complexity, local quadratic convergence rate and termination within finite steps. Numerical experiments on random data and real data demonstrate its superior performance when against with seven state-of-the-art solvers.