We present a new solution framework to solve the generalized trust region subproblem (GTRS) of minimizing a quadratic objective over a quadratic constraint. More specifically, we derive a convex quadratic reformulation (CQR) via minimizing a linear objective over two convex quadratic constraints for the GTRS. We show that an optimal solution of the GTRS can be recovered from an optimal solution of the CQR. We further prove that this CQR is equivalent to minimizing the maximum of the two convex quadratic functions derived from the CQR for the case under our investigation. Although the latter minimax problem is nonsmooth, it is well-structured and convex. We thus develop two steepest descent algorithms corresponding to two different line search rules. We prove for both algorithms their global sublinear convergence rates. We also obtain a local linear convergence rate of the first algorithm by estimating the Kurdyka-{\L}ojasiewicz exponent at any optimal solution under mild conditions. We finally demonstrate the efficiency of our algorithms in our numerical experiments.