This talk is devoted to an efficient proximal-gradient-based (PG) method for solving variational inequality problems with monotone and Lipschitz-continuous mapping in Hilbert space. In the existing PG methods, the step size requires the knowledge of the Lipschitz constant of the mapping, linesearch procedure or is generated according to the progress of the algorithms but converges to a smaller positive number (or even zero) for guaranteeing convergence, which may be not practical. To overcome these drawbacks, we present a proximal extrapolated gradient algorithm with larger step size, and extend the acceptable range of parameters to ensure the convergence. Due to the extension of parameter’s range, only the subsequence weakly converging to a solution of problem can be established theoretically to exist, but such subsequences can be extracted easily in our algorithm by comparing two values per iteration, without any additional calculation. The proposed method is as simple as the classic proximal gradient method, requiring only one proximal operator and one value of the mapping per iteration. We establish the ergodic convergence rate for general cases and R-linear convergence rate for a special case under the strong monotonicity assumption. The numerical experiments illustrate the improvements in efficiency from the larger step size.