Abstract In this paper, we propose a stochastic variance reduction gradient method with adaptive step size, referred to as the SVRG‐New BB method, to solve the convex stochastic optimization problem. The method could be roughly viewed as a hybrid of the SVRG algorithm and a new BB step mechanism. Under the condition that the objective function is strongly convex, we provide the linear convergence proof of this algorithm. Numerical experiment results show that the performance of the SVRG‐New BB algorithm can surpass other existing algorithms if parameters in the algorithm are properly chosen.