This document summarizes a new method for solving regularized empirical risk minimization problems in mini-batch settings. The proposed method, called Doubly Accelerated Stochastic Variance Reduced Gradient, combines inner and outer acceleration to improve the mini-batch efficiency of previous methods like SVRG and AccProxSVRG. It achieves this by applying Nesterov's acceleration both within and across iterations of the AccProxSVRG algorithm. Numerical experiments demonstrate that the new method requires a smaller mini-batch size to achieve a given optimization error compared to prior methods.