First-order methods such as proximal gradient, which use Forward–Backward Splitting techniques have proved to be very effective in solving nonsmooth convex minimization problem, which is useful in solving various practical problems in different fields such as machine learning and image processing. In this paper, we propose few new forward–backward splitting algorithms, which consume less number of iterations to converge to an optimum. In addition, we derive convergence rates for the proposed formulations and show that the speed of convergence of these algorithms is significantly better than the traditional forward–backward algorithm. To demonstrate the practical applicability, we apply them to two real-world problems of machine learning and image processing. The first issue deals with the regression on high-dimensional datasets, whereas the second one is the image deblurring problem. Numerical experiments have been conducted on several publicly available real datasets to verify the obtained theoretical results. Results demonstrate the superiority of our algorithms in terms of accuracy, the number of iterations required to converge and the rate of convergence against the classical first-order methods.