随机梯度下降算法
梯度下降
计算机科学
算法
计算复杂性理论
功能(生物学)
回归
指数增长
线性回归
大数据
人工智能
数学
数据挖掘
机器学习
统计
人工神经网络
进化生物学
生物
数学分析
作者
J Nishchal,neel bhandari
标识
DOI:10.36227/techrxiv.14544000.v1
摘要
Information is mounting exponentially, and the world is moving to hunt knowledge with the help of Big Data. The labelled data is used for automated learning and data analysis which is termed as Machine Learning. Linear Regression is a statistical method for predictive analysis. Gradient Descent is the process which uses cost function on gradients for minimizing the complexity in computing mean square error. This work presents an insight into the different types of Gradient descent algorithms namely, Batch Gradient Descent, Stochastic Gradient Descent and Mini-Batch Gradient Descent, which are implemented on a Linear regression dataset, and hence determine the computational complexity and other factors like learning rate, batch size and number of iterations which affect the efficiency of the algorithm.
科研通智能强力驱动
Strongly Powered by AbleSci AI