奇异值分解
梯度下降
随机梯度下降算法
矩阵分解
计算机科学
趋同(经济学)
收敛速度
秩(图论)
矩阵完成
奇异值
基质(化学分析)
可扩展性
算法
数学优化
数学
人工智能
人工神经网络
特征向量
物理
经济
计算机网络
频道(广播)
材料科学
量子力学
组合数学
数据库
复合材料
高斯分布
经济增长
作者
Sandeep Raghuwanshi,R. K. Pateriya
标识
DOI:10.1016/j.jksuci.2018.03.012
摘要
The limitations of neighborhood-based Collaborative Filtering (CF) techniques over scalable and sparse data present obstacle for efficient recommendation systems. These techniques show poor accuracy and dismal speed in generating recommendations. Model-based matrix factorization is an alternative approach use to overcome aforementioned limitations of CF. Singular value decomposition (SVD) is widely used technique to get low-rank factors of rating matrix and use Gradient Descent (GD) or Alternative Least Square (ALS) for optimization of its error objective function. Most researchers have focused on the accuracy of predictions but they did not accumulate the convergence rate of learning approach. In this paper, we propose a new filtering technique that implements SVD using Stochastic Gradient Descent (SGD) optimization and provides an accelerated version of SVD for fast convergence of learning parameters with improved classification accuracy. Our proposed method accelerates SVD in the right direction and dampens oscillation by adding a momentum value in parameters updates. To support our claim, we have tested our proposed model against the famed real world datasets (MovieLens100k, FilmTrust and YahooMovie). The proposed Accelerated Singular Value Decomposition (ASVD) outperformed the existing models and achieved higher convergence rate and better classification accuracy.
科研通智能强力驱动
Strongly Powered by AbleSci AI