Randomized algorithms for large-scale dictionary learning

计算机科学 K-SVD公司 算法 奇异值分解 核(代数) 人工智能 基质(化学分析) 稀疏逼近 数学 材料科学 组合数学 复合材料
作者
Gang Wu,Jiali Yang
出处
期刊:Neural Networks [Elsevier]
卷期号:179: 106628-106628
标识
DOI:10.1016/j.neunet.2024.106628
摘要

Dictionary learning is an important sparse representation algorithm which has been widely used in machine learning and artificial intelligence. However, for massive data in the big data era, classical dictionary learning algorithms are computationally expensive and even can be infeasible. To overcome this difficulty, we propose new dictionary learning methods based on randomized algorithms. The contributions of this work are as follows. First, we find that dictionary matrix is often numerically low-rank. Based on this property, we apply randomized singular value decomposition (RSVD) to the dictionary matrix, and propose a randomized algorithm for linear dictionary learning. Compared with the classical K-SVD algorithm, an advantage is that one can update all the elements of the dictionary matrix simultaneously. Second, to the best of our knowledge, there are few theoretical results on why one can solve the involved matrix computation problems inexactly in dictionary learning. To fill-in this gap, we show the rationality of this randomized algorithm with inexact solving, from a matrix perturbation analysis point of view. Third, based on the numerically low-rank property and Nyström approximation of the kernel matrix, we propose a randomized kernel dictionary learning algorithm, and establish the distance between the exact solution and the computed solution, to show the effectiveness of the proposed randomized kernel dictionary learning algorithm. Fourth, we propose an efficient scheme for the testing stage in kernel dictionary learning. By using this strategy, there is no need to form nor store kernel matrices explicitly both in the training and the testing stages. Comprehensive numerical experiments are performed on some real-world data sets. Numerical results demonstrate the rationality of our strategies, and show that the proposed algorithms are much efficient than some state-of-the-art dictionary learning algorithms. The MATLAB codes of the proposed algorithms are publicly available from https://github.com/Jiali-yang/RALDL_RAKDL.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
大幅提高文件上传限制,最高150M (2024-4-1)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
梦璃完成签到,获得积分10
1秒前
1秒前
1秒前
2秒前
liyi发布了新的文献求助10
2秒前
6秒前
9秒前
10秒前
碗碗豆喵完成签到 ,获得积分10
11秒前
105400155完成签到,获得积分10
12秒前
白樱恋曲发布了新的文献求助10
13秒前
veen完成签到 ,获得积分10
14秒前
运气贼好的熊猫完成签到 ,获得积分10
16秒前
liyi完成签到,获得积分10
16秒前
18秒前
英勇的书包完成签到,获得积分20
19秒前
蜡笔小新发布了新的文献求助10
19秒前
珍妮发布了新的文献求助10
19秒前
obaica完成签到,获得积分10
20秒前
20秒前
W29完成签到,获得积分10
21秒前
NuLi完成签到 ,获得积分10
23秒前
顺利毕业发布了新的文献求助10
25秒前
充电宝应助英勇的书包采纳,获得10
30秒前
天天快乐应助英勇的书包采纳,获得10
30秒前
30秒前
32秒前
小星星完成签到 ,获得积分10
35秒前
白樱恋曲完成签到,获得积分10
35秒前
cc951229发布了新的文献求助10
36秒前
谦让的紫蓝完成签到,获得积分10
39秒前
Ava应助maliang666采纳,获得20
44秒前
明亮无颜发布了新的文献求助10
45秒前
所所应助cc951229采纳,获得10
49秒前
JamesPei应助cc951229采纳,获得10
49秒前
fffffffq完成签到,获得积分10
51秒前
AJY完成签到,获得积分10
54秒前
56秒前
天天完成签到 ,获得积分10
57秒前
珍妮完成签到,获得积分10
59秒前
高分求助中
Sustainability in Tides Chemistry 2800
Kinetics of the Esterification Between 2-[(4-hydroxybutoxy)carbonyl] Benzoic Acid with 1,4-Butanediol: Tetrabutyl Orthotitanate as Catalyst 1000
The Young builders of New china : the visit of the delegation of the WFDY to the Chinese People's Republic 1000
Rechtsphilosophie 1000
Bayesian Models of Cognition:Reverse Engineering the Mind 888
Handbook of Qualitative Cross-Cultural Research Methods 600
Very-high-order BVD Schemes Using β-variable THINC Method 568
热门求助领域 (近24小时)
化学 医学 生物 材料科学 工程类 有机化学 生物化学 物理 内科学 纳米技术 计算机科学 化学工程 复合材料 基因 遗传学 催化作用 物理化学 免疫学 量子力学 细胞生物学
热门帖子
关注 科研通微信公众号,转发送积分 3137627
求助须知:如何正确求助?哪些是违规求助? 2788531
关于积分的说明 7787471
捐赠科研通 2444861
什么是DOI,文献DOI怎么找? 1300119
科研通“疑难数据库(出版商)”最低求助积分说明 625814
版权声明 601023