亲爱的研友该休息了!由于当前在线用户较少,发布求助请尽量完整地填写文献信息,科研通机器人24小时在线,伴您度过漫漫科研夜!身体可是革命的本钱,早点休息,好梦!

Randomized algorithms for large-scale dictionary learning

计算机科学 K-SVD公司 算法 奇异值分解 核(代数) 人工智能 基质(化学分析) 稀疏逼近 数学 材料科学 组合数学 复合材料
作者
Gang Wu,Jiali Yang
出处
期刊:Neural Networks [Elsevier BV]
卷期号:179: 106628-106628
标识
DOI:10.1016/j.neunet.2024.106628
摘要

Dictionary learning is an important sparse representation algorithm which has been widely used in machine learning and artificial intelligence. However, for massive data in the big data era, classical dictionary learning algorithms are computationally expensive and even can be infeasible. To overcome this difficulty, we propose new dictionary learning methods based on randomized algorithms. The contributions of this work are as follows. First, we find that dictionary matrix is often numerically low-rank. Based on this property, we apply randomized singular value decomposition (RSVD) to the dictionary matrix, and propose a randomized algorithm for linear dictionary learning. Compared with the classical K-SVD algorithm, an advantage is that one can update all the elements of the dictionary matrix simultaneously. Second, to the best of our knowledge, there are few theoretical results on why one can solve the involved matrix computation problems inexactly in dictionary learning. To fill-in this gap, we show the rationality of this randomized algorithm with inexact solving, from a matrix perturbation analysis point of view. Third, based on the numerically low-rank property and Nyström approximation of the kernel matrix, we propose a randomized kernel dictionary learning algorithm, and establish the distance between the exact solution and the computed solution, to show the effectiveness of the proposed randomized kernel dictionary learning algorithm. Fourth, we propose an efficient scheme for the testing stage in kernel dictionary learning. By using this strategy, there is no need to form nor store kernel matrices explicitly both in the training and the testing stages. Comprehensive numerical experiments are performed on some real-world data sets. Numerical results demonstrate the rationality of our strategies, and show that the proposed algorithms are much efficient than some state-of-the-art dictionary learning algorithms. The MATLAB codes of the proposed algorithms are publicly available from https://github.com/Jiali-yang/RALDL_RAKDL.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
PDF的下载单位、IP信息已删除 (2025-6-4)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
领导范儿应助我困采纳,获得10
6秒前
Ava应助阿歪歪采纳,获得10
9秒前
12秒前
顾矜应助盐碱地杂草采纳,获得10
16秒前
21秒前
我困发布了新的文献求助10
27秒前
37秒前
40秒前
细心松鼠发布了新的文献求助10
43秒前
Dice°发布了新的文献求助10
43秒前
深情安青应助榴下晨光采纳,获得10
46秒前
leo发布了新的文献求助10
48秒前
烟花应助Dice°采纳,获得10
52秒前
李爱国应助leo采纳,获得30
1分钟前
1分钟前
榴下晨光发布了新的文献求助10
1分钟前
1分钟前
Orange应助科研通管家采纳,获得10
1分钟前
1分钟前
无花果应助谦让的思枫采纳,获得10
1分钟前
张志超发布了新的文献求助10
1分钟前
1分钟前
桐桐应助shuai15054采纳,获得20
2分钟前
李健的粉丝团团长应助CX采纳,获得10
2分钟前
桉豆完成签到 ,获得积分10
2分钟前
2分钟前
尼古丁的味道完成签到 ,获得积分10
2分钟前
我是老大应助可爱花瓣采纳,获得10
2分钟前
2分钟前
可爱花瓣发布了新的文献求助10
2分钟前
CodeCraft应助CCrain采纳,获得30
3分钟前
林狗完成签到 ,获得积分10
3分钟前
可爱花瓣完成签到,获得积分10
3分钟前
张志超发布了新的文献求助10
3分钟前
Hello应助迅速广缘采纳,获得10
3分钟前
小珂完成签到,获得积分10
3分钟前
3分钟前
丘比特应助科研通管家采纳,获得10
3分钟前
斯文败类应助科研通管家采纳,获得10
3分钟前
3分钟前
高分求助中
Pipeline and riser loss of containment 2001 - 2020 (PARLOC 2020) 1000
哈工大泛函分析教案课件、“72小时速成泛函分析:从入门到入土.PDF”等 660
Fermented Coffee Market 500
Theory of Dislocations (3rd ed.) 500
Comparing natural with chemical additive production 500
The Leucovorin Guide for Parents: Understanding Autism’s Folate 500
Phylogenetic study of the order Polydesmida (Myriapoda: Diplopoda) 500
热门求助领域 (近24小时)
化学 医学 生物 材料科学 工程类 有机化学 内科学 生物化学 物理 计算机科学 纳米技术 遗传学 基因 复合材料 化学工程 物理化学 病理 催化作用 免疫学 量子力学
热门帖子
关注 科研通微信公众号,转发送积分 5232484
求助须知:如何正确求助?哪些是违规求助? 4401772
关于积分的说明 13699328
捐赠科研通 4268152
什么是DOI,文献DOI怎么找? 2342364
邀请新用户注册赠送积分活动 1339409
关于科研通互助平台的介绍 1296070