计算机科学
机器学习
可扩展性
人工智能
大数据
蓝图
比例(比率)
数据科学
开放式研究
数据挖掘
量子力学
数据库
机械工程
物理
工程类
万维网
作者
Meng Wang,Weijie Fu,Xiangnan He,Shijie Hao,Xindong Wu
出处
期刊:IEEE Transactions on Knowledge and Data Engineering
[Institute of Electrical and Electronics Engineers]
日期:2020-01-01
卷期号:: 1-1
被引量:81
标识
DOI:10.1109/tkde.2020.3015777
摘要
Machine learning can provide deep insights into data, allowing machines to make high-quality predictions and having been widely used in real-world applications, such as text mining, visual classification, and recommender systems. However, most sophisticated machine learning approaches suffer from huge time costs when operating on large-scale data. This issue calls for the need of Large-scale Machine Learning (LML), which aims to learn patterns from big data with comparable performance efficiently. In this paper, we offer a systematic survey on existing LML methods to provide a blueprint for the future developments of this area. We first divide these LML methods according to the ways of improving the scalability: 1) model simplification on computational complexities, 2) optimization approximation on computational efficiency, and 3) computation parallelism on computational capabilities. Then we categorize the methods in each perspective according to their targeted scenarios and introduce representative methods in line with intrinsic strategies. Lastly, we analyze their limitations and discuss potential directions as well as open issues that are promising to address in the future.
科研通智能强力驱动
Strongly Powered by AbleSci AI