计算机科学
外部数据表示
数据冗余
冗余(工程)
人工智能
数据挖掘
计算机工程
机器学习
模式识别(心理学)
操作系统
作者
Lulu Ge,Keshab K. Parhi
出处
期刊:IEEE Circuits and Systems Magazine
[Institute of Electrical and Electronics Engineers]
日期:2020-01-01
卷期号:20 (2): 30-47
被引量:151
标识
DOI:10.1109/mcas.2020.2988388
摘要
Hyperdimensional (HD) computing is built upon its unique data type referred to as hypervectors. The dimension of these hypervectors is typically in the range of tens of thousands. Proposed to solve cognitive tasks, HD computing aims at calculating similarity among its data. Data transformation is realized by three operations, including addition, multiplication and permutation. Its ultra-wide data representation introduces redundancy against noise. Since information is evenly distributed over every bit of the hypervectors, HD computing is inherently robust. Additionally, due to the nature of those three operations, HD computing leads to fast learning ability, high energy efficiency and acceptable accuracy in learning and classification tasks. This paper introduces the background of HD computing, and reviews the data representation, data transformation, and similarity measurement. The orthogonality in high dimensions presents opportunities for flexible computing. To balance the tradeoff between accuracy and efficiency, strategies include but are not limited to encoding, retraining, binarization and hardware acceleration. Evaluations indicate that HD computing shows great potential in addressing problems using data in the form of letters, signals and images. HD computing especially shows significant promise to replace machine learning algorithms as a light-weight classifier in the field of internet of things (IoTs).
科研通智能强力驱动
Strongly Powered by AbleSci AI