计算机科学
树库
集合(抽象数据类型)
公制(单位)
人工智能
匹配(统计)
弹丸
人工神经网络
钥匙(锁)
班级(哲学)
深层神经网络
深度学习
机器学习
任务(项目管理)
注释
经济
统计
计算机安全
有机化学
化学
管理
程序设计语言
数学
运营管理
作者
Oriol Vinyals,Charles Blundell,Timothy Lillicrap,Koray Kavukcuoglu,Daan Wierstra
出处
期刊:Cornell University - arXiv
日期:2016-06-13
被引量:843
摘要
Learning from a few examples remains a key challenge in machine learning. Despite recent advances in important domains such as vision and language, the standard supervised deep learning paradigm does not offer a satisfactory solution for learning new concepts rapidly from little data. In this work, we employ ideas from metric learning based on deep neural features and from recent advances that augment neural networks with external memories. Our framework learns a network that maps a small labelled support set and an unlabelled example to its label, obviating the need for fine-tuning to adapt to new class types. We then define one-shot learning problems on vision (using Omniglot, ImageNet) and language tasks. Our algorithm improves one-shot accuracy on ImageNet from 87.6% to 93.2% and from 88.0% to 93.8% on Omniglot compared to competing approaches. We also demonstrate the usefulness of the same model on language modeling by introducing a one-shot task on the Penn Treebank.
科研通智能强力驱动
Strongly Powered by AbleSci AI