人工智能
计算机科学
判别式
特征学习
模式识别(心理学)
相似性学习
人工神经网络
边距(机器学习)
机器学习
参数统计
竞争性学习
无监督学习
特征(语言学)
相似性(几何)
上下文图像分类
深度学习
分类器(UML)
半监督学习
特征选择
特征提取
特征向量
水准点(测量)
卷积神经网络
Boosting(机器学习)
图像(数学)
数学
统计
语言学
哲学
作者
Zhirong Wu,Yuanjun Xiong,Stella X. Yu,Dahua Lin
出处
期刊:Computer Vision and Pattern Recognition
日期:2018-06-01
被引量:1821
标识
DOI:10.1109/cvpr.2018.00393
摘要
Neural net classifiers trained on data with annotated class labels can also capture apparent visual similarity among categories without being directed to do so. We study whether this observation can be extended beyond the conventional domain of supervised learning: Can we learn a good feature representation that captures apparent similarity among instances, instead of classes, by merely asking the feature to be discriminative of individual instances? We formulate this intuition as a non-parametric classification problem at the instance-level, and use noise-contrastive estimation to tackle the computational challenges imposed by the large number of instance classes. Our experimental results demonstrate that, under unsupervised learning settings, our method surpasses the state-of-the-art on ImageNet classification by a large margin. Our method is also remarkable for consistently improving test performance with more training data and better network architectures. By fine-tuning the learned feature, we further obtain competitive results for semi-supervised learning and object detection tasks. Our non-parametric model is highly compact: With 128 features per image, our method requires only 600MB storage for a million images, enabling fast nearest neighbour retrieval at the run time.
科研通智能强力驱动
Strongly Powered by AbleSci AI