人工神经网络
计算机科学
领域(数学分析)
人工智能
编码(内存)
领域知识
卷积神经网络
机器学习
空格(标点符号)
数学
操作系统
数学分析
作者
Russell J. Stewart,Stefano Ermon
出处
期刊:Cornell University - arXiv
日期:2016-01-01
标识
DOI:10.48550/arxiv.1609.05566
摘要
In many machine learning applications, labeled data is scarce and obtaining more labels is expensive. We introduce a new approach to supervising neural networks by specifying constraints that should hold over the output space, rather than direct examples of input-output pairs. These constraints are derived from prior domain knowledge, e.g., from known laws of physics. We demonstrate the effectiveness of this approach on real world and simulated computer vision tasks. We are able to train a convolutional neural network to detect and track objects without any labeled examples. Our approach can significantly reduce the need for labeled training data, but introduces new challenges for encoding prior knowledge into appropriate loss functions.
科研通智能强力驱动
Strongly Powered by AbleSci AI