计算机科学
领域知识
领域(数学分析)
人工神经网络
人工智能
机器学习
深度学习
单调函数
功能(生物学)
深层神经网络
数据建模
数据挖掘
数学
数学分析
生物
进化生物学
数据库
作者
Nikhil Muralidhar,Mohammad Raihanul Islam,Manish Marwah,Anuj Karpatne,Naren Ramakrishnan
标识
DOI:10.1109/bigdata.2018.8621955
摘要
In recent years, the large amount of labeled data available has also helped tend research toward using minimal domain knowledge, e.g., in deep neural network research. However, in many situations, data is limited and of poor quality. Can domain knowledge be useful in such a setting? In this paper, we propose domain adapted neural networks (DANN) to explore how domain knowledge can be integrated into model training for deep networks. In particular, we incorporate loss terms for knowledge available as monotonicity constraints and approximation constraints. We evaluate our model on both synthetic data generated using the popular Bohachevsky function and a real-world dataset for predicting oxygen solubility in water. In both situations, we find that our DANN model outperforms its domain-agnostic counterpart yielding an overall mean performance improvement of 19.5% with a worst- and best-case performance improvement of 4% and 42.7%, respectively.
科研通智能强力驱动
Strongly Powered by AbleSci AI