计算机科学
本体论
判决
自然语言处理
变压器
上层本体
领域(数学分析)
人工智能
背景(考古学)
光学(聚焦)
领域知识
情报检索
认识论
光学
物理
数学分析
哲学
古生物学
生物
电压
量子力学
数学
作者
Alcides Gonçalves Lopes,Joel Luís Carbonera,Daniela Schmidt,Luan Fonseca Garcia,Fabrício Henrique Rodrigues,Mara Abel
标识
DOI:10.1016/j.knosys.2023.110385
摘要
The classification of domain entities into top-level ontology concepts remains an activity performed manually by an ontology engineer. Although some works focus on automating this task by applying machine-learning approaches using textual sentences as input, they require the existence of the domain entities in external knowledge resources, such as pre-trained embedding models. In this context, this work proposes an approach that combines the term representing the domain entity and its informal definition into a single text sentence without requiring external knowledge resources. Thus, we use this sentence as the input of a deep neural network that contains a language model as a layer. Also, we present a methodology used to extract two novel datasets from the OntoWordNet ontology based on Dolce-Lite and Dolce-Lite-Plus top-level ontologies. Our experiments show that by using the transformer-based language models, we achieve promising results in classifying domain entities into 82 top-level ontology concepts, with 94% regarding micro F1-score.
科研通智能强力驱动
Strongly Powered by AbleSci AI