计算机科学
表(数据库)
边距(机器学习)
文本生成
变压器
嵌入
匹配(统计)
人工智能
自然语言处理
公制(单位)
数据挖掘
情报检索
理论计算机科学
机器学习
统计
电压
经济
物理
量子力学
数学
运营管理
作者
Zhenyi Wang,Xiaoyang Wang,Bang An,Dong Yu,Changyou Chen
标识
DOI:10.18653/v1/2020.acl-main.101
摘要
Text generation from a knowledge base aims to translate knowledge triples to natural language descriptions. Most existing methods ignore the faithfulness between a generated text description and the original table, leading to generated information that goes beyond the content of the table. In this paper, for the first time, we propose a novel Transformer-based generation framework to achieve the goal. The core techniques in our method to enforce faithfulness include a new table-text optimal-transport matching loss and a table-text embedding similarity loss based on the Transformer model. Furthermore, to evaluate faithfulness, we propose a new automatic metric specialized to the table-to-text generation problem. We also provide detailed analysis on each component of our model in our experiments. Automatic and human evaluations show that our framework can significantly outperform state-of-the-art by a large margin.
科研通智能强力驱动
Strongly Powered by AbleSci AI