生成语法
期限(时间)
生成模型
理解力
计算机科学
潜变量
阅读(过程)
人工智能
阅读理解
订单(交换)
质量(理念)
机器学习
认识论
语言学
哲学
经济
物理
程序设计语言
量子力学
财务
作者
Kazutoshi Shinoda,Akiko Aizawa
出处
期刊:Cornell University - arXiv
日期:2020-04-07
被引量:1
摘要
We present a deep generative model of question-answer (QA) pairs for machine reading comprehension. We introduce two independent latent random variables into our model in order to diversify answers and questions separately. We also study the effect of explicitly controlling the KL term in the variational lower bound in order to avoid the posterior collapse issue, where the model ignores latent variables and generates QA pairs that are almost the same. Our experiments on SQuAD v1.1 showed that variational methods can aid QA pair modeling capacity, and that the controlled KL term can significantly improve diversity while generating high-quality questions and answers comparable to those of the existing systems.
科研通智能强力驱动
Strongly Powered by AbleSci AI