自动汇总
计算机科学
变压器
健康档案
任务(项目管理)
人工智能
自然语言处理
情报检索
机器学习
医疗保健
工程类
经济增长
电气工程
经济
电压
系统工程
作者
Jangyeong Jeon,Junyeong Kim
标识
DOI:10.1109/iceic57457.2023.10049932
摘要
The Self-attention mechanism in Transformer has been successful so far. However, it needed adequate performance in electronic health records and text summarization tasks that operate on long sentences due to computational complexity during self-attention calculation. Therefore, this paper provides a survey on the proposed method to uncover sparse attention that operates on long sentences and examine the case of improving the performance in the EHR task.
科研通智能强力驱动
Strongly Powered by AbleSci AI