The Self-attention mechanism in Transformer has been successful so far. However, it needed adequate performance in electronic health records and text summarization tasks that operate on long sentences due to computational complexity during self-attention calculation. Therefore, this paper provides a survey on the proposed method to uncover sparse attention that operates on long sentences and examine the case of improving the performance in the EHR task.