Online class-incremental learning aims to learn from continuous and single-pass data streams. However, during the learning process, catastrophic forgetting often occurs, leading to the loss of knowledge about previous classes. Most of the current methods to solve the problem of catastrophic forgetting do not fully exploit the semantic information in single-pass data streams. To effectively address this issue, we propose an Attention-based Dual-View Consistency (ADVC) strategy. Specifically, we generate an attention score for each region through the attention mechanism to identify those important features that are useful for classification. This enables the model to thoroughly explore the semantic information in the single-pass data stream and improve prediction accuracy. At the same time, we consider the effectiveness of sample retrieval from another perspective (the samples themselves) by using data augmentation methods in memory to generate variants of old classes of samples to alleviate the problem of catastrophic forgetting. Through extensive experimental results, we demonstrate that our approach outperforms state-of-the-art methods on a range of benchmark datasets.