ABSTRACT This paper introduces an enhanced BERT‐DPCNN model for the task of Chinese news text classification. The model addresses the common challenge of balancing accuracy and computational efficiency in existing models, especially when dealing with large‐scale, high‐dimensional text data. To tackle this issue, the paper proposes an improved BERT‐DPCNN model that integrates BERT's pre‐trained language model with DPCNN's efficient convolutional structure to capture deep semantic information and key features from the text. Additionally, the paper incorporates the zebra optimization algorithm (ZOA) to dynamically optimize the model's hyperparameters, overcoming the limitations of manual tuning in traditional models. By automatically optimizing hyperparameters such as batch size, learning rate, and the number of filters through ZOA, the model's classification performance is significantly enhanced. Experimental results demonstrate that the improved ZOA‐BERT‐DPCNN model outperforms traditional methods on the THUCNEWS Chinese news dataset, not only verifying its effectiveness in news text classification tasks but also showcasing its potential to enhance classification performance.