Feature selection (FS) aims to remove the irrelevant and redundant features to improve the classification accuracy of the algorithm, which is regarded as an NP-hard problem. Recently, particle swarm optimization (PSO) has shown promise in FS problems, but most previous PSO-based FS methods use implicit representation, whose particle size is equal to the number of original features. Such particle representation not only consumes a lot of memory and computational cost but also leads to a large search space when applied to high-dimensional data. In this paper, we propose a novel representation scheme called explicit representation (i.e. particles are directly represented by the corresponding selected feature subset) and redefine the particle update strategy for the new representation. Moreover, we adopt a feature grouping strategy based on feature importance and divide the original feature set into multiple groups. Finally, a size-adaptive expansion strategy is proposed, in which the swarm automatically determines the next feature group to increase the particle size. The proposed algorithm, called ESAPSO, is able to effectively reduce the particle size as well as the computational cost and the memory occupation. We validate the performance of the proposed ESAPSO with several state-of-the-art algorithms on ten benchmark datasets. Experimental results show that the proposed ESAPSO is usually achieved by better classification performance as well as feature subsets with similar or smaller sizes. This study provides valuable and novel insight into the particle representation of the PSO-based feature selection problem.