Person re-identification refers to matching people across non-overlapping camera views on different locations and at different times. In the case of changes in perspective, light, background, veil, and person's clothing, traditional method cannot achieve person recognition effectively and reliably. In this paper, we propose a novel biometric metric learning method named Human Skeleton Mutual Learning person re-identification (HSMLP-Reid). The purpose of HSML person re-identification method (HSMLP-Reid) largely aims to use the new pedestrian local segmentation method proposed in this paper combined with the global skeleton information to solve the influence of background and local posture change. Firstly, bottom-up method is used to estimate the pedestrian posture and skeleton and the joint points of the pedestrians will be marked in this process. A new local segmentation method proposed in this paper named joint segmentation is used to locally segment pedestrians and perform local block matching. Furthermore, we learn the global skeleton information by defined joint distances from the pedestrian 2D skeleton estimation by the bottom-up method and use global skeleton information to global skeleton matching. Finally, we use local match and global skeleton match for mutual learning. Local match based on pedestrian nodes and global skeleton match based on pedestrian skeleton are based on biometrics. We learn the classification loss and metric learning loss to train model. Metric loss includes global skeletal distance and local block metric distance. Extensive experimental results on the large-scale Market1501, CUHK03 and CUHK-SYSU data sets demonstrate that the proposed method achieves consistently superior performance and outperforms most of the state-of-the-art methods.