Graph Neural Networks (GNNs) are one of the primary methods for molecular property prediction due to their ability to learn state-of-the-art level representations from graph-structured molecular data. In addition, the Federated Learning (FL) paradigm, which allows multiple ends to collaborate on machine learning training without sharing local data, is being considered for introduction to improve the performance of multiple ends. However, in FL, the molecular graph data among clients are not only Non-Independent Identically Distribution (Non-IID) but also skewed in quantity distribution. In this paper, we propose the GFedKRL framework to perform knowledge distillation and re-learning during the interaction between clients and servers in each cluster after clustering the graph embeddings uploaded. We also analyze the risk of privacy leakage in the GFedKRL and propose personalized local differential privacy to protect privacy while better controlling the amount of noise input and improving model performance. In addition, to resist the impact of noise data on the clients’ model, graph representation learning is enhanced by knowledge contrast learning at the local clients. Finally, our approach achieves better results in three experimental datasets compared with four public benchmark methods.