Federated learning, which allows distributed medical institutions to train a shared deep learning model with privacy protection, has become increasingly popular recently. However, in practical application, due to data heterogeneity between different hospitals, the performance of the model will be degraded in the training process. In this paper, we propose a federated contrastive learning (FedCL) approach. FedCL integrates the idea of contrastive learning into the federated learning framework. Specifically, it combines the local model and the global model for contrastive learning, so that the local model gradually approaches the global model with the increase of communication rounds, which improves the generalization ability of the model. We validate our method on two public datasets. Extensive experiments show that our method is superior to other federated learning algorithms in medical image classification.