Supervised deep hashing aims to learn hash functions using label information. Existing methods learn hash functions by employing either pairwise/triplet loss to explore the point-to-point relation or center loss to explore the point-to-class relation. However, these methods overlook the collaboration between the above two kinds of relations and the hardness of pairs. In this work, we propose a novel Self-Paced Relational Contrastive Hashing (SPRCH) method with a single learning objective to capture valuable discriminative information from hard pairs using both the point-to-point and point-to-class relations. To exploit the above two kinds of relations, the Relational Contrastive Hash (RCH) loss is proposed, which ensures that each data anchor is closer to all similar data points and corresponding class centers in the Hamming space compared to dissimilar ones. Moreover, the proposed RCH loss reduces the drastic imbalance between point-to-point pairs and point-to-class pairs by adjusting their weights. To prioritize to hard pairs, a self-paced learning schedule is proposed, assigning higher weights to these pairs in the RCH loss. The self-paced learning schedule assigns dynamic weights to pairs according to their similarities and the training process. In this way, deep hash model can initially learn universal patterns from the entire set of pairs and then gradually acquire more valuable discriminative information from hard pairs. Experimental results on four widely-used image retrieval datasets demonstrate that our proposed SPRCH method significantly outperforms the state-of-the-art supervised deep hash methods. The source code will be made publicly available upon the publication.