Cross-domain recommendation (CDR) improves recommendation accuracy by transferring knowledge from rich domains to sparse domains, which is a significant advancement in the effort to deal with data sparsity. Existing CDR works, however, still have some challenges, including (1) ignoring the user-item interaction long-tail distribution problem and (2) transferring only the domain-shared feature preferences of common users. In this paper, we propose a CDR framework named joint Cross-Attention Transfer and Contrastive Learning for Cross-Domain Recommendation (CATCL). We first add random uniform noise to the original representation to maximize the mutual information between the original representation and its augmented view, and then pre-train to obtain more uniformly distributed user/item representations, in order to address the issues of data sparsity and popularity bias within intra-domain. In addition, we introduce a cross-attention mechanism for extracting user domain-shared and domain-specific features in order to capture the relevance of user preferences between inter-domain. Then we employ an element-wise attention component that dynamically distributes weights between domain-specific and domain-shared features, allowing different features to exhibit different importance in rich and sparse domains. The final experimental results on three public datasets demonstrate the effectiveness of the proposed framework for many powerful state-of-the-art approaches.