Lack of knowledge is a common consequence of data incompleteness when learning from real-world data. To deal with such a situation, this work utilizes transfer learning (TL) to reuse knowledge from different (yet related) but complete domains. Due to its powerful feature construction ability, genetic programming (GP) is used to construct feature-based transformations that map the feature space of the source domain to that of the target domain such that their differences are reduced. Particularly, this work proposes a new multitree GP-based feature construction approach to TL in symbolic regression with missing values. It transfers knowledge related to the importance of the features and instances in the source domain to the target domain to improve the learning performance. Moreover, new genetic operators are developed to encourage minimizing the distribution discrepancy between the transformed domain and the target domain. A new probabilistic crossover is developed to make the well-constructed trees in the individuals more likely to be mated than the other trees. A new mutation operator is designed to give more probability for the poorly constructed trees to be mutated. The experimental results show that the proposed method not only achieves better performance compared with different traditional learning methods but also advances two recent TL methods on real-world data sets with various incompleteness and learning scenarios.