During the injection molding process, the cooling process represents the largest portion of the cycle time. The effectiveness of the cooling system significantly affects the production efficiency and part quality, where it is limited by the conventional cooling channels manufactured by the drilling and casting process. Although the maturing advanced additive manufacturing (AM) technology allows the design and fabrication of complex conformal cooling channels, the temperature variance caused by non-uniform thickness distribution of the part remains unsolved. This issue is caused by the fact that the existing conformal cooling designs do not create the channels conformal to the part thickness distributions. In this work, a machine learning aided design method is proposed to generate cooling systems which conform not only to the part surface but also to the part thickness values. Three commonly used conformal cooling channel topologies including spiral, zig-zag, and porous are selected. A surrogate model is derived for each cooling channel topology to approximate the relationship between the design parameters of the cooling channels, part thickness, and the resulting part surface temperature. Based on the surrogate model, the design parameters of each type of cooling channels are optimized to minimize the part surface temperature variation. At the end of the paper, design cases are studied to validate the effectiveness of the proposed method. Based on the proposed method, much lower temperature variance and a smaller coolant pressure drop are achieved compared with the conventional conformal cooling design.