五元
材料科学
合金
三元运算
高熵合金
延展性(地球科学)
价电子
热力学
复合材料
计算机科学
电子
蠕动
物理
量子力学
程序设计语言
作者
Xingge Xu,Hualei Zhang,Xiangdong Ding,Jun Sun
标识
DOI:10.1016/j.jmst.2023.07.077
摘要
Reducing the exploration of multi-principal element alloy space is a key challenge to design high-performance U-based high-entropy alloy (UHEA). Here, the best combination of multi-principal element can be efficiently acquired because proposed alloying strategy and screening criteria can substantially reduce the space of alloy and thus accelerate alloy design, rather than enormous random combinations through a trial-and-error approach. To choose the best seed alloy and suitable dopants, the screening criteria include small anisotropy, high specific modulus, high dynamical stability, and high ductility. We therefore find a shortcut to design UHEA from typical binary (UTi and UNb) to ternary (UTiNb), quaternary (UTiNbTa), and quinary (UTiNbTaFe). Finally, we find a best bcc senary UHEA (UTiNbTaFeMo), which has highest hardness and yield strength, while maintains good ductility among all the candidates. Compared to overestimation from empirical strength-hardness relationship, improved strength prediction can be achieved using a parameter-free theory considering volume mismatch and temperature effect on yield strength. This finding indicates that larger volume mismatch corresponds to higher yield strength, agreeing with the available measurements. Moreover, the dynamical stability and mechanical properties of candidates are greatly enhanced with increasing the number of multi-principal element, indicating the feasibility and effectiveness of adopted alloying strategy. The increasing of multi-principal element corresponds to the increasing valence electron concentration (VEC). Alternatively, the mechanical properties significantly improve as increasing VEC, agreeing with measurements for other various bcc HEAs. This work can speed up research and development of advanced UHEA by greatly reducing the space of alloy composition.
科研通智能强力驱动
Strongly Powered by AbleSci AI