材料科学
合金
冶金
热的
热稳定性
化学工程
热力学
物理
工程类
作者
Rou Ding,Junwang Deng,Xiaochun Liu,Yiyou Wu,Zhaowen Geng,Dan Li,Taomei Zhang,Chao Chen,Kechao Zhou
标识
DOI:10.1016/j.jallcom.2022.167894
摘要
The trade-off between printability and performance of Al–Ni–Sc alloy fabricated by additive manufacturing (AM) herein is demonstrated. By designing eutectic solidification, a crack-free microstructure was obtained. Extremely high cooling rate during rapid solidification promoted the formation of spherical Al3Ni nano-particles. Further by adding Sc, coherent L12 Al3Sc nano-particles were formed, which served as nucleation sites of α-Al to transform columnar grains into equiaxed grains and provided further strengthening. Superior mechanical properties with tensile strength of 445 MPa and yield strength of 320 MPa at ambient temperature were obtained. And this Al–Ni–Sc alloy can still exhibit excellent tensile strength of 259 MPa and yield strength of 248 MPa at 250 ℃. After thermal exposure at 200–400 °C, Al–Ni–Sc alloy consistently showed higher microhardness than the corresponding Al–Ni alloy and both alloys retained hardness at 300 °C for 50 h. The Al3Sc nano-precipitates provided additional hardness increments and compensated the microhardness decrease owing to the coarsening of Al3Ni nano-particles during high-temperature exposure. The coarsening process studied on Al–Ni alloy can be divided into two stages, rapid coarsening stage at first 0.5 h due to solute supersaturation and coarsening slowing stage due to the decreased coarsening drive. Following the classical coarsening mechanism–LSW model and Arrhenius equation, the corresponding diffusion activation energy of the second stage was calculated to be 23.03 KJ/mol, much lower than Ni self-diffusion in Al. This phenomenon was attributed to the formation of numerous defects and grain boundaries during AM as fast diffusion channels of atoms. The finding of this work provides guidance for considering discrepancy of Al alloys produced by AM and cast.
科研通智能强力驱动
Strongly Powered by AbleSci AI