材料科学
延展性(地球科学)
应变率
断裂(地质)
压力(语言学)
应力-应变曲线
复合材料
拉伤
结构工程
钛合金
变形(气象学)
流离失所(心理学)
合金
蠕动
工程类
心理治疗师
哲学
内科学
医学
语言学
心理学
作者
Dou Wang,Zejian Xu,Yang Han,Mengyu Su,Fenglei Huang
标识
DOI:10.1016/j.ijimpeng.2024.104898
摘要
The ductility of metallic materials is recognized to be significantly affected by stress state and strain rate. However, systematic investigations on the impacts of stress state and strain rate on ductile behavior are infrequent, and only a minority of fracture models can consider both effects concurrently. This work aims to systematically investigate the ductile response of Ti6Al4V alloy over a broad spectrum of stress states and strain rates and to propose a ductile model that incorporates the coupled influence of stress state and strain rate. First, five types of specimens were employed to obtain distinct stress states. Then, the fracture strain of material was ascertained for strain rate spanning from 10−3/s to 103/s, stress triaxiality ranging from -0.7 to 0.4, and Lode angle parameter varying from -1 to 1 using a combined experimental-numerical method. It was found that over a broad spectrum of stress states, the relation between fracture strain versus stress state is not monotonic, and the fracture strain reduces as strain rate increases. Utilizing the experimental data, a novel ductile fracture model incorporating the stress triaxiality, Lode angle parameter and strain rate was established and embedded into the ABAQUS/Explicit through the user material subroutine VUMAT. Finally, a validation test was conducted to demonstrate the validity of the proposed model. The results indicate the satisfactory correlation between experiment and calculation for fracture displacement and fracture morphology, which illustrates that the proposed model can predict the ductile behavior across a wide spectrum of stress states and strain rates. This work provides a definite theoretical guidance for the optimization design and safety evaluation of Ti-6Al-4V alloy structural components.
科研通智能强力驱动
Strongly Powered by AbleSci AI