粘弹性
流变学
应力松弛
蠕动
放松(心理学)
非线性系统
本构方程
材料科学
压力(语言学)
幂律
机械
数学
物理
热力学
复合材料
有限元法
统计
哲学
社会心理学
量子力学
语言学
心理学
作者
Y. J. F. Kpomahou,Koffi Judicaël Agbélélé,Arnaud Edouard Yamadjako,Bachir Koladé Adélakoun Ambelohoun
出处
期刊:International Journal of Materials Science and Applications
[Science Publishing Group]
日期:2023-03-04
标识
DOI:10.11648/j.ijmsa.20231201.11
摘要
Viscoelastic materials are widely used as devices for vibration control in modern engineering applications. They exhibit both viscous and elastic characteristic when undergoing deformation. They are mainly characterized by three time-dependent mechanical properties such as creep, stress relaxation and hysteresis. Among them, stress relaxation is one of the most important features in the characterization of viscoelastic materials. This phenomenon is defined as a time-dependent decrease in stress under a constant strain. Due to the inherent nonlinearity shown by the material response over a certain range of strain when viscoelastic materials are subjected to external loads, nonlinear rheological models are needed to better describe the experimental data. In this study, a singlenonlinear differential constitutive equation is derived froma nonlinear rheological model composed of a generalized nonlinear Maxwell fluid model in parallel with a nonlinear spring obeying a power law for the prediction of the stress relaxation behavior in viscoelastic materials. Under a constant strain-history, the time-dependent stress is analytically derived in the cases where the positive power law exponent, α ˂ 1 and α ˃1. The Trust Region Method available in MATLAB Optimization Toolbox is used to identify the material parameters. Significant correlations are found between the experimental relaxation data taken from literature and exact analytical predictions. The obtained results show that the developed rheological model with integer and non-integer orders nonlinearities accurately describes the experimental relaxation data of some viscoelastic materials.
科研通智能强力驱动
Strongly Powered by AbleSci AI