材料科学
极限抗拉强度
复合材料
开裂
软化
拉伸试验
硬化(计算)
应变硬化指数
延展性(地球科学)
纤维混凝土
张力(地质)
结构工程
压力(语言学)
蠕动
纤维
哲学
工程类
图层(电子)
语言学
作者
Zhidong Zhou,Pizhong Qiao
出处
期刊:Journal of Testing and Evaluation
[ASTM International]
日期:2018-11-16
卷期号:48 (4): 20170644-20170644
被引量:29
摘要
Ultra-high performance concrete (UHPC) is characterized by its superior strength, ductility, durability, and particularly its unique post-cracking performance in tension. Dog-bone–shaped specimens are widely used for determination of the tensile behavior of UHPC, but there is no standard test method or specimen design for the characterization of tensile behavior. In this study, an evolving strategy on designing a direct tension test (DTT) specimen is first conducted using numerical finite element analysis. Seven series of DTT specimens made of UHPC and with well-designed dimensions to avoid local stress concentration are then tested experimentally. Results indicate that the post-cracking localization within the gauge measurement region is guaranteed, and the DTT specimen is capable of fully capturing tensile stress–strain responses of UHPC. An idealized constitutive model with three linear phases is proposed to fit the experimental data and thus characterize the linear elastic, strain-hardening, and strain-softening behavior of UHPC in tension. Four tensile material parameters extracted from the experimental stress–strain curves are implemented in the idealized constitutive model from which the multi-phase responses of UHPC in tension are reconstructed. It is found that most tensile material parameters extracted from experimental stress–strain curves, including tensile strength, modulus of elasticity, and dissipated energy, increase with the increased volume fraction of steel fibers, curing age, and displacement loading rate, while the strain capacity at the first cracking remains nearly constant. The DTT specimen developed can be used effectively to characterize the tensile behavior of ductile fiber-reinforced cementitious materials.
科研通智能强力驱动
Strongly Powered by AbleSci AI