空隙(复合材料)
复合数
材料科学
转移模塑
汽车工业
复合材料
造型(装饰)
毛细管作用
互连性
压缩成型
压实
机械
机械工程
工程类
计算机科学
模具
人工智能
航空航天工程
物理
作者
Aouatif Saad,Adil Echchelh,Mohamed Hattabi,Mohammed El Ganaoui
出处
期刊:Composites: mechanics, computations, applications
[Begell House Inc.]
日期:2018-01-01
卷期号:9 (1): 51-93
被引量:5
标识
DOI:10.1615/compmechcomputapplintj.v9.i1.50
摘要
Liquid composite molding (LCM) processes are being used in manufacturing near-net-shape, geometrically complex composite parts. One of the current obstacles to a larger scale application of these processes is the formation of defects such as voids during resin injection. To reach aeronautic requirements or short injection cycles in the automotive industry, entrapped air in the final part before curing has to remain as low as possible. Air entrapment will depend on the fibrous structure and on the injection parameters, or more precisely on the fluid pressure and the flow front orientation with respect to the fibrous direction. A key parameter for production of structural composite parts is air entrapment, since high void content could lead to mechanical softening, early failure, or part rejection. The quantitative simulation of the void formation is important for proper design and selection of material and processing parameters to minimize such voids in the composite materials. Despite several advancements in voidage predictions via modeling and simulations, the void formation mechanisms in RTM and similar processes are still not fully understood. In this study, a review of current approaches to modeling and simulation of void formation and unsaturated flow in the liquid composite molding process is presented. We examine modeling efforts considering all the mechanisms involved such as void formation and transport, bubble compression, and gas dissolution. In particular, the capillary number is identified as a key parameter for void formation and transport. The influence of voids on the global resin flow is also investigated and a state-of-the-art is presented.
科研通智能强力驱动
Strongly Powered by AbleSci AI