沉积作用
膜污染
超滤(肾)
凝结
结垢
膜
化学
色谱法
化学工程
絮凝作用
材料科学
地质学
生物化学
心理学
古生物学
有机化学
沉积物
精神科
工程类
作者
Yanyan Ding,Tong Li,Kaipei Qiu,Baiwen Ma,Ruijun Wu
标识
DOI:10.1016/j.envres.2021.110756
摘要
Pre-coagulation is commonly used with ultrafiltration (UF) to alleviate the membrane fouling. Compared to conventional coagulation-sedimentation-UF (CSUF) processes, the direct coagulation-UF (CUF) processes are widely believed to perform better due to the formation of a looser cake layer. It is however shown in this study that not only the density of a cake layer, but also its thickness as well, can affect the membrane fouling behavior, which therefore are influenced by both the sedimentation time and flocs characteristics. Herein, the membrane fouling performance of Fe-based coagulation-UF process was systematically investigated with different sedimentation times. A critical threshold of 30 min was observed at the lab-scale: if shorter than that, the membrane fouling depended mainly on the cake layer density, and thus CUF outperformed CSUF; but when the sedimentation time was over 30 min, the cake layer thickness turned to be the dominant factor, thereby resulting in CSUF performing better. Furthermore, it was shown that the critical sedimentation time was decided by flocs characteristics. A lower water temperature induced the formation of irregular flocs with a lower fractal dimension, and the corresponding cake layer exhibited an almost identical density with increasing sedimentation time. In this regard, CSUF processes were constantly superior to CUF as the cake layer thickness decreased. On the other hand, a critical sedimentation time reappeared because of the higher floc fractal dimension under acidic conditions. This work showed for the first time that the membrane fouling of CSUF was up to the sedimentation time, and it was possible to outperform CUF if the sedimentation time exceeded a critical threshold. Such a finding is crucial to the future development of coagulation integrated UF processes.
科研通智能强力驱动
Strongly Powered by AbleSci AI