膜
聚乙烯醇
明胶
静电纺丝
戊二醛
材料科学
化学工程
硫酸庆大霉素
生物相容性
姜黄素
图层(电子)
化学
庆大霉素
色谱法
复合材料
聚合物
有机化学
生物化学
抗生素
工程类
冶金
作者
Ssu-Meng Huang,Shih-Ming Liu,Hua-Yi Tseng,Wen‐Cheng Chen
出处
期刊:Membranes
[MDPI AG]
日期:2023-05-30
卷期号:13 (6): 564-564
被引量:9
标识
DOI:10.3390/membranes13060564
摘要
Nanofibrous membranes made of hydrogels have high specific surface areas and are suitable as drug carriers. Multilayer membranes fabricated by continuous electrospinning could delay drug release by increasing diffusion pathways, which is beneficial for long-term wound care. In this experiment, polyvinyl alcohol (PVA) and gelatin were used as membrane substrates, and a sandwich PVA/gelatin/PVA structure of layer-by-layer membranes was prepared by electrospinning under different drug loading concentrations and spinning times. The outer layers on both sides were citric-acid-crosslinked PVA membranes loaded with gentamicin as an electrospinning solution, and the middle layer was a curcumin-loaded gelatin membrane for the study of release behavior, antibacterial activity, and biocompatibility. According to the in vitro release results, the multilayer membrane could release curcumin slowly; the release amount was about 55% less than that of the single layer within 4 days. Most of the prepared membranes showed no significant degradation during immersion, and the phosphonate-buffered saline absorption rate of the multilayer membrane was about five to six times its weight. The results of the antibacterial test showed that the multilayer membrane loaded with gentamicin had a good inhibitory effect on Staphylococcus aureus and Escherichia coli. In addition, the layer-by-layer assembled membrane was non-cytotoxic but detrimental to cell attachment at all gentamicin-carrying concentrations. This feature could be used as a wound dressing to reduce secondary damage to the wound when changing the dressing. This multilayer wound dressing could be applied to wounds in the future to reduce the risk of bacterial infection and help wounds heal.
科研通智能强力驱动
Strongly Powered by AbleSci AI