催化作用
材料科学
法拉第效率
电解质
晶界
析氧
氮气
化学工程
氧气
空位缺陷
X射线光电子能谱
电子转移
氨生产
电解
电化学
无机化学
化学
电极
冶金
物理化学
微观结构
结晶学
工程类
生物化学
有机化学
作者
Xiu Zhong,Enxian Yuan,Fu Yang,Yang Liu,Hao Lü,Jun Yang,Fei Gao,Yu Zhou,Jianming Pan,Jiawei Zhu,Chao Yu,Chengzhang Zhu,Aihua Yuan,Edison Huixiang Ang
标识
DOI:10.1073/pnas.2306673120
摘要
Electrocatalytic nitrogen reduction is a challenging process that requires achieving high ammonia yield rate and reasonable faradaic efficiency. To address this issue, this study developed a catalyst by in situ anchoring interfacial intergrown ultrafine MoO 2 nanograins on N-doped carbon fibers. By optimizing the thermal treatment conditions, an abundant number of grain boundaries were generated between MoO 2 nanograins, which led to an increased fraction of oxygen vacancies. This, in turn, improved the transfer of electrons, resulting in the creation of highly active reactive sites and efficient nitrogen trapping. The resulting optimal catalyst, MoO 2 /C 700 , outperformed commercial MoO 2 and state-of-the-art N 2 reduction catalysts, with NH 3 yield and Faradic efficiency of 173.7 μg h −1 mg −1 cat and 27.6%, respectively, under − 0.7 V vs. RHE in 1 M KOH electrolyte. In situ X-ray photoelectron spectroscopy characterization and density functional theory calculation validated the electronic structure effect and advantage of N 2 adsorption over oxygen vacancy, revealing the dominant interplay of N 2 and oxygen vacancy and generating electronic transfer between nitrogen and Mo(IV). The study also unveiled the origin of improved activity by correlating with the interfacial effect, demonstrating the big potential for practical N 2 reduction applications as the obtained optimal catalyst exhibited appreciable catalytic stability during 60 h of continuous electrolysis. This work demonstrates the feasibility of enhancing electrocatalytic nitrogen reduction by engineering grain boundaries to promote oxygen vacancies, offering a promising avenue for efficient and sustainable ammonia production.
科研通智能强力驱动
Strongly Powered by AbleSci AI