堆积
材料科学
异质结
超短脉冲
电子转移
光电子学
密度泛函理论
范德瓦尔斯力
双层
激子
电荷(物理)
化学物理
光学
凝聚态物理
物理
化学
计算化学
分子
量子力学
生物化学
有机化学
核磁共振
膜
激光器
作者
Ziheng Ji,Hao Hong,Jin Zhang,Qi Zhang,Wei Huang,Ting Cao,Ruixi Qiao,Can Liu,Jing Liang,Chuanhong Jin,Liying Jiao,Kebin Shi,Sheng Meng,Kaihui Liu
出处
期刊:ACS Nano
[American Chemical Society]
日期:2017-11-21
卷期号:11 (12): 12020-12026
被引量:131
标识
DOI:10.1021/acsnano.7b04541
摘要
Van der Waals-coupled two-dimensional (2D) heterostructures have attracted great attention recently due to their high potential in the next-generation photodetectors and solar cells. The understanding of charge-transfer process between adjacent atomic layers is the key to design optimal devices as it directly determines the fundamental response speed and photon-electron conversion efficiency. However, general belief and theoretical studies have shown that the charge transfer behavior depends sensitively on interlayer configurations, which is difficult to control accurately, bringing great uncertainties in device designing. Here we investigate the ultrafast dynamics of interlayer charge transfer in a prototype heterostructure, the MoS2/WS2 bilayer with various stacking configurations, by optical two-color ultrafast pump–probe spectroscopy. Surprisingly, we found that the charge transfer is robust against varying interlayer twist angles and interlayer coupling strength, in time scale of ∼90 fs. Our observation, together with atomic-resolved transmission electron characterization and time-dependent density functional theory simulations, reveals that the robust ultrafast charge transfer is attributed to the heterogeneous interlayer stretching/sliding, which provides additional channels for efficient charge transfer previously unknown. Our results elucidate the origin of transfer rate robustness against interlayer stacking configurations in optical devices based on 2D heterostructures, facilitating their applications in ultrafast and high-efficient optoelectronic and photovoltaic devices in the near future.
科研通智能强力驱动
Strongly Powered by AbleSci AI