磷烯
单层
剥脱关节
双层
材料科学
带隙
吸收(声学)
化学物理
吸收边
直接和间接带隙
吸收光谱法
基质(水族馆)
工作职能
吸光度
图层(电子)
分析化学(期刊)
纳米技术
石墨烯
光电子学
化学
光学
膜
复合材料
有机化学
物理
海洋学
地质学
生物化学
作者
Adam H. Woomer,Tyler W. Farnsworth,Jun Hu,Rebekah A. Wells,Carrie L. Donley,Scott C. Warren
出处
期刊:ACS Nano
[American Chemical Society]
日期:2015-08-21
卷期号:9 (9): 8869-8884
被引量:450
标识
DOI:10.1021/acsnano.5b02599
摘要
Phosphorene, a two-dimensional (2D) monolayer of black phosphorus, has attracted considerable theoretical interest, although the experimental realization of monolayer, bilayer, and few-layer flakes has been a significant challenge. Here, we systematically survey conditions for liquid exfoliation to achieve the first large-scale production of monolayer, bilayer, and few-layer phosphorus, with exfoliation demonstrated at the 10 g scale. We describe a rapid approach for quantifying the thickness of 2D phosphorus and show that monolayer and few-layer flakes produced by our approach are crystalline and unoxidized, while air exposure leads to rapid oxidation and the production of acid. With large quantities of 2D phosphorus now available, we perform the first quantitative measurements of the material's absorption edge—which is nearly identical to the material's band gap under our experimental conditions—as a function of flake thickness. Our interpretation of the absorbance spectrum relies on an analytical method introduced in this work, allowing the accurate determination of the absorption edge in polydisperse samples of quantum-confined semiconductors. Using this method, we found that the band gap of black phosphorus increased from 0.33 ± 0.02 eV in bulk to 1.88 ± 0.24 eV in bilayers, a range that is larger than that of any other 2D material. In addition, we quantified a higher-energy optical transition (VB–1 to CB), which changes from 2.0 eV in bulk to 3.23 eV in bilayers. This work describes several methods for producing and analyzing 2D phosphorus while also yielding a class of 2D materials with unprecedented optoelectronic properties.
科研通智能强力驱动
Strongly Powered by AbleSci AI