显微镜
双光子激发显微术
荧光显微镜
胞吐
共焦显微镜
生物物理学
超分辨显微术
时间分辨率
化学
神经科学
生物
出处
期刊:Molecules and Cells
[Springer Science+Business Media]
日期:2008-06-26
卷期号:26 (2): 113-120
被引量:18
摘要
Laser light microscopy enables observation of various simultaneously occurring events in living cells. This capability is important for monitoring the spatiotemporal patterns of the molecular interactions underlying such events. Two-photon excited fluorescence microscopy (two-photon microscopy), a technology based on multiphoton excitation, is one of the most promising candidates for such imaging. The advantages of two-photon microscopy have spurred wider adoption of the method, especially in neurological studies. Multicolor excitation capability, one advantage of two-photon microscopy, has enabled the quantification of spatiotemporal patterns of [Ca(2+)](i) and single episodes of fusion pore openings during exocytosis. In pancreatic acinar cells, we have successfully demonstrated the existence of sequential compound exocytosis for the first time, a process which has subsequently been identified in a wide variety of secretory cells including exocrine, endocrine and blood cells. Our newly developed method, the two-photon extracellular polar-tracer imaging-based quantification (TEPIQ) method, can be used for determining fusion pores and the diameters of vesicles smaller than the diffraction-limited resolution. Furthermore, two-photon microscopy has the demonstrated capability of obtaining cross-sectional images from deep layers within nearly intact tissue samples over long observation times with excellent spatial resolution. Recently, we have successfully observed a neuron located deeper than 0.9 mm from the brain cortex surface in an anesthetized mouse. This microscopy also enables the monitoring of long-term changes in neural or glial cells in a living mouse. This minireview describes both the current and anticipated capabilities of two-photon microscopy, based on a discussion of previous publications and recently obtained data.
科研通智能强力驱动
Strongly Powered by AbleSci AI