化学
分子信标
分子内力
小RNA
流离失所(心理学)
生物物理学
计算生物学
DNA
立体化学
生物化学
基因
心理学
生物
寡核苷酸
心理治疗师
作者
Guohui Xue,Zhenming Sui,Baoqiang Chen,Zheng Xiao,Yuanyuan Yao,Hua Lin,Jianguo Xu
出处
期刊:Talanta
[Elsevier]
日期:2024-12-01
卷期号:280: 126778-126778
标识
DOI:10.1016/j.talanta.2024.126778
摘要
Given the critical role of miRNAs in regulating gene expression and their potential as biomarkers for various diseases, accurate and sensitive miRNA detection is essential for early diagnosis and monitoring of conditions such as cancer. In this study, we introduce a dimeric molecular beacon (Di-MB) based isothermal strand displacement amplification (ISDA) system (Di-MB-ISDA) for enhanced miRNA detection. The Di-MB system is composed of two monomeric MBs (Mono-MBs) connected by a double-stranded DNA linker with single-stranded sequences in the middle, facilitating binding with the flexible arms of the Mono-MBs. This design forms a compact, high-density structure, significantly improving biostability against nuclease degradation. In the absence of target miRNA, the Di-MB maintains its stable structure. When target miRNA is present, it binds to the stem-loop regions, causing the hairpin structure to unfold and expose the stem sequences. These sequences serve as templates for the built-in primers, triggering DNA replication through an intramolecular recognition mechanism. This spatial confinement effect accelerates the strand displacement reaction, allowing the target miRNA to initiate additional reaction cycles and amplify the detection signal. The Di-MB-ISDA system addresses key challenges such as poor biostability and limited sensitivity seen in traditional methods. By enhancing biostability and optimizing reaction conditions, this system demonstrates robust performance for miRNA detection with a detection limit of 100 pM. The findings highlight the potential of Di-MB-ISDA for sensitive and accurate miRNA analysis, paving the way for its application in biomedical study and disease diagnosis in complex biological samples.
科研通智能强力驱动
Strongly Powered by AbleSci AI