清晨好,您是今天最早来到科研通的研友!由于当前在线用户较少,发布求助请尽量完整地填写文献信息,科研通机器人24小时在线,伴您科研之路漫漫前行!

Pre-training a Foundation Model for Universal Fluorescence Microscopy Image Restoration

计算机科学 人工智能 一般化 任务(项目管理) 深度学习 机器学习 学习迁移 工程类 数学 系统工程 数学分析
作者
Weimin Tan,Chenxi Ma,Weimin Tan
出处
期刊:Research Square - Research Square
标识
DOI:10.21203/rs.3.rs-3208267/v1
摘要

Abstract Fluorescence microscopy image restoration (FMIR) has received wide attention in the life science field and led to significant progress, benefiting from the deep learning (DL) technology. However, most of the current DL-based FMIR methods need to train a task-specific deep model from scratch on a specific dataset for each FMIR problem, such as super-resolution (SR), denoising, isotropic reconstruction, projection, volume reconstruction, etc. The performance and practicability of these FMIR models are limited due to the troublesome training, the difficulty in obtaining high-quality training images, and the limited generalization ability. Nowadays, the pre-trained foundation models have obtained significant breakthroughs in computer vision (CV) and natural language processing (NLP), demonstrating the powerful effect of the pre-training and fine-tuning paradigm. Here, inspired by the huge success of the pre-trained foundation models in the artificial intelligence (AI), we provide a universal solution for different FMIR problems by presenting a unified FMIR foundation model (UniFMIR), achieving higher image precision, better generalization performance, efficient and low-cost training of the task-specific model. The experimental results on five FMIR tasks and nine datasets, covering a wide range of fluorescence microscopy imaging modalities and biological samples, demonstrate the strong capability of the UniFMIR to handle various FMIR situations with a single model. The UniFMIR, pre-trained on the large-scale dataset we collected, can effectively transfer the knowledge learned during the pre-training to a specific FMIR situation by fine-tuning and can obtain a significant performance improvement, uncovering clear nanoscale cell structures and facilitating high-quality imaging in live samples. This work first explores the potential of applying the foundation model for FMIR. We hope to provide some inspiration for more researchers to further explore the DL-based FMIR and to trigger new research highlights of the FMIR model pre-training and development.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
PDF的下载单位、IP信息已删除 (2025-6-4)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
qqq完成签到,获得积分10
4秒前
浚稚完成签到 ,获得积分10
43秒前
优秀的流沙完成签到,获得积分10
46秒前
Owen应助Fairy采纳,获得10
1分钟前
千里草完成签到,获得积分10
1分钟前
GingerF应助科研通管家采纳,获得200
1分钟前
大胆的碧菡完成签到,获得积分10
4分钟前
5分钟前
Fairy发布了新的文献求助10
5分钟前
GingerF应助科研通管家采纳,获得150
5分钟前
科研通AI2S应助科研通管家采纳,获得10
5分钟前
kk完成签到,获得积分10
6分钟前
CipherSage应助kk采纳,获得10
6分钟前
li完成签到 ,获得积分10
7分钟前
两个榴莲完成签到,获得积分0
8分钟前
树洞里的刺猬完成签到,获得积分10
8分钟前
guan完成签到,获得积分10
8分钟前
CHEN完成签到 ,获得积分10
10分钟前
9527完成签到,获得积分10
10分钟前
fabius0351完成签到 ,获得积分10
10分钟前
Nikki发布了新的文献求助30
11分钟前
11分钟前
Nikki完成签到,获得积分10
11分钟前
激动的似狮完成签到,获得积分10
11分钟前
英俊的铭应助活力桃采纳,获得10
12分钟前
tt完成签到,获得积分10
12分钟前
浮游应助科研通管家采纳,获得10
13分钟前
woxinyouyou完成签到,获得积分0
13分钟前
夏木完成签到 ,获得积分10
17分钟前
DBP87弹完成签到 ,获得积分10
17分钟前
今后应助科研通管家采纳,获得10
17分钟前
博ge完成签到 ,获得积分10
17分钟前
CodeCraft应助邱旭东采纳,获得10
17分钟前
17分钟前
邱旭东发布了新的文献求助10
18分钟前
可爱的函函应助邱旭东采纳,获得10
18分钟前
19分钟前
19分钟前
邱旭东发布了新的文献求助10
19分钟前
活力桃发布了新的文献求助10
19分钟前
高分求助中
(应助此贴封号)【重要!!请各用户(尤其是新用户)详细阅读】【科研通的精品贴汇总】 10000
Bandwidth Choice for Bias Estimators in Dynamic Nonlinear Panel Models 2000
HIGH DYNAMIC RANGE CMOS IMAGE SENSORS FOR LOW LIGHT APPLICATIONS 1500
Constitutional and Administrative Law 1000
The Social Work Ethics Casebook: Cases and Commentary (revised 2nd ed.). Frederic G. Reamer 800
The Experimental Biology of Bryophytes 500
Fiction e non fiction: storia, teorie e forme 500
热门求助领域 (近24小时)
化学 材料科学 医学 生物 工程类 有机化学 生物化学 物理 纳米技术 计算机科学 内科学 化学工程 复合材料 物理化学 基因 遗传学 催化作用 冶金 量子力学 光电子学
热门帖子
关注 科研通微信公众号,转发送积分 5367891
求助须知:如何正确求助?哪些是违规求助? 4495936
关于积分的说明 13996458
捐赠科研通 4400943
什么是DOI,文献DOI怎么找? 2417524
邀请新用户注册赠送积分活动 1410248
关于科研通互助平台的介绍 1385859