A Comprehensive Survey on Pretrained Foundation Models: A History from BERT to ChatGPT

计算机科学 变压器 模式 自回归模型 人工智能 机器学习 初始化 提取器 图形 理论计算机科学 程序设计语言 社会科学 物理 量子力学 电压 社会学 工艺工程 工程类 经济 计量经济学
作者
Zhou, Ce,Li, Qian,Li, Chen,Yu, Jun,Liu, Yixin,Wang, Guangjing,Zhang, Kai,Ji, Cheng,Yan, Qiben,He, Lifang,Peng, Hao,Li, Jianxin,Wu, Jia,Liu, Ziwei,Xie, Pengtao,Xiong, Caiming,Pei, Jian,Yu, Philip S.,Sun, Lichao
出处
期刊:Cornell University - arXiv
标识
DOI:10.48550/arxiv.2302.09419
摘要

The Pretrained Foundation Models (PFMs) are regarded as the foundation for various downstream tasks with different data modalities. A pretrained foundation model, such as BERT, GPT-3, MAE, DALLE-E, and ChatGPT, is trained on large-scale data which provides a reasonable parameter initialization for a wide range of downstream applications. The idea of pretraining behind PFMs plays an important role in the application of large models. Different from previous methods that apply convolution and recurrent modules for feature extractions, the generative pre-training (GPT) method applies Transformer as the feature extractor and is trained on large datasets with an autoregressive paradigm. Similarly, the BERT apples transformers to train on large datasets as a contextual language model. Recently, the ChatGPT shows promising success on large language models, which applies an autoregressive language model with zero shot or few show prompting. With the extraordinary success of PFMs, AI has made waves in a variety of fields over the past few years. Considerable methods, datasets, and evaluation metrics have been proposed in the literature, the need is raising for an updated survey. This study provides a comprehensive review of recent research advancements, current and future challenges, and opportunities for PFMs in text, image, graph, as well as other data modalities. We first review the basic components and existing pretraining in natural language processing, computer vision, and graph learning. We then discuss other advanced PFMs for other data modalities and unified PFMs considering the data quality and quantity. Besides, we discuss relevant research about the fundamentals of the PFM, including model efficiency and compression, security, and privacy. Finally, we lay out key implications, future research directions, challenges, and open problems.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
大幅提高文件上传限制,最高150M (2024-4-1)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
刚刚
刚刚
刚刚
muxiangrong完成签到,获得积分0
刚刚
1秒前
1秒前
1秒前
1秒前
PG发布了新的文献求助10
2秒前
大力盼波完成签到,获得积分10
2秒前
和谐小霸王给和谐小霸王的求助进行了留言
2秒前
岚风完成签到,获得积分10
3秒前
星辰大海应助打工人采纳,获得30
3秒前
herz完成签到,获得积分10
3秒前
受伤的迎松完成签到 ,获得积分10
3秒前
我是老大应助uwasa采纳,获得10
4秒前
4秒前
iNk应助伯赏凝旋采纳,获得20
4秒前
珂伟应助六子采纳,获得10
4秒前
Mr.Young发布了新的文献求助10
4秒前
科学家发布了新的文献求助10
5秒前
5秒前
freedom发布了新的文献求助10
5秒前
平沙照完成签到,获得积分10
6秒前
可爱的函函应助俏皮问兰采纳,获得10
6秒前
Yanzhi发布了新的文献求助10
6秒前
7秒前
上官若男应助150采纳,获得10
7秒前
千城暮雪完成签到,获得积分10
7秒前
8秒前
8秒前
8秒前
田园完成签到,获得积分10
8秒前
辣辣耳朵完成签到,获得积分10
9秒前
9秒前
呆呆江发布了新的文献求助10
9秒前
9秒前
雪茶完成签到,获得积分10
10秒前
10秒前
11秒前
高分求助中
Licensing Deals in Pharmaceuticals 2019-2024 3000
Cognitive Paradigms in Knowledge Organisation 2000
Effect of reactor temperature on FCC yield 2000
Near Infrared Spectra of Origin-defined and Real-world Textiles (NIR-SORT): A spectroscopic and materials characterization dataset for known provenance and post-consumer fabrics 610
Introduction to Spectroscopic Ellipsometry of Thin Film Materials Instrumentation, Data Analysis, and Applications 600
Promoting women's entrepreneurship in developing countries: the case of the world's largest women-owned community-based enterprise 500
Shining Light on the Dark Side of Personality 400
热门求助领域 (近24小时)
化学 医学 生物 材料科学 工程类 有机化学 生物化学 物理 内科学 纳米技术 计算机科学 化学工程 复合材料 基因 遗传学 催化作用 物理化学 免疫学 量子力学 细胞生物学
热门帖子
关注 科研通微信公众号,转发送积分 3309117
求助须知:如何正确求助?哪些是违规求助? 2942485
关于积分的说明 8509235
捐赠科研通 2617584
什么是DOI,文献DOI怎么找? 1430190
科研通“疑难数据库(出版商)”最低求助积分说明 664086
邀请新用户注册赠送积分活动 649251