计算机科学
萃取(化学)
信息抽取
情报检索
背景(考古学)
化学
色谱法
历史
考古
作者
Junkai Liu,Jiayi Wang,Hui Huang,Rui Zhang,Muyun Yang,Tiejun Zhao
出处
期刊:Communications in computer and information science
日期:2024-01-01
卷期号:: 49-59
被引量:1
标识
DOI:10.1007/978-981-97-1717-0_4
摘要
The Large Language Model (LLM) has received widespread attention in the industry. In the context of the popularity of LLM, almost all NLP tasks are transformed into prompt based language generation tasks. On the other hand, LLM can also achieve superior results on brand new tasks without fine-tuning, solely with a few in-context examples. This paper describes our participation in the China Health Information Processing Conference (CHIP 2023). We focused on in-context learning (ICL) and experimented with different combinations of demonstration retrieval strategies on the given task and tested the optimal strategy combination proposed by us. The experimental results show that our retrieval strategies based on Chinese-LlaMA2-13B-chat achieved a average score of 40.27, ranked the first place among five teams, confirmed the effectiveness of our method.
科研通智能强力驱动
Strongly Powered by AbleSci AI