- 参考链接:
- A Comprehensive Survey of Continual Learning: Theory, Method and Application, TPAMI 2024, THU:截止20250610,cited by 1034
- Efficient Continual Pre-training for Building Domain Specific Large Language Models, ACL 2024, Amazon:截止20250610,cited by 47
- Continual Learning for Large Language Models: A Survey, arXiv 2024,Monash University && Griffith University:截止20250610,cited by 145
- Continual Learning of Large Language Models: A Comprehensive Survey, ACM Computing Surveys 2024, DeepMind:截止20250610,cited by 96
- Continual Learning with Pre-Trained Models: A Survey, arXiv 2024, Nanjing University:截止20250610,cited by 98
- Continual Pre-Training of Large Language Models: How to (re)warm your model?,arXiv 202308,Concordia University:截止20250610,cited by 100
- Recent Advances of Foundation Language Models-based Continual Learning: A Survey, arXiv 202409(ACM Computing Surveys 2025), ECNU:截止20250610,cited by 21
- Simple and Scalable Strategies to Continually Pre-train Large Language Models, arXiv 2024:截止20250610,cited by 82
from LLM 预训练和评估奖励模型的技巧 - 北方的郎的文章 - 知乎
最近的论文《持续预训练大型语言模型的简单且可扩展的策略》提供了关于如何继续使用新数据预训练LLM的宝贵见解 - Continual Learning of Natural Language Processing Tasks: A Survey, UIC, arXiv 2022:截止20250610,cited by 101
- Continual Lifelong Learning in Natural Language Processing: A Survey, 2020, UPC:截止20250610,cited by 273
- ECONET: Effective Continual Pretraining of Language Models for Event Temporal Reasoning, 2021, UCLA & USC:截止20250610,cited by 84
- EcomGPT-CT: Continual Pre-training of E-commerce Large Language Models with Semi-structured Data, arXiv 202312, THU, Alibaba:截止20250611,cited by 26
- ChatHome: Development and Evaluation of a Domain-Specific Language Model for Home Renovation, arXiv 20230728, Beike:截止20250616,cited by 33