美联储:预算法学硕士:主动知识提炼,高效分类大型文本语料库(英文版)
美联储:预算法学硕士:主动知识提炼,高效分类大型文本语料库(英文版).pdf |
下载文档 |
资源简介
Large Language Models (LLMs) are highly accurate in classification tasks, however, substantial computational and financial costs hinder their large-scale deployment in dynamic environments. Knowledge Distillation (KD) where a LLM ”teacher” trains a smaller and more efficient ”student” model, offers a promising solution to this problem. However, the distillation process itself often remains costly for large datasets, since it requires the teacher to label a vast number of samples while incurri
本文档仅能预览20页


