Browsing College of Engineering by Subject "Knowledge Distillation"
Now showing items 1-1 of 1
-
Enhancing Knowledge Distillation for Text Summarization
(2024 , Master Thesis)In the realm of natural language processing, recent advancements have been significantly shaped by the development of large pretrained Seq2Seq Transformer models, including BART, PEGASUS, and T5. These models have ...