AIBullisharXiv โ CS AI ยท Feb 275/107
๐ง
Decoder-based Sense Knowledge Distillation
Researchers have developed Decoder-based Sense Knowledge Distillation (DSKD), a new framework that integrates lexical resources into decoder-style large language models during training. The method enhances knowledge distillation performance while enabling generative models to inherit structured semantics without requiring dictionary lookup during inference.