AINeutralarXiv โ CS AI ยท 4h ago4/10
๐ง
Reliability Gated Multi-Teacher Distillation for Low Resource Abstractive Summarization
Researchers developed EWAD and CPDP techniques for improving multi-teacher knowledge distillation in low-resource abstractive summarization tasks. The study across Bangla and cross-lingual datasets shows logit-level knowledge distillation provides most reliable gains, while complex distillation improves short summaries but degrades longer outputs.