y0news
AnalyticsDigestsSourcesRSSAICrypto
#cross-entropy-loss1 article
1 articles
AINeutralarXiv โ€“ CS AI ยท 5d ago4/103
๐Ÿง 

Rejuvenating Cross-Entropy Loss in Knowledge Distillation for Recommender Systems

Researchers propose Rejuvenated Cross-Entropy for Knowledge Distillation (RCE-KD) to improve knowledge distillation in recommender systems by addressing limitations of Cross-Entropy loss when distilling teacher model rankings. The method splits teacher's top items into subsets and uses adaptive sampling to better align with theoretical assumptions.