AIBullisharXiv – CS AI · Mar 276/10
🧠
TAG-MoE: Task-Aware Gating for Unified Generative Mixture-of-Experts
Researchers propose TAG-MoE, a new framework that improves unified image generation and editing models by making AI routing decisions task-aware rather than task-agnostic. The system uses hierarchical task semantic annotation and predictive alignment regularization to reduce task interference and improve model performance.