y0news
AnalyticsDigestsSourcesTopicsRSSAICrypto

#parameter-orthogonality News & Analysis

1 article tagged with #parameter-orthogonality. AI-curated summaries with sentiment analysis and key takeaways from 50+ sources.

1 articles
AIBullisharXiv – CS AI · 6h ago6/10
🧠

Decomposing the Basic Abilities of Large Language Models: Mitigating Cross-Task Interference in Multi-Task Instruct-Tuning

Researchers propose BADIT, a novel approach to improve large language model training by decomposing shared parameters into orthogonal basic abilities, mitigating the cross-task interference problem that degrades performance in multi-task instruction-tuning. The method outperforms existing solutions on the SuperNI benchmark across 6 LLMs by maintaining parameter orthogonality through spherical clustering during training.