AIBullisharXiv โ CS AI ยท 7h ago6/10
๐ง
JumpLoRA: Sparse Adapters for Continual Learning in Large Language Models
Researchers introduce JumpLoRA, a novel framework that uses sparse adapters with JumpReLU gating to enable continual learning in large language models while mitigating catastrophic forgetting. The method dynamically isolates parameters across tasks, outperforming existing state-of-the-art approaches like ELLA and significantly improving IncLoRA performance.