AIBullisharXiv โ CS AI ยท Feb 277/107
๐ง
NoRA: Breaking the Linear Ceiling of Low-Rank Adaptation via Manifold Expansion
Researchers introduce NoRA (Non-linear Rank Adaptation), a new parameter-efficient fine-tuning method that overcomes the 'linear ceiling' limitations of traditional LoRA by using SiLU gating and structural dropout. NoRA achieves superior performance at rank 64 compared to LoRA at rank 512, demonstrating significant efficiency gains in complex reasoning tasks.