AINeutralarXiv – CS AI · 10h ago6/10
🧠
AdaPreLoRA: Adafactor Preconditioned Low-Rank Adaptation
AdaPreLoRA addresses a fundamental challenge in fine-tuning large language models by proposing a new optimization method that combines Adafactor preconditioning with Low-Rank Adaptation. The technique achieves competitive or superior performance across multiple benchmarks while maintaining memory efficiency comparable to standard LoRA optimizers.