FedRot-LoRA: Mitigating Rotational Misalignment in Federated LoRA
Researchers propose FedRot-LoRA, a new framework that solves rotational misalignment issues in federated learning for large language models. The solution uses orthogonal transformations to align client updates before aggregation, improving training stability and performance without increasing communication costs.