y0news
AnalyticsDigestsSourcesTopicsRSSAICrypto

#communication-efficiency News & Analysis

5 articles tagged with #communication-efficiency. AI-curated summaries with sentiment analysis and key takeaways from 50+ sources.

5 articles
AIBullisharXiv โ€“ CS AI ยท Mar 97/10
๐Ÿง 

FLoRG: Federated Fine-tuning with Low-rank Gram Matrices and Procrustes Alignment

Researchers propose FLoRG, a new federated learning framework for efficiently fine-tuning large language models that reduces communication overhead by up to 2041x while improving accuracy. The method uses Gram matrix aggregation and Procrustes alignment to solve aggregation errors and decomposition drift issues in distributed AI training.

AI ร— CryptoBullisharXiv โ€“ CS AI ยท Mar 37/1010
๐Ÿค–

Communication-Efficient Quantum Federated Learning over Large-Scale Wireless Networks

Researchers present a novel quantum federated learning framework for large-scale wireless networks that combines quantum computing with privacy-preserving federated learning. The study introduces a sum-rate maximization approach using quantum approximate optimization algorithm (QAOA) that achieves over 100% improvement in performance compared to conventional methods.

AIBullisharXiv โ€“ CS AI ยท Mar 115/10
๐Ÿง 

FedLECC: Cluster- and Loss-Guided Client Selection for Federated Learning under Non-IID Data

Researchers propose FedLECC, a new client selection strategy for federated learning that improves AI model training efficiency in distributed environments. The method groups clients by data similarity and prioritizes those with higher loss, achieving up to 12% better accuracy while reducing communication overhead by 50%.

AINeutralarXiv โ€“ CS AI ยท Mar 44/103
๐Ÿง 

Adaptive Personalized Federated Learning via Multi-task Averaging of Kernel Mean Embeddings

Researchers propose a new Personalized Federated Learning approach that automatically learns optimal collaboration weights between agents without prior knowledge of data heterogeneity. The method uses kernel mean embedding estimation to capture statistical relationships between agents and includes a practical implementation for communication-constrained federated settings.

AINeutralarXiv โ€“ CS AI ยท Mar 34/107
๐Ÿง 

CA-AFP: Cluster-Aware Adaptive Federated Pruning

Researchers propose CA-AFP, a new federated learning framework that combines client clustering with adaptive model pruning to address both statistical and system heterogeneity challenges. The approach achieves better accuracy and fairness while reducing communication costs compared to existing methods, as demonstrated on human activity recognition benchmarks.