y0news
AnalyticsDigestsSourcesRSSAICrypto
#linear-complexity3 articles
3 articles
AIBullisharXiv – CS AI · 6d ago7/102
🧠

RMAAT: Astrocyte-Inspired Memory Compression and Replay for Efficient Long-Context Transformers

Researchers introduce RMAAT (Recurrent Memory Augmented Astromorphic Transformer), a new architecture inspired by brain astrocyte cells that addresses the quadratic complexity problem in Transformer models for long sequences. The system uses recurrent memory tokens and adaptive compression to achieve linear complexity while maintaining competitive accuracy on benchmark tests.

AIBullisharXiv – CS AI · Feb 277/106
🧠

ViT-Linearizer: Distilling Quadratic Knowledge into Linear-Time Vision Models

Researchers developed ViT-Linearizer, a distillation framework that transfers Vision Transformer knowledge into linear-time models, addressing quadratic complexity issues for high-resolution inputs. The method achieves 84.3% ImageNet accuracy while providing significant speedups, bridging the gap between efficient RNN-based architectures and transformer performance.