AIBullisharXiv โ CS AI ยท 7h ago7/10
๐ง
CoMeT: Collaborative Memory Transformer for Efficient Long Context Modeling
Researchers introduce CoMeT (Collaborative Memory Transformer), a novel architecture that enables large language models to process arbitrarily long sequences with constant memory usage and linear time complexity. The system uses a dual-memory approach with FIFO queues and gated updates, demonstrating remarkable performance on long-context tasks including 1M token sequences and real-world applications.