Toeplitz MLP Mixers are Low Complexity, Information-Rich Sequence Models
Researchers introduce Toeplitz MLP Mixer (TMM), a transformer alternative that replaces attention mechanisms with triangular-masked Toeplitz matrix multiplication, achieving O(dn log n) training complexity and O(dn) inference complexity. TMMs demonstrate superior training efficiency, information retention, and in-context learning performance compared to existing sub-quadratic architectures.