AINeutralarXiv – CS AI · 10h ago6/10
🧠
Rethinking Random Transformers as Adaptive Sequence Smoothers for Sleep Staging
Researchers challenge the assumption that Transformers improve sleep staging through learning complex dependencies, instead revealing that random, untrained Transformers substantially boost performance by acting as adaptive smoothers. The findings suggest sleep staging relies more on architectural inductive bias than parameter learning, enabling simpler, more efficient models suitable for edge deployment in healthcare systems.