MinMax Recurrent Neural Cascades
Researchers introduce MinMax Recurrent Neural Cascades, a new neural network architecture that solves the vanishing/exploding gradient problem using MinMax algebra. The model demonstrates theoretical expressivity comparable to finite-state machines while maintaining bounded gradients, and shows competitive performance on both synthetic tasks and a 127M-parameter language model.