AINeutralarXiv – CS AI · 9h ago6/10
🧠
Flat Channels to Infinity in Neural Loss Landscapes
Researchers identify and characterize 'channels to infinity' in neural network loss landscapes—flat regions where neurons diverge to extreme values while converging to shared weight vectors. These structures, which gradient-based optimizers frequently reach, functionally collapse to gated linear units and reveal surprising computational properties of fully connected layers.