AINeutralarXiv – CS AI · 7h ago6/10
🧠
Why Geometric Continuity Emerges in Deep Neural Networks: Residual Connections and Rotational Symmetry Breaking
Researchers identify why deep neural networks develop geometric continuity—where weight matrices across layers align in similar directions. The mechanism combines residual connections that synchronize gradient flow across layers with symmetry-breaking nonlinearities that anchor weights to a shared coordinate frame, preventing rotational drift that would otherwise destabilize network structure.