AINeutralarXiv – CS AI · Feb 277/106
🧠
On the Complexity of Neural Computation in Superposition
Researchers establish theoretical foundations for neural network superposition, proving lower bounds that require at least Ω(√m' log m') neurons and Ω(m' log m') parameters to compute m' features. The work demonstrates exponential complexity gaps between computing versus merely representing features and provides first subexponential bounds on network capacity.