AINeutralarXiv โ CS AI ยท Feb 277/106
๐ง
On the Complexity of Neural Computation in Superposition
Researchers establish theoretical foundations for neural network superposition, proving lower bounds that require at least ฮฉ(โm' log m') neurons and ฮฉ(m' log m') parameters to compute m' features. The work demonstrates exponential complexity gaps between computing versus merely representing features and provides first subexponential bounds on network capacity.