y0news
AnalyticsDigestsSourcesRSSAICrypto
#network-capacity1 article
1 articles
AINeutralarXiv โ€“ CS AI ยท Feb 277/106
๐Ÿง 

On the Complexity of Neural Computation in Superposition

Researchers establish theoretical foundations for neural network superposition, proving lower bounds that require at least ฮฉ(โˆšm' log m') neurons and ฮฉ(m' log m') parameters to compute m' features. The work demonstrates exponential complexity gaps between computing versus merely representing features and provides first subexponential bounds on network capacity.