y0news
AnalyticsDigestsSourcesTopicsRSSAICrypto

#computational-complexity News & Analysis

6 articles tagged with #computational-complexity. AI-curated summaries with sentiment analysis and key takeaways from 50+ sources.

6 articles
AIBullisharXiv – CS AI Β· Mar 47/102
🧠

Neural Paging: Learning Context Management Policies for Turing-Complete Agents

Researchers introduce Neural Paging, a new architecture that addresses the computational bottleneck of finite context windows in Large Language Models by implementing a hierarchical system that decouples reasoning from memory management. The approach reduces computational complexity from O(NΒ²) to O(NΒ·KΒ²) for long-horizon reasoning tasks, potentially enabling more efficient AI agents.

AIBullisharXiv – CS AI Β· Mar 47/102
🧠

Channel-Adaptive Edge AI: Maximizing Inference Throughput by Adapting Computational Complexity to Channel States

Researchers developed a new channel-adaptive AI algorithm that maximizes inference throughput in 6G edge computing networks by dynamically adjusting computational complexity based on channel conditions. The system uses integrated communication and computation (ICΒ²) to optimize both feature compression and model complexity for mobile edge inference.

AIBullisharXiv – CS AI Β· Mar 37/105
🧠

Expressive Power of Implicit Models: Rich Equilibria and Test-Time Scaling

Researchers provide mathematical proof that implicit models can achieve greater expressive power through increased test-time computation, explaining how these memory-efficient architectures can match larger explicit networks. The study validates this scaling property across image reconstruction, scientific computing, operations research, and LLM reasoning domains.

AINeutralarXiv – CS AI Β· Feb 277/106
🧠

On the Complexity of Neural Computation in Superposition

Researchers establish theoretical foundations for neural network superposition, proving lower bounds that require at least Ω(√m' log m') neurons and Ω(m' log m') parameters to compute m' features. The work demonstrates exponential complexity gaps between computing versus merely representing features and provides first subexponential bounds on network capacity.

AINeutralarXiv – CS AI Β· Mar 37/109
🧠

Universal NP-Hardness of Clustering under General Utilities

Researchers prove that clustering problems in machine learning are universally NP-hard, providing theoretical explanation for why clustering algorithms often produce unstable results. The study demonstrates that major clustering methods like k-means and spectral clustering inherit fundamental computational intractability, explaining common failure modes like local optima.

AINeutralHugging Face Blog Β· Feb 25/108
🧠

NPHardEval Leaderboard: Unveiling the Reasoning Abilities of Large Language Models through Complexity Classes and Dynamic Updates

NPHardEval Leaderboard introduces a new evaluation framework for assessing large language models' reasoning capabilities through computational complexity classes with dynamic updates. The leaderboard aims to provide more rigorous testing of LLM reasoning abilities by incorporating problems from different complexity categories.