y0news
AnalyticsDigestsSourcesTopicsRSSAICrypto

#regression News & Analysis

7 articles tagged with #regression. AI-curated summaries with sentiment analysis and key takeaways from 50+ sources.

7 articles
AINeutralarXiv โ€“ CS AI ยท Mar 47/102
๐Ÿง 

Covering Numbers for Deep ReLU Networks with Applications to Function Approximation and Nonparametric Regression

Researchers have derived tight bounds on covering numbers for deep ReLU neural networks, providing fundamental insights into network capacity and approximation capabilities. The work removes a log^6(n) factor from the best known sample complexity rate for estimating Lipschitz functions via deep networks, establishing optimality in nonparametric regression.

AINeutralarXiv โ€“ CS AI ยท Mar 55/10
๐Ÿง 

Zono-Conformal Prediction: Zonotope-Based Uncertainty Quantification for Regression and Classification Tasks

Researchers introduce zono-conformal prediction, a new uncertainty quantification method for machine learning that uses zonotope-based prediction sets instead of traditional intervals. The approach is more computationally efficient and less conservative than existing conformal prediction methods while maintaining statistical coverage guarantees for both regression and classification tasks.

AINeutralarXiv โ€“ CS AI ยท Mar 45/102
๐Ÿง 

Eliciting Numerical Predictive Distributions of LLMs Without Autoregression

Researchers developed a method to extract numerical prediction distributions from Large Language Models without costly autoregressive sampling by training probes on internal representations. The approach can predict statistical functionals like mean and quantiles directly from LLM embeddings, potentially offering a more efficient alternative for uncertainty-aware numerical predictions.

AINeutralarXiv โ€“ CS AI ยท Mar 264/10
๐Ÿง 

Deep Neural Regression Collapse

Researchers have extended Neural Collapse theory to regression problems, discovering that Deep Neural Regression Collapse (NRC) occurs across multiple layers in neural networks, not just the final layer. The study reveals that collapsed layers learn structured representations where features align with target dimensions and covariance, providing insights into the simple structures that deep networks learn for regression tasks.

AINeutralarXiv โ€“ CS AI ยท Feb 274/107
๐Ÿง 

Revisiting Chebyshev Polynomial and Anisotropic RBF Models for Tabular Regression

Researchers developed smooth-basis regression models including anisotropic RBF networks and Chebyshev polynomial regressors that compete with tree ensembles in tabular regression tasks. Testing across 55 datasets showed these models achieve similar accuracy to tree ensembles while offering better generalization properties and gradual prediction surfaces suitable for optimization applications.

AINeutralarXiv โ€“ CS AI ยท Feb 274/106
๐Ÿง 

Model Agreement via Anchoring

Researchers developed a new mathematical technique called 'anchoring' to control model disagreement between machine learning models trained independently. The method provides bounds for reducing disagreement to zero across four common ML algorithms including stacked aggregation, gradient boosting, neural networks, and regression trees.

AIBullisharXiv โ€“ CS AI ยท Mar 34/106
๐Ÿง 

Machine Learning Grade Prediction Using Students' Grades and Demographics

Researchers developed a unified machine learning framework that predicts both pass/fail outcomes and continuous grades for secondary school students with up to 96% accuracy. The study of 4424 students demonstrates how AI can enable early identification of at-risk students and optimize educational resource allocation through data-driven predictions.