y0news
AnalyticsDigestsSourcesTopicsRSSAICrypto

#spectral-analysis News & Analysis

9 articles tagged with #spectral-analysis. AI-curated summaries with sentiment analysis and key takeaways from 50+ sources.

9 articles
AIBearisharXiv โ€“ CS AI ยท 4d ago7/10
๐Ÿง 

From Dispersion to Attraction: Spectral Dynamics of Hallucination Across Whisper Model Scales

Researchers propose the Spectral Sensitivity Theorem to explain hallucinations in large ASR models like Whisper, identifying a phase transition between dispersive and attractor regimes. Analysis of model eigenspectra reveals that intermediate models experience structural breakdown while large models compress information, decoupling from acoustic evidence and increasing hallucination risk.

AIBullisharXiv โ€“ CS AI ยท Apr 107/10
๐Ÿง 

SpecQuant: Spectral Decomposition and Adaptive Truncation for Ultra-Low-Bit LLMs Quantization

SpecQuant introduces a novel quantization framework using spectral decomposition to compress large language models to 4-bit precision for both weights and activations, achieving only 1.5% accuracy loss on LLaMA-3 8B while enabling 2x faster inference and 3x memory reduction. The technique exploits frequency domain properties to preserve essential signal components while suppressing high-frequency noise, addressing a critical challenge in deploying LLMs on edge devices.

AIBullisharXiv โ€“ CS AI ยท Mar 167/10
๐Ÿง 

Learnable Koopman-Enhanced Transformer-Based Time Series Forecasting with Spectral Control

Researchers propose a new family of learnable Koopman operators that combine linear dynamical systems theory with deep learning for time series forecasting. The approach integrates with existing transformer architectures like Patchtst and Autoformer, offering improved stability and interpretability in predictive models.

AINeutralarXiv โ€“ CS AI ยท Mar 46/102
๐Ÿง 

The Malignant Tail: Spectral Segregation of Label Noise in Over-Parameterized Networks

Researchers identify the 'Malignant Tail' phenomenon where over-parameterized neural networks segregate signal from noise during training, leading to harmful overfitting. They demonstrate that Stochastic Gradient Descent pushes label noise into high-frequency orthogonal subspaces while preserving semantic features in low-rank subspaces, and propose Explicit Spectral Truncation as a post-hoc solution to recover optimal generalization.

AIBullisharXiv โ€“ CS AI ยท Mar 96/10
๐Ÿง 

Energy-Driven Adaptive Visual Token Pruning for Efficient Vision-Language Models

Researchers developed E-AdaPrune, an energy-driven adaptive pruning framework that optimizes Vision-Language Models by dynamically allocating visual tokens based on image information density. The method shows up to 0.6% average improvement across benchmarks, with a notable 5.1% boost on reasoning tasks, while adding only 8ms latency per image.

AINeutralarXiv โ€“ CS AI ยท Mar 37/106
๐Ÿง 

StaTS: Spectral Trajectory Schedule Learning for Adaptive Time Series Forecasting with Frequency Guided Denoiser

Researchers introduce StaTS, a new diffusion model for time series forecasting that learns adaptive noise schedules and uses frequency-guided denoising. The model addresses limitations of fixed noise schedules in existing diffusion models by incorporating spectral regularization and data-adaptive scheduling for improved structural preservation.

$NEAR
AIBullisharXiv โ€“ CS AI ยท Mar 36/104
๐Ÿง 

Towards Principled Dataset Distillation: A Spectral Distribution Perspective

Researchers propose Class-Aware Spectral Distribution Matching (CSDM), a new dataset distillation method that addresses performance issues on imbalanced datasets. The technique achieves 14% improvement over existing methods on CIFAR-10-LT with enhanced stability on long-tailed data distributions.