y0news
AnalyticsDigestsSourcesTopicsRSSAICrypto

#svd News & Analysis

5 articles tagged with #svd. AI-curated summaries with sentiment analysis and key takeaways from 50+ sources.

5 articles
AIBullisharXiv โ€“ CS AI ยท Apr 147/10
๐Ÿง 

SVD-Prune: Training-Free Token Pruning For Efficient Vision-Language Models

SVD-Prune introduces a training-free token pruning method for Vision-Language Models using Singular Value Decomposition to reduce computational overhead. The approach maintains model performance while drastically reducing vision tokens to 16-32, addressing efficiency challenges in multimodal AI systems without requiring retraining.

AIBullisharXiv โ€“ CS AI ยท Mar 177/10
๐Ÿง 

ERC-SVD: Error-Controlled SVD for Large Language Model Compression

Researchers propose ERC-SVD, a new compression method for large language models that uses error-controlled singular value decomposition to reduce model size while maintaining performance. The method addresses truncation loss and error propagation issues in existing SVD-based compression techniques by leveraging residual matrices and selectively compressing only the last few layers.

AIBullisharXiv โ€“ CS AI ยท Mar 57/10
๐Ÿง 

Spectral Surgery: Training-Free Refinement of LoRA via Gradient-Guided Singular Value Reweighting

Researchers have developed Spectral Surgery, a training-free method to improve LoRA (Low-Rank Adaptation) model performance by reweighting singular values based on gradient sensitivity. The technique achieves significant performance gains (up to +4.4 points on CommonsenseQA) by adjusting only about 1,000 scalar coefficients without requiring retraining.

๐Ÿง  Llama
AIBullisharXiv โ€“ CS AI ยท Mar 37/103
๐Ÿง 

AdaRank: Adaptive Rank Pruning for Enhanced Model Merging

Researchers introduce AdaRank, a new AI model merging framework that adaptively selects optimal singular directions from task vectors to combine multiple fine-tuned models. The technique addresses cross-task interference issues in existing SVD-based approaches by dynamically pruning problematic components during test-time, achieving state-of-the-art performance with nearly 1% gap from individual fine-tuned models.

AINeutralarXiv โ€“ CS AI ยท Apr 156/10
๐Ÿง 

Orthogonal Subspace Projection for Continual Machine Unlearning via SVD-Based LoRA

Researchers propose an SVD-based orthogonal subspace projection method for continual machine unlearning that prevents interference between sequential deletion tasks in neural networks. The approach maintains model performance on retained data while effectively removing influence of unlearned data, addressing a critical limitation of naive LoRA fusion methods.