y0news
AnalyticsDigestsSourcesTopicsRSSAICrypto

#parameter-reduction News & Analysis

2 articles tagged with #parameter-reduction. AI-curated summaries with sentiment analysis and key takeaways from 50+ sources.

2 articles
AIBullisharXiv – CS AI · Apr 76/10
🧠

Training Transformers in Cosine Coefficient Space

Researchers developed a new method to train transformer neural networks using discrete cosine transform (DCT) coefficients, achieving the same performance while using only 52% of the parameters. The technique requires no architectural changes and simply replaces standard linear layers with spectral layers that store DCT coefficients instead of full weight matrices.

🏢 Perplexity
AINeutralHugging Face Blog · Jan 235/106
🧠

SmolVLM Grows Smaller – Introducing the 256M & 500M Models!

SmolVLM has released smaller versions of their vision-language model with 256M and 500M parameter variants. The article title suggests these are more compact versions of their existing AI model, potentially making the technology more accessible and efficient for various applications.