y0news
AnalyticsDigestsRSSAICrypto
#mixed-precision1 article
1 articles
AINeutralarXiv โ€“ CS AI ยท 5h ago
๐Ÿง 

Activation Outliers in Transformer Quantization: Reproduction, Statistical Analysis, and Deployment Tradeoffs

Researchers reproduced and analyzed severe accuracy degradation in BERT transformer models when applying post-training quantization, showing validation accuracy drops from 89.66% to 54.33%. The study found that structured activation outliers intensify with model depth, with mixed precision quantization being the most effective mitigation strategy.