y0news
AnalyticsDigestsSourcesRSSAICrypto
#low-precision1 article
1 articles
AIBullisharXiv โ€“ CS AI ยท Feb 276/108
๐Ÿง 

GRAU: Generic Reconfigurable Activation Unit Design for Neural Network Hardware Accelerators

Researchers propose GRAU, a new reconfigurable activation unit design for neural network hardware accelerators that uses piecewise linear fitting with power-of-two slopes. The design reduces LUT consumption by over 90% compared to traditional multi-threshold activators while supporting mixed-precision quantization and nonlinear functions.