AIBullisharXiv โ CS AI ยท Feb 276/108
๐ง
GRAU: Generic Reconfigurable Activation Unit Design for Neural Network Hardware Accelerators
Researchers propose GRAU, a new reconfigurable activation unit design for neural network hardware accelerators that uses piecewise linear fitting with power-of-two slopes. The design reduces LUT consumption by over 90% compared to traditional multi-threshold activators while supporting mixed-precision quantization and nonlinear functions.