Normalized Architectures are Natively 4-Bit
Researchers demonstrate that nGPT, a neural architecture that normalizes weights and hidden representations to a unit hypersphere, achieves stable 4-bit precision training without requiring additional quantization interventions. The approach leverages mathematical properties of dot products to maintain stronger signal-to-noise ratios, enabling efficient training of models up to 30B parameters.