y0news
AnalyticsDigestsSourcesRSSAICrypto
#gradient-computation1 article
1 articles
AIBullisharXiv โ€“ CS AI ยท 1d ago7/10
๐Ÿง 

Moonwalk: Inverse-Forward Differentiation

Researchers introduce Moonwalk, a new algorithm that solves backpropagation's memory limitations by eliminating the need to store intermediate activations during neural network training. The method uses vector-inverse-Jacobian products and submersive networks to reconstruct gradients in a forward sweep, enabling training of networks more than twice as deep under the same memory constraints.