y0news
AnalyticsDigestsSourcesRSSAICrypto
#moores-law3 articles
3 articles
AIBullishOpenAI News ยท May 57/104
๐Ÿง 

AI and efficiency

A new analysis reveals that compute requirements for training neural networks to match ImageNet classification performance have decreased by 50% every 16 months since 2012. Training a network to AlexNet-level performance now requires 44 times less compute than in 2012, far outpacing Moore's Law improvements which would only yield 11x cost reduction over the same period.

AIBullishOpenAI News ยท May 167/107
๐Ÿง 

AI and compute

Analysis reveals AI training compute has grown exponentially since 2012 with a 3.4-month doubling time, increasing over 300,000x compared to Moore's Law's 7x growth over the same period. This dramatic acceleration in computational requirements suggests AI systems will soon possess capabilities far beyond current levels.

AINeutralHugging Face Blog ยท Oct 261/104
๐Ÿง 

Large Language Models: A New Moore's Law?

The article title suggests an exploration of whether Large Language Models follow a Moore's Law-like trajectory of exponential improvement. However, no article body content was provided to analyze the specific claims, data, or implications discussed.