y0news
AnalyticsDigestsSourcesTopicsRSSAICrypto

#energy-efficiency News & Analysis

32 articles tagged with #energy-efficiency. AI-curated summaries with sentiment analysis and key takeaways from 50+ sources.

32 articles
AIBullisharXiv – CS AI · Feb 276/105
🧠

Spark: Modular Spiking Neural Networks

Researchers have introduced Spark, a new modular framework for spiking neural networks that aims to improve energy efficiency and data processing compared to traditional neural networks. The framework demonstrates its capabilities by solving complex problems like the sparse-reward cartpole using simple plasticity mechanisms, potentially advancing continuous learning approaches similar to biological systems.

AINeutralIEEE Spectrum – AI · Feb 236/108
🧠

AI’s Math Tricks Don’t Work for Scientific Computing

AI engineer Laslo Hunhold has developed 'takums,' a new number format specifically designed for scientific computing that maintains dynamic range when using fewer bits. Unlike AI-optimized formats that work well for machine learning but fail in scientific applications, takums address the unique computational needs of physics, biology, and engineering simulations.

AIBullishOpenAI News · Jan 205/105
🧠

Stargate Community

Stargate Community announces a community-first approach to AI infrastructure development, emphasizing locally tailored plans that incorporate community input, energy requirements, and workforce considerations. This initiative represents a decentralized model for AI infrastructure deployment.

AINeutralMIT News – AI · Jan 96/104
🧠

3 Questions: How AI could optimize the power grid

The article explores how AI technologies, while increasing energy demands, can simultaneously help optimize power grids to make them more efficient and cleaner. This presents a dual narrative where AI both challenges and potentially solves energy infrastructure problems.

AINeutralarXiv – CS AI · Apr 64/10
🧠

An Initial Exploration of Contrastive Prompt Tuning to Generate Energy-Efficient Code

Researchers explored using Contrastive Prompt Tuning (CPT) to improve Large Language Models' ability to generate energy-efficient code, combining contrastive learning with parameter-efficient fine-tuning. The study tested CPT across Python, Java, and C++ on three different models, finding consistent accuracy improvements for two models but variable efficiency gains depending on model, language, and task complexity.

AINeutralarXiv – CS AI · Mar 64/10
🧠

ASFL: An Adaptive Model Splitting and Resource Allocation Framework for Split Federated Learning

Researchers propose ASFL, an adaptive split federated learning framework that optimizes machine learning model training across wireless networks by splitting computation between clients and central servers. The framework reduces training delay by up to 75% and energy consumption by 80% compared to baseline approaches while maintaining faster convergence rates.

AINeutralarXiv – CS AI · Mar 34/104
🧠

Multimodal Modular Chain of Thoughts in Energy Performance Certificate Assessment

Researchers developed a Multimodal Modular Chain of Thoughts (MMCoT) framework using Vision-Language models to automate Energy Performance Certificate assessments from visual data. Testing on 81 UK residential properties showed significant improvements over traditional prompting methods, offering a cost-effective solution for energy efficiency evaluation in data-scarce regions.

← PrevPage 2 of 2