y0news
AnalyticsDigestsSourcesTopicsRSSAICrypto

#neuromorphic-computing News & Analysis

11 articles tagged with #neuromorphic-computing. AI-curated summaries with sentiment analysis and key takeaways from 50+ sources.

11 articles
AIBullisharXiv โ€“ CS AI ยท 2d ago7/10
๐Ÿง 

Towards Green Wearable Computing: A Physics-Aware Spiking Neural Network for Energy-Efficient IMU-based Human Activity Recognition

Researchers have developed PAS-Net, a physics-aware spiking neural network that dramatically reduces power consumption in wearable IMU-based human activity recognition systems. The architecture achieves state-of-the-art accuracy while cutting energy consumption by up to 98% through sparse integer operations and an early-exit mechanism, establishing a new standard for ultra-low-power edge computing on battery-constrained devices.

AIBullisharXiv โ€“ CS AI ยท 3d ago7/10
๐Ÿง 

Ge$^\text{2}$mS-T: Multi-Dimensional Grouping for Ultra-High Energy Efficiency in Spiking Transformer

Researchers introduce GeยฒmS-T, a novel Spiking Vision Transformer architecture that optimizes energy efficiency while maintaining training and inference performance through multi-dimensional grouped computation. The approach addresses fundamental limitations in existing SNN paradigms by balancing memory overhead, learning capability, and energy consumption simultaneously.

AIBullisharXiv โ€“ CS AI ยท Mar 267/10
๐Ÿง 

Physics-driven human-like working memory outperforms digital networks in dynamic vision

Researchers have developed a physics-driven AI system called Intrinsic Plasticity Network (IPNet) that uses magnetic tunnel junctions to create human-like working memory. The system demonstrates 18x error reduction in dynamic vision tasks while reducing memory-energy overhead by over 90,000x compared to traditional digital AI systems.

AIBullisharXiv โ€“ CS AI ยท Mar 167/10
๐Ÿง 

SRAM-Based Compute-in-Memory Accelerator for Linear-decay Spiking Neural Networks

Researchers developed an SRAM-based compute-in-memory accelerator for spiking neural networks that uses linear decay approximation instead of exponential decay, achieving 1.1x to 16.7x reduction in energy consumption. The innovation addresses the bottleneck of neuron state updates in neuromorphic computing by performing in-place decay directly within memory arrays.

AINeutralarXiv โ€“ CS AI ยท 1d ago6/10
๐Ÿง 

EMBER: Autonomous Cognitive Behaviour from Learned Spiking Neural Network Dynamics in a Hybrid LLM Architecture

Researchers present EMBER, a hybrid architecture combining spiking neural networks with large language models where the SNN acts as a persistent, biologically-inspired memory substrate that autonomously triggers LLM reasoning. The system demonstrates emergent autonomous behavior, initiating unprompted user contact after learning associations during idle periods, suggesting a fundamental shift in how AI systems could coordinate cognition and action.

AINeutralarXiv โ€“ CS AI ยท 3d ago6/10
๐Ÿง 

Practical Bayesian Inference for Speech SNNs: Uncertainty and Loss-Landscape Smoothing

Researchers demonstrate that applying Bayesian inference to Spiking Neural Networks (SNNs) for speech processing smooths the irregular loss landscape caused by threshold-based spike generation. Testing on speech datasets shows improved performance metrics and more regular predictive landscapes compared to deterministic approaches.

AIBullisharXiv โ€“ CS AI ยท Apr 76/10
๐Ÿง 

Neuromorphic Computing for Low-Power Artificial Intelligence

Researchers outline how neuromorphic computing could overcome energy efficiency limits in classical CMOS technology for AI applications. The approach requires co-design across materials, circuits, and algorithms to achieve brain-inspired compute-in-memory architectures.

AIBullisharXiv โ€“ CS AI ยท Mar 276/10
๐Ÿง 

Reconstructing Spiking Neural Networks Using a Single Neuron with Autapses

Researchers propose TDA-SNN, a novel spiking neural network framework that uses a single neuron with time-delayed autapses to reconstruct traditional multilayer architectures. The approach significantly reduces neuron count and memory requirements while maintaining competitive performance, though at the cost of increased temporal latency.

AIBullisharXiv โ€“ CS AI ยท Feb 276/105
๐Ÿง 

Spark: Modular Spiking Neural Networks

Researchers have introduced Spark, a new modular framework for spiking neural networks that aims to improve energy efficiency and data processing compared to traditional neural networks. The framework demonstrates its capabilities by solving complex problems like the sparse-reward cartpole using simple plasticity mechanisms, potentially advancing continuous learning approaches similar to biological systems.

AINeutralarXiv โ€“ CS AI ยท Mar 34/103
๐Ÿง 

Synaptic bundle theory for spike-driven sensor-motor system: More than eight independent synaptic bundles collapse reward-STDP learning

Researchers developed a spike-driven sensor-motor system that identifies critical limits for neuronal learning. The study found that learning collapses when the number of motor neurons or independent synaptic bundles exceeds certain thresholds, providing insights into biological spike-based control mechanisms.

AINeutralarXiv โ€“ CS AI ยท Mar 25/105
๐Ÿง 

Modelling and Simulation of Neuromorphic Datasets for Anomaly Detection in Computer Vision

Researchers introduce ANTShapes, a Unity-based simulation framework that generates synthetic neuromorphic vision datasets to address the scarcity of Dynamic Vision Sensor data. The tool creates configurable 3D scenes with randomly-behaving objects for training anomaly detection and object recognition systems in event-based computer vision.