y0news
AnalyticsDigestsSourcesTopicsRSSAICrypto

#spiking-neural-networks News & Analysis

15 articles tagged with #spiking-neural-networks. AI-curated summaries with sentiment analysis and key takeaways from 50+ sources.

15 articles
AIBullisharXiv โ€“ CS AI ยท 5d ago7/10
๐Ÿง 

Towards Green Wearable Computing: A Physics-Aware Spiking Neural Network for Energy-Efficient IMU-based Human Activity Recognition

Researchers have developed PAS-Net, a physics-aware spiking neural network that dramatically reduces power consumption in wearable IMU-based human activity recognition systems. The architecture achieves state-of-the-art accuracy while cutting energy consumption by up to 98% through sparse integer operations and an early-exit mechanism, establishing a new standard for ultra-low-power edge computing on battery-constrained devices.

AIBullisharXiv โ€“ CS AI ยท 6d ago7/10
๐Ÿง 

Ge$^\text{2}$mS-T: Multi-Dimensional Grouping for Ultra-High Energy Efficiency in Spiking Transformer

Researchers introduce GeยฒmS-T, a novel Spiking Vision Transformer architecture that optimizes energy efficiency while maintaining training and inference performance through multi-dimensional grouped computation. The approach addresses fundamental limitations in existing SNN paradigms by balancing memory overhead, learning capability, and energy consumption simultaneously.

AIBullisharXiv โ€“ CS AI ยท Mar 177/10
๐Ÿง 

SPARQ: Spiking Early-Exit Neural Networks for Energy-Efficient Edge AI

SPARQ introduces a unified framework combining spiking neural networks, quantization-aware training, and reinforcement learning-guided early exits for energy-efficient edge AI. The system achieves up to 5.15% higher accuracy than conventional quantized SNNs while reducing system energy consumption by over 330 times and cutting synaptic operations by over 90%.

AIBullisharXiv โ€“ CS AI ยท Mar 167/10
๐Ÿง 

SRAM-Based Compute-in-Memory Accelerator for Linear-decay Spiking Neural Networks

Researchers developed an SRAM-based compute-in-memory accelerator for spiking neural networks that uses linear decay approximation instead of exponential decay, achieving 1.1x to 16.7x reduction in energy consumption. The innovation addresses the bottleneck of neuron state updates in neuromorphic computing by performing in-place decay directly within memory arrays.

AIBullisharXiv โ€“ CS AI ยท Mar 57/10
๐Ÿง 

mlx-snn: Spiking Neural Networks on Apple Silicon via MLX

Researchers have released mlx-snn, the first spiking neural network library built natively for Apple's MLX framework, targeting Apple Silicon hardware. The library demonstrates 2-2.5x faster training and 3-10x lower GPU memory usage compared to existing PyTorch-based solutions, achieving 97.28% accuracy on MNIST classification tasks.

AIBullisharXiv โ€“ CS AI ยท Mar 37/104
๐Ÿง 

Learning Internal Biological Neuron Parameters and Complexity-Based Encoding for Improved Spiking Neural Networks Performance

Researchers developed a novel learning approach for spiking neural networks that optimizes both synaptic weights and intrinsic neuronal parameters, achieving up to 13.50 percentage point improvements in classification accuracy. The study introduces a biologically-inspired SNN-LZC classifier that achieves 99.50% accuracy with sub-millisecond inference latency.

AINeutralarXiv โ€“ CS AI ยท 4d ago6/10
๐Ÿง 

EMBER: Autonomous Cognitive Behaviour from Learned Spiking Neural Network Dynamics in a Hybrid LLM Architecture

Researchers present EMBER, a hybrid architecture combining spiking neural networks with large language models where the SNN acts as a persistent, biologically-inspired memory substrate that autonomously triggers LLM reasoning. The system demonstrates emergent autonomous behavior, initiating unprompted user contact after learning associations during idle periods, suggesting a fundamental shift in how AI systems could coordinate cognition and action.

AINeutralarXiv โ€“ CS AI ยท 6d ago6/10
๐Ÿง 

Practical Bayesian Inference for Speech SNNs: Uncertainty and Loss-Landscape Smoothing

Researchers demonstrate that applying Bayesian inference to Spiking Neural Networks (SNNs) for speech processing smooths the irregular loss landscape caused by threshold-based spike generation. Testing on speech datasets shows improved performance metrics and more regular predictive landscapes compared to deterministic approaches.

AIBullisharXiv โ€“ CS AI ยท Apr 76/10
๐Ÿง 

Event-Driven Neuromorphic Vision Enables Energy-Efficient Visual Place Recognition

Researchers developed SpikeVPR, a bio-inspired visual place recognition system using event-based cameras and spiking neural networks that achieves comparable performance to deep networks while using 50x fewer parameters and consuming 30-250x less energy. The neuromorphic approach enables real-time deployment on mobile platforms for autonomous robot navigation.

AIBullisharXiv โ€“ CS AI ยท Mar 276/10
๐Ÿง 

Reconstructing Spiking Neural Networks Using a Single Neuron with Autapses

Researchers propose TDA-SNN, a novel spiking neural network framework that uses a single neuron with time-delayed autapses to reconstruct traditional multilayer architectures. The approach significantly reduces neuron count and memory requirements while maintaining competitive performance, though at the cost of increased temporal latency.

AIBullisharXiv โ€“ CS AI ยท Mar 176/10
๐Ÿง 

Collapse or Preserve: Data-Dependent Temporal Aggregation for Spiking Neural Network Acceleration

Researchers developed Temporal Aggregated Convolution (TAC) to accelerate spiking neural networks by aggregating spike frames before convolution, achieving 13.8x speedup on rate-coded data. The study reveals that optimal temporal aggregation strategies depend on data type - collapsing temporal dimensions for rate-coded data while preserving them for event-based data.

๐Ÿข Nvidia
AIBullisharXiv โ€“ CS AI ยท Mar 176/10
๐Ÿง 

CATFormer: When Continual Learning Meets Spiking Transformers With Dynamic Thresholds

Researchers introduce CATFormer, a new spiking neural network architecture that solves catastrophic forgetting in continual learning through dynamic threshold neurons. The framework uses context-adaptive thresholds and task-agnostic inference to maintain knowledge across multiple learning tasks without performance degradation.

AIBullisharXiv โ€“ CS AI ยท Mar 36/104
๐Ÿง 

Scalable Multi-Task Learning through Spiking Neural Networks with Adaptive Task-Switching Policy for Intelligent Autonomous Agents

Researchers have developed SwitchMT, a novel methodology using Spiking Neural Networks with adaptive task-switching for multi-task learning in autonomous agents. The approach addresses task interference issues and demonstrates competitive performance in multiple Atari games while maintaining low power consumption and network complexity.

AIBullisharXiv โ€“ CS AI ยท Feb 276/105
๐Ÿง 

Spark: Modular Spiking Neural Networks

Researchers have introduced Spark, a new modular framework for spiking neural networks that aims to improve energy efficiency and data processing compared to traditional neural networks. The framework demonstrates its capabilities by solving complex problems like the sparse-reward cartpole using simple plasticity mechanisms, potentially advancing continuous learning approaches similar to biological systems.

AINeutralarXiv โ€“ CS AI ยท Mar 34/104
๐Ÿง 

Accuracy-Efficiency Trade-Offs in Spiking Neural Networks: A Lempel-Ziv Complexity Perspective on Learning Rules

Researchers developed a framework using Lempel-Ziv complexity to evaluate trade-offs between accuracy and computational efficiency in spiking neural networks. The study found that gradient-based learning achieves highest accuracy but at high computational cost, while bio-inspired learning rules offer better efficiency trade-offs for temporal pattern recognition tasks.