15 articles tagged with #spiking-neural-networks. AI-curated summaries with sentiment analysis and key takeaways from 50+ sources.
AIBullisharXiv โ CS AI ยท 5d ago7/10
๐ง Researchers have developed PAS-Net, a physics-aware spiking neural network that dramatically reduces power consumption in wearable IMU-based human activity recognition systems. The architecture achieves state-of-the-art accuracy while cutting energy consumption by up to 98% through sparse integer operations and an early-exit mechanism, establishing a new standard for ultra-low-power edge computing on battery-constrained devices.
AIBullisharXiv โ CS AI ยท 6d ago7/10
๐ง Researchers introduce GeยฒmS-T, a novel Spiking Vision Transformer architecture that optimizes energy efficiency while maintaining training and inference performance through multi-dimensional grouped computation. The approach addresses fundamental limitations in existing SNN paradigms by balancing memory overhead, learning capability, and energy consumption simultaneously.
AIBullisharXiv โ CS AI ยท Mar 177/10
๐ง SPARQ introduces a unified framework combining spiking neural networks, quantization-aware training, and reinforcement learning-guided early exits for energy-efficient edge AI. The system achieves up to 5.15% higher accuracy than conventional quantized SNNs while reducing system energy consumption by over 330 times and cutting synaptic operations by over 90%.
AIBullisharXiv โ CS AI ยท Mar 167/10
๐ง Researchers developed an SRAM-based compute-in-memory accelerator for spiking neural networks that uses linear decay approximation instead of exponential decay, achieving 1.1x to 16.7x reduction in energy consumption. The innovation addresses the bottleneck of neuron state updates in neuromorphic computing by performing in-place decay directly within memory arrays.
AIBullisharXiv โ CS AI ยท Mar 57/10
๐ง Researchers have released mlx-snn, the first spiking neural network library built natively for Apple's MLX framework, targeting Apple Silicon hardware. The library demonstrates 2-2.5x faster training and 3-10x lower GPU memory usage compared to existing PyTorch-based solutions, achieving 97.28% accuracy on MNIST classification tasks.
AIBullisharXiv โ CS AI ยท Mar 37/104
๐ง Researchers developed a novel learning approach for spiking neural networks that optimizes both synaptic weights and intrinsic neuronal parameters, achieving up to 13.50 percentage point improvements in classification accuracy. The study introduces a biologically-inspired SNN-LZC classifier that achieves 99.50% accuracy with sub-millisecond inference latency.
AINeutralarXiv โ CS AI ยท 4d ago6/10
๐ง Researchers present EMBER, a hybrid architecture combining spiking neural networks with large language models where the SNN acts as a persistent, biologically-inspired memory substrate that autonomously triggers LLM reasoning. The system demonstrates emergent autonomous behavior, initiating unprompted user contact after learning associations during idle periods, suggesting a fundamental shift in how AI systems could coordinate cognition and action.
AINeutralarXiv โ CS AI ยท 6d ago6/10
๐ง Researchers demonstrate that applying Bayesian inference to Spiking Neural Networks (SNNs) for speech processing smooths the irregular loss landscape caused by threshold-based spike generation. Testing on speech datasets shows improved performance metrics and more regular predictive landscapes compared to deterministic approaches.
AIBullisharXiv โ CS AI ยท Apr 76/10
๐ง Researchers developed SpikeVPR, a bio-inspired visual place recognition system using event-based cameras and spiking neural networks that achieves comparable performance to deep networks while using 50x fewer parameters and consuming 30-250x less energy. The neuromorphic approach enables real-time deployment on mobile platforms for autonomous robot navigation.
AIBullisharXiv โ CS AI ยท Mar 276/10
๐ง Researchers propose TDA-SNN, a novel spiking neural network framework that uses a single neuron with time-delayed autapses to reconstruct traditional multilayer architectures. The approach significantly reduces neuron count and memory requirements while maintaining competitive performance, though at the cost of increased temporal latency.
AIBullisharXiv โ CS AI ยท Mar 176/10
๐ง Researchers developed Temporal Aggregated Convolution (TAC) to accelerate spiking neural networks by aggregating spike frames before convolution, achieving 13.8x speedup on rate-coded data. The study reveals that optimal temporal aggregation strategies depend on data type - collapsing temporal dimensions for rate-coded data while preserving them for event-based data.
๐ข Nvidia
AIBullisharXiv โ CS AI ยท Mar 176/10
๐ง Researchers introduce CATFormer, a new spiking neural network architecture that solves catastrophic forgetting in continual learning through dynamic threshold neurons. The framework uses context-adaptive thresholds and task-agnostic inference to maintain knowledge across multiple learning tasks without performance degradation.
AIBullisharXiv โ CS AI ยท Mar 36/104
๐ง Researchers have developed SwitchMT, a novel methodology using Spiking Neural Networks with adaptive task-switching for multi-task learning in autonomous agents. The approach addresses task interference issues and demonstrates competitive performance in multiple Atari games while maintaining low power consumption and network complexity.
AIBullisharXiv โ CS AI ยท Feb 276/105
๐ง Researchers have introduced Spark, a new modular framework for spiking neural networks that aims to improve energy efficiency and data processing compared to traditional neural networks. The framework demonstrates its capabilities by solving complex problems like the sparse-reward cartpole using simple plasticity mechanisms, potentially advancing continuous learning approaches similar to biological systems.
AINeutralarXiv โ CS AI ยท Mar 34/104
๐ง Researchers developed a framework using Lempel-Ziv complexity to evaluate trade-offs between accuracy and computational efficiency in spiking neural networks. The study found that gradient-based learning achieves highest accuracy but at high computational cost, while bio-inspired learning rules offer better efficiency trade-offs for temporal pattern recognition tasks.