Spiking Neural Networks in Imaging: A Review and Case Study
Abstract
1. Introduction
- Compare pre-processing techniques, training approaches, and encoding methods.
- Survey architectures applied to imaging tasks.
- Evaluate implementations on hardware platforms such as accelerators and processors.
- Identify gaps and propose directions for future research.
2. Background
2.1. Brief History
2.2. Neuron Models
2.2.1. Biological Neurons
2.2.2. Artificial Neurons
2.2.3. Spiking Neurons
2.3. Spike Encoding
2.3.1. Rate and Population Coding
2.3.2. Temporal Coding
2.4. SNN Image Processing
2.5. Summary
3. State of the Art
3.1. Sensor Integration
3.1.1. Sensor Types
3.1.2. Pre-Processing Techniques
3.1.3. Encoding Methods
3.2. Datasets, Benchmarks, and Applications
3.2.1. Datasets
3.2.2. Applications
3.2.3. Metrics
3.3. Spiking Architectures
3.3.1. Feed-Forward Networks
3.3.2. Recurrent Networks
3.3.3. Attention-Based Networks
3.3.4. Neuron and Threshold Variations
3.3.5. Beyond Binary Spikes
3.4. Training SNNs
3.4.1. ANN-SNN Conversion
3.4.2. Supervised Direct Training
3.4.3. Unsupervised Direct Training
3.4.4. Hybrid Training
3.5. Hardware and Accelerators
3.5.1. Neuromorphic Processors
3.5.2. FPGA-Based Accelerators
3.5.3. ASICs and Integrated Vision Chips
3.5.4. Memristive Approaches
3.5.5. On-Chip Learning
3.6. Summary
4. Comparative Analysis
4.1. Analysis of Reported Results
4.1.1. Data Collection Methodology
- Hardware platform: CPU/GPU, FPGA, ASIC, or neuromorphic processors.
- Sensor type: SPAD arrays, event-based (DVS and spike cameras), ToF (including LiDAR), or CIS.
- Performance metrics: Accuracy, latency, and energy efficiency where available. In cases where the resolution was not specified, the resolution of the training images was used as a proxy.
- Training method: ANN-SNN conversion, direct training, and datasets used for an application.
- Application simplification: Recognition tasks were grouped under classification, and detection tasks were grouped under classification and regression (bounding box estimation).
4.1.2. Model Complexity Trade-Offs
4.1.3. Deployment Bottlenecks
4.1.4. Encoding Suitability
4.1.5. Application Focus and Sensor Integration
4.2. Limitations of Existing Methods
4.2.1. Dataset and Generalisation Gaps
4.2.2. Training and Conversion Inefficiencies
4.2.3. Hardware Limitations
4.2.4. Narrow Application Focus
4.2.5. Energy and Latency Trade-Offs
4.2.6. Algorithm-Hardware Mismatch
4.2.7. Inconsistent Reporting and Benchmarks
4.3. Summary of Insights
5. Case Study: Spiking Neural Network-Based TDC-Less dToF
5.1. Introduction
5.2. Architecture
5.3. Results
5.4. Conclusion
6. Challenges and Future Direction
6.1. Scaling and Ecosystem
6.2. Hardware Bottlenecks and Roofline Models
6.3. Sensor Integration and Data Bottlenecks
6.4. Training and Algorithmic Inefficiencies
6.5. Applications Beyond Vision
6.6. Standardisation and Benchmarks
6.7. Summary
7. Conclusions
7.1. Summary of Findings
- Application bias: Most of the studies focus on classification and regression tasks, with segmentation and reconstruction being far less represented.
- Dataset limitations: Research is dominated by small, constrained, or custom datasets (e.g., MNIST, CIFAR-10, DVS Gesture), raising concerns about robustness and generalisation to high-resolution, real-world imaging.
- Training inefficiencies: ANN-SNN conversion remains the most widely adopted training strategy, but it often increases latency and energy consumption. Direct training and hybrid methods show promise but are computationally expensive and under-explored at scale.
- Hardware gap: Most evaluations remain at the CPU/GPU simulation level. Dedicated FPGA, ASIC, or neuromorphic processor implementations are comparatively rare.
- Energy and latency trade-offs: Compact models achieve efficiency at the expense of accuracy, while large models scale poorly in energy and latency. Reported efficiency gains are often based on estimates rather than measured benchmarks.
- Sensor–network co-design: Few works explicitly optimise sensor design with network architectures. Novel sensors such ToF devices remain largely unexplored in hardware-integrated SNN pipelines.
- Inconsistent reporting: Lack of standardised benchmarks for energy, latency, and throughput hampers fair comparison and slows progress.
7.2. Outlook
7.3. Closing Remarks
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
References
- LeCun, Y.; Bengio, Y.; Hinton, G. Deep Learning. Nature 2015, 521, 436–444. [Google Scholar] [CrossRef]
- Abiodun, O.I.; Jantan, A.; Omolara, A.E.; Dada, K.V.; Mohamed, N.A.; Arshad, H. State-of-the-Art in Artificial Neural Network Applications: A Survey. Heliyon 2018, 4, e00938. [Google Scholar] [CrossRef] [PubMed]
- Anumol, C.S. Advancements in CNN Architectures for Computer Vision: A Comprehensive Review. In Proceedings of the 2023 Annual International Conference on Emerging Research Areas: International Conference on Intelligent Systems (AICERA/ICIS), Kanjirapally, India, 16–18 November 2023; pp. 1–7. [Google Scholar]
- Sze, V.; Chen, Y.-H.; Yang, T.-J.; Emer, J.S. Efficient Processing of Deep Neural Networks: A Tutorial and Survey. Proc. IEEE 2017, 105, 2295–2329. [Google Scholar] [CrossRef]
- Deng, L.; Wu, Y.; Hu, X.; Liang, L.; Ding, Y.; Li, G.; Zhao, G.; Li, P.; Xie, Y. Rethinking the Performance Comparison between SNNS and ANNS. Neural Netw. 2020, 121, 294–307. [Google Scholar] [CrossRef]
- Maass, W. Networks of Spiking Neurons: The Third Generation of Neural Network Models. Neural Netw. 1997, 10, 1659–1671. [Google Scholar] [CrossRef]
- Pfeiffer, M.; Pfeil, T. Deep Learning with Spiking Neurons: Opportunities and Challenges. Front. Neurosci. 2018, 12, 774. [Google Scholar] [CrossRef]
- Gerstner, W.; Kistler, W.M. Spiking Neuron Models: Single Neurons, Populations, Plasticity, 1st ed.; Cambridge University Press: Cambridge, UK, 2002; ISBN 978-0-511-81570-6. [Google Scholar]
- Mcculloch, W.S.; Pitts, W.H. A Logical Calculus of the Ideas Immanent in Nervous Activity. Bull. Math. Biophys. 1943, 5, 115–133. [Google Scholar] [CrossRef]
- Rosenblatt, F. The Perceptron: A Probabilistic Model for Information Storage and Organization in the Brain. Psychol. Rev. 1958, 65, 386–408. [Google Scholar] [CrossRef]
- Thorpe, S.; Imbert, M. Biological Constraints on Connectionist Modeling; Elsevier: Amsterdam, The Netherlands, 1989; pp. 63–92. [Google Scholar]
- Benjamin, B.V.; Gao, P.; McQuinn, E.; Choudhary, S.; Chandrasekaran, A.R.; Bussat, J.-M.; Alvarez-Icaza, R.; Arthur, J.V.; Merolla, P.A.; Boahen, K. Neurogrid: A Mixed-Analog-Digital Multichip System for Large-Scale Neural Simulations. Proc. IEEE 2014, 102, 699–716. [Google Scholar] [CrossRef]
- Akopyan, F.; Sawada, J.; Cassidy, A.; Alvarez-Icaza, R.; Arthur, J.; Merolla, P.; Imam, N.; Nakamura, Y.; Datta, P.; Nam, G.-J.; et al. TrueNorth: Design and Tool Flow of a 65 mW 1 Million Neuron Programmable Neurosynaptic Chip. IEEE Trans. Comput.-Aided Des. Integr. Circuits Syst. 2015, 34, 1537–1557. [Google Scholar] [CrossRef]
- Furber, S.B.; Galluppi, F.; Temple, S.; Plana, L.A. The SpiNNaker Project. Proc. IEEE 2014, 102, 652–665. [Google Scholar] [CrossRef]
- Hodgkin, A.L.; Huxley, A.F. A Quantitative Description of Membrane Current and Its Application to Conduction and Excitation in Nerve. J. Physiol. 1952, 117, 500–544. [Google Scholar] [CrossRef]
- Izhikevich, E.M. Simple Model of Spiking Neurons. IEEE Trans. Neural Netw. 2003, 14, 1569–1572. [Google Scholar] [CrossRef]
- Izhikevich, E.M. Which Model to Use for Cortical Spiking Neurons? IEEE Trans. Neural Netw. 2004, 15, 1063–1070. [Google Scholar] [CrossRef]
- Dubey, S.R.; Singh, S.K.; Chaudhuri, B.B. Activation Functions in Deep Learning: A Comprehensive Survey and Benchmark. Neurocomputing 2022, 503, 92–108. [Google Scholar] [CrossRef]
- Brunel, N.; Van Rossum, M.C.W. Lapicque’s 1907 Paper: From Frogs to Integrate-and-Fire. Biol. Cybern. 2007, 97, 337–339. [Google Scholar] [CrossRef] [PubMed]
- Eshraghian, J.K.; Ward, M.; Neftci, E.O.; Wang, X.; Lenz, G.; Dwivedi, G.; Bennamoun, M.; Jeong, D.S.; Lu, W.D. Training Spiking Neural Networks Using Lessons From Deep Learning. Proc. IEEE 2023, 111, 1016–1054. [Google Scholar] [CrossRef]
- Adrian, E.D.; Zotterman, Y. The Impulses Produced by Sensory Nerve Endings. J. Physiol. 1926, 61, 465–483. [Google Scholar] [CrossRef] [PubMed]
- Auge, D.; Hille, J.; Mueller, E.; Knoll, A. A Survey of Encoding Techniques for Signal Processing in Spiking Neural Networks. Neural Process. Lett. 2021, 53, 4693–4710. [Google Scholar] [CrossRef]
- Brette, R. Philosophy of the Spike: Rate-Based vs. Spike-Based Theories of the Brain. Front. Syst. Neurosci. 2015, 9, 151. [Google Scholar] [CrossRef]
- Zhang, J.; Dong, B.; Zhang, H.; Ding, J.; Heide, F.; Yin, B.; Yang, X. Spiking Transformers for Event-Based Single Object Tracking. In Proceedings of the 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), New Orleans, LA, USA, 18–24 June 2022; pp. 8791–8800. [Google Scholar]
- Zhang, Z.; Yang, H.; Li, J.; Chong, S.W.; Eshraghian, J.K.; Yong, K.-T.; Vigolo, D.; McGuire, H.M.; Kavehei, O. Neuromorphic Imaging Cytometry on Human Blood Cells. Neuromorphic Comput. Eng. 2025, 5, 024001. [Google Scholar] [CrossRef]
- Zhang, C.; Kang, L.; Yang, X.; Guo, G.; Feng, P.; Yu, S.; Liu, L. A 1000 Fps Spiking Neural Network Tracking Algorithm for On-Chip Processing of Dynamic Vision Sensor Data. In Proceedings of the 2022 IEEE International Conference on Integrated Circuits, Technologies and Applications (ICTA), Xi’an, China, 28–30 October 2022; pp. 178–179. [Google Scholar]
- Zhang, Z.; Yang, H.; Eshraghian, J.K.; Li, J.; Yong, K.-T.; Vigolo, D.; McGuire, H.M.; Kavehei, O. Cell Detection with Convolutional Spiking Neural Network for Neuromorphic Cytometry. APL Mach. Learn. 2024, 2, 026117. [Google Scholar] [CrossRef]
- Kang, Y.; Yang, G.; Park, C. Event-Based White Blood Cell Classification Using Convolutional Spiking Neural Networks. In Proceedings of the 2024 IEEE International Conference on Metaverse Computing, Networking, and Applications (MetaCom), Hong Kong, China, 12–14 August 2024; pp. 301–304. [Google Scholar]
- Zhu, L.; Wang, X.; Chang, Y.; Li, J.; Huang, T.; Tian, Y. Event-Based Video Reconstruction via Potential-Assisted Spiking Neural Network 2022. arXiv 2022, arXiv:2201.10943. [Google Scholar]
- Massa, R.; Marchisio, A.; Martina, M.; Shafique, M. An Efficient Spiking Neural Network for Recognizing Gestures with a DVS Camera on the Loihi Neuromorphic Processor 2021. arXiv 2020, arXiv:2006.09985. [Google Scholar]
- Gehrig, M.; Shrestha, S.B.; Mouritzen, D.; Scaramuzza, D. Event-Based Angular Velocity Regression with Spiking Networks 2020. arXiv 2020, arXiv:2003.02790. [Google Scholar]
- Shen, Z.; Corradi, F. Spike Vision: A Fully Spiking Neural Network Transformer-Inspired Model for Dynamic Vision Sensors. In Proceedings of the 2024 58th Asilomar Conference on Signals, Systems, and Computers, Pacific Grove, CA, USA, 27–30 October 2024; pp. 1537–1541. [Google Scholar]
- Zhao, R.; Xiong, R.; Zhang, J.; Yu, Z.; Zhu, S.; Ma, L.; Huang, T. Spike Camera Image Reconstruction Using Deep Spiking Neural Networks. IEEE Trans. Circuits Syst. Video Technol. 2024, 34, 5207–5212. [Google Scholar] [CrossRef]
- Hasssan, A.; Meng, J.; Seo, J.-S. Spiking Neural Network with Learnable Threshold for Event-Based Classification and Object Detection. In Proceedings of the 2024 International Joint Conference on Neural Networks (IJCNN), Yokohama, Japan, 30 June–5 July 2024; pp. 1–8. [Google Scholar]
- Moustakas, G.; Tsilikas, I.; Bogris, A.; Mesaritakis, C. Neuromorphic Imaging Flow Cytometry Combined with Adaptive Recurrent Spiking Neural Networks. Opt. Express 2025, 33, 34180. [Google Scholar] [CrossRef]
- MacLean, J.I.; Stewart, B.D.; Gyongy, I. TDC-Less Direct Time-of-Flight Imaging Using Spiking Neural Networks. IEEE Sens. J. 2024, 24, 33838–33846. [Google Scholar] [CrossRef]
- Lielamurs, E.; Sayed, I.; Cvetkovs, A.; Novickis, R.; Zencovs, A.; Celitans, M.; Bizuns, A.; Dimitrakopoulos, G.; Koszescha, J.; Ozols, K. A Distributed Time-of-Flight Sensor System for Autonomous Vehicles: Architecture, Sensor Fusion, and Spiking Neural Network Perception. Electronics 2025, 14, 1375. [Google Scholar] [CrossRef]
- Zhou, S.; Chen, Y.; Li, X.; Sanyal, A. Deep SCNN-Based Real-Time Object Detection for Self-Driving Vehicles Using LiDAR Temporal Data. IEEE Access 2020, 8, 76903–76912. [Google Scholar] [CrossRef]
- Zhuang, G.; Bing, Z.; Huang, K.; Knoll, A. Toward Neuromorphic Perception: Spike-Driven Lane Segmentation for Autonomous Driving Using LiDAR Sensor. In Proceedings of the 2023 IEEE 26th International Conference on Intelligent Transportation Systems (ITSC), Bilbao, Spain, 24 September 2023; IEEE: Bilbao, Spain, 2023; pp. 2448–2453. [Google Scholar]
- Zang, Z.; Li, X.; Li, D.D.U. Spiking Neural Network Enhanced Hand Gesture Recognition Using Low-Cost Single-Photon Avalanche Diode Array. arXiv 2024, arXiv:2402.05441. [Google Scholar] [CrossRef]
- Shawkat, M.S.A.; Adnan, M.M.; Febbo, R.D.; Murray, J.J.; Rose, G.S. A Single Chip SPAD Based Vision Sensing System With Integrated Memristive Spiking Neuromorphic Processing. IEEE Access 2023, 11, 19441–19457. [Google Scholar] [CrossRef]
- Lin, Y.; Charbon, E. Spiking Neural Networks for Active Time-Resolved SPAD Imaging. In Proceedings of the 2024 IEEE/CVF Winter Conference on Applications of Computer Vision (WACV), Waikoloa, HI, USA, 3–8 January 2024; pp. 8132–8141. [Google Scholar]
- Kirkland, P.; Kapitany, V.; Lyons, A.; Soraghan, J.; Turpin, A.; Faccio, D.; Di Caterina, G. Imaging from Temporal Data via Spiking Convolutional Neural Networks. In Proceedings of the Emerging Imaging and Sensing Technologies for Security and Defence V; and Advanced Manufacturing Technologies for Micro- and Nanosystems in Security and Defence III; Farsari, M., Rarity, J.G., Kajzar, F., Szep, A., Hollins, R.C., Buller, G.S., Lamb, R.A., Laurenzis, M., Camposeo, A., Persano, L., et al., Eds.; SPIE: Online Only, 2020; Volume 11540, p. 115400J. [Google Scholar]
- Yang, X.; Yao, C.; Kang, L.; Luo, Q.; Qi, N.; Dou, R.; Yu, S.; Feng, P.; Wei, Z.; Liu, J.; et al. A Bio-Inspired Spiking Vision Chip Based on SPAD Imaging and Direct Spike Computing for Versatile Edge Vision. IEEE J. Solid-State Circuits 2024, 59, 1883–1898. [Google Scholar] [CrossRef]
- Yang, X.; Lei, F.; Tian, N.; Shi, C.; Wang, Z.; Yu, S.; Dou, R.; Feng, P.; Qi, N.; Wei, Z.; et al. A 10,000-Inference/s Bio-Inspired Spiking Vision Chip Based on an End-to-End SNN Embedding Image Signal Enhancement. IEEE J. Solid-State Circuits 2025, 1–17. [Google Scholar] [CrossRef]
- Lemaire, E.; Moretti, M.; Daniel, L.; Miramond, B.; Millet, P.; Feresin, F.; Bilavarn, S. An FPGA-Based Hybrid Neural Network Accelerator for Embedded Satellite Image Classification. In Proceedings of the 2020 IEEE International Symposium on Circuits and Systems (ISCAS), Seville, Spain, 12–14 October 2020; pp. 1–5. [Google Scholar]
- Shi, X.; Hao, Z.; Yu, Z. SpikingResformer: Bridging ResNet and Vision Transformer in Spiking Neural Networks 2024. arXiv 2024, arXiv:2403.14302. [Google Scholar]
- Zhou, Z.; Zhu, Y.; He, C.; Wang, Y.; Yan, S.; Tian, Y.; Yuan, L. Spikformer: When Spiking Neural Network Meets Transformer. arXiv 2023, arXiv:2209.15425. [Google Scholar]
- Yao, M.; Hu, J.; Zhou, Z.; Yuan, L.; Tian, Y.; Xu, B.; Li, G. Spike-Driven Transformer 2023. arXiv 2023, arXiv:2307.01694. [Google Scholar]
- Datta, G.; Deng, H.; Aviles, R.; Liu, Z.; Beerel, P.A. Bridging the Gap Between Spiking Neural Networks & LSTMs for Latency & Energy Efficiency. In Proceedings of the 2023 IEEE/ACM International Symposium on Low Power Electronics and Design (ISLPED), Vienna, Austria, 7–8 August 2023; pp. 1–6. [Google Scholar]
- Chen, X.; Yang, Q.; Wu, J.; Li, H.; Tan, K.C. A Hybrid Neural Coding Approach for Pattern Recognition with Spiking Neural Networks. IEEE Trans. Pattern Anal. Mach. Intell. 2024, 46, 3064–3078. [Google Scholar] [CrossRef] [PubMed]
- Wang, Z.; Yu, N.; Liao, Y. Activeness: A Novel Neural Coding Scheme Integrating the Spike Rate and Temporal Information in the Spiking Neural Network. Electronics 2023, 12, 3992. [Google Scholar] [CrossRef]
- Luu, N.T.; Luu, D.T.; Nam, P.N.; Thang, T.C. Improvement of Spiking Neural Network with Bit Plane Coding. In Proceedings of the 2024 IEEE 16th International Conference on Computational Intelligence and Communication Networks (CICN), Indore, India, 22–23 December 2024; pp. 1220–1224. [Google Scholar]
- Wang, W.; Zhou, S.; Li, J.; Li, X.; Yuan, J.; Jin, Z. Temporal Pulses Driven Spiking Neural Network for Time and Power Efficient Object Recognition in Autonomous Driving. In Proceedings of the 2020 25th International Conference on Pattern Recognition (ICPR), Milan, Italy, 10–15 January 2021; IEEE: Milan, Italy, 2021; pp. 6359–6366. [Google Scholar]
- Kim, S.; Park, S.; Na, B.; Yoon, S. Spiking-YOLO: Spiking Neural Network for Energy-Efficient Object Detection 2019. arXiv 2019, arXiv:1903.06530. [Google Scholar]
- Xiao, Y.; He, P.; Deng, H.; Xie, T.; Jing, M.; Zuo, L. Multi-Bit Mechanism: Towards Ultra-Low Time Steps for Spiking Neural Networks. In Advanced Intelligent Computing Technology and Applications; Huang, D.-S., Zhang, C., Zhang, Q., Pan, Y., Eds.; Lecture Notes in Computer Science; Springer Nature: Singapore, 2025; Volume 15862, pp. 378–389. [Google Scholar]
- Hu, Y.; Deng, L.; Wu, Y.; Yao, M.; Li, G. Advancing Spiking Neural Networks Toward Deep Residual Learning. IEEE Trans. Neural Netw. Learn. Syst. 2025, 36, 2353–2367. [Google Scholar] [CrossRef]
- Stöckl, C.; Maass, W. Optimized Spiking Neurons Can Classify Images with High Accuracy through Temporal Coding with Two Spikes. Nat. Mach. Intell. 2021, 3, 230–238. [Google Scholar] [CrossRef]
- Nowshin, F.; An, H.; Yi, Y. Towards Energy-Efficient Spiking Neural Networks: A Robust Hybrid CMOS-Memristive Accelerator. ACM J. Emerg. Technol. Comput. Syst. 2024, 20, 1–20. [Google Scholar] [CrossRef]
- Voelker, A.; Kajić, I.; Eliasmith, C. Legendre Memory Units: Continuous-Time Representation in Recurrent Neural Networks; Curran Associates Inc.: Red Hook, NY, USA, 2019; pp. 15570–15579. [Google Scholar]
- Vaswani, A.; Shazeer, N.; Parmar, N.; Uszkoreit, J.; Jones, L.; Gomez, A.N.; Ukasz Kaiser, Ł.; Polosukhin, I. Attention Is All You Need. In Proceedings of the Advances in Neural Information Processing Systems; Guyon, I., Luxburg, U.V., Bengio, S., Wallach, H., Fergus, R., Vishwanathan, S., Garnett, R., Eds.; Curran Associates, Inc.: Red Hook, NY, USA, 2017; Volume 30. [Google Scholar]
- Yao, M.; Hu, J.; Hu, T.; Xu, Y.; Zhou, Z.; Tian, Y.; Xu, B.; Li, G. Spike-Driven Transformer V2: Meta Spiking Neural Network Architecture Inspiring the Design of Next-Generation Neuromorphic Chips 2024. arXiv 2024, arXiv:2404.03663v1. [Google Scholar]
- Pan, Y.; Jiang, H.; Chen, J.; Li, Y.; Zhao, H.; Zhou, Y.; Shu, P.; Wu, Z.; Liu, Z.; Zhu, D.; et al. EG-SpikeFormer: Eye-Gaze Guided Transformer on Spiking Neural Networks for Medical Image Analysis 2024. arXiv 2024, arXiv:2410.09674. [Google Scholar]
- Xu, Q.; Gao, Y.; Shen, J.; Li, Y.; Ran, X.; Tang, H.; Pan, G. Enhancing Adaptive History Reserving by Spiking Convolutional Block Attention Module in Recurrent Neural Networks 2024. arXiv 2024, arXiv:2401.03719. [Google Scholar]
- Orchard, G.; Frady, E.P.; Rubin, D.B.D.; Sanborn, S.; Shrestha, S.B.; Sommer, F.T.; Davies, M. Efficient Neuromorphic Signal Processing with Loihi 2. In Proceedings of the 2021 IEEE Workshop on Signal Processing Systems (SiPS), Coimbra, Portugal, 19–21 October 2021; IEEE: Coimbra, Portugal, 2021; pp. 254–259. [Google Scholar]
- Guo, Y.; Chen, Y.; Liu, X.; Peng, W.; Zhang, Y.; Huang, X.; Ma, Z. Ternary Spike: Learning Ternary Spikes for Spiking Neural Networks. Proc. AAAI Conf. Artif. Intell. 2024, 38, 12244–12252. [Google Scholar] [CrossRef]
- Wang, X.; Zhang, Y. MT-SNN: Enhance Spiking Neural Network with Multiple Thresholds 2024. arXiv 2023, arXiv:2303.11127. [Google Scholar]
- Hao, Z.; Shi, X.; Liu, Y.; Yu, Z.; Huang, T. LM-HT SNN: Enhancing the Performance of SNN to ANN Counterpart through Learnable Multi-Hierarchical Threshold Model. arXiv 2024, arXiv:2402.00411. [Google Scholar]
- Feng, L.; Liu, Q.; Tang, H.; Ma, D.; Pan, G. Multi-Level Firing with Spiking DS-ResNet: Enabling Better and Deeper Directly-Trained Spiking Neural Networks. In Proceedings of the Thirty-First International Joint Conference on Artificial Intelligence, Vienna, Austria, 23–29 July 2022; pp. 2471–2477. [Google Scholar]
- Qi, H.; Lian, S.; Li, X.; Tang, H. Adaptive Multi-Level Firing for Direct Training Deep Spiking Neural Networks. In Proceedings of the 2024 International Joint Conference on Neural Networks (IJCNN), Yokohama, Japan, 30 June–5 July 2024; pp. 1–8. [Google Scholar]
- Fan, L.; Shen, H.; Lian, X.; Li, Y.; Yao, M.; Li, G.; Hu, D. A Multisynaptic Spiking Neuron for Simultaneously Encoding Spatiotemporal Dynamics. Nat. Commun. 2025, 16, 7155. [Google Scholar] [CrossRef]
- Diehl, P.U.; Neil, D.; Binas, J.; Cook, M.; Liu, S.-C.; Pfeiffer, M. Fast-Classifying, High-Accuracy Spiking Deep Networks through Weight and Threshold Balancing. In Proceedings of the 2015 International Joint Conference on Neural Networks (IJCNN), Killarney, Ireland, 12–17 July 2015; pp. 1–8. [Google Scholar]
- Rueckauer, B.; Lungu, I.-A.; Hu, Y.; Pfeiffer, M.; Liu, S.-C. Conversion of Continuous-Valued Deep Networks to Efficient Event-Driven Networks for Image Classification. Front. Neurosci. 2017, 11, 682. [Google Scholar] [CrossRef]
- Sengupta, A.; Ye, Y.; Wang, R.; Liu, C.; Roy, K. Going Deeper in Spiking Neural Networks: VGG and Residual Architectures. Front. Neurosci. 2019, 13, 95. [Google Scholar] [CrossRef]
- Ding, J.; Yu, Z.; Tian, Y.; Huang, T. Optimal ANN-SNN Conversion for Fast and Accurate Inference in Deep Spiking Neural Networks. In Proceedings of the Thirtieth International Joint Conference on Artificial Intelligence, Montreal, QC, Canada, 19–27 August 2021; pp. 2328–2336. [Google Scholar]
- Deng, S.; Gu, S. Optimal Conversion of Conventional Artificial Neural Networks to Spiking Neural Networks 2021. arXiv 2021, arXiv:2103.00476. [Google Scholar]
- Bu, T.; Fang, W.; Ding, J.; Dai, P.; Yu, Z.; Huang, T. Optimal ANN-SNN Conversion for High-Accuracy and Ultra-Low-Latency Spiking Neural Networks 2023. arXiv 2023, arXiv:2303.04347. [Google Scholar]
- Neftci, E.O.; Mostafa, H.; Zenke, F. Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Power of Gradient-Based Optimization to Spiking Neural Networks. IEEE Signal Process. Mag. 2019, 36, 51–63. [Google Scholar] [CrossRef]
- Masquelier, T.; Thorpe, S.J. Unsupervised Learning of Visual Features through Spike Timing Dependent Plasticity. PLoS Comput. Biol. 2007, 3, e31. [Google Scholar] [CrossRef] [PubMed]
- Kheradpisheh, S.R.; Ganjtabesh, M.; Thorpe, S.J.; Masquelier, T. STDP-Based Spiking Deep Convolutional Neural Networks for Object Recognition. Neural Netw. 2018, 99, 56–67. [Google Scholar] [CrossRef] [PubMed]
- Rathi, N.; Srinivasan, G.; Panda, P.; Roy, K. Enabling Deep Spiking Neural Networks with Hybrid Conversion and Spike Timing Dependent Backpropagation. arXiv 2020, arXiv:2005.01807. [Google Scholar] [CrossRef]
- Davies, M.; Srinivasa, N.; Lin, T.-H.; Chinya, G.; Cao, Y.; Choday, S.H.; Dimou, G.; Joshi, P.; Imam, N.; Jain, S.; et al. Loihi: A Neuromorphic Manycore Processor with On-Chip Learning. IEEE Micro 2018, 38, 82–99. [Google Scholar] [CrossRef]
- Kreiser, R.; Renner, A.; Leite, V.R.C.; Serhan, B.; Bartolozzi, C.; Glover, A.; Sandamirskaya, Y. An On-Chip Spiking Neural Network for Estimation of the Head Pose of the iCub Robot. Front. Neurosci. 2020, 14, 551. [Google Scholar] [CrossRef] [PubMed]
- Shukla, R.; Lipasti, M.; Van Essen, B.; Moody, A.; Maruyama, N. REMODEL: Rethinking Deep CNN Models to Detect and Count on a NeuroSynaptic System. Front. Neurosci. 2019, 13, 4. [Google Scholar] [CrossRef]
- Patiño-Saucedo, A.; Rostro-Gonzalez, H.; Serrano-Gotarredona, T.; Linares-Barranco, B. Event-Driven Implementation of Deep Spiking Convolutional Neural Networks for Supervised Classification Using the SpiNNaker Neuromorphic Platform. Neural Netw. 2020, 121, 319–328. [Google Scholar] [CrossRef] [PubMed]
- Ju, X.; Fang, B.; Yan, R.; Xu, X.; Tang, H. An FPGA Implementation of Deep Spiking Neural Networks for Low-Power and Fast Classification. Neural Comput. 2020, 32, 182–204. [Google Scholar] [CrossRef]
- Kakani, V.; Li, X.; Cui, X.; Kim, H.; Kim, B.-S.; Kim, H. Implementation of Field-Programmable Gate Array Platform for Object Classification Tasks Using Spike-Based Backpropagated Deep Convolutional Spiking Neural Networks. Micromachines 2023, 14, 1353. [Google Scholar] [CrossRef]
- Yang, L. Transporter A 128×4 SPAD Imager with On-Chip Encoder for SNN Processing. In Proceedings of the International Image Sensor Workshop (IISW) 2025, Kobe, Japan, 2–5 June 2025. [Google Scholar]
- Roldan, J.B.; Maldonado, D.; Aguilera-Pedregosa, C.; Moreno, E.; Aguirre, F.; Romero-Zaliz, R.; García-Vico, A.M.; Shen, Y.; Lanza, M. Spiking Neural Networks Based on Two-Dimensional Materials. Npj 2D Mater. Appl. 2022, 6, 63. [Google Scholar] [CrossRef]
- Kwon, D.; Lim, S.; Bae, J.-H.; Lee, S.-T.; Kim, H.; Seo, Y.-T.; Oh, S.; Kim, J.; Yeom, K.; Park, B.-G.; et al. On-Chip Training Spiking Neural Networks Using Approximated Backpropagation With Analog Synaptic Devices. Front. Neurosci. 2020, 14, 423. [Google Scholar] [CrossRef] [PubMed]
- Vohra, S.K.; Thomas, S.A.; Sakare, M.; Das, D.M. Circuit Implementation of On-Chip Trainable Spiking Neural Network Using CMOS Based Memristive STDP Synapses and LIF Neurons. Integration 2024, 95, 102122. [Google Scholar] [CrossRef]
- Tian, M.; Lu, J.; Gao, H.; Wang, H.; Yu, J.; Shi, C. A Lightweight Spiking GAN Model for Memristor-Centric Silicon Circuit with On-Chip Reinforcement Adversarial Learning. In Proceedings of the 2022 IEEE International Symposium on Circuits and Systems (ISCAS), Austin, TX, USA, 27 May–1 June 2022; pp. 3388–3392. [Google Scholar]
- AMD. AMD FPGAs Overview. 2025. Available online: https://www.amd.com/en/products/adaptive-socs-and-fpgas/fpga.html (accessed on 22 October 2025).
- Gyongy, I.; Dutton, N.A.W.; Henderson, R.K. Direct Time-of-Flight Single-Photon Imaging. IEEE Trans. Electron Devices 2022, 69, 2794–2805. [Google Scholar] [CrossRef]
- Taneski, F.; Abbas, T.A.; Henderson, R.K. Laser Power Efficiency of Partial Histogram Direct Time-of-Flight LiDAR Sensors. J. Light. Technol. 2022, 40, 5884–5893. [Google Scholar] [CrossRef]
- Sheehan, M.P.; Tachella, J.; Davies, M.E. Spline Sketches: An Efficient Approach for Photon Counting Lidar. IEEE Trans. Comput. Imaging 2024, 10, 863–875. [Google Scholar] [CrossRef]
- Ingle, A.; Maier, D. Count-Free Single-Photon 3D Imaging With Race Logic. IEEE Trans. Pattern Anal. Mach. Intell. 2025, 47, 7292–7303. [Google Scholar] [CrossRef] [PubMed]
- Tontini, A.; Mazzucchi, S.; Passerone, R.; Broseghini, N.; Gasparini, L. Histogram-Less LiDAR Through SPAD Response Linearization. IEEE Sens. J. 2024, 24, 4656–4669. [Google Scholar] [CrossRef]
- Milanese, T.; Zhao, J.; Hearn, B.; Charbon, E. Histogram-Less Direct Time-of-Flight Imaging Based on a Machine Learning Processor on FPGA. In Proceedings of the IISW23 International Image Sensor Workshop 2023, Crieff, UK, 21–25 May 2023. [Google Scholar]
- Yamazaki, K.; Vo-Ho, V.-K.; Bulsara, D.; Le, N. Spiking Neural Networks and Their Applications: A Review. Brain Sci. 2022, 12, 863. [Google Scholar] [CrossRef]
- Patanwala, S.M.; Gyongy, I.; Mai, H.; Abmann, A.; Dutton, N.A.W.; Rae, B.R.; Henderson, R.K. A High-Throughput Photon Processing Technique for Range Extension of SPAD-Based LiDAR Receivers. IEEE Open J. Solid-State Circuits Soc. 2022, 2, 12–25. [Google Scholar] [CrossRef]
- Rasmussen, D. NengoDL: Combining Deep Learning and Neuromorphic Modelling Methods. Neuroinformatics 2019, 17, 611–628. [Google Scholar] [CrossRef] [PubMed]
- Rajakumari, V.; Pradhan, K.P. BTBT Based LIF Junctionless FET Neuron with Plausible Mimicking Efficiency. IEEE Trans. Nanotechnol. 2023, 22, 172–177. [Google Scholar] [CrossRef]
- Mushtaq, U.; Akram, M.W.; Prasad, D.; Islam, A. An Energy and Area-Efficient Spike Frequency Adaptable LIF Neuron for Spiking Neural Networks. Comput. Electr. Eng. 2024, 119, 109562. [Google Scholar] [CrossRef]
- Kudithipudi, D.; Schuman, C.; Vineyard, C.M.; Pandit, T.; Merkel, C.; Kubendran, R.; Aimone, J.B.; Orchard, G.; Mayr, C.; Benosman, R.; et al. Neuromorphic Computing at Scale. Nature 2025, 637, 801–812. [Google Scholar] [CrossRef]
- Bouvier, M.; Valentian, A.; Mesquida, T.; Rummens, F.; Reyboz, M.; Vianello, E.; Beigne, E. Spiking Neural Networks Hardware Implementations and Challenges: A Survey. ACM J. Emerg. Technol. Comput. Syst. 2019, 15, 1–35. [Google Scholar] [CrossRef]
- Verhelst, M.; Benini, L.; Verma, N. How to Keep Pushing ML Accelerator Performance? Know Your Rooflines! IEEE J. Solid-State Circuits 2025, 60, 1888–1905. [Google Scholar] [CrossRef]
- Delic, D.; Afshar, S. Neuromorphic Computing for Compact LiDAR Systems. In More-than-Moore Devices and Integration for Semiconductors; Iacopi, F., Balestra, F., Eds.; Springer International Publishing: Cham, Germany, 2023; pp. 191–240. ISBN 978-3-031-21610-7. [Google Scholar]
- Malcolm, K.; Casco-Rodriguez, J. A Comprehensive Review of Spiking Neural Networks: Interpretation, Optimization, Efficiency, and Best Practices 2023. arXiv 2023, arXiv:2303.10780. [Google Scholar]
- Sabbella, H.; Mukherjee, A.; Kandappu, T.; Dey, S.; Pal, A.; Misra, A.; Ma, D. The Promise of Spiking Neural Networks for Ubiquitous Computing: A Survey and New Perspectives 2025. arXiv 2025, arXiv:2506.01737. [Google Scholar]
- Jungerman, S.; Leblang, M.; Gupta, S.; Sadekar, K. Visionsim 2025. Available online: https://github.com/WISION-Lab/visionsim (accessed on 20 October 2025).
- MacLean, J.I.; Stewart, B.D.; Gyongy, I. Stress Testing of Spiking Neural Network-Based TDC-Less dToF. In Proceedings of the 2025 International Image Sensor Workshop, Kobe, Japan, 2–5 June 2025. [Google Scholar]




















| Sensor Type | Output | Pre-Processing | Example Operations | Example Works |
|---|---|---|---|---|
| Event-based (DVS, Spike) | Asynchronous events | Temporal Structuring | Fixed-window event binning, voxel grid | [25,26,27,29,30,32,35] |
| Event Noise Filtering | Thresholding | |||
| Signal Normalisation | Polarity | |||
| ToF/LiDAR | Depth image | Temporal Structuring | Voxel grid | [37,38,39] |
| Event Noise Filtering | Histogram | |||
| SPAD | Photon arrival timestamps | Temporal Structuring | Photon arrival time encoding | [42,44,45] |
| Intensity | Event Noise Filtering | Spatio-temporal filters, histogram thresholding | ||
| CIS | Continuous-intensity frames | Signal Normalisation | Crop, resize, scale, feature normalisation | [46] |
| Platform | Architecture | Core Count/Scale | On-Chip Memory (MB) | Power Consumption (W) *1 | Technology Node (nm) | Key Features | Example Works |
|---|---|---|---|---|---|---|---|
| Loihi 1/2 (Intel) | Digital Neuromorphic | 128 Neuro-cores | 33/37 *2 | 1.5/1 | 14/4 | On-chip learning, graded spikes, programmable neurons, mixed-precision synapses, flexible memory allocation | [65,82] |
| TrueNorth (IBM) | Digital Neuromorphic | 4096 cores | 53 *3 | 0.3 | 28 | Binary spikes, fixed-weight synapses | [13] |
| SpiNNaker (Manchester University) | Digital Neuromorphic | Up to 1M cores | 128 | 1 | 130 | Many-core ARM-based system, asynchronous routing | [14] |
| FPGA | Reconfigurable Digital | Model dependent | 2–20 *4 | 0.7–4 | 16–28 *4 | Reconfigurable logic, hardware-level flexibility, quick prototyping, sensor co-design capability | [32,46,86,87] |
| ASIC | Analogue/Digital/Mixed Signal | Custom | 0.5 | 0.9 | 55,180 | Fully custom design, CiM achieves very low power and fast inference | [44,45] |
| Memristive | Analogue/Mixed Signal | Array-Based | In-memory crossbar | 0.003 | 65 | CiM, collocated memory and computation for ultra-low-energy processing | [41] |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Voudaskas, M.; MacLean, J.I.; Dutton, N.A.W.; Stewart, B.D.; Gyongy, I. Spiking Neural Networks in Imaging: A Review and Case Study. Sensors 2025, 25, 6747. https://doi.org/10.3390/s25216747
Voudaskas M, MacLean JI, Dutton NAW, Stewart BD, Gyongy I. Spiking Neural Networks in Imaging: A Review and Case Study. Sensors. 2025; 25(21):6747. https://doi.org/10.3390/s25216747
Chicago/Turabian StyleVoudaskas, Michael, Jack Iain MacLean, Neale A. W. Dutton, Brian D. Stewart, and Istvan Gyongy. 2025. "Spiking Neural Networks in Imaging: A Review and Case Study" Sensors 25, no. 21: 6747. https://doi.org/10.3390/s25216747
APA StyleVoudaskas, M., MacLean, J. I., Dutton, N. A. W., Stewart, B. D., & Gyongy, I. (2025). Spiking Neural Networks in Imaging: A Review and Case Study. Sensors, 25(21), 6747. https://doi.org/10.3390/s25216747

