Next Article in Journal
Structural Monitoring Without a Budget—Laboratory Results and Field Report on the Use of Low-Cost Acceleration Sensors
Previous Article in Journal
FFT-Based Angular Compression for CSI Feedback in Single-User Massive MIMO Systems
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
This is an early access version, the complete PDF, HTML, and XML versions will be available soon.
Article

EPCNet: Implementing an ‘Artificial Fovea’ for More Efficient Monitoring Using the Sensor Fusion of an Event-Based and a Frame-Based Camera

by
Orla Sealy Phelan
1,2,
Dara Molloy
3,
Roshan George
1,2,
Edward Jones
1,2,
Martin Glavin
1,2 and
Brian Deegan
1,2,*
1
Department of Electrical and Electronic Engineering, University of Galway, University Road, H91 TK33 Galway, Ireland
2
Ryan Institute, University of Galway, University Road, H91 TK33 Galway, Ireland
3
Valeo Vision Systems, Tuam, H54 Y276 Galway, Ireland
*
Author to whom correspondence should be addressed.
Sensors 2025, 25(15), 4540; https://doi.org/10.3390/s25154540
Submission received: 2 May 2025 / Revised: 7 July 2025 / Accepted: 10 July 2025 / Published: 22 July 2025
(This article belongs to the Section Sensing and Imaging)

Abstract

Efficient object detection is crucial to real-time monitoring applications such as autonomous driving or security systems. Modern RGB cameras can produce high-resolution images for accurate object detection. However, increased resolution results in increased network latency and power consumption. To minimise this latency, Convolutional Neural Networks (CNNs) often have a resolution limitation, requiring images to be down-sampled before inference, causing significant information loss. Event-based cameras are neuromorphic vision sensors with high temporal resolution, low power consumption, and high dynamic range, making them preferable to regular RGB cameras in many situations. This project proposes the fusion of an event-based camera with an RGB camera to mitigate the trade-off between temporal resolution and accuracy, while minimising power consumption. The cameras are calibrated to create a multi-modal stereo vision system where pixel coordinates can be projected between the event and RGB camera image planes. This calibration is used to project bounding boxes detected by clustering of events into the RGB image plane, thereby cropping each RGB frame instead of down-sampling to meet the requirements of the CNN. Using the Common Objects in Context (COCO) dataset evaluator, the average precision (AP) for the bicycle class in RGB scenes improved from 21.08 to 57.38. Additionally, AP increased across all classes from 37.93 to 46.89. To reduce system latency, a novel object detection approach is proposed where the event camera acts as a region proposal network, and a classification algorithm is run on the proposed regions. This achieved a 78% improvement over baseline.
Keywords: neuromorphic camera; object detection; multi-modal fusion; stereo camera neuromorphic camera; object detection; multi-modal fusion; stereo camera

Share and Cite

MDPI and ACS Style

Phelan, O.S.; Molloy, D.; George, R.; Jones, E.; Glavin, M.; Deegan, B. EPCNet: Implementing an ‘Artificial Fovea’ for More Efficient Monitoring Using the Sensor Fusion of an Event-Based and a Frame-Based Camera. Sensors 2025, 25, 4540. https://doi.org/10.3390/s25154540

AMA Style

Phelan OS, Molloy D, George R, Jones E, Glavin M, Deegan B. EPCNet: Implementing an ‘Artificial Fovea’ for More Efficient Monitoring Using the Sensor Fusion of an Event-Based and a Frame-Based Camera. Sensors. 2025; 25(15):4540. https://doi.org/10.3390/s25154540

Chicago/Turabian Style

Phelan, Orla Sealy, Dara Molloy, Roshan George, Edward Jones, Martin Glavin, and Brian Deegan. 2025. "EPCNet: Implementing an ‘Artificial Fovea’ for More Efficient Monitoring Using the Sensor Fusion of an Event-Based and a Frame-Based Camera" Sensors 25, no. 15: 4540. https://doi.org/10.3390/s25154540

APA Style

Phelan, O. S., Molloy, D., George, R., Jones, E., Glavin, M., & Deegan, B. (2025). EPCNet: Implementing an ‘Artificial Fovea’ for More Efficient Monitoring Using the Sensor Fusion of an Event-Based and a Frame-Based Camera. Sensors, 25(15), 4540. https://doi.org/10.3390/s25154540

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop