Methodology for Automatically Detecting Pan–Tilt–Zoom CCTV Camera Drift in Advanced Traffic Management System Networks
Abstract
1. Introduction
2. Literature Review
2.1. Early Cameras in Traffic Management
2.2. Developments in Image Analysis
2.3. Preliminary Applications in Transportation
2.4. The Need for Camera Calibration
2.5. Camera Drift Detection
2.6. Approaches for Image Comparison
2.6.1. SSIM
2.6.2. SIFT
2.6.3. SURF
3. Study Objectives
- Define PTZ CCTV drift using a set of images and recorded PTZ values from one camera;
- Apply the mean SSIM algorithm and determine if it is feasible to detect drift with collected samples;
- Expand and scale the methodology to 49 cameras along a 53-mile route (on Interstate 465) using one sample for each day of the week (seven per week), and show the systematic application of the methodology;
- Reduce the number of samples from seven per week to two per week (Sunday and Saturday) to show that the methodology is still valid at lower sample sizes (important when considering a real-time deployment);
- Establish criteria to classify a potential drift alert;
- Establish analysis criteria to detect and eliminate false positives;
- Analyze generated alerts to establish the drift detection rate recommendations.
4. Motivation
5. Methods
- Capture the “original image”.
- Capture the “comparison image”.
- Compare the images using the mSSIM algorithm.
- Store the mSSIM value for the two compared images.
- Compute the 2-day average mSSIM.
- Check if the 2-day average mSSIM is greater than the alerting threshold:
- a.
- If greater than or equal to the alerting threshold, do not send an alert;
- b.
- If less than the alerting threshold, send an alert.
- Repeat 2–6 during the operation of the camera at set intervals.
5.1. Sample Selection Criteria
- Images whose times occurred at 16:00 UTC ± 10 min, best ensuring constant overhead lighting;
- Images that have a common PTZ value with the “original image,” defined in the next section.
5.2. Original Image Selection Criteria
- Images whose times occur at 16:00 UTC ± 1 h;
- Images that have the most common observed combination of PTZ values;
- Images that occur within the first week of August 2023.
5.3. Single Camera
5.4. Multiple Cameras
- Extract all aggregated bins with an average mean SSIM value of 0.3 or less;
- Group extracted bins by camera number;
- Select the first occurrence in bins with at least two consecutive alerted bins as a new drift alert.
6. Results
7. Discussion
7.1. Limitations
7.2. Future Research
8. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Malackowski, H.A. Techniques For Reducing Traffic Management Center Camera Positioning Latency For Accelerated Incident Response. Master’s Thesis, Purdue University Graduate School, West Lafayette, IN, USA, 2024. [Google Scholar]
- Mathew, J.K.; Malackowski, H.A.; Koshan, Y.; Gartner, C.; Desai, J.; Li, H.; Cox, E.D.; Habib, A.; Bullock, D.M. Joint Transportation Research Program. Development of Latitude/Longitude (and Route/Milepost) Positioning Traffic Management Cameras; Purdue University: West Lafayette, IN, USA, 2023. [Google Scholar]
- Castleman, M. A History of Traffic Management Technology. 2019. Available online: https://streets.mn/2019/10/16/a-history-of-traffic-management-technology/ (accessed on 3 October 2024).
- Stehr, R.A. Minnesota Department of Transportation Experience in the Application of Advanced Traffic Management Systems; ASCE: Minneapolis, MN, USA, 1991; pp. 458–462. [Google Scholar]
- Anonymous. City Keeps an Eye on Traffic with Video Technology. Am. City Cty. 2003, 118, 22. [Google Scholar]
- Anonymous. Optelecom-NKF Awarded Projects by Imtech: [1]. Telecomworldwire 2008. Available online: https://www.proquest.com/trade-journals/optelecom-nkf-awarded-projects-imtech/docview/190999863/se-2?accountid=13360 (accessed on 3 October 2024).
- McAuliffe, T.P. Pan, Tilt, Zoom, Better than Ever. Access Control Secur. Syst. 2002, 45, 50. [Google Scholar]
- Careless, J. Government Video; Future Publishing Ltd.: New York, NY, USA; London, UK, 2011; pp. 20–23. [Google Scholar]
- Anonymous. Caltrans “Converts” for the Future. Commun. News 2004, 41, 26–30. [Google Scholar]
- Petkovic, D.; Wilder, J.; Ejiri, M.; Haralick, R.M.; Jain, R.; Ruetz, P.A.; Sklansky, J.; Swonger, C.W. Machine Vision in the 1990s: Applications and How to Get There. Mach. Vis. Apps. 1991, 4, 113–126. [Google Scholar] [CrossRef]
- Rosin, P.; Ellis, T. Image Difference Threshold Strategies and Shadow Detection. In Proceedings of the British Machine Conference, Birmingham, UK, 11–14 September 1995; Volume 1. [Google Scholar] [CrossRef]
- Lowe, D.G. Distinctive Image Features from Scale-Invariant Keypoints. Int. J. Comput. Vis. 2004, 60, 91–110. [Google Scholar] [CrossRef]
- LeCun, Y.; Bengio, Y.; Laboratories, T.B. Convolutional Networks for Images, Speech, and Time-Series. In The Handbook of Brain Theory and Neural Networks; ACM: New York, NY, USA, 1998. [Google Scholar]
- Bruzzone, L.; Prieto, D.F. Automatic Analysis of the Difference Image for Unsupervised Change Detection. IEEE Trans. Geosci. Remote Sens. 2000, 38, 1171–1182. [Google Scholar] [CrossRef]
- Lowe, D.G. Object Recognition from Local Scale-Invariant Features. In Proceedings of the Seventh IEEE International Conference on Computer Vision, Kerkyra, Greece, 20–27 September 1999; Volume 2, pp. 1150–1157. [Google Scholar]
- Kodratoff, Y.; Moscatelli, S. Machine Learning for Object Recognition and Scene Analysis. Int. J. Patt. Recogn. Artif. Intell. 1994, 08, 259–304. [Google Scholar] [CrossRef]
- Wang, Z.; Bovik, A.C.; Sheikh, H.R.; Simoncelli, E.P. Image Quality Assessment: From Error Visibility to Structural Similarity. IEEE Trans. Image Process. 2004, 13, 600–612. [Google Scholar] [CrossRef]
- Nilsson, J.; Akenine-Möller, T. Understanding SSIM. arXiv 2020, arXiv:2006.13846. [Google Scholar]
- Sampat, M.P.; Wang, Z.; Gupta, S.; Bovik, A.C.; Markey, M.K. Complex Wavelet Structural Similarity: A New Image Similarity Index. IEEE Trans. Image Process. 2009, 18, 2385–2401. [Google Scholar] [CrossRef]
- Mak, C.L.; Fan, H.S.L. Heavy Flow-Based Incident Detection Algorithm Using Information From Two Adjacent Detector Stations. J. Intell. Transp. Syst. 2006, 10, 23–31. [Google Scholar] [CrossRef]
- Schoepflin, T.N. Algorithms for Estimating Mean Vehicle Speed Using Uncalibrated Traffic Management Cameras. Ph.D. Thesis, University of Washington, Seattle, WA, USA, 2003. [Google Scholar]
- Thornton, J.; Baran-Gale, J.; Yahr, A. An Assessment of the Video Analytics Technology Gap for Transportation Facilities. In Proceedings of the 2009 IEEE Conference on Technologies for Homeland Security, Waltham, MA, USA, 11–12 May 2009; pp. 135–142. [Google Scholar]
- Zheng, X.-Q.; Wu, Z.; Xu, S.-X.; Guo, M.-M.; Lin, Z.-X.; Zhang, Y.-Y. Video-Based Measurement and Data Analysis of Traffic Flow on Urban Expressways. Acta Mech. Sin. 2011, 27, 346–353. [Google Scholar] [CrossRef]
- Galego, R.; Bernardino, A.; Gaspar, J. Auto-Calibration of Pan-Tilt Cameras Including Radial Distortion and Zoom. In Proceedings of the Advances in Visual Computing; Bebis, G., Boyle, R., Parvin, B., Koracin, D., Fowlkes, C., Wang, S., Choi, M.-H., Mantler, S., Schulze, J., Acevedo, D., et al., Eds.; Springer: Berlin/Heidelberg, Germany, 2012; pp. 169–178. [Google Scholar]
- Dadashi, N.; Stedmon, A.W.; Pridmore, T.P. Semi-Automated CCTV Surveillance: The Effects of System Confidence, System Accuracy and Task Complexity on Operator Vigilance, Reliance and Workload. Appl. Ergon. 2013, 44, 730–738. [Google Scholar] [CrossRef]
- Mathew, J.K.; Malackowski, H.A.; Gartner, C.M.; Desai, J.; Cox, E.D.; Habib, A.F.; Bullock, D.M. Methodology for Automatically Setting Camera View to Mile Marker for Traffic Incident Management. J. Transp. Technol. 2023, 13, 708–730. [Google Scholar] [CrossRef]
- Lisanti, G.; Masi, I.; Pernici, F.; Del Bimbo, A. Continuous Localization and Mapping of a Pan–Tilt–Zoom Camera for Wide Area Tracking. Mach. Vis. Appl. 2016, 27, 1071–1085. [Google Scholar] [CrossRef]
- Bay, H. Surf: Speeded Up Robust Features; Computer Vision—ECCV 2006; Springer: Berlin/Heidelberg, Germany, 2006. [Google Scholar]
- Eslami, M.A.; Rzasa, J.R.; Milner, S.D.; Davis, C.C. Dual Dynamic PTZ Tracking Using Cooperating Cameras. Intell. Control Autom. 2015, 6, 45–54. [Google Scholar] [CrossRef]
- Mao, K.; Xu, Y.; Wang, R. A General Calibration Method for Dual-PTZ Cameras Based on Feedback Parameters. Appl. Sci. 2022, 12, 9148. [Google Scholar] [CrossRef]
- Bastanlar, Y. A Simplified Two-View Geometry Based External Calibration Method for Omnidirectional and PTZ Camera Pairs. Pattern Recognit. Lett. 2016, 71, 1–7. [Google Scholar] [CrossRef]
- Liao, H.-C.; Pan, M.-H.; Hwang, H.-W.; Chang, M.-C.; Chen, P.-C. An Automatic Calibration Method Based On Feature Point Matching For The Cooperation Of Wide-Angle And Pan-Tilt-Zoom Cameras. Inf. Technol. Control 2011, 40, 41–47. [Google Scholar] [CrossRef]
- Hillemann, M.; Jutzi, B. UCalMiCeL—Unified Intrinsic And Extrinsic Calibration Of A Multi-Camera-System And A Laserscanner. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2017, IV-2-W3, 17–24. [Google Scholar] [CrossRef]
- Wang, Z.; Simoncelli, E.P.; Bovik, A.C. Multiscale Structural Similarity for Image Quality Assessment. In Proceedings of the Thirty-Seventh Asilomar Conference on Signals, Systems & Computers, Pacific Grove, CA, USA, 9–12 November 2003; pp. 1398–1402. [Google Scholar]
- Zeng, K.; Wang, Z. 3D-SSIM for Video Quality Assessment. In Proceedings of the 2012 19th IEEE International Conference on Image Processing, Orlando, FL, USA, 30 September–3 October 2012; pp. 621–624. [Google Scholar]
- Chen, S.; Zhang, Y.; Li, Y.; Chen, Z.; Wang, Z. Spherical Structural Similarity Index for Objective Omnidirectional Video Quality Assessment. In Proceedings of the 2018 IEEE International Conference on Multimedia and Expo (ICME), San Diego, CA, USA, 23–27 July 2018; pp. 1–6. [Google Scholar]
- Castillo-Carrión, S.; Guerrero-Ginel, J.-E. SIFT Optimization and Automation for Matching Images from Multiple Temporal Sources. Int. J. Appl. Earth Obs. Geoinf. 2017, 57, 113–122. [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Gartner, C.; Mathew, J.K.; Bullock, D. Methodology for Automatically Detecting Pan–Tilt–Zoom CCTV Camera Drift in Advanced Traffic Management System Networks. Future Transp. 2024, 4, 1297-1317. https://doi.org/10.3390/futuretransp4040062
Gartner C, Mathew JK, Bullock D. Methodology for Automatically Detecting Pan–Tilt–Zoom CCTV Camera Drift in Advanced Traffic Management System Networks. Future Transportation. 2024; 4(4):1297-1317. https://doi.org/10.3390/futuretransp4040062
Chicago/Turabian StyleGartner, Christopher, Jijo K. Mathew, and Darcy Bullock. 2024. "Methodology for Automatically Detecting Pan–Tilt–Zoom CCTV Camera Drift in Advanced Traffic Management System Networks" Future Transportation 4, no. 4: 1297-1317. https://doi.org/10.3390/futuretransp4040062
APA StyleGartner, C., Mathew, J. K., & Bullock, D. (2024). Methodology for Automatically Detecting Pan–Tilt–Zoom CCTV Camera Drift in Advanced Traffic Management System Networks. Future Transportation, 4(4), 1297-1317. https://doi.org/10.3390/futuretransp4040062