A Secure Traffic Police Remote Sensing Approach via a Deep Learning-Based Low-Altitude Vehicle Speed Detector through UAVs in Smart Cites: Algorithm, Implementation and Evaluation
Abstract
:1. Introduction
- A more reliable and secure solution to verify vehicle speed using drones instead of current studies is offered.
- Providing an effective alternative method against the many reported cases of accident damages and injuries for police officers as well as fixed traffic cameras.
- Introducing the quick speed check system, which can be used in places with limited access and in conjunction with the available speed detection system.
- A low-altitude drone equipped with a Mobile Net-SSD method to measure vehicle speed and a network connection capability was employed to implement our approach.
- To achieve our solution, a low-altitude drone equipped with the Mobile Net-SSD algorithm was used to detect the speed of vehicles and has the ability to connect to the traffic police.
- Several scenarios were taken into account in various conditions, such as road intersections and settings with abrupt speed changes, to provide more accurate findings.
- To increase the solution’s speed detection accuracy, the effective cases were tested and calibrated using a drone. The system was run on Raspberry PI4B for its faster-processing speed, and memory capabilities, such as capacity and bandwidth, benefit the deep learning-based computer vision module to run smoothly.
- In addition to our earlier contribution, which was already discussed above, a movable camera system was mounted on top of the drone, offering a number of opportunities to assess our solution at various altitudes along the X and Y axes.
- In addition, a graphical user interface (GUI) was designed and implemented that allows us to record the environment’s status, identify problematic conditions based on speed parameters, and send alarms to the appropriate authorities was built and put into place. Additionally, we may manage the camera’s status, including its movement, through the GUI.
2. Materials and Methods
2.1. System Materials
2.2. Deep Learning Model Architecture
2.3. Vehicle Detection and Tracking
2.3.1. Vehicle Detection Approach
2.3.2. Vehicle Speed Estimation Using an Improved VASCAR approach
2.4. System Calibration
2.4.1. Indoor System Calibration
- Step 1: Calculates the horizontal, vertical, and diagonal FOV from a fixed distance from the object at different heights.
- Step 2: FOV extraction of the camera in different tilted positions on the x-axis to check any orientation effects on the Pi’s FOV.
- Step 3: FOV extraction of the camera in different tilted directions of the y-axis while the camera is 20 cm from the object. The same principle of the trapezoid as the last test applies, but in this case, the trapezoidal frame appears along the vertical side.
- Step 4: Finally, the camera performance was studied with a random test for any height and degree to calculate the FOV.
2.4.2. Outdoor System Calibration
3. Experimentation and Results
3.1. Indoor System Calibration
3.2. Outdoor System Calibration
3.3. System Software Parameters Calibration
3.4. Vehicle Speed Estimation Optimization
3.5. Real Vehicle Test
3.6. Vehicle Detection and Tracking
4. Discussion
5. Conclusions and Future Work
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Nomenclature
AVG H_FOV | Average Horizontal Field Of View |
Band values | |
Centre of points A | |
Centre of points B | |
Distance between x, y | |
Distance in meters between A & B | |
d/t | Distance over time |
DL | Distance From Lens |
DI_L | Diagonal Length |
D_FOV | Diagonal FOV |
Field of View | |
HL | Horizontal Length |
H_FOV | Horizontal FOV |
Km/h | Kilometer per hour |
L1 | Length of base 1 of the trapezoidal image area |
L2 | Length of base 2 of the trapezoidal image area |
Median of the trapezoidal image area | |
Normalized band | |
OUT | out object |
PPM | Pixel per meter |
Pixel spacing between a & b | |
UAV | Unmanned aerial vehicle |
VL1 | the Vertical Length of base 1 |
VL2 | the Vertical Length of base 2 |
VMT | Vertical Median of Trapezoidal Frame |
V_FOV | Vertical FOV |
The mean of each red, green, and blue band | |
Timestamp between a and b | |
Scaling factor for normalization |
References
- Awais, M.; Li, W.; Cheema, M.J.; Hussain, S.; Shah, A.; Aslam, B.; Liu, C.; Ali, A. Assessment of optimal flying height and timing using high-resolution unmanned aerial vehicle images in precision agriculture. Int. J. Environ. Sci. Technol. 2021, 19, 2703–2720. [Google Scholar] [CrossRef]
- Bennett, R.; van Oosterom, P.; Lemmen, C.; Koeva, M. Remote sensing for land administration. Remote Sens. 2020, 12, 2497. [Google Scholar] [CrossRef]
- Moranduzzo, T.; Melgani, F. Car speed estimation method for UAV images. In 2014 IEEE Geoscience and Remote Sensing Symposium; IEEE: Piscataway, NJ, USA, 2014; pp. 4942–4945. [Google Scholar] [CrossRef]
- Cao, G.; Yang, X.; Li, H. Intelligent Monitoring System of Special Vehicle Based on the Internet of Things. In Proceedings of International Conference on Computer Science and Information Technology. Advances in Intelligent Systems and Computing; Patnaik, S., Li, X., Eds.; Springer: New Delhi, India, 2014; Volume 255. [Google Scholar] [CrossRef]
- Carranza-García, M.; Torres-Mateo, J.; Lara-Benítez, P.; García-Gutiérrez, J. On the performance of one-stage and two-stage object detectors in autonomous vehicles using camera data. Remote Sens. 2020, 13, 89. [Google Scholar] [CrossRef]
- Cepni, S.; Atik, M.E.; Duran, Z. Vehicle detection using different deep learning algorithms from image sequence. Balt. J. Mod. Comput. 2020, 8, 347–358. [Google Scholar] [CrossRef]
- Deng, J.; Zhong, Z.; Huang, H.; Lan, Y.; Han, Y.; Zhang, Y. Lightweight Semantic Segmentation Network for Real-Time Weed Mapping Using Unmanned Aerial Vehicles. Appl. Sci. 2020, 10, 7132. [Google Scholar] [CrossRef]
- Coombes, M.; Fletcher, T.; Chen, W.H.; Liu, C. Decomposition-based mission planning for fixed-wing UAVs surveying in wind. J. Field Robot. 2020, 37, 440–465. [Google Scholar] [CrossRef]
- Du, M.; Noguchi, N. Monitoring of Wheat Growth Status and Mapping of Wheat Yield’s within-Field Spatial Variations Using Color Images Acquired from UAV-camera System. Remote Sens. 2017, 9, 289. [Google Scholar] [CrossRef]
- García, L.; Parra, L.; Jimenez, J.M.; Lloret, J.; Mauri, P.V.; Lorenz, P. DronAway: A Proposal on the Use of Remote Sensing Drones as Mobile Gateway for WSN in Precision Agriculture. Appl. Sci. 2020, 10, 6668. [Google Scholar] [CrossRef]
- Ibrar, M.; Mi, J.; Karim, S.; Laghari, A.A.; Shaikh, S.M.; Kumar, V. Improvement of Large-Vehicle Detection and Monitoring on CPEC Route. 3D Res. 2018, 9, 45. [Google Scholar] [CrossRef]
- Døssing, A.; Lima Simoes da Silva, E.; Martelet, G.; Maack Rasmussen, T.; Gloaguen, E.; Thejll Petersen, J.; Linde, J. A High-Speed, Light-Weight Scalar Magnetometer Bird for km Scale UAV Magnetic Surveying: On Sensor Choice, Bird Design, and Quality of Output Data. Remote Sens. 2021, 13, 649. [Google Scholar] [CrossRef]
- Huang, Y.; Thomson, S.J.; Brand, H.J.; Reddy, K.N. Development and evaluation of low-altitude remote sensing systems for crop production management. Int. J. Agric. Biol. Eng. 2016, 9, 1–11. [Google Scholar] [CrossRef]
- Jaiswal, D.; Kumar, P. Real-time implementation of moving object detection in UAV videos using GPUs. J. Real-Time Image Proc. 2020, 17, 1301–1317. [Google Scholar] [CrossRef]
- Lenain, L.; Melville, W.K. Autonomous surface vehicle measurements of the ocean’s response to Tropical Cyclone Freda. J. Atmos. Ocean. Technol. 2014, 31, 2169–2190. [Google Scholar] [CrossRef]
- Park, S.S.; Kozawa, K.; Fruin, S.; Mara, S.; Hsu, Y.-K.; Jakober, C.; Winer, A.; Herner, J. Emission factors for high-emitting vehicles based on on-road measurements of individual vehicle exhaust with a mobile measurement platform. J. Air Waste Manag. Assoc. 2011, 61, 1046–1056. [Google Scholar] [CrossRef] [PubMed]
- Afifah, F.; Nasrin, S.; Mukit, A. Vehicle Speed Estimation using Image Processing. J. Adv. Res. Appl. Mech. 2019, 48, 9–16. Available online: http://www.akademiabaru.com/doc/ARAMV48_N1_P9_16.pdf (accessed on 10 July 2022).
- Sadeghi, M.; Emadi Andani, M.; Bahrami, F.; Parnianpour, M. Trajectory of Human Movement during Sit to Stand: A New Modeling Approach Based on Movement Decomposition and Multi-Phase Cost Function. Exp. Brain Res. 2013, 229, 221–234. [Google Scholar] [CrossRef] [PubMed]
- Zhang, X.; Yang, G.; Liu, S.; Moshayedi, A.J. Fractional-order circuit design with hybrid controlled memristors and FPGA implementation. AEU-Int. J. Electron. Commun. 2022, 153, 154268. [Google Scholar] [CrossRef]
- Intel. Intel® Movidius™ Myriad™ X Vision Processing Unit. Available online: https://www.intel.com/content/www/us/en/products/details/processors/movidius-vpu/movidius-myriad-x.html (accessed on 10 July 2022).
- Moshayedi, A.J.; Roy, A.; Liao, L.; Li, S. Raspberry Pi SCADA Zonal based System for Agricultural Plant Monitoring. In Proceedings of the 2019 6th International Conference on Information Science and Control Engineering (ICISCE), Shanghai, China, 20–22 December 2019; pp. 427–433. [Google Scholar] [CrossRef]
- Gay, W. Pi Camera. In Advanced Raspberry Pi; Apress: Berkeley, CA, USA, 2018. [Google Scholar] [CrossRef]
- Jian, Z.; Yonghui, Z.; Yan, Y.; Ruonan, L.; Xueyao, W. MobileNet-SSD with the adaptive expansion of the receptive field. In Proceedings of the 2020 IEEE 3rd International Conference of Safe Production and Informatization (IICSPI), Chongqing, China, 28–30 November 2020; pp. 177–181. [Google Scholar] [CrossRef]
- Ren, J.; Li, H. Implementation of Vehicle and License Plate Detection on Embedded Platform. In Proceedings of the 2020 12th International Conference on Measuring Technology and Mechatronics Automation (ICMTMA), Phuket, Thailand, 28–29 February 2020; pp. 75–79. [Google Scholar] [CrossRef]
- Chiu, Y.C.; Tsai, C.Y.; Ruan, M.D.; Shen, G.Y.; Lee, T.T. Mobilenet-SSDv2: An Improved Object Detection Model for Embedded Systems. In Proceedings of the 2020 International Conference on System Science and Engineering (ICSSE), Kagawa, Japan, 31 August–3 September 2020; pp. 1–5. [Google Scholar] [CrossRef]
- Xu, G.; Khan, A.S.; Moshayedi, A.; Zhang, X.; Shuxin, Y. The Object Detection, Perspective and Obstacles In Robotic: A Review. EAI Endorsed Trans. AI Robot. 2022, 1, e13. [Google Scholar] [CrossRef]
- Gao, C.; Zhai, Y.; Guo, X. Visual Object Detection and Tracking System Design based on MobileNet-SSD. In Proceedings of the 2021 7th International Conference on Computer and Communications (ICCC), Chengdu, China, 10–13 December 2021; pp. 589–593. [Google Scholar] [CrossRef]
- Zhang, J.; Xu, J.; Zhu, L.; Zhang, K.; Liu, T.; Wang, D.; Wang, X. An improved MobileNet-SSD algorithm for automatic defect detection on vehicle body paint. Multimed. Tools Appl. 2020, 79, 23367–23385. [Google Scholar] [CrossRef]
- Prakash, N.; Kamath, C.; Gururaja, H.S. Detection of traffic violations using moving object and speed detection. Wutan Huatan Jisuan Jishu 2020, 16, 1001–1749. [Google Scholar]
- Fernández Llorca, D.; Hernández Martínez, A.; García Daza, I. Vision-based vehicle speed estimation: A survey. IET Intell. Transp. Syst. 2021, 15, 987–1005. [Google Scholar] [CrossRef]
- Haldar, S.K. Field of View. Available online: https://www.sciencedirect.com/topics/earth-and-planetary-sciences/field-of-view (accessed on 10 July 2022).
- Schwager, M.; Julian, B.; Angermann, M.; Rus, D. Eyes in the Sky: Decentralized Control for the Deployment of Robotic Camera Networks. Proc. IEEE 2011, 99, 1541–1561. [Google Scholar] [CrossRef]
- Mishra, D.; Khan, A.; Tiwari, R.; Upadhay, S. Automated Irrigation System-IoT Based Approach. In Proceedings of the 2018 3rd International Conference On Internet of Things: Smart Innovation and Usages (IoT-SIU), Bhimtal, India, 23–24 February 2018; pp. 1–4. [Google Scholar] [CrossRef]
- Tiwari, R.; Sharma, H.K.; Upadhyay, S.; Sachan, S.; Sharma, A. Automated parking system-cloud and IoT based technique. Int. J. Eng. Adv. Technol. (IJEAT) 2019, 8, 116–123. [Google Scholar]
- Asim, M.; Mashwani, W.K.; Shah, H.; Belhaouari, S.B. An evolutionary trajectory planning algorithm for multi-UAV-assisted MEC system. Soft Comput. 2022, 26, 7479–7492. [Google Scholar] [CrossRef]
- Asim, M.; Mashwani, W.; Belhaouari, S.; Hassan, S. A Novel Genetic Trajectory Planning Algorithm With Variable Population Size for Multi-UAV-Assisted Mobile Edge Computing System. IEEE Access 2021, 9, 125569–125579. [Google Scholar] [CrossRef]
- Iftikhar, S.; Zhang, Z.; Asim, M.; Muthanna, A.; Koucheryavy, A.; Abd El-Latif, A.A. Deep Learning-Based Pedestrian Detection in Autonomous Vehicles: Substantial Issues and Challenges. Electronics 2022, 11, 3551. [Google Scholar] [CrossRef]
- Iftikhar, S.; Asim, M.; Zhang, Z.; El-Latif, A.A.A. Advance generalization technique through 3D CNN to overcome the false positives pedestrian in autonomous vehicles. Telecommun. Syst. 2022, 80, 545–557. [Google Scholar] [CrossRef]
- Nguyen, D.D.; Rohács, J.; Rohács, D.; Boros, A. Intelligent total transportation management system for future smart cities. Appl. Sci. 2020, 10, 8933. [Google Scholar] [CrossRef]
- Gao, Z.K.; Liu, A.A.; Wang, Y.; Small, M.; Chang, X.; Kurths, J. IEEE Access Special Section Editorial: Big Data Learning and Discovery. IEEE Access 2021, 9, 158064–158073. [Google Scholar] [CrossRef]
H (m) | HL (cm) | H_FOC (Degree) | VL (cm) | V_FOV (Degree) | DL (cm) | D_FOV (Degree) |
---|---|---|---|---|---|---|
0.25 | 19.5 | 51.9784 | 15.5 | 42.3626 | 23 | 59.7978 |
0.5 | 19.6 | 52.2097 | 15.4 | 42.1134 | 23.2 | 60.2274 |
0.75 | 20 | 53.1301 | 14.9 | 40.8607 | 23.1 | 60.0128 |
1 | 20 | 53.1301 | 15 | 41.1120 | 22.8 | 59.3662 |
1.25 | 20.3 | 53.8155 | 15.2 | 41.6135 | 23.5 | 60.8684 |
1.5 | 19.8 | 52.6708 | 15.5 | 42.3626 | 23 | 59.7978 |
1.75 | 19.6 | 52.2097 | 15.3 | 41.8637 | 23 | 59.7978 |
2 | 20.3 | 53.8155 | 15.2 | 41.6135 | 23 | 59.7978 |
2.25 | 20.3 | 53.8155 | 15.2 | 41.6135 | 22.9 | 59.5822 |
2.5 | 20.1 | 53.3590 | 15.1 | 41.3630 | 23 | 59.7978 |
2.75 | 19.5 | 51.9784 | 15.1 | 41.3630 | 23 | 59.7978 |
3 | 20 | 53.1301 | 15.1 | 41.3630 | 23.1 | 60.0128 |
3.25 | 20.1 | 53.3590 | 15 | 41.1120 | 23 | 59.7978 |
3.5 | 19.9 | 52.9006 | 15 | 41.1120 | 23 | 59.7978 |
AOV_X (Degrees) | DL (cm) | H_L (Degree) | H_FOV (Degree) | VL1 (cm) | VL2 (cm) | VMT (cm) | V_FOV (Degree) | DI_L (cm) | D_FOV (Degree) |
---|---|---|---|---|---|---|---|---|---|
90 | Obj. out of scope | Null | 0 | Null | Null | Null | 0 | Null | 0 |
75 | |||||||||
60 | |||||||||
45 | 22.5 | 23 | 54.1441 | 15 | 18 | 16.5 | 40.272 | 35 | 75.7499 |
30 | 21.5 | 21 | 52.0591 | 15 | 17 | 16 | 40.819 | 32 | 73.3122 |
15 | 21 | 21 | 53.1301 | 15 | 15 | 15 | 39.307 | 30 | 71.0753 |
0 | 20 | 20 | 53.1301 | 15 | 15 | 15 | 41.112 | 23 | 59.7978 |
−15 | 21 | 21 | 53.1301 | 15 | 16 | 15.5 | 40.512 | 30 | 71.0753 |
−30 | 21.5 | 21 | 52.0591 | 15 | 17 | 16 | 40.819 | 32 | 73.3122 |
−45 | 22.5 | 21 | 50.0337 | 15 | 18 | 16.5 | 40.272 | 35 | 75.7499 |
−60 | Obj. out of scope | Null | 0 | Null | Null | Null | 0 | Null | 0 |
−75 | |||||||||
−90 |
AOV_Y (Degrees) | DL (cm) | VL2 (cm) | VMT (cm) | H_FOV (Degree) | VL1 (cm) | V_FOV (Degrees) | DI_L (cm) | D_FOV (Deg) |
---|---|---|---|---|---|---|---|---|
90 | Obj. out of scope | Null | Null | 0 | Null | 0 | Null | 0 |
75 | ||||||||
60 | 40 | 42 | 31 | 42.3626 | 16.1 | 29.113 | 39 | 64.342 |
45 | 34 | 36.5 | 28.25 | 45.1200 | 15.8 | 31.246 | 35 | 63.553 |
30 | 27 | 32 | 26 | 51.4199 | 15.5 | 33.196 | 32 | 63.215 |
15 | 24 | 27 | 23.5 | 52.1711 | 15.2 | 35.842 | 30 | 65.100 |
0 | 20 | 20 | 20 | 53.1301 | 15 | 41.112 | 23 | 59.797 |
−15 | 24 | 26.5 | 23.25 | 51.6887 | 15 | 35.757 | 30 | 65.657 |
−30 | 27 | 32 | 26 | 51.4199 | 15.4 | 32.993 | 32 | 63.215 |
−45 | 34 | 36 | 28 | 44.7602 | 15.6 | 31.132 | 35 | 64.010 |
−60 | 40 | 41.5 | 30.75 | 42.0510 | 15.9 | 28.991 | 39 | 64.761 |
−75 | Obj. out of scope | Null | Null | 0 | Null | 0 | Null | 0 |
−90 |
Distance From Road (cm) | FOV (cm) | FOV (Degree) | AVG H_FOV (Degree) | FOV (Degree) |
---|---|---|---|---|
300 | 335 | 48.2 | 51.12 | 53.1 |
350 | 385 | 48.8 | ||
400 | 430 | 48.8 | ||
450 | 465 | 51.6 | ||
500 | 515 | 51.7 | ||
550 | 570 | 51.5 | ||
600 | 570 | 53.5 | ||
650 | 650 | 53.1 | ||
700 | 770 | 51.53 | ||
1600 | 1500 | 51.53 |
Component | Description | Value (Unit) |
---|---|---|
max_disappear | Maximum consecutive frames for an object to be allowed to pass before deregistering it | 15 frames |
max_distance | Maximum distance between centroid to associate an object | 1.75 m |
track_object | Number of frames to track for object | 4 frames |
confidence | Minimum confidence or probability of detection | 0.4 |
frame_width | Frame width in pixels | 480 pixels |
speed_estimation_zone | Speed estimation columns | 4 (A, B, C, D) |
distance | Distance from road to camera | 16 m |
speed_limit | Speed limit | To Be Set |
Speed (km/h) | Direction 1 (y1) | Difference | Direction 2 (y2) | Difference |
---|---|---|---|---|
10 | 7.76 | 2.24 | 6.96 | 3.04 |
20 | 15.62 | 4.38 | 15.29 | 4.71 |
30 | 23.18 | 6.82 | 24.91 | 5.09 |
40 | 33.729 | 6.271 | 54.26 | −14.26 |
50 | 43.1 | 6.9 | 32.83 | 17.17 |
60 | 73.08 | −13.08 | 73.32 | −13.32 |
70 | 83.22 | −13.22 | 80.5 | −10.5 |
UAV Height (m) | Number of Vehicles on Street | Number of Detected Vehicles | Detected Car (Error %) |
---|---|---|---|
0.7 | 32 | 42 | 31.25 |
1.0 | 19 | 24 | 26.32 |
1.25 | 25 | 31 | 24.00 |
1.50 | 32 | 38 | 18.75 |
1.75 | 34 | 40 | 17.65 |
2.50 | 34 | 38 | 11.76 |
3.0 | 34 | 38 | 11.76 |
X-Axis Change (Degree) | Number of Vehicles on Street | Number of Detected Vehicles | Detected Car (Error %) |
---|---|---|---|
−15 | 39 | 36 | 7.69 |
+15 | 26 | 30 | 15.38 |
Y-Axis Change (Degree) | Number of Vehicles on Street | Number of Detected Vehicles | Detected Car (Error %) |
---|---|---|---|
−30 | 22 | 28 | 27.27 |
−15 | 18 | 21 | 16.67 |
+15 | 21 | 25 | 19.05 |
+30 | The road is out of range |
Random Parameter | Vehicles on Street (KM/h) | Detected Vehicles (KM/h) | Detection Error % |
---|---|---|---|
UAV height 1.5 m | 49 | 58 | 18.37 |
UAV height 1.0 m | 34 | 44 | 29.41 |
UAV height 2.0 m | 42 | 46 | 9.52 |
x-axis −15 degree | 36 | 39 | 8.33 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Moshayedi, A.J.; Roy, A.S.; Taravet, A.; Liao, L.; Wu, J.; Gheisari, M. A Secure Traffic Police Remote Sensing Approach via a Deep Learning-Based Low-Altitude Vehicle Speed Detector through UAVs in Smart Cites: Algorithm, Implementation and Evaluation. Future Transp. 2023, 3, 189-209. https://doi.org/10.3390/futuretransp3010012
Moshayedi AJ, Roy AS, Taravet A, Liao L, Wu J, Gheisari M. A Secure Traffic Police Remote Sensing Approach via a Deep Learning-Based Low-Altitude Vehicle Speed Detector through UAVs in Smart Cites: Algorithm, Implementation and Evaluation. Future Transportation. 2023; 3(1):189-209. https://doi.org/10.3390/futuretransp3010012
Chicago/Turabian StyleMoshayedi, Ata Jahangir, Atanu Shuvam Roy, Alireza Taravet, Liefa Liao, Jianqing Wu, and Mehdi Gheisari. 2023. "A Secure Traffic Police Remote Sensing Approach via a Deep Learning-Based Low-Altitude Vehicle Speed Detector through UAVs in Smart Cites: Algorithm, Implementation and Evaluation" Future Transportation 3, no. 1: 189-209. https://doi.org/10.3390/futuretransp3010012
APA StyleMoshayedi, A. J., Roy, A. S., Taravet, A., Liao, L., Wu, J., & Gheisari, M. (2023). A Secure Traffic Police Remote Sensing Approach via a Deep Learning-Based Low-Altitude Vehicle Speed Detector through UAVs in Smart Cites: Algorithm, Implementation and Evaluation. Future Transportation, 3(1), 189-209. https://doi.org/10.3390/futuretransp3010012