A Study on Recent Developments and Issues with Obstacle Detection Systems for Automated Vehicles
Abstract
:1. Introduction
- Radar: Performs mapping at medium to long range. Better than cameras and LIDAR in the worse weather possible, but lacks the fine resolution required for object identification.
- LIDAR: Provides 360-degree high-resolution mapping, from short to long range. Limited by harsh environments with low reflective targets.
- Ultrasound: Low cost and shows good performance in short-range measurement. Suitable for parking assistant in parking lots due to its fast response in a relatively short range.
- Camera: Provides a complete picture of the environment in a variety of situations, as well as able to accurately read road signals and colour buttons, but are limited by the visibility conditions within the driving environment.
- Infrared (IR): Gives excellent support for night vision among all sensors. LIDAR can also be used during night time because of its capabilities to work in low-visibility environments.
- Combining different technologies gives a more accurate detection, surveillance and recognition of the driving environment and all surroundings, including vehicle and pedestrian, lane and other objects; but, still, further effort is required to develop more precise sensor fusion systems.
- Data collected from different technologies is not homogeneous; as a result, a sophisticated data fusion mechanism is needed for accurate data analytics.
- In both extreme weather conditions (rain, snow, fog) and some specific situations in urban areas, although quite advanced, the current technological developments are not sophisticated enough to guarantee 100% precision and accuracy of obstacle detection. Hence, further work is needed.
2. The Purpose of Obstacle Detection System
3. Technology and Existing Systems
3.1. Background
3.2. RADAR/LIDAR
- i.
- Filtering: A pre-processing step to separate non-ground objects from ground information, which can reduce the data size and shorten the calculation time afterwards.
- ii.
- Data structuring: The geometry information about objects detected is encoded with X, Y, Z coordinates, and then fit into grids [14].
- iii.
- Segmentation: Point Cloud Library (PCL), an open-source software with VC++ language [19], can create clusters based on the Euclidean algorithm.
- iv.
- Cluster detection: With the aid of statistical means and histograms, the target cluster would be separated by visualization software from the other clusters (due to its nature, this process can be quite complicated and time-consuming).
- v.
- Detection with software: Terrasolid finally utilizes a progressive densification algorithm to classify and label the target category and other objects; for example, in the case of Anandakumar’s research, detecting buildings from the background together with plants.
3.3. Vision Cameras
3.4. Sonar/Ultrasound Sensor
3.5. Combining Different Technologies
- Rear—single laser sensor: detecting the vehicles in the rear (especially velocity and distance).
- Front—two independent LIDAR sensors: 45-degree corners, detecting obstacles in curves and when turning.
- Front—two powerful long-range sensors (e.g., vision camera): obtaining a full understanding of the environment, like traffic signals, pedestrians and vehicles. During cruise control, over 200 m is required to be covered.
- Front—single infrared camera: for detecting pedestrians during night time.
- Side—four short-range (20–30 m) RADAR sensors: determining if there is a vehicle in parallel and or in the blind corner.
- Corner—two sonar sensors at the four corners: for detecting obstacles during starting and pulling-back (measuring range less than 2 m);
- Inside—GPS, IMU, odometry modules: to obtain the exact location of the car;
- Inside—computer system: for data fusion, and data processing, output displayed on the control panel.
3.6. Data Fusion
- Inertial Measurement Unit (IMU): an electronic device for dynamic measurement for the relative position of a vehicle. It can provide high-frequency updates but sometimes with inaccurate results due to measurement error [49].
- Wheel odometry: to receive data about the wheel speed through a specific sensor to estimate the vehicle velocity [50].
- Global Positioning System (GPS): the GPS receiver can provide an absolute and exact location, although there are issues such as infrequent updates, noise occurrence, the signal being easily obstructed by surrounding solid structures, e.g., mountains or buildings [48].
3.7. Shadows and Colours
4. Weather and Obstacle Detection
4.1. Statistics
4.2. Rain
4.3. Fog
4.4. Snow
5. Obstacle Detection in Urban Areas
5.1. Exhaust from Front Vehicle
5.2. Reflection from Glass and Smooth Surfaces
5.3. Small Obstacles the Vehicle can Pass-by, Drive on, Over and Through
5.4. Detection of Emergence Service Vehicles
5.5. Opening Vehicle’s Doors
5.6. Autonomous Vehicle Driving in a Smart City Context
6. Conclusions
7. Future Work
Author Contributions
Funding
Conflicts of Interest
References
- Jeppsson, H.; Östling, M.; Lubbe, N. Real life safety benefits of increasing brake deceleration in car-to-pedestrian accidents: Simulation of Vacuum Emergency Braking. Accid. Anal. Prev. 2018, 111, 311–320. [Google Scholar] [CrossRef]
- AASHTO. American Association of State Highway and Transportation Officials; Highway Safety Manual; AASHTO: Washington, DC, USA, 2010. [Google Scholar]
- Andreopoulos, A.; Tsotsos, J.K. 50 Years of object recognition: Directions forward. Comput. Vis. Image Underst. 2013, 117, 827–891. [Google Scholar] [CrossRef]
- Poczter, S.L.; Jankovic, L.M. The Google car: Driving toward a better future? J. Bus. Case Stud. 2014, 10, 7–14. [Google Scholar] [CrossRef] [Green Version]
- Xu, X.; Fan, C.K. Autonomous vehicles, risk perceptions and insurance demand: An individual survey in China. Transp. Res. Part A 2018. [Google Scholar] [CrossRef]
- Yang, Z.; Pun-Cheng, L.S.C. Vehicle detection in intelligent transportation systems and its applications under varying environments: A review. Image Vis. Comput. 2018, 69, 143–154. [Google Scholar] [CrossRef]
- Hane, C.; Sattler, T.; Pollefeys, M. Obstacle detection for self-driving cars using only monocular cameras and wheel odometry. In Proceedings of the International Conference on Interlligent Robots and Systems, Hamburg, Germany, 28 September–2 October 2015; pp. 5101–5108. [Google Scholar]
- Pelliccione, P.; Knauss, E.; Heldal, R.; Ågren, S.M.; Mallozzi, P.; Alminger, A.; Borgentun, D. Automotive Architecture Framework: The experience of Volvo Cars. J. Syst. Archit. 2017, 77, 83–100. [Google Scholar] [CrossRef]
- Gruyer, D.; Magnier, V.; Hamdi, K.; Claussmann, L.; Orfila, O.; Rakotonirainy, A. Perception, information processing and modeling: Critical stages for autonomous driving applications. Annu. Rev. Control 2017, 44, 324–340. [Google Scholar] [CrossRef]
- Kim, D.; Choi, J.; Yoo, H.; Yang, U.; Sohn, K. Rear obstacle detection system with fisheye stereo camera using HCT. Expert Syst. Appl. 2015, 42, 6295–6305. [Google Scholar] [CrossRef]
- Favarò, F.; Eurich, S.; Nader, N. Autonomous vehicles’ disengagements: Trends, triggers and regulatory limitations. Accid. Anal. Prev. 2018, 110, 136–148. [Google Scholar] [CrossRef]
- Budzan, S.; Kasprzyk, J. Fusion of 3D laser scanner and depth images for obstacle recognition. Opt. Lasers Eng. 2016, 77, 230–240. [Google Scholar] [CrossRef]
- Van Brummelen, J.; O’Brien, M.; Gruyer, D.; Najjaran, H. Autonomous vehicle perception: The technology of today and tomorrow. Transp. Res. Part C 2018, 89, 384–406. [Google Scholar] [CrossRef]
- Baltsavias, E. Airborne laser scanning: Basic relations and formulas. ISPRS J. Photogramm. Remote Sens. 1999, 54, 199–214. [Google Scholar] [CrossRef]
- Filgueira, A.; González-Jorge, H.; Lagüela, S.; Díaz-Vilariño, L.; Arias, P. Quantifying the influence of rain in LiDAR performance. Measurement 2017, 95, 143–148. [Google Scholar] [CrossRef]
- Mockel, S.; Scherer, F.; Schuster, P.F. Multi-sensor obstacle detection on railway track. Vitronic Dr. Ing. Stein Bildverarbeitungssysteme. In Proceedings of the IEEE IV2003 Intelligent Vehicles Symposium, Proceedings (Cat. No.03TH8683), Columbus, OH, USA, 9–11 June 2003; pp. 42–46. [Google Scholar]
- Sun, Z.; Bebis, G.; Miller, R. On-Road Vehicle Detection: A Review. IEEE Trans. Pattern Anal. Mach. Intell. 2006, 28, 694–711. [Google Scholar] [PubMed]
- Ramiya, A.M.; Nidamanuri, R.R.; Krishnan, R. Segmentation based building detection approach from LiDAR point cloud. Egypt. J. Remote Sens. Space Sci. 2017, 20, 71–77. [Google Scholar] [CrossRef] [Green Version]
- Rusu, R.B.; Cousins, S. 3D is here: Point Cloud Library (PCL). In Proceedings of the IEEE International Conference on Robotics and Automation, Shanghai, China, 9–13 May 2011. [Google Scholar]
- Javanmardi, E.; Gu, Y.; Javanmardi, M.; Kamijo, S. Autonomous vehicle self-localization based on abstract map and multichannel. IATSS Res. 2018, 43, 1–13. [Google Scholar] [CrossRef]
- Wang, H.; Wang, B.; Liu, B.; Meng, X.; Yang, G. Pedestrian recognition and tracking using 3D LiDAR for autonomous. Robot. Auton. Syst. 2017, 88, 71–78. [Google Scholar] [CrossRef]
- Liu, G.; Zhou, M.; Wang, L.; Wang, H.; Guo, X. A blind spot detection and warning system based on millimeter wave radar for driver assistance. Optik 2017, 135, 353–365. [Google Scholar] [CrossRef]
- Gibbs, G.; Jia, H.; Madani, I. Obstacle Detection with ultrasonic sensors and signal analysis metrics. Transp. Res. Procedia 2017, 28, 173–182. [Google Scholar] [CrossRef]
- Devantech. SRF02 Ultrasonic range finder-Technical Specification. 10 January 2006. Available online: http://www.robot-electronics.co.uk/htm/srf02tech.htm (accessed on 16 April 2019).
- INFINITI-USA. 2014 Infiniti QX70-AroundView® Monitor with Moving Object Detection and Sonar System. 26 July 2013. Available online: https://www.youtube.com/watch?v=7JwJj6BlpJ0 (accessed on 30 June 2019).
- Rasshofer, R.H.; Gresser, K. Automotive radar and lidar systems for next generation driver assistance functions. Adv. Radio Sci. 2005, 3, 205–209. [Google Scholar] [CrossRef] [Green Version]
- Langer, D.; Thorpe, C.E. Sonar Based Outdoor Vehicle Navigation and Collsion Avoidance; The Robotics Institute, Carnegie Mellon University: Pittsburgh, PA, USA, 1992. [Google Scholar]
- Chen, X.; Kundu, K.; Zhu, Y.; Berneshawi, A.G.; Ma, H.; Fidler, S.; Urtasun, R. 3D Object Proposals for Accurate Object Class Detection. In Advances in Neural Information Processing Systems; MIT Press: Cambridge, MA, USA, 2015; pp. 424–432. [Google Scholar]
- Wang, G.; Xiao, D.; Gu, J. Review on Vehicle Detection Based on Video for Traffic Surveillance. In Proceedings of the International Conference on Automation and Logistics, Qingdao, China, 1–3 September 2008. [Google Scholar]
- Yet, W.C.; Qidwai, U. Intelligent surround sensing using fuzzy inference system. In Proceedings of the 2005 IEEE Sensors, Irvine, CA, USA, 30 October–3 November 2005; pp. 1034–1037. [Google Scholar]
- Otto, C.; Gerber, W.; León, F.P.; Wirnitzer, J. A joint integrated probabilistic data association filter for pedestrian tracking across blind regions using monocular camera and redar. In Proceedings of the 2012 IEEE Intelligent Vehicles Symposium, Alcala de Henares, Spain, 3–7 June 2012; pp. 636–641. [Google Scholar]
- Asvadi, A.; Garrote, L.; Premebida, C.; Peixoto, P.; Nunes, U.J. Multimodal vehicle detection: Fusing 3D-LIDAR and color camera data. Pattern Recognit. Lett. 2017, 1–10. [Google Scholar] [CrossRef]
- Dent, M.; Marinov, M. Introducing Automated Obstacle Detection to British Level Crossings. In Sustainable Rail Transport: Proceedings of RailNewcastle 2017; a collection of articles presented at the RailExchange conference in October 2017 at Newcastle University, Newcastle upon Tyne, UK; Fraszczyk, A., Marinov, M., Eds.; Springer: Berlin/Heidelberg, Germany, 2018; pp. 37–80. [Google Scholar] [CrossRef] [Green Version]
- Sabu, B.; Marinov, M. An Obstacle Detection System for Freight Yards. IF Ing. Ferrov. 2018, 73, 539. [Google Scholar]
- Kirk, R. Cars of the furture: The Internet of things in the automotive industry. Netw. Secur. 2015, 2015, 16–18. [Google Scholar] [CrossRef]
- Seif, H.G.; Hu, X. Autonomous Driving in the iCity—HD Maps as a Key Challenge of the Automotive Industry. Engineering 2016, 2, 159–162. [Google Scholar] [CrossRef] [Green Version]
- Ramasamy, S.; Sabatini, R.; Gardi, A.; Liu, J. LIDAR obstacle warning and avoidance system for unmanned aerial vehicle sense-and-avoid. Aerosp. Sci. Technol. 2016, 55, 344–358. [Google Scholar] [CrossRef]
- Mousazadeh, H.; Jafarbiglu, H.; Abdolmaleki, H.; Omrani, E.; Monhaseri, F.; Abdollahzadeh, M.R.; Mohammadi-Aghdam, A.; Kiapei, A.; Salmani-Zakaria, Y.; Makhsoos, A. Developing a navigation, guidance and obstacle avoidance algorithm for an unmanned surface vehicle (USV) by algorithms fusion. Ocean Eng. 2018, 159, 56–65. [Google Scholar] [CrossRef]
- Wei, P.; Cagle, L.; Reza, T.; Ball, J.; Gafford, J. LiDAR and Camera Detection Fusion in a Real-Time Industrial Multi-Sensor Collision Avoidance System. Electronics 2018, 7, 84. [Google Scholar] [CrossRef] [Green Version]
- SensLTech. SensL Solid State LiDAR Design Consideration. 8 February 2017. Available online: https://www.youtube.com/watch?v=npnAr1BlQhw&t=240s&list=PL4zcvv-9jq2lyaY70fWG8SiUgs5A2sJo0&index=3 (accessed on 13 March 2019).
- Chen, Y.; Zhao, D.; Lv, L.; Zhang, Q. Multi-task learning for dangerous object detection in autonomous driving. Inf. Sci. 2018, 432, 559–571. [Google Scholar] [CrossRef]
- Diaz-Cabrera, M.; Cerri, P.; Medici, P. Robust real-time traffic light detection and distance estimation using a single camera. Expert Syst. Appl. 2015, 42, 3911–3923. [Google Scholar] [CrossRef]
- Shi, W.; Alawieh, M.B.; Li, X.; Yu, H. Algorithm and hardware implementation for visual perception system in autonomous vehicle: A survey. Integr. VLSI J. 2017, 59, 148–156. [Google Scholar] [CrossRef]
- Árnason, J.I.; Jepsen, J.; Koudal, A.; Schmidt, M.R.; Serafin, S. Volvo intelligent news: A context aware multi modal. Pervasive Mob. Comput. 2014, 14, 95–111. [Google Scholar] [CrossRef]
- Häne, C.; Heng, L.; Lee, G.H.; Fraundorfer, F.; Furgale, P.; Sattler, T.; Pollefeys, M. 3D visual perception for self-driving cars using a multi-camera system: Calibration, mapping, localization, and obstacle detection. Image Vis. Comput. 2017, 68, 14–27. [Google Scholar] [CrossRef] [Green Version]
- Ruder, M.; Mohler, N.; Ahmed, F. An obstacle detection system for automated trains. Driver assistance and operations control. In Proceedings of the IEEE IV2003 Intelligent Vehicles Symposium, Proceedings (Cat. No.03TH8683), Columbus, OH, USA, 9–11 June 2003; pp. 180–185. [Google Scholar]
- Soták, M.; Labun, J. The new approach of evaluation differential signal of airborne FMCW radar-altimeter. Aerosp. Sci. Technol. 2012, 17, 1–6. [Google Scholar] [CrossRef]
- Yao, Y.; Xu, X.; Zhu, C.; Chan, C.Y. A hybrid fusion algorithm for GPS/INS integration during GPS outages. Measurement 2017, 103, 42–51. [Google Scholar] [CrossRef]
- Jiang, W.; Yin, Z. Combining passive visual cameras and active IMU sensors for persistent pedestrian tracking. J. Vis. Commun. Image Represent. 2017, 48, 419–431. [Google Scholar] [CrossRef]
- De la Escalera, A.; Izquierdo, E.; Martín, D.; Musleh, B.; García, F.; Armingol, J.M. Stereo visual odometry in urban environments based on detecting ground features. Obotics Auton. Syst. 2016, 80, 1–10. [Google Scholar] [CrossRef]
- Im, J.H.; Im, S.H.; Jee, G.I. Vertical Corner Feature Based Precise Vehicle Localization Using 3D LIDAR in Urban Area. Sensors 2016, 16, 1268. [Google Scholar] [CrossRef] [Green Version]
- Yan, G.; Yu, M.; Yu, Y.; Fan, L. Real-time vehicle detection using histograms of oriented gradients and AdaBoost classification. Optik 2016, 127, 7941–7951. [Google Scholar] [CrossRef]
- NagaRaju, C.; NagaMani, S.; Rakesh Prasad, G.; Sunitha, S. Morphological Edge Detection Algorithm Based on Multi-Structure Elements of Different Directions. Int. J. Inf. Commun. Technol. Res. 2011, 1, 37–43. [Google Scholar]
- Betke, M.; Haritaoglu, E.; Davis, L.S. Real-Time Multiple Vehicle Detection and Tracking from a Moving Vehicle. Mach. Vis. Appl. 2000, 12, 621–631. [Google Scholar] [CrossRef] [Green Version]
- Meher, S.K.; Murty, M.N. Efficient method of moving shadow detection and vehicle classification. Int. J. Electron. Commun. 2013, 67, 665–670. [Google Scholar] [CrossRef]
- National Highway Traffic Safety Administration. The Effectiveness of Amber Rear Turn Signals for Reducing Rear Impacts; Patent Technical No. DOT HS 811 115; National Highway Traffic Safety Administration: Washington, DC, USA, 2009.
- Lee, S.E.; Wierwille, W.W.; Klauer, S.G. Enhanced Rear Lighting and Signaling Systems: Literature Review and Analyses of Alternative System Concepts; National Highway Traffic Safety Administration: Washington, DC, USA, 2002.
- Goerick, C.; Noll, D.; Werner, M. Artificial Neural Networks in real-time car detection and tracking applications. Pattern Recognit. Lett. 1996, 17, 335–343. [Google Scholar] [CrossRef]
- Juang, C.F.; Chen, G.C.; Liang, C.W.; Lee, D. Stereo-camera-based object detection using fuzzy color histogramsand a fuzzy classifier with depth and shape estimations. Appl. Soft Comput. 2016, 46, 753–766. [Google Scholar] [CrossRef]
- Cucchiara, R.; Piccardi, M. Vehicle Detection under Day and Night Illumination. In Proceedings of the International Symposia on Intelligent Industrial Automation, Genova, Italy, 1–4 June 1999. [Google Scholar]
- Wu, B.F.; Huang, H.Y.; Chen, C.J.; Chen, Y.H.; Chang, C.W.; Chen, Y.L. A vision-based blind spot warning system for daytime and nighttime driver assistance. Comput. Electr. Eng. 2013, 39, 846–862. [Google Scholar] [CrossRef]
- Kim, J.H.; Batchuluun, G.; Park, K.R. Pedestrian detection based on faster R-CNN in nighttime by fusing deep convolutional features of successive images. Expert Syst. Appl. 2018, 114, 15–33. [Google Scholar] [CrossRef]
- Sarkar, B.; Saha, S.; Pal, P.K. A novel method for computation of importance weights in Monte Carlo localization on line segment-based maps. Robot. Auton. Syst. 2015, 74, 51–65. [Google Scholar] [CrossRef]
- Kim, B.; Son, J.; Sohn, K. Illumination invariant road detection based on learing method. In Proceedings of the IEEE Interlligent Transportation Systems Conference, Washington, DC, USA, 5–7 October 2011; pp. 1009–1014. [Google Scholar]
- Bassani, M.; Catani, L.; Cirillo, C.; Mutani, G. Night-time and daytime operating speed distribution in urban arterials. Transp. Res. Part F 2016, 42, 56–69. [Google Scholar] [CrossRef]
- Rasshofer, R.H.; Spies, M.; Spies, H. Influences of weather phenomena on automotive laser radar systems. Adv. Radio Sci. 2011, 9, 49–60. [Google Scholar] [CrossRef] [Green Version]
- Azam, S.; Islam, M.M. Automatic license plate detection in hazardous condition. J. Vis. Commun. Image Represent. 2016, 36, 172–186. [Google Scholar] [CrossRef]
- Hasirlioglu, S.; Riener, A.; Huber, W.; Wintersberger, P. Effects of Exhaust Gases on Laser Scanner Data Quality at Low Ambient Temperatures. In Proceedings of the IEEE Intelligent Vehicles Symposium, Redondo Beach, CA, USA, 11–14 June 2017. [Google Scholar]
- McKnight, D.; Miles, R. Impact of Reduced Visibility Conditions on Laser-Based DP Sensors; Marine Technology Society: Washington, DC, USA, 2014. [Google Scholar]
- Wang, K.; Liu, Y. Can Beijing fight with haze? Lessons can be learned from London and Los Angeles. Natual Hazards 2014, 72, 1265–1274. [Google Scholar] [CrossRef]
- Zhang, D.; Liu, J.; Li, B. Tackling Air Pollution in China—What do We Learn from the Great Smog of 1950s in London. Sustainability 2014, 6, 5322–5338. [Google Scholar] [CrossRef] [Green Version]
- Zhu, J.; Dolgov, D.; Ferguson, D. Methods and Systems for Detecting Weather Conditions Including Fog Using Vehicle Onboard Sensors. U.S. Patent US8983705B2, 17 March 2015. [Google Scholar]
- Radecki, P.; Campbell, M.; Matzen, K. All Weather Perception: Joint Data Association, Tracking, and Classification for autonomous vehicles. arXiv 2016, arXiv:1605.02196. [Google Scholar]
- Cornick, M.; Koechling, J.; Stanley, B.; Zhang, B. Localizing Ground Penetrating RADAR: A Step toward Robust Autonomous Ground Vehicle Localization. J. Field Robot. 2016, 23, 82–102. [Google Scholar] [CrossRef]
- Jo, J.; Tsunoda, Y.; Stantic, B.; Liew AW, C. A Likelihood-Based Data Fusion Model for the Integration of Multiple Sensor Data: A Case Study with Vision and Lidar Sensors. In Robot Intelligence Technology and Applications 4; Advances in Intelligent Systems and Computing; Springer International Publishing: Cham, Switzerland, 2017; Volume 447, pp. 489–500. [Google Scholar]
- Chen, Y.; Han, C. Night-time pedestrian detection by visual-infrared video fusion. In Proceedings of the 7th World Congress on Intelligent Control and Automation, Chongqing, China, 25–27 June 2008; pp. 5079–5084. [Google Scholar]
- Olmeda, D.; de la Escalera, A.; Armingol, J.M. Far infrared pedestrian detection and tracking for night driving. Robotica 2011, 29, 495–505. [Google Scholar] [CrossRef]
- Franco, V.; Kousoulidou, M.; Muntean, M.; Ntziachristos, L.; Hausberger, S.; Dilara, P. Road vehicle emission factors development: A review. Atmos. Environ. 2013, 70, 84–97. [Google Scholar] [CrossRef]
- Guo, H.; Zhang, Q.; Shi, Y.; Wang, D. On-road remote sensing measurements and fuel-based motor vehicle emission inventory in Hangzhou, China. Atmos. Environ. 2007, 41, 3095–3107. [Google Scholar] [CrossRef]
- HanwhaTechwinEurope. Samsung Thermal Camera Sees through Smoke-Thermal vs Optical Camera. 15 June 2010. Available online: https://www.youtube.com/watch?v=uz0Ee8hFudY (accessed on 17 August 2018).
- Zhang, T.H.; Tang, C.W. Multiple-target tracking on mixed images with reflections and occlusions. J. Vis. Commun. Image Represent. 2018, 52, 45–57. [Google Scholar] [CrossRef]
- Pham, C.C.; Jeon, J.W. Robust object proposals re-ranking for object detection in autonomous driving using convolutional neural networks. Signal Process. Image Commun. 2017, 53, 110–122. [Google Scholar] [CrossRef]
- Yuan, Y.; Zhao, Y.; Wang, X. Day and Night Vehicle Detection and Counting in Complex Environment. In Proceedings of the 28th International Conference on Image and Vision Computing, Wellington, New Zealand, 27–29 November 2013. [Google Scholar]
- Weiss, Y. Deriving intrinsic images from image sequences. In Proceedings of the IEEE International Conference on Computer Vision, Vancouver, BC, Canada, 7–14 July 2001; Volume 2, pp. 68–75. [Google Scholar]
- Mani, S. Intelligent Pothole Detection. 19 June 2018. Available online: https://www.youtube.com/watch?v=w6RMC_io--U (accessed on 23 August 2018).
- Miller, J.S.; Bellinger, W.Y. Distress Identification Manual for the Long-Term Pavement Maintenance Program; Federal Highway Administration: Washington, DC, USA, 2003.
- Repairer Driven News. Jaguar Land Rover Pothole Detection System. 11 June 2015. Available online: https://www.youtube.com/watch?v=KQIL5585pPA (accessed on 23 August 2018).
- Ouma, Y.O.; Hahn, M. Pothole detection on asphalt pavements from 2D-colour pothole images using fuzzy c-means clustering and morphological reconstruction. Autom. Constr. 2017, 83, 196–211. [Google Scholar] [CrossRef]
- Koch, C.; Brilakis, I. Pothole detection in asphalt pavement images. Adv. Eng. Inform. 2011, 25, 507–515. [Google Scholar] [CrossRef]
- Huidrom, L.; Das, L.K.; Sud, S.K. Method for automated assessment of potholes, cracks and patches from road surface video clips. Procedia Soc. Behav. Sci. 2013, 104, 312–321. [Google Scholar] [CrossRef] [Green Version]
- Derevitskii, I.; Kurilkin, A.; Bochenina, K. Use of video data for analysis of special transport movement. Procedia Comput. Sci. 2017, 119, 262–268. [Google Scholar] [CrossRef]
- D. f. &. r. r. CC TUBE. OPEN DOORS DAY! CAR CRASH COMPILATION. 6 May 2016. Available online: https://www.youtube.com/watch?v=imy4xYr9GM0 (accessed on 29 June 2019).
- McIntyre, S.; Gugerty, L.; Duchowski, A. Brake lamp detection in complex and dynamic environments: Recognizing limitations of visual attention and perception. Accid. Anal. Prev. 2012, 45, 588–599. [Google Scholar] [CrossRef] [PubMed]
- eMarketer. 2 Billion Consumers Worldwide to Get Smartphones by 2016. 11 December 2014. Available online: www.emarketer.com/Article/2-Billion-Consumers-World-wide-Smartphones-by-2016/1011694 (accessed on 27 November 2019).
- Vasseur, J.; Dunkels, A. Smart Cities and Urban Networks. In Interconnecting Smart Objects with IP: The Next Internet; Morgan Kaufmann: Burlington, MA, USA, 2010; pp. 335–351. [Google Scholar]
- Fagnant, D.J.; Kockelman, K. Preparing a nation for autonomous vehicles: Opportunities, barriers and policy recommendations. Transp. Res. Part A 2015, 77, 167–181. [Google Scholar] [CrossRef]
- Atiyeh, C. Predicting Traffic Patterns, One Honda at a Time. MSN Auto, 25 June 2012. [Google Scholar]
- Bullis, K. How Vehicle Automation Will Cut Fuel Consumption. MIT’s Technology Review, 24 October 2011. [Google Scholar]
- Kunze, R.; Ramakers, R.; Henning, K.; Jeschke, S. Organization of electronically coupled truck platoons on German motorways. In Proceedings of the Intelligent Robotics and Applications: Second International Conference, Singapore, 16–18 December 2009; Volume 5928, pp. 135–146. [Google Scholar]
- Inspex-H2020-project. Overall Presentation of Inspex Project. 20 March 2017. Available online: http://www.inspex-ssi.eu/Pages/Presentation-overall.aspx (accessed on 13 September 2019).
Vehicles | Autonomous Freeway Driving | Autonomous Lane Change | Semi-Autonomous Parking | Semi-Autonomous Braking |
---|---|---|---|---|
BMW750i xDrive (BMW,2017; Sherman, 2016) | √ | √ | √ | |
Ford (high-end production vehicles) (Company, 2014) | √ | √ | √ | |
2015 Infiniti Q50S (Sherman, 2016) | √ | √ | ||
Lexus RX (Lexus, 2017) | √ | √ | ||
Mercedes-Benz E and S-class(Sherman, 2016; Mercedes-Benz, 2013; Ulrich, 2014; Tingwall, 2013; Vanderbilt, 2012) | √ | √ | √ | |
Otto Semi-Trucks(Stewart, 2016) | √ | √ | ||
Renault GT Nav(Renault, 2017) | √ | √ | ||
Tesla, Model S (Golson, 2016; Sherman, 2016) | √ | √ | √ | √ |
Volvo XC90(Volvo, 2106) | √ | √ | √ |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Yu, X.; Marinov, M. A Study on Recent Developments and Issues with Obstacle Detection Systems for Automated Vehicles. Sustainability 2020, 12, 3281. https://doi.org/10.3390/su12083281
Yu X, Marinov M. A Study on Recent Developments and Issues with Obstacle Detection Systems for Automated Vehicles. Sustainability. 2020; 12(8):3281. https://doi.org/10.3390/su12083281
Chicago/Turabian StyleYu, Xiaoyan, and Marin Marinov. 2020. "A Study on Recent Developments and Issues with Obstacle Detection Systems for Automated Vehicles" Sustainability 12, no. 8: 3281. https://doi.org/10.3390/su12083281