Human–Machine Interface in Transport Systems: An Industrial Overview for More Extended Rail Applications
Abstract
:1. Introduction
2. Organization of State-of-the-Art Investigation
- Automatic and semiautomatic control systems developed by manufacturers of surface vehicles outside the railway field, including trucks, cars and ships, were detailed. Without claiming universality, selection criteria considered the potential application of automated driving functions to railway vehicles. They were required to ensure the high level of safety typical of guided transport systems and the possibility of a fast switch from autopilot to manual operation and vice versa.
- Simulators were developed for various transport systems, including rail, cars, aviation and integrated solutions. Their selection criteria was in line with that which was applied in Section 3, with the additional requirement of consolidated use for driver training activities in the respective operational fields. For the rail simulation, the focus was on systems developed by authors in previous research activities that were able to deal with operational situations, including traffic conflicts (e.g., nodes, junctions, level crossings). Some interactive functions were also imported from game simulators. However, these had a confined role due to the need to ensure that the safety levels were compatible with strict railway requirements.
- An extended literature review on the performance of supporting tools for driver assistance was conducted. The focus was predominantly on the role of human behaviors and the extent of their consideration by various experimental applications, as well as a wide review of gesture control systems as operational tools to put them into practice.
- A synthetic analysis of results and corresponding conclusions was presented.
3. Systems Developed by Manufacturers of No-Rail Surface Vehicles
3.1. Trucks
3.2. Cars
- 8 × 360° cameras;
- 12 ultrasonic sensors for the detection of hard and soft objects at a distance and with double the accuracy of previous systems;
- A forward-facing radar system, with high processing capabilities, providing additional data on the surrounding environment on different wavelengths in order to counteract the effects of heavy rain, fog, dust and other cars.
- The steering wheel retracts, providing more room for passengers;
- The pedals retract, creating a flat surface on the footwall;
- The driver and front passenger can turn back towards other passengers in the rear seats;
- Various displays provide information about the surrounding area.
3.3. Ships
4. Simulators of Transport Systems
4.1. Rail Simulators
- Immersion in a sonic and visual environment;
- Integration between real components and a simulated environment;
- Management of driver information.
- A driving simulator desk used in combination with a 3D representation of tracks;
- A traffic simulator, acting both as a single train management tool and for railway traffic control;
- A test bench connected with onboard ERTMS equipment, in compliance with specifications and rules.
4.2. Car Simulators
- Sensors for the control, communication and processing of of dashboard information;
- Images of road scenes projected on five frontal screens in a visual field covering 200° × 40°:
- A device providing rear-view images;
- Quadraphonic sound reproducing internal (motor, rolling, starter) and external traffic noises.
- A finger-tracking system;
- Tactile displays and dynamic content;
- Windshield or glasshouse reflection studies based on physically accurate reflection simulations;
- Testing and validation of head up displays, specifying and improving optical performance and the quality of the content.
4.3. Aviation Simulators
- An intuitive lesson plan builder;
- A 3D map of flight paths with event markers;
- Increased information density;
- Ergonomic redesign of interiors (Figure 5).
- A visual system with high-definition commercial projectors;
- Up to 220° × 80° field-of-view direct projection dome, with full chin window coverage tailored to helicopter training operations.
- Human cockpit operations analysis module, with human factor methods demonstrated in the prototype of the project;
- Semantic virtual cockpit, with semantic virtual scene-graph and knowledge-based reasoning of objects and intelligent querying functions, providing a semantic-based scene-graph and human task data processing and management engine;
- Virtual cockpit design environment, with a virtual environment provided by human ergonomics evaluation software based upon the Airbus flight simulator, to develop a new general user interface for cockpits.
4.4. Integrated Simulators
5. Support Tools for Driver Assistance
5.1. Human Factors and Their Limits
- To explain results in a pedagogical way [17];
5.2. Gesture Control Technology
6. Analysis of Results and Conclusions
- DRIVE PILOT, which provides automation with a fallback working mode to switch to driver responsibility;
- iNEXT-COPILOT, which works by voice or tactile command to switch to/from automatic driving.
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Deng, Q. A General Simulation Framework for Modeling and Analysis of Heavy-Duty Vehicle Platooning. IEEE Trans. Intell. Transp. Syst. 2016, 17, 352–3262. [Google Scholar] [CrossRef]
- Daimler. Highway Pilot. The Autopilot for Trucks. 2020. Available online: https://www.daimler.com/innovation/case/autonomous/highway-pilot-2.html (accessed on 28 January 2021).
- Tesla. Il Futuro Della Guida. 2020. Available online: https://www.tesla.com/it_IT/autopilot?redirect=no (accessed on 28 January 2021).
- Maximilan, J. 2019. Available online: https://commons.wikimedia.org/wiki/File:BMW_Vision_iNEXT_IAA_2019_JM_0166.jpg (accessed on 28 January 2021).
- Rekdalsbakken, W.; Styve, A. Simulation of Intelligent Ship Autopilots. In Proceedings of the 22nd European Conference on Modelling and Simulation, Nicosia, Cyprus, 3–6 June 2008. [Google Scholar]
- PSCHITT-Rail Collaborative. Hybrid, Intermodal Simulation Platform in Land Transport—Rail. Available online: https://www.uphf.fr/LAMIH/en/PSCHITT-Rail (accessed on 28 January 2021).
- OKTAL SYDAC. Conception. 2020. Available online: https://www.oktalsydac.com/en/ (accessed on 28 January 2021).
- IFFSTAR. Institut Français des Sciences et Technologies des Transports, de l’Aménagement et des Réseaux. 2020. Available online: https://www.ifsttar.fr/en/exceptional-facilities/simulators/ (accessed on 28 January 2021).
- NVIDIA DRIVE. Scalable AI Platform for Autonomous Driving. 2019. Available online: https://www.nvidia.com/en-us/self-driving-cars/drive-platform/ (accessed on 28 January 2021).
- Ansys. VRX Dynamic Driving Experience. 2020. Available online: https://www.ansys.com/products/systems/ansys-vrxperience (accessed on 28 January 2021).
- Ansys. Ansys VRXPERIENCE HMI. 2020. Available online: https://www.ansys.com/products/systems/ansys-vrxperience/hmi (accessed on 28 January 2021).
- Epagnoux, S. CAE Flight Simulator. 2020. Available online: https://commons.wikimedia.org/wiki/File:CAE-flight-simulator-Lockheed-Martin-Boeing-Airbus-aerospace-industry-Canada-EDIWeekly.jpg (accessed on 28 January 2021).
- CAE. CAE 3000 Series Flight Simulator. 2020. Available online: https://www.cae.com/civil-aviation/training-equipment-and-aviation-services/training-equipment/full-flight-simulators/cae3000/ (accessed on 28 January 2021).
- Alsim. Alsim Flight Training Solutions. Alsim Simulators & Technology. 2020. Available online: https://www.alsim.com/simulators (accessed on 28 January 2021).
- Vanderhaegen, F.; Richard, P. MissRail: A platform dedicated to training and research in railway systems. In Proceedings of the International Conference HCII, Heraklion, Greece, 22–27 June 2014; pp. 544–549. [Google Scholar]
- Vanderhaegen, F. MissRail® and Innorail. 2015. Available online: http://www.missrail.org (accessed on 28 January 2021).
- Vanderhaegen, F. Pedagogical learning supports based on human–systems inclusion applied to rail flow control. Cogn. Technol. Work 2019. [Google Scholar] [CrossRef]
- Vanderhaegen, F.; Jimenez, V. The amazing human factors and their dissonances for autonomous Cyber-Physical & Human Systems. In Proceedings of the First IEEE Conference on Industrial Cyber-Physical Systems, Saint-Petersburg, Russia, 15–18 May 2018; pp. 597–602. [Google Scholar]
- Fond, G.; MacGregor, A.; Leboyer, M.; Michalsen, A. Fasting in mood disorders: Neurobiology and effectiveness. A review of the literature. Psychiatry Res. 2013, 209, 253–258. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Smith, A. Effects of chewing gum on cognitive function, mood and physiology in stressed and non-stressed volunteers. Nutr. Neurosci. 2010, 13, 7–16. [Google Scholar] [CrossRef] [PubMed]
- Onyper, S.V.; Carr, T.L.; Farrar, J.S.; Floyd, B.R. Cognitive advantages of chewing gum. Now you see them, now you don’t. Appatite 2011, 57, 321–328. [Google Scholar] [CrossRef] [PubMed]
- Mori, F.; Naghsh, F.A.; Tezuka, T. The effect of music on the level of mental concentration and its temporal change. In Proceedings of the 6th International Conference on Computer Supported Education, Barcelona, Spain, 1–3 April 2014; pp. 34–42. [Google Scholar] [CrossRef]
- Chtouroua, H.; Briki, W.; Aloui, A.; Driss, T.; Souissi, N.; Chaouachi, A. Relationship between music and sport performance: Toward a complex and dynamical perspective. Sci. Sports 2015, 30, 119–125. [Google Scholar] [CrossRef]
- Stanton, N.A.; Young, M.S. Driver behaviour with adaptive cruise control. Ergonomics 2005, 48, 1294–1313. [Google Scholar] [CrossRef] [Green Version]
- Schömig, N.; Hargutt, V.; Neukum, A.; Petermann Stock, I.; Othersen, I. The interaction between highly automated driving and the development of drowsiness. Procedia Manuf. 2015, 3, 6652–6659. [Google Scholar] [CrossRef] [Green Version]
- Vogelpohl, T.; Kühn, M.; Hummel, T.; Vollrath, M. Asleep at the automated wheel -Sleepiness and fatigue during highly automated driving. Accid. Anal. Prev. 2019, 126, 70–84. [Google Scholar] [CrossRef]
- Borghini, G.; Astolfi, L.; Vecchiato, G.; Mattia, D.; Babiloni, F. Measuring neurophysiological signals in aircraft pilots and car drivers for the assessment of mental workload, fatigue and drowsiness. Neurosci. Biobehav. Rev. 2014, 44, 58–75. [Google Scholar] [CrossRef]
- Thomas, L.C.; Gast, C.; Grube, R.; Craig, K. Fatigue detection in commercial flight operations: Results using physiological measures. Procedia Manuf. 2015, 3, 2357–2364. [Google Scholar] [CrossRef] [Green Version]
- Wanyan, X.; Zhuang, D.; Zhang, H. Improving pilot mental workload evaluation with combined measures. BioMed Mater. Eng. 2014, 24, 2283–2290. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Pereda-Baños, A.; Arapakis, I.; Barreda-Ángeles, M. On human information processing in information retrieval (position paper). In Proceedings of the SIGIR Workshop Neuro-Physiological Methods IR, Santiago, Chile, 13 August 2015. [Google Scholar]
- Hensch, A.C.; Rauh, N.; Schmidt, C.; Hergeth, S.; Naujoks, F.; Krems, J.F.; Keinath, A. Effects of secondary tasks and display position on glance behavior during partially automated driving. Transp. Res. Part F Traffic Psychol. Behav. 2020, 68, 23–32. [Google Scholar] [CrossRef]
- De Winter, J.C.; Happee, R.; Martens, M.H.; Stanton, N.A. Effects of adaptive cruise control and highly automated driving on workload and situation awareness: A review of the empirical evidence. Transp. Res. Part F Traffic Psychol. Behav. 2014, 27, 196–217. [Google Scholar] [CrossRef] [Green Version]
- Merat, N.; Jamson, A.H.; Lai, F.C.; Daly, M.; Carsten, O.M. Transition to manual: Driver behaviour when resuming control from a highly automated vehicle. Transp. Res. Part F Traffic Psychol. Behav. 2014, 27, 274–282. [Google Scholar] [CrossRef] [Green Version]
- Di Stasi, L.L.; Contreras, D.; Cañas, J.J.; Cándido, A.; Maldonado, A.; Catena, A. The consequences of unexpected emotional sounds on driving behaviour in risky situations. Saf. Sci. 2010, 48, 1463–1468. [Google Scholar] [CrossRef]
- Sanderson, P.; Crawford, J.; Savill, A.; Watson, M.; Russell, W.J. Visual and auditory attention in patient monitoring: A formative analysis. Cogn. Technol. Work 2004, 6, 172–185. [Google Scholar] [CrossRef]
- Jakus, G.; Dicke, C.; Sodnikv, J. A user study of auditory, head-up and multi-modal displays in vehicles. Appl. Ergon. 2015, 46, 184–192. [Google Scholar] [CrossRef]
- Geitner, C.; Biondi, F.; Skrypchuk, L.; Jennings, P.; Birrell, S. The comparison of auditory, tactile, and multimodal warnings for the effective communication of unexpected events during an automated driving scenario. Transp. Res. Part F Traffic Psychol. Behav. 2019, 65, 23–33. [Google Scholar] [CrossRef]
- Salminen, K.; Farooq, A.; Rantala, J.; Surakka, V.; Raisamo, R. Unimodal and Multimodal Signals to Support Control Transitions in Semiautonomous Vehicles. In Proceedings of the 11th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, Utrecht, The Netherlands, 22–25 September 2019; Association for Computing Machinery: New York, NY, USA, 2019; pp. 308–318. [Google Scholar]
- Dalton, B.H.; Behm, D.G. Effects of noise and music on human and task performance: A systematic review. Occup. Ergon. 2007, 7, 143–152. [Google Scholar]
- Prince-Paul, M.; Kelley, C. Mindful communication: Being present. Semin. Oncol. Nurs. 2017, 33, 475–482. [Google Scholar] [CrossRef]
- Vanderhaegen, F. Dissonance engineering: A new challenge to analyze risky knowledge when using a system. Int. J. Comput. Commun. Control 2014, 9, 750–759. [Google Scholar] [CrossRef]
- Vanderhaegen, F. A rule-based support system for dissonance discovery and control applied to car driving. Expert Syst. Appl. 2016, 65, 361–371. [Google Scholar] [CrossRef]
- Vanderhaegen, F. Towards increased systems resilience: New challenges based on dissonance control for human reliability in Cyber-Physical & Human Systems. Annu. Rev. Control 2017, 44, 316–322. [Google Scholar]
- Dufour, A. Driving assistance technologies and vigilance: Impact of speed limiters and cruise control on drivers’ vigilance. In Proceedings of the Seminar on the Impact of Distracted Driving and Sleepiness on Road Safety, Paris, France, 15 April 2014. [Google Scholar]
- JTSB. Aircraft Serious Incident—Investigation Report; Report AI2008–01; JTSB: Tokyo Japan, 2008. Available online: https://www.mlit.go.jp/jtsb/eng-air_report/JA767F_JA8967.pdf (accessed on 28 January 2021).
- Galluscio, E.H.; Fjelde, K. Eye movement and reaction time measures of the effectiveness of caution signs. Saf. Sci. 1993, 16, 627–635. [Google Scholar] [CrossRef]
- Rosch, J.L.; Vogel-Walcutt, J.J. A review of eye-tracking applications as tools for training. Cogn. Technol. Work 2013, 15, 313–327. [Google Scholar] [CrossRef]
- De Winter, J.C.F.; Eisma, Y.B.; Cabrall, C.D.D.; Hancock, P.A.; Stanton, N.A. Situation awareness based on eye movements in relation to the task environment. Cogn. Technol. Work 2018, 21, 99–111. [Google Scholar] [CrossRef] [Green Version]
- Samima, S.; Sarma, S.; Samanta, D.; Prasad, G. Estimation and quantification of vigilance using ERPs and eye blink rate with a fuzzy model-based approach. Cogn. Technol. Work 2019, 21, 517–533. [Google Scholar] [CrossRef]
- Beatty, J. Task-evoked pupillary responses, processing load, and the structure of processing resources. Psychol. Bull. 1982, 91, 276–292. [Google Scholar] [CrossRef]
- Fletcher, K.; Neal, A.; Yeo, G. The effect of motor task precision on pupil diameter. Appl. Ergon. 2017, 65, 309–315. [Google Scholar] [CrossRef]
- Fogarty, C.; Stern, J.A. Eye movements and blinks: Their relationship to higher cognitive processes. Int. J. Psychophysiol. 1989, 8, 35–42. [Google Scholar] [CrossRef]
- Benedetto, S.; Pedrotti, M.; Minin, L.; Baccino, T.; Re, A.; Montanari, R. Driver workload and eye blink duration. Transp. Res. Part F Traffic Psychol. Behav. 2011, 14, 199–208. [Google Scholar] [CrossRef]
- Tsai, Y.F.; Viirre, E.; Strychacz, C.; Chase, B.; Jung, T.P. Task performance and eye activity: Predicting behavior relating to cognitive workload. Aviat. Space Environ. Med. 2007, 78, 176–185. [Google Scholar]
- Recarte, M.A.; Pérez, E.; Conchillo, A.; Nunes, L.M. Mental workload and visual impairment: Differences between pupil, blink, and subjective rating. Span. J. Psychol. 2008, 11, 374–385. [Google Scholar] [CrossRef] [Green Version]
- Findley, J.M. Visual selection, covert attention and eye movements? In Active Vision: The Psychology of Looking and Seeing; Oxford Psychology Series; Oxford University Press: Oxford, UK, 2003; pp. 35–54. [Google Scholar] [CrossRef]
- Taelman, J.; Vandeput, S.; Spaepen, A.; Van Huffel, S. Influence of mental stress on heart rate and heart rate variability. In 4th European Conference of the International Federation for Medical and Biological Engineering Proceedings; Springer: Berlin/Heidelberg, Germany, 2009; pp. 1366–1369. [Google Scholar] [CrossRef]
- Geisler, F.C.M.; Vennewald, N.; Kubiak, T.; Weber, H. The impact of heart rate variability on subjective well-being is mediated by emotion regulation. Personal. Individ. Differ. 2010, 49, 723–728. [Google Scholar] [CrossRef]
- Pizziol, S.; Dehais, F.; Tessier, C. Towards human operator state assessment. In Proceedings of the 1st International Conference on Application and Theory of Automation in Command and Control Systems, Barcelone, Spain, 26–27 May 2011; IRIT Press: Oxford, UK, 2011; pp. 99–106. [Google Scholar]
- Hidalgo-Muñoz, A.R.; Mouratille, D.; Matton, N.; Caussec, M.; Rouillard, Y.; El-Yagoubi, R. Cardiovascular correlates of emotional state, cognitive workload and time on-task effect during a realistic flight simulation. Int. J. Psychophysiol. 2018, 128, 62–69. [Google Scholar] [CrossRef] [Green Version]
- Salomon, R.; Ronchi, R.; Dönz, J.; Bello-Ruiz, J.; Herbelin, B.; Martet, R.; Faivre, N.; Schaller, K.; Blanke, O. The insula mediates access to awareness of visual stimuli presented synchronously to the heartbeat. J. Neurosci. 2016, 36, 5115–5127. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Vanderhaegen, F.; Wolff, M.; Ibarboure, S.; Mollard, R. Heart-Computer synchronization Interface to control human-machine symbiosis: A new human availability support for cooperative systems. In Proceedings of the 14th IFAC/IFIP/IFORS/IEA Symposium on Analysis, Design, and Evaluation of Human-Machine Systems, Tallinn, Estonia, 16–19 September 2019. [Google Scholar] [CrossRef]
- Vanderhaegen, F. Multilevel organization design: The case of the air traffic control. Control Eng. Pract. 1997, 5, 391–399. [Google Scholar] [CrossRef]
- Vanderhaegen, F. Toward a model of unreliability to study error prevention supports. Interact. Comput. 1999, 11, 575–595. [Google Scholar] [CrossRef]
- Vanderhaegen, F. Human-error-based design of barriers and analysis of their uses. Cogn. Technol. Work 2010, 12, 133–142. [Google Scholar] [CrossRef]
- Dehzangi, O.; Rajendra, V.; Taherisadr, M. Wearable driver distraction identification on the road via continuous decomposition of galvanic skin responses. Sensors 2018, 18, 503. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Chen, L.L.; Zhao, Y.; Ye, P.F.; Zhang, J.; Zou, J.Z. Detecting driving stress in physiological signals based on multimodal feature analysis and kernel classifiers. Expert Syst. Appl. 2017, 85, 279–291. [Google Scholar] [CrossRef]
- Collet, C.; Salvia, E.; Petit-Boulanger, C. Measuring workload with Electrodermal activity during common braking actions. Ergonomics 2014, 57, 886–896. [Google Scholar] [CrossRef]
- De Naurois, C.J.; Bourdin, C.; Stratulat, A.; Diaz, E.; Vercher, J.L. Detection and prediction of driver drowsiness using artificial neural network models. Accid. Anal. Prev. 2017, 126, 95–104. [Google Scholar] [CrossRef]
- Ngxande, M.; Tapamo, J.R.; Burke, M. Driver drowsiness detection using behavioral measures and machine learning techniques: A review of state-of-art techniques. In Proceedings of the 2017 Pattern Recognition Association of South Africa and Robotics and Mechatronics (PRASA-RobMech), Bloemfontein, South Africa, 30 November–1 December 2017; IEEE: Piscataway, NJ, USA, 2017; pp. 156–161. [Google Scholar]
- Zhao, L.; Wang, Z.; Wang, X.; Liu, Q. Driver drowsiness detection using facial dynamic fusion information and a DBN. IET Intell. Transp. Syst. 2017, 12, 127–133. [Google Scholar] [CrossRef]
- Lim, S.; Yang, J.H. Driver state estimation by convolutional neural network using multimodal sensor data. Electron. Lett. 2016, 52, 1495–1497. [Google Scholar] [CrossRef] [Green Version]
- Shukla, J.; Barreda-Ángeles, M.; Oliver, J.; Puig, D. Efficient wavelet-based artefact removal for Electrodermal activity in real-world applications. Biomed. Signal Process. Control 2018, 42, 45–52. [Google Scholar] [CrossRef]
- Li, J.; Cheng, K.; Wang, S.; Morstatter, F.; Trevino, R.P.; Tang, J.; Liu, H. Feature selection: A data perspective. ACM Comput. Surv. 2017, 50, 94. [Google Scholar] [CrossRef] [Green Version]
- Koifman, V. Sofkinetic CARlib. 2016. Available online: http://www.f4news.com/2016/06/24/softkinetic-carlib/ (accessed on 28 January 2021).
- Dhall, P. EyeDrive: A Smart Drive. BWCIO BUSINESSWORLD. 2019. Available online: http://bwcio.businessworld.in/article/EyeDrive-A-smart-drive-/05-07-2019-172905/ (accessed on 28 January 2021).
- Boulestin, R. L’Haptix Transforme Toute Surface en Interface Tactile. 2013. Available online: https://www.silicon.fr/lhaptix-transforme-toute-surface-en-interface-tactile-88560.html (accessed on 28 January 2021).
- Ganguly, B.; Vishwakarma, P.; Biswas, S.; Rahul, S. Kinect Sensor Based Single Person Hand Gesture Recognition for Man–Machine Interaction. Comput. Adv. Commun. Circuits Lect. Notes Electr. Eng. 2020, 575, 139–144. [Google Scholar]
- Saha, S.; Lahiri, R.; Konar, A. A Novel Approach to Kinect-Based Gesture Recognition for HCI Applications. In Handbook of Research on Emerging Trends and Applications of Machine Learning; IGI Global: Hershey, PA, USA, 2020; pp. 62–78. [Google Scholar]
- Georgiou, O.; Biscione, V.; Hardwood, A.; Griffiths, D.; Giordano, M.; Long, B.; Carter, T. Haptic In-Vehicle Gesture Controls. In Proceedings of the 9th International ACM Conference on Automotive User Interfaces and Interactive Vehicular Applications, Automotive, Oldenburg, Germany, 24–27 September 2017. [Google Scholar]
- He, S.; Yang, C.; Wang, M.; Cheng, L.; Hu, Z. Hand gesture recognition using MYO armband. In Proceedings of the Chinese Automation Congress, Jinan, China, 20–22 October 2017; pp. 4850–4855. [Google Scholar]
- Wong, A.M.H.; Furukawa, M.; Ando, H.; Maeda, T. Dynamic Hand Gesture Authentication using Electromyography (EMG). In Proceedings of the IEEE/SICE International Symposium on System Integration, Honolulu, HI, USA, 12–15 January 2020; pp. 300–304. [Google Scholar]
- Anderson, T. OK, Google. We’ve Got Just the Gesture for You: Hand-Tracking Project Soli Coming to Pixel 4. The Register. 2019. Available online: https://www.theregister.co.uk/2019/07/30/google_project_soli_coming_to_pixel_4/ (accessed on 28 January 2021).
- Raphael, J.R. Project Soli in Depth: How Radar-Detected Gestures Could Set the Pixel 4 Apart. COMPUTERWORLD. 2019. Available online: https://www.computerworld.com/article/3402019/google-project-soli-pixel-4.html (accessed on 28 January 2021).
- Priest, D. The Fibaro Swipe Makes Your Hand the Remote. CNET. 2016. Available online: https://www.cnet.com/reviews/fibaro-swipe-preview/ (accessed on 28 January 2021).
- Shankland, S. Minority Report’ Gesture Control is about to Get Very Real. CNET. 2018. Available online: https://www.cnet.com/news/sony-builds-eyesight-gesture-control-tech-into-xperia-touch/ (accessed on 28 January 2021).
- Zhao, L. Gesture Control Technology: An Investigation on the Potential Use in Higher Education; University of Birmingham, IT Innovation Centre: Birmingham, UK, 2016. [Google Scholar]
- Malavasi, G.; Ricci, S. Simulation of stochastic elements in railway systems using self-learning processes. Eur. J. Oper. Res. 2001, 131, 262–272. [Google Scholar] [CrossRef]
- Ricci, S.; Tieri, A. Check and forecasting of railway traffic regularity by a Petri Nets based simulation model. Ing. Ferrov. 2009, 9, 723–767. [Google Scholar]
- Ricci, S.; Capodilupo, L.; Tombesi, E. Discrete Events Simulation of Intermodal Terminals Operation: Modelling Techniques and Achievable Results. Civ. Comp. Proc. 2016. [Google Scholar] [CrossRef]
- Fang, J.; Yan, D.; Qiao, J.; Xue, J. DADA: A Large-scale Benchmark and Model for Driver Attention Prediction in Accidental Scenarios. arXiv 2019, arXiv:1912.12148. [Google Scholar]
- Lin, S.; Wang, K.; Yang, K.; Cheng, R. KrNet: A Kinetic Real-Time Convolutional Neural Network for Navigational Assistance. Lecture Notes in Computer Science. 2018. Available online: https://link.springer.com/book/10.1007/978-3-319-94274-2 (accessed on 28 January 2021).
- CARBODIN. Car Body Shells, Doors and Interiors. Grant Agreement n. 881814. In H2020 Shift2Rail Joint Undertaking; European Commission: Bruxelles, Belgium, 2019. [Google Scholar]
KINECT | LEAP MOTION | MYO BRACELET | |
---|---|---|---|
Strengths |
|
|
|
Weaknesses |
|
|
|
Opportunities |
|
|
|
Threats |
|
|
|
Name | Operating Features | Automation Levels | Vehicles |
---|---|---|---|
MAN PLATOONING | Driver assistance and control | Driver always keeps hands on the wheel | Trucks |
HIGHWAY PILOT | Autopilot | Driver can choose for autonomous driving manually | Trucks |
AUTOPILOT 2.5 | Autopilot | Driver always keeps hands on the wheel | Cars |
iNEXT-COPILOT | Easy switch to automatic mode | Driver can choose autonomous driving manually or with a voice command | Cars |
THE FALCO | Full automation | No human intervention, ergonomic HMI with control levers and touch screens for call up and control | Ships |
Name | Operating Features | Vehicles |
---|---|---|
PSCHITT-RAIL | Movement capturing via eyes trackers, physiological measurement sensors (six degrees of freedom motions) | Trains |
SPICA RAIL | Increasing disturbances to evaluate human behavior, and a supervision platform | Trains |
OKTAL SYDAC | Exact replicas of the cab for a realistic driving experience | Trains |
IFFSTAR-RAIL | Driving and rail traffic supervision simulation | Trains |
IFFSTAR TS2 | Impact of internal and external factors on driver behavior, and a fixed-base cab HMI | Cars |
NDIVIA DRIVE | Interface among environment, vehicles and traffic scenarios by open platform sensors | Cars |
VRX-2019 | Autonomous vehicle reproducing the cockpit HMI by advanced sensors | Cars |
CEE 7000XR | Full-flight with adaptation to operator needs and easy access to advanced functions | Aircrafts |
CAE 3000 | Helicopter flight in normal and unusual/dangerous conditions | Helicopters |
EXCALIBUR MP521 | Flight control with graphical user interface and data editor in capsule with six-axis motion system, visual and instrument touch displays | Aircrafts |
ALSIM | Flight training and interchangeable cockpit configuration with high performance visual system | Aircrafts |
iVISION | Semantic virtual cockpit design environment and validation of human-centered operations with analysis module | Aircrafts |
MISSRAIL | Automated driving assistance combining accident scenarios including pedestrians, trains and cars with human factor control | Integrated |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Enjalbert, S.; Gandini, L.M.; Pereda Baños, A.; Ricci, S.; Vanderhaegen, F. Human–Machine Interface in Transport Systems: An Industrial Overview for More Extended Rail Applications. Machines 2021, 9, 36. https://doi.org/10.3390/machines9020036
Enjalbert S, Gandini LM, Pereda Baños A, Ricci S, Vanderhaegen F. Human–Machine Interface in Transport Systems: An Industrial Overview for More Extended Rail Applications. Machines. 2021; 9(2):36. https://doi.org/10.3390/machines9020036
Chicago/Turabian StyleEnjalbert, Simon, Livia Maria Gandini, Alexandre Pereda Baños, Stefano Ricci, and Frederic Vanderhaegen. 2021. "Human–Machine Interface in Transport Systems: An Industrial Overview for More Extended Rail Applications" Machines 9, no. 2: 36. https://doi.org/10.3390/machines9020036
APA StyleEnjalbert, S., Gandini, L. M., Pereda Baños, A., Ricci, S., & Vanderhaegen, F. (2021). Human–Machine Interface in Transport Systems: An Industrial Overview for More Extended Rail Applications. Machines, 9(2), 36. https://doi.org/10.3390/machines9020036