Development of a Test Bench for Fault Diagnosis in the Caution and Warning Panels of the UH-60 Helicopter
Abstract
:1. Introduction
2. Context and Previous Work
3. Materials and Methods
3.1. Sistem Design Processes
3.1.1. Definition and Baseline of Stakeholder Expectations
3.1.2. Generation and Baseline of Technical Requirements
3.1.3. Logical and Behavioral Decomposition
3.2. Technical Management Processes
3.3. Description of the Logic Diagnostic
4. Results
4.1. Product Implementation
4.2. Product Verification and Validation
- Reference value: nominal voltage values stated in the technical manual of the UH-60 helicopter. When reference value is available;
- Measurements 1–4: experimental data collected from two panels in-service conditions (two measurements each) under a controlled environment;
- Measurement 5: data from a panel with known failure;
- Normal operating means: the arithmetic mean of measurements 1–4, representing the expected average behavior under normal conditions;
- Normal operating range: the difference between the maximum and minimum values recorded between measurements 1–4, indicating signal dispersion;
- Variation vs. reference: the absolute difference between the normal operating mean and the reference value from the technical manual, highlighting possible discrepancies;
- Variation vs. faulty panel: the absolute difference between the normal operating mean and the faulty panel value used to evaluate the sensitivity to faults.
- Standard deviation normal operation: this value quantifies the dispersion of signal measurements under normal system conditions (measured by measurements 1–4). Low values (<0.02 VDC) confirm high consistency and repeatability;
- Standard deviation vs. reference: measures the deviation of the average signal from the value indicated in the technical manual of the UH-60 helicopter. In several cases, deviations exceeded ±2 VDC, suggesting a possible discrepancy with actual conditions;
- Standard deviation vs. failure condition: evaluate the variability between normal operation and fault condition (Measurement 5). high deviations, up to 5.63 VDC, indicate abnormal signal behavior related to system faults;
- Percent variation vs. reference: This represents the proportional deviation of the observed mean from the UH-60 helicopter technical manual value, providing a normalized perspective of the discrepancy. When the manual lacks a reference value, the variance is recorded as 100%;
- Percentage variation vs. failure condition: Weights the percent change between the normal operating average and the failure condition. Significant deviations (greater than 300% in night vision goggle modes) reflect sensitivity to failure and help identify critical failure indicators.
5. Analysis and Discussion
5.1. Tests
5.2. Cost Analysis
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
DURC Statement
Acknowledgments
Conflicts of Interest
References
- Kinnison, H.A.; Siddiqui, T. Aviation Maintenance Management, 2nd ed.; McGraw-Hill Education: New York, NY, USA, 2013. [Google Scholar]
- Li, R.; Wang, J.; Liao, H.; Huang, N. A new method for reliability allocation of avionics connected via an airborne network. J. Netw. Comput. Appl. 2015, 48, 14–21. [Google Scholar] [CrossRef]
- Raza, A.; Ulansky, V. Through-life maintenance cost of digital avionics. Appl. Sci. 2021, 11, 715. [Google Scholar] [CrossRef]
- Beshears, R. Diagnostics model enhanced maintenance. In Proceedings of the Annual Reliability and Maintainability Symposium (RAMS), Orlando, FL, USA, 28–31 January 2019. [Google Scholar] [CrossRef]
- Barut, S.; Dominguez, N. NDT diagnosis automation: A key to efficient production in the aeronautic industry. In Proceedings of the 19th World Conference on Non-Destructive Testing, Munich, Germany, 13–17 June 2016; Available online: http://ndt.net/?id=19184 (accessed on 18 January 2025).
- Mofokeng, T.; Mativenga, P.T.; Marnewick, A. Analysis of aircraft maintenance processes and cost. Procedia CIRP 2020, 90, 467–472. [Google Scholar] [CrossRef]
- Gunes, T.; Turhan, U.; Acikel, B. An assessment of aircraft maintenance technician competency. Int. J. Aviat. Sci. Technol. 2020, 1, 22–29. [Google Scholar] [CrossRef]
- Gordo, Y.A.M.; Loaiza, C.A.S. Diseño e implementación de un dispositivo para pruebas de un actuador electro hidráulico mecánico del AFCS del helicóptero UH-60. Ing. Reg. 2008, 5, 81–89. [Google Scholar] [CrossRef]
- AeroTrain Corp. AeroTrain Maintenance and Training Solutions. Available online: http://aerotrain.aero/page-products (accessed on 19 January 2025).
- Wang, H.; Durak, U.; Hartmann, S. Design and development of research aviation training device. In Proceedings of the ASIM 2018 Conference, Heilbronn, Germany, 8–9 March 2018; Available online: https://www.researchgate.net/publication/324114955_Design_and_Development_of_Research_Aviation_Training_Device (accessed on 19 January 2025).
- Laguna Benet, G. Test Bench Design for Smart Cooling Device and Implementation in LabVIEW. Bachelor’s Thesis, Universitat de Lleida, Cataluña, Spain, 2017. Available online: https://recercat.cat/handle/10459.1/60167 (accessed on 19 January 2025).
- Patterson-Hine, A.; Hindson, W.; Sanderfer, D.; Deb, S. A model-based health monitoring and diagnostic system for the UH-60 helicopter. In Proceedings of the AHS 57th Annual Forum, Washington, DC, USA, 9–11 May 2001. [Google Scholar]
- Cabrera-Arias, C.A.; Garay-Rairan, F.S.; Contreras-Gutiérrez, D.C.; Gómez-Vargas, O.E. Modelo para el desarrollo de proyetos de innovación en tecnología para la aviación: Caso de estudio banco de pruebas digital para las pruebas caza fallas de la GCU1. Rev. Esc. Adm. Neg. 2021, 89, 235–256. [Google Scholar] [CrossRef]
- Radaei, M. Mathematical Programming for Optimization of Integrated Modular Avionics; SAE Technical Papers Series; SAE International: Warrendale, PA, USA, 2021. [Google Scholar] [CrossRef]
- Madhava, G.; Mohan, T.M.; Chakraborty, S.; Nama, N. An approach to develop a remotely operated integrated avionics test suit. INCOSE Int. Symp. 2016, 26, 264–275. [Google Scholar] [CrossRef]
- He, M.; He, J.; Scherer, S. Model-based real-time robust controller for a small helicopter. Mech. Syst. Signal Process. 2021, 146, 107022. [Google Scholar] [CrossRef]
- Qi, G.; Huang, D. Modeling and dynamical analysis of a small-scale unmanned helicopter. Nonlinear Dyn. 2019, 98, 2131–2145. [Google Scholar] [CrossRef]
- D’Souza, M.; Kashi, R.N. Avionics self-adaptive software: Towards formal verification and validation. In Lecture Notes in Computer Science; Springer: Cham, Switzerland, 2019; Volume 11319, pp. 3–23. [Google Scholar] [CrossRef]
- Annighoefer, B.; Roeseler, C.; Neumann, S.; Garmann, B.; Schaefer, M. The adaptive avionics platform. IEEE Aerosp. Electron. Syst. Mag. 2019, 34, 6–17. [Google Scholar] [CrossRef]
- Mu, X.; Li, G.; Yu, Y. Advanced helicopter cockpit ergonomic design concepts. In Complex Systems Design & Management; Springer: Cham, Switzerland, 2021; pp. 121–131. [Google Scholar] [CrossRef]
- División de Aviación Asalto Aéreo del Ejército Nacional de Colombia. Datos Estadísticos de Reparaciones del Caution Warning Panel Obtenidos del Sistema ERP SAP en el Período Comprendido Entre el 2020–2024; DAVAA: Bogotá, Colombia, 2024. [Google Scholar]
- Gemünden, H.G.; Lehner, P.; Kock, A. The project-oriented organization and its contribution to innovation. Int. J. Proj. Manag. 2018, 36, 147–160. [Google Scholar] [CrossRef]
- Hamid, S.S.; Nasir, M.H.N.M.; Sahibuddin, S.; Nor, M.K.M. Managing software projects with team software process (TSP). In Enterprise Resource Planning: Concepts, Methodologies, Tools, and Applications; IGI Global: Hershey, PA, USA, 2013; Volumes 1–3, pp. 84–117. [Google Scholar] [CrossRef]
- Jones, C. Software Methodologies: A Quantitative Guide, 1st ed.; CRC Press Taylor & Francis Group: Boca Raton, FL, USA, 2017. [Google Scholar] [CrossRef]
- O’Regan, G. Software process improvement. In Concise Guide to Software Engineering; Springer: Cham, Switzerland, 2017; pp. 239–254. [Google Scholar] [CrossRef]
- US Army. TM 1-1520-237-23&P Interactive Electronic Technical Manual for Field Maintenance Manual for Army Model Helicopters UH-60A, UH-60L, EH-60A, HH-60A, and HH-60L, Including Parts Information (EM 0013); Rev. 09; Department of the Army: Washington, DC, USA, 2017. [Google Scholar]
- Karasubasi, M.; Koktas, Y.; Sagirkaya, H. Model-based testing of aircraft interfaces. In Proceedings of the AUTOTESTCON 2022, National Harbor, MD, USA, 29 August 2022. [Google Scholar] [CrossRef]
- Song, Z. The Test and Launch Control Technology for Launch Vehicles, 1st ed.; Springer: Singapore, 2018. [Google Scholar] [CrossRef]
- Mirzaeian, M.; Langridge, S. Creating a virtual test bed using a dynamic engine model with integrated controls to support in-the-loop hardware and software optimization and calibration. Energies 2021, 14, 652. [Google Scholar] [CrossRef]
- Larios, D.F.; Barbancho, J.; Biscarri, F.; Monedero, I. A research study for the design of a portable and configurable ground test system for the A400M aircraft. Int. J. Aerosp. Eng. 2019, 2019, 5167575. [Google Scholar] [CrossRef]
- National Aeronautics and Space Administration (NASA). NASA Systems Engineering Handbook; NASA/SP-2007-6105 Rev1; NASA: Washington, DC, USA, 2016. Available online: https://ntrs.nasa.gov/search.jsp?R=20170001761 (accessed on 22 January 2025).
- Honour, E. Systems engineering return on investment. INCOSE Int. Symp. 2010, 20, 1422–1439. [Google Scholar] [CrossRef]
- International Council on Systems Engineering (INCOSE). Systems Engineering Handbook: A Guide for System Life Cycle Process and Activities, 4th ed.; INCOSE: San Diego, CA, USA, 2015; Available online: https://www.researchgate.net/publication/280570921 (accessed on 22 January 2025).
- Schön, E.M.; Thomaschewski, J.; Escalona, M.J. Agile requirements engineering: A systematic literature review. Comput. Stand. Interfaces 2017, 49, 79–91. [Google Scholar] [CrossRef]
- Dalpiaz, F.; Brinkkemper, S. Agile requirements engineering: From user stories to software architectures. In Proceedings of the IEEE International Conference on Requirements Engineering, Notre Dame, IN, USA, 16–20 August 2021; pp. 504–505. [Google Scholar] [CrossRef]
- Mumtaz, M.; Ahmad, N.; Ashraf, M.U.; Alghamdi, A.M.; Bahaddad, A.A.; Almarhabi, K.A. Iteration causes, impact, and timing in software development lifecycle: An SLR. IEEE Access 2022, 10, 65355–65375. [Google Scholar] [CrossRef]
- Kooli, M.; Kaddachi, F.; Di Natale, G.; Bosio, A.; Benoit, P.; Torres, L. Computing reliability: On the differences between sofware testing and software fault injection techniques. Microprocess. Microsyst. 2017, 50, 102–112. [Google Scholar] [CrossRef]
- Zhang, H.; Liu, W.; Xiong, H.; Dong, X. Analyzing data flow diagrams by combination of formal methods and visualization techniques. J. Vis. Lang. Comput. 2018, 48, 41–51. [Google Scholar] [CrossRef]
- Napolitano, E.V.; Masciari, E.; Ordonez, C. Integrating flow and structure in diagrams for data science. In Proceedings of the 2024 IEEE International Conference on Big Data (BigData), Washington, DC, USA, 15–18 December 2024; pp. 5769–5774. [Google Scholar] [CrossRef]
- Lanzotti, F.G.; Di Gironimo, G.; Korzeniowska, J.; Imbriani, V.; Mazzone, G.; You, J.H.; Marzullo, D. Systems engineering procedure for requirements management of divertor system of tokamak reactors. Fusion Eng. Des. 2023, 194, 113917. [Google Scholar] [CrossRef]
- Smagin, D.I.; Savelev, R.S.; Satin, A.A. Methods for the design of modern on-board systems of advanced aircraft. In Proceedings of the 2019 IEEE 10th International Conference on Mechanical and Aerospace Engineering (ICMAE), Brussels, Belgium, 22–25 July 2019; pp. 97–101. [Google Scholar] [CrossRef]
- Potthoff, N.; Feismann, T.; Denkelmann, R.; Kiffmeier, M.; Ruebartsch, M.; Frei, S. Optimization of power and signal distribu-tion systems for advanced safety features. In Proceedings of the AmE 2020—Automotive meets Electronics, Dortmund, Germany, 10–11 March 2020. [Google Scholar]
- Bester, J.E.; Mpanda, A.; Polfliet, N.; Van Den Keybus, J.; Hazemeyer, G.M. Design of an electrical distribution test rig for validation of equipment for more electrical aircraft. In Proceedings of the 2013 15th European Conference on Power Electronics and Applications (EPE), Lille, France, 2–6 September 2013. [Google Scholar] [CrossRef]
- Al-Roomi, A.R. Fundamentals of power system protection. In Optimal Coordination of Power Protective Devices with Illustrative Examples; Wiley: Hoboken, NJ, USA, 2021; pp. 57–77. [Google Scholar] [CrossRef]
- Dello Sterpaio, L.; Nannipieri, P.; Marino, A.; Fanucci, L. Design of a spacewire/spacefibre EGSE system based on PXI-industry standard. In Proceedings of the 2019 IEEE International Workshop on Metrology for Aerospace (MetroAeroSpace), Turin, Italy, 19–21 June 2019; pp. 22–26. [Google Scholar] [CrossRef]
- Nerkar, S.S. Temperature control using PLC and LabVIEW integration. Int. J. Futur. Revolut. Comput. Sci. Commun. Eng. 2018, 4, 325–330. Available online: http://www.ijfrcsce.org (accessed on 8 February 2025).
- ISO 9241-210:2019; Ergonomics of Human-System Interaction—Part 210: Human-Centred Design for Interactive Systems. International Organization for Standardization: Geneva, Switzerland, 2019.
- Huang, D.; Yang, C.; Li, B. Rack-level cooling and server-level cooling. In Data Center Handbook: Plan, Design, Build, and Opera-tions of a Smart Data Center; Wiley: Hoboken, NJ, USA, 2021; pp. 227–237. [Google Scholar] [CrossRef]
- Santos, A.S.; Madureira, A.M.; Varela, L.R. A self-parametrization framework for meta-heuristics. Mathematics 2022, 10, 475. [Google Scholar] [CrossRef]
- Mayorga-Ponce, R.B.; Reyes-Torres, S.B.; Baltazar-Téllez, R.M.; Martínez-Alamilla, A. Medidas de dispersión. Educ. Y Salud Boletín Científico Inst. De Cienc. De La Salud Univ. Autónoma Del Estado De Hidalgo 2021, 9, 77–79. [Google Scholar] [CrossRef]
- Brooke, J. SUS: A quick and dirty usability scale. In Usability Evaluation in Industry, 1st ed.; Taylor & Francis Group: London, UK, 1996; pp. 207–212. [Google Scholar] [CrossRef]
- Sauro, J. Measuring Usability with the System Usability Scale (SUS)—MeasuringU. Available online: https://measuringu.com/sus/ (accessed on 6 April 2025).
- Brigada de Aviación Ejército No. 32. Plan De Prototipado: Mejoramiento Del Proceso De Mantenimiento Del Panel De Precaución Y Advertencia De Los Helicópteros Uh-60 Series, Por Medio De Una Solución Tecnológica; Ejército Nacional de Colombia: Nilo, Colombia, 2024. [Google Scholar]
Variable | Description |
---|---|
Repair capability | 80% of repairs are outsourced to external suppliers. |
Repair time | 4–6 months per unit (taken from shipment of the component to the external supplier and until receipt of the component in the organization). |
Diagnostic time | 1–3 days per unit. |
External Supplier Repair Cost | $46,000,000 COP (Average). |
New component cost | $75,000,000 COP (Average). |
Optimal stock | 10 units in stock vs. 14 minimum units required. |
Category | Description | Acceptance Criteria |
---|---|---|
Functionality | Ability to perform automated tests based on maintenance manuals. | The system must automatically execute tests following the procedures specified in the manuals. |
Generation of detailed and customized test reports. | The reports must include complete data and be customizable according to user needs. | |
User management with specific roles and permissions. | The system must allow the creation and management of users with different levels of access and permissions. | |
Integration with existing maintenance management systems. | The system must seamlessly integrate with the maintenance management systems already in use. | |
Performance | Real-time data processing. | The system must process and analyze test data in real-time. |
High reliability under conditions of high operational load. | The system must maintain optimal performance even under high-load conditions. | |
Usability | Intuitive and easy-to-use user interface. | Users must be able to navigate and use the system intuitively without extensive training. |
Accessibility for users with different levels of technical skills. | The system must be accessible to users with varying levels of technical skill. | |
Security | Compliance with aeronautical security standards. | The system must comply with all applicable aeronautical security regulations and standards. |
Protection against unauthorized access and cyberattacks. | The system must implement robust security measures to prevent unauthorized access and attacks. | |
Scalability | Expandability to support more users and larger data volumes in the future. | The system must be scalable to accommodate growth in the number of users and data processed. |
Maintenance | Ease of software updates and maintenance. | The system must allow for efficient updates and maintenance with minimal downtime. |
Category | Requirement | Description |
---|---|---|
Must have | Intuitive user interface | The interface must be easy to use, allowing users to perform tests and generate reports without extensive training. |
Integration with existing systems | The software must be integrated with existing maintenance management systems to facilitate data transfer and operational consistency. | |
Automated test execution | It must enable the automation of tests to improve efficiency and reduce human error. | |
Security and compliance with aeronautical standards | The system must comply with all safety regulations and industry standards in the aeronautical sector. | |
Generation of detailed and customizable reports | It must be capable of generating detailed reports that can be customized according to the specific needs of users. | |
Should have | Multilingual functionality | The interface should support multiple languages to be used by personnel from different regions. |
Predictive analysis module | It should include predictive analysis functionalities to anticipate failures and improve preventive maintenance. | |
Support for mobile devices | It should be accessible from mobile devices for greater flexibility in usage. | |
Could have | Integration with augmented reality | It could include augmented reality capabilities to guide technicians in the maintenance process. |
Real-time chat functionality for technical support | It could incorporate a real-time chat system to provide instant technical support. | |
Advanced user interface customization | It could allow users to customize the appearance and layout of the interface according to their individual preferences. | |
Won’t have | Integration with financial management systems | It will not be integrated with financial management systems in this phase of the project, as it is not essential for the test bench operation. |
Gamification for training and education | Gamification will not be included at this stage, as the focus is on the operational functionality of the test bench. | |
Big data analysis for advanced predictive maintenance | Big data analysis will not be addressed in this phase, as it requires additional resources and will be considered in future versions of the system. |
Signal (VDC) | Reference Value (Manual) | Measuremt 1 (Panel 1) | Measurement 2 (Panel 1) | Measurement 3 (Panel 2) | Measurement 4 (Panel 2) | Measurement 5 (Faulty Panel) | Mean Normal Operation | Normal Operation Range | Normal Operation Variation vs. Reference | Normal Operation Variation vs. Failure |
---|---|---|---|---|---|---|---|---|---|---|
Normal condition (no capsules activated) | ||||||||||
Fault In/Out | Not reference | −1.24 | −1.21 | −1.22 | −1.25 | −1.25 | −1.23 | 0.04 | −1.23 | 0.02 |
Inverter Relay | Not reference | 5.54 | 5.57 | 5.53 | 5.56 | 7.77 | 5.55 | 0.04 | 5.55 | 2.22 |
BRT/DIM Advisory | 25 | 17.87 | 17.86 | 17.87 | 17.87 | 17.87 | 17.87 | 0.01 | −7.13 | 0.00 |
BRT/DIM Caution | 25 | 18.08 | 18.07 | 18.08 | 18.08 | 18.08 | 18.08 | 0.01 | −6.92 | 0.00 |
Master Warning | 25 | 17.75 | 17.74 | 17.75 | 17.75 | 17.75 | 17.75 | 0.01 | −7.25 | 0.00 |
Maximum brightness mode (all capsules activated) | ||||||||||
Fault In/Out | 22 a 26 | 17.86 | 17.86 | 17.74 | 17.86 | 15.46 | 17.86 | 0.02 | −8.15 | 2.40 |
Inverter Relay | Not reference | 7.29 | 7.06 | 7.29 | 7.3 | 10.23 | 7.29 | 0.03 | 7.29 | 2.94 |
BRT/DIM Advisory | 25 | 17.87 | 17.8 | 17.79 | 17.87 | 17.13 | 17.86 | 0.03 | −7.14 | 0.73 |
BRT/DIM Caution | 25 | 18.08 | 17.99 | 17.99 | 18.08 | 17.14 | 18.00 | 0.03 | −7.00 | 0.86 |
Master Warning | 25 | 17.75 | 17.67 | 17.67 | 17.75 | 17.13 | 17.74 | 0.03 | −7.26 | 0.61 |
Activation of dimming mode | ||||||||||
Fault In/Out | Not reference | 3.06 | 3.07 | 3.04 | 3.05 | −1.02 | 3.06 | 0.03 | 3.06 | 4.08 |
Inverter Relay | 25 | 24.18 | 24.17 | 24.16 | 24.18 | 12.57 | 24.17 | 0.02 | −0.83 | 11.60 |
BRT/DIM Advisory | 3 a 4.5 | 3.86 | 3.86 | 3.862 | 3.87 | 15.78 | 3.86 | 0.01 | −0.64 | 11.92 |
BRT/DIM Caution | 3 a 4.5 | 3.82 | 3.81 | 3.82 | 3.82 | 15.45 | 3.82 | 0.01 | −0.68 | 11.63 |
Master Warning | 3 a 4.5 | 3.82 | 3.83 | 3.81 | 3.82 | 15.43 | 3.82 | 0.02 | −0.68 | 11.61 |
Activation of NVG (Night Vision Goggles) mode | ||||||||||
Fault In/Out | Not reference | 3.04 | 3.03 | 3.04 | 3.04 | −1.6 | 3.04 | 0.01 | 3.04 | 4.64 |
Inverter Relay | 25 | 24.14 | 24.16 | 24.14 | 24.16 | 12.56 | 24.15 | 0.02 | −0.85 | 11.59 |
BRT/DIM Advisory | 3 a 4.5 | 3.86 | 3.84 | 3.85 | 3.83 | 17.75 | 3.85 | 0.03 | −0.66 | 13.91 |
BRT/DIM Caution | 3 a 4.5 | 3.8 | 3.83 | 3.85 | 3.83 | 17.9 | 3.83 | 0.04 | −0.68 | 14.08 |
Master Warning | 3 a 4.5 | 3.8 | 3.83 | 3.85 | 3.84 | 17.62 | 3.83 | 0.04 | −0.67 | 13.79 |
Signal (VDC) | Standard Deviation Normal Operation | Standard Deviation Normal Operation vs. Reference | Standard Deviation Normal Operation vs. Failure | Equation % Normal Operation vs. Reference | Equation % Normal Operation vs. Failure |
---|---|---|---|---|---|
Normal condition (no capsules activated) | |||||
Fault In/Out | 0.018 | 0.492 | 0.02 | 100% | 2% |
Inverter Relay | 0.018 | 2.220 | 0.89 | 100% | 40% |
BRT/DIM Advisory | 0.005 | 2.853 | 0.00 | 40% | 0% |
BRT/DIM Caution | 0.005 | 2.769 | 0.00 | 38% | 0% |
Master Warning | 0.005 | 2.901 | 0.00 | 41% | 0% |
Maximum brightness mode (all capsules activated) | |||||
Fault In/Out | 0.010 | 3.258 | 0.96 | 46% | 13% |
Inverter | 0.013 | 2.915 | 1.18 | 100% | 40% |
BRT/DIM Advisory | 0.015 | 2.857 | 0.29 | 40% | 4% |
BRT/DIM Caution | 0.015 | 2.799 | 0.35 | 39% | 5% |
Master Warning | 0.015 | 2.905 | 0.24 | 41% | 3% |
Activation of dimming mode | |||||
Fault In/Out | 0.013 | 1.222 | 1.63 | 100% | 133% |
Inverter Relay | 0.010 | 0.331 | 4.64 | 3% | 48% |
BRT/DIM Advisory | 0.005 | 0.255 | 4.77 | 16% | 308% |
BRT/DIM Caution | 0.005 | 0.273 | 4.65 | 18% | 305% |
Master Warning | 0.008 | 0.272 | 4.64 | 18% | 304% |
Activation of NVG (Night Vision Goggles) mode | |||||
Fault In/Out | 0.005 | 1.215 | 1.86 | 100% | 153% |
Inverter Relay | 0.012 | 0.340 | 4.64 | 4% | 48% |
BRT/DIM Advisory | 0.013 | 0.262 | 5.56 | 17% | 362% |
BRT/DIM Caution | 0.017 | 0.270 | 5.63 | 18% | 368% |
Master Warning | 0.017 | 0.267 | 5.52 | 17% | 360% |
Category | Description | Criteria |
---|---|---|
Satisfaction | How much does the user like the digital interface and ergonomics of the test bench. | Likert scale; Very dissatisfied (1)—Very satisfied (5). |
Usability | Ability of the test bench to be employed by users to accomplish their tasks. | Usability scaling system—SUS [51]. |
Efficiency | Time it will decrease to use the test bench to proceed with the repair of the component. | Likert scale; Very significant (1)—Not significant (5). |
Understandable | How complex is it to understand the information in the diagnostic report provided by the test bench. | Likert scale; Very easy (1)—Very difficult (5). |
Quality information | The information provided by the test equipment allows decisions to be made to make the repair to the component. | Likert scale; Strongly disagree (1)—Strongly agree (5). |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Sáenz-Hernández, C.; Cuadros, R.; Rodríguez, J.; Rativa, E.; Linares-Vásquez, M.; Donoso, Y.; Lozano, C. Development of a Test Bench for Fault Diagnosis in the Caution and Warning Panels of the UH-60 Helicopter. Eng 2025, 6, 101. https://doi.org/10.3390/eng6050101
Sáenz-Hernández C, Cuadros R, Rodríguez J, Rativa E, Linares-Vásquez M, Donoso Y, Lozano C. Development of a Test Bench for Fault Diagnosis in the Caution and Warning Panels of the UH-60 Helicopter. Eng. 2025; 6(5):101. https://doi.org/10.3390/eng6050101
Chicago/Turabian StyleSáenz-Hernández, Cristian, Rubén Cuadros, Jorge Rodríguez, Edwin Rativa, Mario Linares-Vásquez, Yezid Donoso, and Cristian Lozano. 2025. "Development of a Test Bench for Fault Diagnosis in the Caution and Warning Panels of the UH-60 Helicopter" Eng 6, no. 5: 101. https://doi.org/10.3390/eng6050101
APA StyleSáenz-Hernández, C., Cuadros, R., Rodríguez, J., Rativa, E., Linares-Vásquez, M., Donoso, Y., & Lozano, C. (2025). Development of a Test Bench for Fault Diagnosis in the Caution and Warning Panels of the UH-60 Helicopter. Eng, 6(5), 101. https://doi.org/10.3390/eng6050101