Development and Reliability Review of an Assessment Tool to Measure Competency in the Seven Elements of the Risk Management Process: Part Two—Evaluation of the Individual Results from the RISKometric Tool
Abstract
:1. Introduction
1.1. Background
1.2. Research Programme
- The development of the risk scenario exercise;
- The degree of association between RISKometric and individual risk scenario exercise ratings (Part A);
- The degree of association between the control group’s risk scenario exercise ratings (Part B).
1.3. Scenario Planning Tasks
2. Materials and Methods
2.1. Participants
2.2. Procedure and Material
- E1 “establishing the context” competency was tested using ownership, roles, and scope issues to establish appropriate stakeholders and legal and regulatory frameworks.
- E2 “risk identification” competency was tested using downside and upside (opportunities) risks, competing trade and infrastructure requirements, multiple consequences and other ambiguities such as fitness for work. Additions also included some consideration of financial penalties and compressed timeframes and some unknowns with weather conditions.
- E3 “risk analysis” competency was measured by including a range of likelihood of events and the multiple consequences that could result from the hazards.
- E4 “risk evaluation” competency was measured on factors such as comparison of the estimated levels of risk against the benchmark criteria, consideration of the balance between potential benefits and adverse outcomes and determining if the risk is acceptable or tolerable and mitigation and priorities to “so far as reasonably practicable” principles.
- E5 “risk treatment” competency was measured in terms of tolerating, treating, terminating or transferring the risk in combination with the methodology to get the job done. Due to the multiple ways to undertake the task (e.g., by fixed or extending ladder, platform ladder, trestle ladder, scaffolding, elevated work platforms), this was the most extensive set of criteria for consideration. However, all elements in the risk management process were given an equal rating.
- E6 “communication and consultation” competency was measured to determine who should be engaged and relevant information sources.
- E7 “monitor and review” competency was measured to check that controls were in place and establish their efficiency for the given scenario.
- Read the risk scenario narrative concerning the supporting diagram;
- Undertake a risk assessment by any means you want—but you must do it without assistance from another person;
- You can take notes and use other resources to complete the risk assessment;
- You do not have to finish the task in one sitting;
- While undertaking the task, record your thought processes and decisions either using the camera or your voice recorder on your computer;
- It is important that you verbalise your thoughts and decisions while undertaking the task;
- Once finished, send the voice/video file to the researcher, and
- The process should not take more than 30 min, based on eight people completing a pilot risk scenario exercise with an average completion time of 27 min.
- Twenty submitted video files (8 with supporting notations);
- Four submitted voice files (2 with supporting notations), and
- Two submitted written responses only.
2.3. Analysis Strategy
- First-round completion of the risk scenario exercise (n = 26 participants), and
- Second-round completion of the risk scenario exercise by the control group (n = 8).
3. Results
3.1. The First Round of Individual Risk Scenario Exercise Results (n = 26)
3.1.1. Inter-Rater Reliability
3.1.2. Comparison of the First-Round Individual Risk Scenario Exercise Results with Study 1 Results (n = 26)
3.2. Risk Scenario Exercise Results, Round One and Two, Control Group (n = 8)
3.2.1. Inter-Rater Reliability—The Control Group—Round Two
3.2.2. Comparison of the Control Group’s Risk Scenario Exercise Results across Rounds One and Two (n = 8)
4. Discussion
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Marling, G.J.; Horberry, T.; Harris, J. Development and reliability of an assessment tool to measure competency in the seven elements of the risk management process: Part one—The RISKometric. Safety 2020, 7, 1. [Google Scholar] [CrossRef]
- Borys, D.; Else, D.; Leggett, S. The fifth age of safety: The adaptive age? J. Health Saf. Res. Pract. 2009, 1, 19–27. [Google Scholar]
- Pillay, M.; Borys, D.; Else, D.; Tuck, M. Safety Culture and Resilience Engineering—Exploring Theory and Application in Improving Gold Mining, Safety. In Proceedings of the Gravity Gold Conference/BALLARAT, Ballarat, VIC, Australia, 21–22 September 2010. [Google Scholar]
- Lewis, K. Knowledge and Performance in Knowledge-Worker Teams: A Longitudinal Study of Transactive Memory Systems. Manag. Sci. 2004, 50, 1519–1533. [Google Scholar] [CrossRef]
- ISO 31000:2009; Risk Management—Principles and Guidelines. Standards Australia: Sydney, NSW, Australia; Standards New Zealand: Wellington, New Zealand, 2009.
- Lyon, B.K.; Hollcroft, B. Risk Assessments Top 10 Pitfalls & Tips for Improvement, December. Professional Safety. J. Am. Soc. Saf. Eng. 2012, 57, 28–34. [Google Scholar]
- Mathieu, J.; Maynard, M.T.; Rapp, T.; Gilson, L. Team Effectiveness 1997–2007: A Review of Recent Advancements and a Glimpse into the Future. J. Manag. 2008, 34, 410–476. [Google Scholar] [CrossRef] [Green Version]
- Reason, J.; Andersen, H.B. Errors in a team context. Mohawc Belg. Workshop, 1991. [Google Scholar]
- Guzzo, R.A.; Dickson, M.W. Teams in organizations: Recent research on performance and effectiveness. Psychology 1995, 47, 307–338. [Google Scholar] [CrossRef] [PubMed]
- Sasou, K.; Reason, J. Team errors: Definition and taxonomy. Reliab. Eng. Syst. Saf. 1999, 65, 1–9. [Google Scholar] [CrossRef]
- Houghton, S.M.; Simon, M.; Aquino, K.; Goldberg, C.B. No Safety in Numbers Persistence of Biases and Their Effects on Team Risk Perception and Team Decision Making. Team Organ. Manag. 2000, 25, 325–353. [Google Scholar]
- Cooper, A.C.; Woo, C.Y.; Dunkelberg, W.C. Entrepreneurs’ perceived chance of success. J. Bus. Ventur. 1988, 3, 97–108. [Google Scholar] [CrossRef]
- Feeser, H.R.; Willard, G.E. Founding strategy and performance: A comparison of high and low growth high tech firms. Strateg. Manag. J. 1990, 11, 87–98. [Google Scholar] [CrossRef]
- McCarthy, S.M.; Schoorman, F.D.; Cooper, A.C. Reinvestment decisions by entrepreneurs: Rational decision-making or escalation of commitment? J. Bus. Ventur. 1993, 8, 9–24. [Google Scholar] [CrossRef]
- Van Knippenberg, D.; Schippers, M.C. Workgroup diversity. Annu. Rev. Psychol. 2007, 58, 515–541. [Google Scholar] [CrossRef] [Green Version]
- Mello, A.S.; Ruckes, M.E. Team Composition. J. Bus. 2006, 79, 1019–1039. [Google Scholar] [CrossRef]
- Harrison, D.A.; Price, K.H.; Bell, M.P. Beyond relational demography: Time and the effects of surface- and deep-level diversity on work group cohesion. Acad. Manag. J. 1998, 41, 96–107. [Google Scholar]
- Hoffman, R.L.; Maier, N.R.F. Quality and acceptance of problem solutions by members of homogeneous and heterogeneous groups. J. Abnorm. Soc. Psychol. 1961, 62, 401–407. [Google Scholar] [CrossRef] [PubMed]
- Van Knippenberg, D.; De Dreu CK, W.; Homan, A.C. Work group diversity and group performance: An integrative model and research agenda. J. Appl. Psychol. 2004, 89, 1008. [Google Scholar] [CrossRef]
- Carrillo, J.D.; Gromb, D. Culture in Organisations: Inertia and Uniformity; Discussion Paper # 3613; Centre for Economic Policy Research: London, UK, 2002. [Google Scholar]
- Bantel, K.; Jackson, S. Top management and innovations in banking: Does the composition of the top team make a difference? Strateg. Manag. J. 1989, 10, 107–124. [Google Scholar] [CrossRef]
- Madjuka, R.J.; Baldwin, T.T. Team-based employee involvement programs: Effects of design and administration. Pers. Psychol. 1991, 44, 793–812. [Google Scholar] [CrossRef]
- Jackson, S.E.; May, K.E.; Whitney, K. Understanding the dynamics of configuration in decision making teams. In Team Effectiveness and Decision Making in Organizations; Jossey-Bass: San Francisco, CA, USA, 1995. [Google Scholar]
- Fox, S.; Caspy, T.; Reisler, A. Variables affecting leniency, halo and validity of self-appraisal. J. Occup. Organ. Psychol. 1994, 67, 45–56. [Google Scholar] [CrossRef]
- McEvoy, G.; Beatty, R. Assessment centres and subordinate appraisal of managers: A seven-year longitudinal examination of predictive validity. Pers. Psychol. 1989, 42, 37–52. [Google Scholar] [CrossRef]
- Atkins, P.W.B.; Wood, R.E. Self-versus others’ ratings as predictors of assessment center ratings: Validation evidence for 360-degree feedback programs. Pers. Psychol. 2002, 55, 871–904. [Google Scholar] [CrossRef]
- Bettenhausen, K.; Fedor, D. Peer and upward appraisals: A comparison of their benefits and problems. Group Organ. Manag. 1997, 22, 236–263. [Google Scholar] [CrossRef]
- Fedor, D.; Bettenhausen, K.; Davis, W. Peer reviews: Employees’ dual roles as raters and recipients’. Group Organ. Manag. 1999, 24, 92–120. [Google Scholar] [CrossRef]
- Wexley, K.; Klimoski, R. Performance appraisal: An update. In Research in Personnel and Human Resources; Rowland, K., Ferris, G., Eds.; JAI Press: Greenwich, CT, USA, 1984. [Google Scholar]
- Kane, J.; Lawler, E. Methods of peer assessment. Psychol. Bull. 1978, 8, 555–586. [Google Scholar] [CrossRef]
- Murphy, K.; Cleveland, J. Performance Appraisal: An Organizational Perspective; Allyn & Bacon: Boston, MA, USA, 1991. [Google Scholar]
- Cardy, R.; Dobbins, G. Performance Appraisal: Alternative Perspectives; South-Western Publishing: Cincinnati, OH, USA, 1994. [Google Scholar]
- Burt, G.; Wright, G.; Bradfield, R.; Cairns, G.; van der Heijden, K. Limitations of PEST and its derivatives to understanding the environment: The role of scenario thinking in identifying environmental discontinuities and managing the future. Int. Stud. Manag. Organ. 2006, 36, 78–97. [Google Scholar]
- Schwartz, P. The Art of the Long View; Doubleday: New York, NY, USA, 1991. [Google Scholar]
- Van der Heijden, K. Scenarios, the Art of Strategic Conversation; John Wiley: Chichester, UK, 1996. [Google Scholar]
- De Geus, A. Planning as learning. Harv. Bus. Rev. 1988, 66, 70–74. [Google Scholar]
- Chermack, T.J. Studying scenario planning: Theory, research suggestions, and hypotheses. Technol. Forecast. Soc. Chang. 2005, 72, 59–73. [Google Scholar] [CrossRef]
- Van der Heijden, K.; Bradfield, R.; Burt, G.; Cairns, G.; Wright, G. The Sixth Sense: Accelerating Organisational Learning with Scenarios; John Wiley: Chichester, UK, 2002. [Google Scholar]
- Marsh, B. Using scenarios to identify, analyze, and manage uncertainty. In Learning from the Future; Fahey, L., Randall, R.M., Eds.; John Wiley: Chichester, UK, 1998; pp. 40–53. [Google Scholar]
- Burt, G.; Chermack, T.J. Learning with Scenarios: Summary and Critical Issues. Adv. Dev. Hum. Resour. 2008, 10, 285–295. [Google Scholar] [CrossRef] [Green Version]
- Sincero, S.M. Methods of Survey Sampling. 2015. Available online: https://explorable.com/methods-of-survey-sampling (accessed on 1 April 2021).
- Marling, G.J. Optimising Risk Management Team Processes. Ph.D. Thesis, The University of Queensland, Brisbane, Australia, 2015. [Google Scholar]
- Marling, G.J.; Horberry, T.; Harris, J. Development and validation of plain English interpretations of the seven elements of the risk management process. Safety 2019, 5, 75. [Google Scholar] [CrossRef]
General Criteria | Specific Criteria | Objective Evidence (0/P/F) | Score (0–5) |
---|---|---|---|
Did the participant identify the affected parties and the parties that affect the task? |
| ||
Did the participant identify the appropriate legal frameworks? |
| ||
Did the participant identify the criteria that risks will be evaluated against? |
|
Element | Median | Quartile Array 1 (Q1) | Quartile Array 3 (Q3) | |||
---|---|---|---|---|---|---|
Observer | O1 | O1 | O1 | |||
O2 | O2 | O2 | ||||
E1—“establishing the context” | 2.00 | 1.00 | 3.00 | |||
2.00 | 1.00 | 3.00 | ||||
E2—“risk identification” | 3.00 | 2.00 | 4.00 | |||
3.00 | 2.00 | 4.00 | ||||
E3—“risk analysis” | 2.00 | 1.00 | 2.75 | |||
1.00 | 0.25 | 2.00 | ||||
E4—“risk evaluation” | 0.00 | 0.00 | 2.00 | |||
1.00 | 0.00 | 2.00 | ||||
E5—“risk treatment” | 2.00 | 1.00 | 2.75 | |||
2.00 | 1.00 | 3.00 | ||||
E6—“communication and consultation” | 1.00 | 0.00 | 2.75 | |||
1.00 | 0.00 | 2.00 | ||||
E7—“monitor and review” | 0.00 | 0.00 | 1.00 | |||
0.00 | 0.00 | 1.00 |
Element | Correlation Coefficient (r) | Median | Quartile Array 1 (Q1) | Quartile Array 3 (Q3) |
---|---|---|---|---|
Context | ||||
Self | 0.074 | 3.00 | 2.25 | 4.00 |
Upline | 0.391 | 3.00 | 3.00 | 4.00 |
Peers | 0.864 ** | 2.00 | 1.00 | 3.00 |
Downline | 0.953 ** | 2.00 | 2.00 | 3.00 |
Exercise | 1 | 2.00 | 1.00 | 3.00 |
Risk identification | ||||
Self | 0.225 | 3.00 | 3.00 | 4.00 |
Upline | 0.508 | 4.00 | 3.00 | 4.00 |
Peers | 0.891 ** | 2.00 | 2.00 | 3.00 |
Downline | 0.908 ** | 3.00 | 2.00 | 4.00 |
Exercise | 1 | 3.00 | 2.00 | 4.00 |
Risk analysis | ||||
Self | 0.110 | 3.00 | 2.25 | 4.00 |
Upline | 0.626 ** | 4.00 | 2.00 | 4.00 |
Peers | 0.922 ** | 1.00 | 1.00 | 2.00 |
Downline | 0.943 ** | 1.50 | 1.00 | 2.75 |
Exercise | 1 | 1.00 | 1.00 | 2.00 |
Risk evaluation | ||||
Self | 0.189 | 3.00 | 2.00 | 4.00 |
Upline | 0.329 | 3.00 | 2.50 | 4.00 |
Peers | 0.763 ** | 1.00 | 1.00 | 2.00 |
Downline | 0.987 ** | 1.50 | 1.00 | 2.00 |
Exercise | 1 | 1.00 | 0.00 | 2.00 |
Risk treatment | ||||
Self | 0.003 | 3.00 | 3.00 | 4.00 |
Upline | 0.286 | 3.00 | 2.00 | 4.00 |
Peers | 0.804 ** | 2.00 | 1.00 | 2.00 |
Downline | 0.867 ** | 2.00 | 1.00 | 3.00 |
Exercise | 1 | 2.00 | 1.00 | 3.00 |
Communication and consultation | ||||
Self | −0.211 | 3.00 | 2.00 | 4.00 |
Upline | 0.266 | 4.00 | 2.50 | 4.00 |
Peers | 0.676 ** | 1.00 | 1.00 | 2.00 |
Downline | 0.846 ** | 1.00 | 1.00 | 2.00 |
Exercise | 1 | 1.00 | 0.00 | 2.00 |
Monitoring and reviewing | ||||
Self | −0.247 | 3.00 | 2.00 | 4.00 |
Upline | 0.315 | 3.00 | 2.00 | 4.00 |
Peers | 0.690 ** | 1.00 | 1.00 | 2.00 |
Downline | 0.751 ** | 1.00 | 1.00 | 1.75 |
Exercise | 1 | 0.00 | 0.00 | 1.00 |
Element | Median | Quartile Array 1 (Q1) | Quartile Array 3 (Q3) |
---|---|---|---|
Observer | O1 | O1 | O1 |
O2 | O2 | O2 | |
E1—“establishing the context” | 2.00 | 1.00 | 3.00 |
2.00 | 1.00 | 2.00 | |
E2—“risk identification” | 3.00 | 2.00 | 3.25 |
2.00 | 2.00 | 3.00 | |
E3—“risk analysis” | 1.00 | 0.00 | 1.00 |
1.00 | 0.00 | 1.25 | |
E4—“risk evaluation” | 0.00 | 0.00 | 0.00 |
0.00 | 0.00 | 0.25 | |
E5—“risk treatment” | 1.50 | 1.00 | 2.25 |
2.00 | 2.00 | 3.00 | |
E6—“communication and consultation” | 0.00 | 0.00 | 1.00 |
0.50 | 0.00 | 1.00 | |
E7—“monitor and review” | 0.00 | 0.00 | 0.00 |
0.00 | 0.00 | 0.25 |
Element | Median | Quartile Array 1 (Q1) | Quartile Array 3 (Q3) |
---|---|---|---|
Round | R1 | R1 | R1 |
R2 | R2 | R2 | |
E1—“establishing the context” | 1.50 | 1.00 | 3.00 |
2.00 | 1.00 | 2.00 | |
E2—“risk identification” | 3.00 | 2.00 | 3.25 |
2.00 | 2.00 | 3.00 | |
E3—“risk analysis” | 1.00 | 0.75 | 1.00 |
1.00 | 0.00 | 1.00 | |
E4—“risk evaluation” | 0.50 | 0.00 | 1.00 |
0.00 | 0.00 | 0.25 | |
E5—“risk treatment” | 1.50 | 1.00 | 2.25 |
2.00 | 1.75 | 3.00 | |
E6—“communication and consultation” | 0.00 | 0.00 | 0.50 |
0.00 | 0.00 | 1.00 | |
E7—“monitor and review” | 0.00 | 0.00 | 0.25 |
0.00 | 0.00 | 0.25 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Marling, G.; Horberry, T.; Harris, J. Development and Reliability Review of an Assessment Tool to Measure Competency in the Seven Elements of the Risk Management Process: Part Two—Evaluation of the Individual Results from the RISKometric Tool. Safety 2022, 8, 71. https://doi.org/10.3390/safety8040071
Marling G, Horberry T, Harris J. Development and Reliability Review of an Assessment Tool to Measure Competency in the Seven Elements of the Risk Management Process: Part Two—Evaluation of the Individual Results from the RISKometric Tool. Safety. 2022; 8(4):71. https://doi.org/10.3390/safety8040071
Chicago/Turabian StyleMarling, Garry, Tim Horberry, and Jill Harris. 2022. "Development and Reliability Review of an Assessment Tool to Measure Competency in the Seven Elements of the Risk Management Process: Part Two—Evaluation of the Individual Results from the RISKometric Tool" Safety 8, no. 4: 71. https://doi.org/10.3390/safety8040071
APA StyleMarling, G., Horberry, T., & Harris, J. (2022). Development and Reliability Review of an Assessment Tool to Measure Competency in the Seven Elements of the Risk Management Process: Part Two—Evaluation of the Individual Results from the RISKometric Tool. Safety, 8(4), 71. https://doi.org/10.3390/safety8040071