Statistical Predictors of Project Management Maturity
Abstract
:1. Introduction
2. Background
2.1. Project Management Maturity
2.2. Project Management Maturity Models
2.3. Example of Project Management Maturity Model Application
2.4. Kerzner Project Management Maturity Model
Example of KPMMM in Practice
2.5. Organization Performance Measurement
2.6. Figueiredo Performance Measurement Model
3. Materials and Method
4. Results and Discussion
4.1. Respondents Profile
4.2. Results of the Project Management Maturity Assessment
5. Conclusions and Recommendations
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Abbreviations
CMMI | Capability Maturity Model Integration |
DA | Desirable Attribute |
DBSCAN | Density-Based Spatial Clustering of Applications with Noise |
FPMM | Figueiredo Performance Measurement Model |
GMM | Gaussian Mixture Model |
IT | Information Technology |
KPMMM | Kerzner Project Management Maturity Model |
MSE | Mean Square Error |
OLS | Ordinary Least Squares |
OOT | Organization Operation Type |
OPM3 | Organizational Project Management Maturity Model |
OS | Organizational Structure |
OV | Observed Variable |
P2MM | PRINCE2 Maturity Model |
P3M3 | Portfolio, Program, and Project Management Maturity Model |
PCR | Principal Component Regression |
PLS | Partial Least Squares |
PM | Project Management |
PMBOK | Project Management Body of Knowledge |
PMI | Project Management Institute |
PMM | Project Management Maturity |
PMMM | Project Management Maturity Model |
PMO | Project Management Office |
PMS | Performance Management System |
PRINCE2 | Projects in Controlled Environments |
RR | Ridge Regression |
References
- Future-Focused Culture. 2020. Available online: https://www.pmi.org/learning/thought-leadership/pulse/pulse-of-the-profession-2020# (accessed on 5 February 2023).
- McLeod, L.; Doolin, B.; MacDonnell, S.G. Perspective-based understanding of project success. Proj. Manag. J. 2012, 43, 68–86. [Google Scholar] [CrossRef] [Green Version]
- Ika, L.A. Project success as a topic in project management journals. Proj. Manag. J. 2011, 40, 16–19. [Google Scholar] [CrossRef]
- PMI’s Pulse of the Profession. 2018. Available online: https://www.pmi.org/-/media/pmi/documents/public/pdf/learning/thought-leadership/pulse/pulse-of-the-profession-2018.pdf (accessed on 5 February 2023).
- Celani de Souza, H.J. Sistema de avaliação de Maturidade em Gerenciamento de Projetos Fundamentado em Pesquisa Quantitativa. Ph.D. Thesis, Universidade Estadual Paulista, Guaratingueta, Brazil, 2011. Available online: http://hdl.handle.net/11449/103058 (accessed on 5 February 2023).
- PMI’s Pulse of the Profession. 2017. Available online: https://www.pmi.org/-/media/pmi/documents/public/pdf/learning/thought-leadership/pulse/pulse-of-the-profession-2017.pdf (accessed on 5 February 2023).
- Shao, J.M.R.; Turner, J.R. Measuring program success. Proj. Manag. J. 2012, 43, 37–49. [Google Scholar] [CrossRef] [Green Version]
- Shenhar, A.J.; Dvir, D.; Levy, O.; Maltz, A.C. Project success: A multidimensional strategic concept. Long Range Plann. 2001, 34, 699–725. [Google Scholar] [CrossRef]
- Andersen, E.S.; Jessen, S.A. Project maturity in organisations. Int. J. Proj. Manag. 2003, 21, 457–461. [Google Scholar] [CrossRef]
- Kerzner, H. Project Management Best Practices, 4th ed.; Wiley: Hoboken, NJ, USA, 2018. [Google Scholar]
- Kerzner, H. Project Management: A System Approach to Planning, Scheduling and Controlling, 13th ed.; Wiley: Hoboken, NJ, USA, 2022. [Google Scholar]
- Cooke-Davies, T.J.; Arzymanow, A. The maturity of project management in different industries: An investigation into variations between project management models. Int. J. Proj. Manag. 2003, 21, 471–478. [Google Scholar] [CrossRef]
- Yazici, H.J. The role of project management maturity and organizational culture in perceived performance. Proj. Manag. J. 2011, 40, 14–33. [Google Scholar] [CrossRef]
- Aubry, M.; Hobbs, B.; Thuillier, D. The contribution of the project management office to organisational performance. Int. J. Manag. Proj. Bus. 2009, 2, 141–148. [Google Scholar] [CrossRef]
- Ibbs, C.W.; Kwak, Y.H. Assessing project management maturity. Proj. Manag. J. 2000, 31, 32–43. [Google Scholar] [CrossRef]
- Jiang, J.J.; Gary Klein, G.; Hwang, H.; Huang, J.; Hung, S. Assessing project management maturity. Inf. Manag. 2004, 41, 279–288. [Google Scholar] [CrossRef]
- Pasian, B.; Sankaran, S.; Boydell, S. Project management maturity: A critical analysis of existing and emergent factors. Int. J. Manag. Proj. Bus. 2012, 5, 146–157. [Google Scholar] [CrossRef]
- Paulk, M.C. A history of the capability maturity model for software. Softw. Qual. Prof. 2009, 12, 5–19. [Google Scholar]
- Kwak, Y.H.; Ibbs, C.W. Project management process maturity (PM)2 model. J. Manag. Eng. 2002, 18, 150–155. [Google Scholar] [CrossRef] [Green Version]
- The Pathway to OPM3. 2004. Available online: https://www.pmi.org/learning/library/pathway-organizational-project-management-maturity-8221 (accessed on 5 February 2023).
- Crawford, J.K. Project Management Maturity Model, 4th ed.; CRC: Boca Raton, FL, USA, 2021. [Google Scholar]
- P3M3 | Portfolio, Programme, and Project Management Maturity Model | Axelos. 2004. Available online: https://www.axelos.com/for-organizations/p3m3 (accessed on 5 February 2023).
- Brookes, N.; Clark, R. Using maturity models to improve project management practice. In Proceedings of the 20th Annual Conference of the Production and Operations Management Society, Orlando, FL, USA, 1–4 May 2009; Available online: https://www.pomsmeetings.org/ConfProceedings/011/FullPapers/011-0288.pdf (accessed on 5 February 2023).
- Celani de Souza, H.J.; Salomon, V.A.P.; Sanches da Silva, C.E.; Aguiar, D.C. Project management maturity: An analysis with fuzzy expert systems. Braz. J. Prod. Oper. Manag. 2012, 9, 29–41. [Google Scholar] [CrossRef]
- Burmann, A.; Meister, S. Practical application of maturity models in healthcare: Findings from multiple digitalization case studies. In Proceedings of the 14th International Joint Conference on Biomedical Engineering Systems and Technologies, Online, 11–13 February 2021; Available online: https://www.scitepress.org/PublishedPapers/2021/102286/pdf/index.html (accessed on 5 February 2023).
- CMMI Institute. Available online: https://cmmiinstitute.com/pars (accessed on 28 March 2023).
- Kerzner, H. Using the Project Management Maturity Model: Strategic Planning for Project Management, 3rd ed.; Wiley: Hoboken, NJ, USA, 2019. [Google Scholar]
- Ortiz-Barrios, M.; Miranda-De la Hoz, C.; López-Meza, P.; Petrillo, A.; De Felice, F. A case of food supply chain management with AHP, DEMATEL, and TOPSIS. J. Multi-Criteria Decis. Anal. 2019, 27, 104–128. [Google Scholar] [CrossRef]
- Kumar, P.; Nirmala, R.; Mekoth, N. Relationship between performance management and organizational performance. Acme Intellects Int. J. Res. Manag. Soc. Sci. Technol. 2015, 9, 1–13. [Google Scholar]
- Schermerhorn, J.J.R.; Bachrach, D.G. Management, 14th ed.; Wiley: Hoboken, NJ, USA, 2020. [Google Scholar]
- Kaplan, R.S.; Norton, D.P. The balanced scorecard–Measures that drive performance. Harv. Bus. Rev. 1992, 70, 71–79. Available online: https://hbr.org/1992/01/the-balanced-scorecard-measures-that-drive-performance-2 (accessed on 5 February 2023).
- Ghalayini, A.M.; Noble, J.S.; Crowe, T.J. An integrated dynamic performance measurement system for improving manufacturing competitiveness. Int. J. Prod. Econ. 1997, 48, 207–225. [Google Scholar] [CrossRef]
- Nelly, A.; Richards, H.; Mills, J.; Platts, K.; Bourne, M. Designing performance measures: A structured approach. Int. J. Oper. Prod. Man. 1997, 17, 1131–1152. [Google Scholar] [CrossRef]
- Figueiredo, M.A.D.; Macedo-Soares, T.D.L.A.; Fuks, S.; Figueiredo, L.C. Definição de atributos desejáveis para auxiliar a auto-avaliação dos novos sistemas de medição de desempenho organizacional. Gest. Prod. 2005, 12, 305–315. [Google Scholar] [CrossRef]
- Zhang, Y.; Li, S. High performance work practices and firm performance: Evidence from the pharmaceutical industry in China. Int. J. Hum. Resour. Man. 2009, 11, 2331–2348. [Google Scholar] [CrossRef]
- Ricci, L. The Impact of Performance Management System Characteristics on Perceived Effectiveness of the System and Engagement. Master’s Thesis, San Jose State University, San Jose, CA, USA, 2016. [Google Scholar] [CrossRef]
- Project Management Institute, Inc. A Guide to the Project Management Body of Knowledge (PMBOK® Guide), 7th ed.; PMI: Newton Township, PA, USA, 2021. [Google Scholar]
- Kim, H.; Choi, I.; Lim, J.; Sung, S. Business Process-Organizational Structure (BP-OS) Performance measurement model and problem-solving guidelines for efficient organizational management in an ontact work environment. Sustainability 2022, 14, 14574. [Google Scholar] [CrossRef]
- Brown, C.J. A comprehensive organizational model for the effective management of project management. S. Afr. J. Bus. Manag. 2008, 39, 1–10. [Google Scholar] [CrossRef]
- Perry, M.P. Business Driven PMO Setup; J. Ross Publishing: Fort Lauderdale, FL, USA, 2009. [Google Scholar]
- Gupta, D.A.K. Growth and challenges in service sector: Literature review, classification and directions for future research. Int. J. Manag. Bus. Stud. 2012, 2, 55–58. Available online: http://www.ijmbs.com/22/akgupta.pdf (accessed on 5 February 2023).
- Garvin, D.A. Building a learning organization. Harv. Bus. Rev. 1993, 71, 78–91. Available online: https://hbr.org/1993/07/building-a-learning-organization (accessed on 5 February 2023). [PubMed]
- Bititci, U.S.; Turner, U.; Begemann, C. Dynamics of performance measurement systems. Int. J. Oper. Prod. Man. 2000, 20, 692–704. [Google Scholar] [CrossRef]
- Neely, A.; Mills, J.; Platts, K.; Richards, H. Performance measurement system design: Developing and testing a process-based approach. Int. J. Oper. Prod. Man. 2000, 20, 1119–1145. [Google Scholar] [CrossRef]
- Neely, A.; Adams, C.; Kennerley, M. The Performance Prism: The Scorecard for Measuring and Managing Business Success; Prentice Hall: London, UK, 2002. [Google Scholar]
- Dixon, J.R.; Nanni, J.A.J.; Vollmann, T.E. The New Performance Challenge: Measuring Operations for World-Class Competition; Dow Jones–Irwin: Homewood, IL, USA, 1990. [Google Scholar]
- Christopher, W.F.; Thor, C.G. Handbook for Productivity Measurement and Improvement; Productivity: Cambridge, MA, USA, 1993. [Google Scholar]
- Thor, C.G. Ten rules for building a measurement system. Qual. Product. Manag. 1993, 9, 7–10. [Google Scholar]
- Forza, C. Survey research in operations management: A process-based perspective. Int. J. Oper. Prod. Man. 2002, 22, 152–194. [Google Scholar] [CrossRef] [Green Version]
- Hair, J.J.F.; Black, W.C.; Babin, B.J.; Anderson, R.E. Multivariate Data Analysis, 8th ed.; Cengage: Andover, UK, 2019. [Google Scholar]
- Yeniay, O.; Goktas, A. A comparison of partial least squares regression with other prediction methods. Hacet. J. Math. Stat. 2002, 31, 99–111. [Google Scholar]
- Data Analysis, Statistical & Process Improvement Tools. 2023. Available online: https://www.minitab.com/en-us/ (accessed on 5 February 2023).
- The Future of Work. 2019. Available online: https://www.pmi.org/-/media/pmi/documents/public/pdf/learning/thought-leadership/pulse/pulse-of-the-profession-2019.pdf (accessed on 5 February 2023).
- Miao, J.; Forget, B.; Smith, K. Analysis of correlations and their impact on convergence rates in Monte Carlo eigenvalue simulations. Ann. Nucl. Energy 2016, 92, 81–95. [Google Scholar] [CrossRef] [Green Version]
- Everitt, B.S.; Dunn, G. Applied Multivariate Data Analysis; Wiley: New York, NY, USA, 1991. [Google Scholar]
- Ulaga, W.; Reinartz, W.J. Hybrid offerings: How manufacturing firms combine goods and services successfully. J. Market. 2011, 75, 5–23. [Google Scholar] [CrossRef]
- Dudoit, S.; Fridlyand, J. Bagging to improve the accuracy of a clustering procedure. Bioinformatics 2003, 19, 1090–1099. [Google Scholar] [CrossRef] [Green Version]
- Gelbard, R.; Goldman, O.; Israel Spiegler, I. Investigating diversity of clustering methods: An empirical comparison. Data Knowl. Eng. 2007, 63, 155–166. [Google Scholar] [CrossRef]
- Beyond Agility. 2021. Available online: https://www.pmi.org/learning/thought-leadership/pulse/pulse-of-the-profession-2021 (accessed on 5 February 2023).
- Berssaneti, F.T.; Carvalho, M.M. Identification of variables that impact project success in Brazilian companies. Int. J. Proj. Manag. 2015, 33, 638–649. [Google Scholar] [CrossRef]
- Success in Disruptive Times. 2021. Available online: https://www.pmi.org/-/media/pmi/documents/public/pdf/learning/thought-leadership/pulse/pulse-of-the-profession-2018.pdf (accessed on 5 February 2023).
Model | Proposal | Maturity Levels |
---|---|---|
Berkeley | William Ibbs and Young Kwak, 2000 | 5 |
CMMI | Carnegie Mellon University, 1984 | 5 |
KPMMM | Harold Kerzner, 2001 | 5 |
OPM3 | Project Management Institute, 2003 | 4 |
PM Solutions | Kent Crawford, 2002 | 5 |
P2MM | United Kingdom government, 1989 | 5 |
Company | Appraisal ID | Expiration | Maturity Level |
---|---|---|---|
IBM | 57090 | 29 June 2025 | 5 |
McKinsey | 63183 | 29 November 2025 | 1 and 2 |
Level 1 Embryonic | Level 2 Executive | Level 3 Line Management | Level 4 Growth | Level 5 Maturity |
---|---|---|---|---|
+8 | +10 | +10 | +3 | −4 |
Author | Definition |
---|---|
John Schermerhorn, 1984 [30] | An assessment and evaluation process for the efficiency and effectiveness of people, resources, and technology. |
R. Kaplan and D. Norton, 1996 [31] | A metric system used to quantify the efficiency and effectiveness of an action. |
Alaa Ghalayini et al., 1997 [32] | A way for organizations to control process improvements to achieve their goals. |
Andy Nelly et al., 1997 [33] | Allow decisions and actions to be performed based on information (data collection, data analysis, interpretation, and dissemination of the results) and quantify process efficiency and the effectiveness of past actions. |
M. Figueiredo et al., 2005 [34] | A set of people, methods, tools, and indicators structured to collect, describe, and represent data, the main goal of which is to generate information. |
Y. Zhang and S. Li, 2009 [35] | A systematic analysis and objective evaluation of the management of projects completed. |
Laura Ricci, 2016 [36] | Interrelated and independent performance management elements that influence one another to increase employee and organizational performance to ultimately enhance organizational effectiveness. |
Attribute | OV | Concepts |
---|---|---|
Learning | 7 | Information supply capability to contribute to knowledge and organizational behavior management [31,32,34,42]. |
Evaluation | 7 | Information supply capability for a global performance evaluation to identify issues and solutions [43,44]. |
Balancing | 5 | Information supply capability according to different performance dimensions to allow a multidimensional perception of organizational behavior [31,34,45]. |
Clarity | 10 | Information supply capability to use user-friendly indicators for all distinct users in different hierarchy levels for reliable decision making [40,45]. |
Agility | 4 | PMS capability to continuously monitor the external and internal organization environments for rapid decision making [32,43]. |
Flexibility | 6 | PMS capability to adapt quickly to organizational changes [32,43]. |
Monitoring | 3 | PMS capability to monitor external and internal organization environments for detecting issues and potential issues [32,43]. |
Integration | 4 | PMS capability to interact with the entire organization’s KPI, aligned with the strategies, tactics, and operational targets [32,43]. |
Alignment | 7 | PMS capability to use KPIs aligned to the organization’s strategy and process to have a clear perception of global performance [31,32,43,45]. |
Participation | 5 | PMS capability to allow stakeholders active participation in all project life cycles [32,34,46,47,48]. |
Causal relationship | 5 | PMS capability to allow inter-relationships among all KPIs to facilitate understanding strategies to ramp up the expected business results [31,44,45]. |
Metric | Description |
---|---|
Mean Squared Error (MSE) | It calculates the average squared difference between the predicted values and the actual values of the response variable. It provides a measure of the average prediction error, with smaller values indicating better predictive accuracy. |
Root Mean Squared Error (RMSE) | It is the square root of the MSE. It provides a measure of the average prediction error in the original units of the response variable, which can be more interpretable than MSE. |
Mean Absolute Error (MAE) | It calculates the average absolute difference between the predicted values and the actual values. It measures the average magnitude of the prediction errors without considering their direction. Smaller MAE values indicate better predictive accuracy. |
R-squared (coefficient of determination) | It measures the proportion of the variance in the response variable, which is explained by the predictor variables. In the context of predictive accuracy, it indicates how much of the variation in the response variable is captured by the model’s predictions. Higher R-squared values indicate better predictive accuracy. |
Mean Absolute Percentage Error (MAPE) | It calculates the average percentage difference between the predicted values and the actual values. It provides a measure of the average relative prediction error. MAPE is commonly used when the magnitude of the errors is important to consider. |
Accuracy, Precision, Recall, F1-score (for classification problems) | These metrics are used to evaluate the performance of classification models. Accuracy measures the proportion of correctly classified instances, precision quantifies the proportion of correctly predicted positive instances among all predicted positive instances, recall calculates the proportion of correctly predicted positive instances among all actual positive instances, and the F1-score is the harmonic mean of precision and recall. Higher values for these metrics indicate better predictive accuracy in classification tasks. |
Practice | Description |
---|---|
Hierarchical clustering | Often constructed using hierarchical clustering algorithms, which iteratively merge or split clusters based on similarity or dissimilarity measures. Hierarchical clustering can be agglomerative (bottom–up) or divisive (top–down). |
Visualization of clustering relationships | Provide a visual representation of the clustering relationships among objects or variables. Objects or variables are represented as leaves in the dendrogram, and branches indicate the merging or splitting of clusters at different levels of similarity or dissimilarity. The height or length of the branches in the dendrogram represents the magnitude of the dissimilarity or distance between the clusters being merged or split. |
Distance or similarity measures | A distance or similarity measure is calculated to quantify the dissimilarity or similarity between objects or variables. Common distance measures include Euclidean distance, Manhattan distance, or correlation-based distances. Similarity measures can be based on cosine similarity, Pearson correlation coefficient, or other similarity metrics, depending on the type of data and the research question. |
Cut-off threshold | Allows flexibility in choosing the desired number of clusters or groups by setting a cut-off threshold. The cut-off threshold determines at which dissimilarity or distance level the dendrogram is pruned, resulting in a specific number of clusters. Different cut-off thresholds can lead to different numbers of clusters and, therefore, different interpretations of the data. |
Interpretation and analysis | Provide insights into the inherent structure and relationships in the data. They can aid in identifying clusters or groups of similar objects or variables, detecting outliers, and assessing the stability of the clustering results. Dendrograms can also be used as a basis for further analysis, such as identifying characteristics or patterns within specific clusters or comparing clusters across different datasets. |
Variations and customization | Can be customized based on the specific needs and characteristics of the data. Various visualization techniques, such as circular dendrograms or interactive dendrograms, can be employed to enhance the representation and interpretation of the clustering relationships. Additionally, color coding, labeling, and annotation can be used to provide additional information or context within the dendrogram. |
Clustering Method | Description |
---|---|
K-Means Clustering | It is a simple and widely used clustering algorithm. It partitions the data into K clusters, where K is a user-defined parameter. Each data point belongs to the cluster whose mean (centroid) is nearest to it. Strengths: Fast convergence, scalable, works well with large datasets and well-separated clusters. Weaknesses: Sensitive to initial cluster centers, does not handle non-spherical clusters well. |
Hierarchical Clustering | It builds a tree-like structure of nested clusters. It can be agglomerative (bottom–up) or divisive (top–down) in nature. Strengths: No need to specify the number of clusters beforehand, can handle non-spherical clusters, and provides a dendrogram for visualization. Weaknesses: Computationally expensive for large datasets, difficult to interpret for very large datasets. |
Density-Based Spatial Clustering of Applications with Noise (DBSCAN) | It groups together data points based on density and identifies noise points as outliers. It requires two parameters: the radius, , and the minimum number of points within to form a cluster.
Strengths: Robust to outliers, can discover clusters of arbitrary shapes, no need to specify the number of clusters, and efficient for large datasets. Weaknesses: Sensitivity to the choice of parameters, does not perform well for datasets with varying density. |
Gaussian Mixture Model (GMM) | GMM is a probabilistic model that represents data points as a mixture of several Gaussian distributions. It estimates the parameters of these distributions to fit the data. Strengths: Can handle clusters with different shapes, provides probabilistic cluster assignments, and allows soft clustering. Weaknesses: Sensitive to the initialization of parameters, can converge to local optima, and can be computationally expensive. |
Mean Shift Clustering | It is an iterative algorithm that shifts data points toward the mode of data distribution. It is non-parametric and can automatically determine the number of clusters. Strengths: Works well with non-spherical clusters, does not require specifying the number of clusters, and robust to outliers. Weaknesses: Computationally expensive, sensitivity to kernel bandwidth (a parameter), and might converge to local optima. |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Celani de Souza, H.J.; Salomon, V.A.P.; Sanches da Silva, C.E. Statistical Predictors of Project Management Maturity. Stats 2023, 6, 868-888. https://doi.org/10.3390/stats6030054
Celani de Souza HJ, Salomon VAP, Sanches da Silva CE. Statistical Predictors of Project Management Maturity. Stats. 2023; 6(3):868-888. https://doi.org/10.3390/stats6030054
Chicago/Turabian StyleCelani de Souza, Helder Jose, Valerio Antonio Pamplona Salomon, and Carlos Eduardo Sanches da Silva. 2023. "Statistical Predictors of Project Management Maturity" Stats 6, no. 3: 868-888. https://doi.org/10.3390/stats6030054
APA StyleCelani de Souza, H. J., Salomon, V. A. P., & Sanches da Silva, C. E. (2023). Statistical Predictors of Project Management Maturity. Stats, 6(3), 868-888. https://doi.org/10.3390/stats6030054