You are currently viewing a new version of our website. To view the old version click .
Sustainability
  • Article
  • Open Access

29 September 2020

Evaluation of Key Performance Indicators (KPIs) for Sustainable Postgraduate Medical Training: An Opportunity for Implementing an Innovative Approach to Advance the Quality of Training Programs at the Saudi Commission for Health Specialties (SCFHS)

,
,
,
,
,
and
1
Saudi Commission for Health Specialties, Riyadh 11614, Saudi Arabia
2
College of Medicine, King Saud Bin-Abdul-Aziz University for Health Sciences, Jeddah 14611, Saudi Arabia
3
Urology Section, Department of Surgery, King Abdulaziz Medical City, Ministry of National Guard, Jeddah 14815, Saudi Arabia
4
King Abdulaziz University, Jeddah 21589, Saudi Arabia
This article belongs to the Special Issue Big Data, Knowledge Management and IoT: New Perspectives for New Challenges in Disruptive Times

Abstract

The Kingdom of Saudi Arabia is undergoing a major transformation in response to a revolutionary vision of 2030, given that healthcare reform is one of the top priorities. With the objective of improving healthcare and allied professional performance in the Kingdom to meet the international standards, the Saudi Commission for Health Specialties (SCFHS) has recently developed a strategic plan that focuses on expanding training programs’ capacity to align with the increasing demand for the country’s healthcare workforce, providing comprehensive quality assurance and control to ensure training programs uphold high quality standards, and providing advanced training programs benchmarked against international standards. In this research paper, we describe our attempt for developing a general framework for key performance indicators (KPIs) and the related metrics, with the aim of contributing to developing new strategies for better medical training compatible with the future. We present the results of a survey conducted in the Kingdom of Saudi Arabia (KSA), for the enhancement of quality of postgraduate medical training. The recent developments in the field of learning analytics present an opportunity for utilizing big data and artificial intelligence in the design and implementation of socio-technical systems with significant potential social impact. We summarize the key aspects of the Training Quality Assurance Initiative and suggest a new approach for designing a new data and services ecosystem for personalized health professionals training in the KSA. The study also contributes to the theoretical knowledge on the integration of sustainability and medical training and education by proposing a framework that can enhance future initiatives from various health organizations.

1. Introduction

Sustainability in healthcare is a key initiative towards digital transformation. Numerous emerging technologies provide a novel ecosystem for data and services exploitation. Applied data science and sophisticated computational information processing in the healthcare domain define new highways for research and innovation. This research work aims to develop an overall unique quality framework for healthcare sustainability and technological innovation in the Kingdom of Saudi Arabia (KSA). It summarizes leading-edge research conducted by the Saudi Commission for Health Specialties (SCFHS).

1.1. The Saudi Commission for Health Specialties’s Quality Assurance Initiative for Sustainable Personalized Medical Training

The SCFHS has recently developed a strategic plan that focuses on expanding training programs’ capacity to comprehend the increasing demand for the country’s healthcare workforce, to provide comprehensive quality assurance and control, to ensure training programs upholding high quality, and to benchmark advanced training programs against international standards. This research paper summarizes a bold initiative towards the implementation of both this strategic plan for more efficient patient-centric healthcare, as well as an interactive data ecosystem for enhanced decision making in the context of the activities and the priorities of the SCFHS.
The SCFHS was established by the approval of Royal Decree in 1993, in order to supervise and evaluate training programs, as well as set regulations and standards for the practice of health and allied professionals. Since its establishment, the number of residency and fellowship training programs has increased.
Across the country and the Gulf region, 1200 programs, that cover the 79 health specialties, are conducted [1]. These newly initiated training programs respond to the anticipated increase in demand for health professionals and training needs over the years. Moreover, there will likely be an expansion in training programs over the coming years that will require sound development planning to ensure high-quality training for health professionals.
Some countries have taken the initiative to pursue quality measurement in competency-based postgraduate training [2,3,4]. Saudi Arabia, through the SCFHS, is among the list of those countries that developed a system that measures quality outcomes of postgraduate training.
The SCFHS has launched the Training Quality Assurance Initiative (TQAI) which aims to develop a system for setting quality standards, benchmarks, processes, and improvement guidelines for the postgraduate training programs.

1.2. Governance of Training Programs in SCFHS

The latest developments in the big data and analytics domain bring forward the significance of defining efficiency and performance metrics as a basis for the design of a data ecosystem for personalized value-adding services. The healthcare domain is one of the most critical with great social impact.
Key performance indicators (KPIs) serve as a way of gently guiding clinicians in the right direction to improve clinical practice and is being used in different fields of medicine [5,6,7]. Santana et al. [8] identified nine measurements that KPIs should meet. Specifically, KPIs should be:
  • Targeting important improvements (there is a direct connection with sustainability),
  • Precisely defined (measurable metrics, impose efficiency and enhanced decision making—they also provide a systematic way for monitoring performance over time with significant managerial implications in the medical sector),
  • Reliable,
  • Valid,
  • Implemented with risk adjustment,
  • Implemented with reasonable cost,
  • Implemented with low data collection effort,
  • Achieving results which can be easily interpreted, and
  • Global (overall evaluation).
In June 2017, the SCFHS’s Secretary General requested the formation of a Quality Indicators Committee (QIC) that served to assess the postgraduate training programs’ outcomes based on the set quality indicators. These indicators are measurable features as a means of providing evidence that reflects quality and, hence, serves as a basis to help raise the quality of training provided to health professionals.
The collaborative efforts of the two main divisions of the SCFHS (i.e., Academic Affairs and Planning and Corporate Excellence Administration) have formed the committee. It consisted of the Chief of Academic Affairs, the Chief of Planning and Corporate Excellence, the Executive Director of Training (and Assistant), the Executive Director of Assessment (and Assistant), the Director of Quality Department, and the Director of Knowledge Management. The objectives of the QIC are to:
  • Improve the quality of postgraduate medical training (PGMT),
  • Provide means for objective assessment of residency programs and training centers,
  • Provide guidance to residency programs and training centers periodically and objectively, and
  • Assist program directors in reviewing the conduction and educational quality of their programs.
Thus, this research paper presents the determined outcome measures based on the quality indicators to identify what elements needed to be modified, which areas needed to be improved, and/or what areas might be developed to fulfill the objectives of the SCFHS. Hence, the main aim of this research article is to evaluate the residency training programs being conducted throughout Saudi Arabia through the core set of preformed KPIs. The main objectives are to evaluate the quality of training programs across the nation, assess the degree of achievement of pre-determined KPIs, and provide baseline data that support decisions for the improvement plan.
As an ultimate objective, this research contributes a novel framework for the integration of KPIs in a sophisticated data and services ecosystem in the SCFHS that can be used in similar organizations worldwide.
This research paper is organized as follows. Section 2 integrates multidisciplinary research on analytics KPIs and personalized medicine and medical education. In Section 3, we present our integrated research methodology with an emphasis on demographics, methods deployed, and research objectives. Analysis and key findings of the analysis of training programs in the SCFHS are in Section 4, including statistical analysis. The main findings are presented in Section 5. The key contribution of our research, the SCFHS Decision Making Model for Training Programs, is then summarized in Section 6. Finally, in Section 7, we present the conclusions and future research aims for this research work.

2. Critical Literature Review on KPIs in Healthcare Training Programs for Sustainable Healthcare

The analysis of performance and efficiency in training programs; technology-enhanced learning; and the systematic analysis of teaching, learning, and training behavior is a key initiative in our days. In fact, in recent years, a new multidisciplinary domain has emerged: the domain of learning and training analytics. It provides critical insights for the methodological methods, measurement tools, and trusted key performance indicators that enhance decision making in the training and learning process [9,10,11]. In Table 1, we summarize an overview of some key approaches in the literature that provide key insights to our methodological approach.
Table 1. Overview of literature review for the perceptions and utilization of learning analytics and key performance indicators (KPIs) for learning and training.
What is more, a systematic evolution in the areas of big data analytics and data science [12,13] offers new unforeseen opportunities for the design, implementation, and functioning of data and services ecosystems, aiming to promote enhanced decision making. Within this diverse area, issues related to data governance, standardization of data, availability of data, and advanced data mining permit new insights and analytical capabilities. Furthermore, the social impact of big data research and the direct connection of sophisticated analytics to strategic management and decision making are also evident in various studies.
In a parallel discussion on sustainable higher education [14,15], researchers emphasize the need to measure the impact of investments in training programs and to maximize the return on investment in social capital. A bold literature section also communicates that socially inclusive economic development must be an intervention of innovative education and enhanced skills and capabilities of citizens.
Within the training domain, there is fast-developing literature related to learning analytics and key performance indicators that provide significant insights into our research. In the next paragraphs, we provide a compact synthesis of complementary approaches [16,17,18,19,20].
Selected research works in the domain of learning analytics and KPIs provide further insights, especially when dealing with the learning, teaching, and training process. Different methodological approaches emphasize diverse pedagogical approaches and interpret them in meaningful sets of KPIs and learning analytics [25,26,27,28]. Other researchers propose KPI interventions that integrate the execution of the learning process with management and decision making at the highest level within organizations [29,30,31,32,33,34]. In an effort to interpret the literature review of learning analytics and to provide the pillars of our research methodology that will be presented in the next section, some critical insights are useful.
In the context of the Saudi Commission for Healthcare Specialties, thousands of medical training programs are offered. Our interest in maintaining a total quality initiative incorporates the development and monitoring of a reliable, trusted, and efficient set of KPIs. In this research study, we emphasize exactly this aspect. We present the first run and execution of a full set of KPIs related to the total quality initiative. This is a constructive knowledge management approach associated with big data analytics research and significant implications for data science and sustainability.
The standardization of KPIs and learning analytics for measuring the attitudes of residents on postgraduate medical training programs in the Kingdom of Saudi Arabia is a significant effort towards enhanced decision making. In this research study, we analyze the impact of using KPIs in order to develop a framework for sustainable medical education. We also introduce, in Section 6, the SCFHS Training Quality Approach—Medical Training Quality Framework. In fact, we contribute to the body of knowledge of the learning analytics and KPI domain by attaching a managerial approach for total quality management in medical education.
Our approach is also related to the latest developments in data science. It also contributes to the sustainability domain and brings forward new challenges and requirements for the digital transformation of healthcare systems. The necessity of having flexible, sustainable postgraduate medical education requires the design and implementation of socio-technical services that interconnect knowledge, knowledge processes, knowledge providers, and stakeholders [35,36,37].
In the next section, we provide the first phase of the quality initiative in the SCFHS. We focus on the design, implementation, and interpretation for sustainable medical education purposes of a newly, innovative KPI and learning analytics framework. As explained in this critical literature review, the main purpose of this framework is to support the following six complementary objectives:
  • Measurement tool for enhanced decision making
  • Learning behavior analysis and adjustment
  • Predicting performance and personalizing learning experience
  • Real-time monitoring tool of learning process
  • Customizing learning feedback
  • Standardization of instruction flow
  • Interoperable technology-enhanced learning systems

3. Research Methodology

The main aim of this research paper is to analyze the interconnections of quality assurance in medical training programs in the KSA as a key enabler for a big data ecosystem in the KSA for the provision of personalized medical training. Our research is directly related to the Vision 2030 for digital transformation and is also a bold initiative for analyzing how sustainable data science can promote integrated sophisticated medical services with social impact.
The three integrated learning objectives in our research are summarized as follows.
Research Objective 1: KPI definitions for efficient medical training. The main aim is to define and to operationalize a set of KPIs for efficient medical training.
Research Objective 2: KPIs’ social impact and sustainability in medical education. The main aim is to understand and to interpret how a set of measurable, compact, and reliable KPIs can be used as the basis of a data ecosystem for efficient medical training.
Research Objective 3: Sustainable data science for medical education and digital transformation of healthcare. We analyze the main impact of our key contribution towards a long-term sustainable plan for the introduction of personalized medical training services for individuals and the community in the KSA.

3.1. Study Design

This research is a prospective, analytical, cross-sectional study design that represents the quality outcome measures of training programs and services provided by the SCFHS in 2018. The key performance indicator list was created by experts in quality and health profession education at the SCFHS. The initial KPI list consisted of 28 KPIs but, after in-depth review and discussion, the committee omitted five KPIs from the final list and only 23 KPIs were selected. The theoretical framework of those KPIs was based on the Kirkpatrick model. Basically, the model is objectively used to analyze the impact of training, to work out how well your team members learned, and to improve their learning in the future [11]. The four-level training model that was created in 1959 represents a sequence of ways to evaluate training programs. As you move from one level to the next, the process becomes more difficult and time-consuming, but it also provides more valuable information. The KPIs are combined into domains that are described in Table 2.
Table 2. The List of Generated KPIs and Their Descriptors.
There were 23 general KPIs created, defined, and agreed upon by the Quality Indicators Committee which were intended to provide objective evidence in measuring the level of performance and its progress. Establishing a positive and trusting relationship with the postgraduate training stakeholders is the core of the SCFHS reform strategy and success. To achieve this, all stakeholders were included in the process of reviewing the 23 agreed upon KPIs in the GMH quality matrix. They were asked specifically about whether these KPIs are measurable, usable, reliable, simple to understand, available (data), robust, and clearly defined. Selected KPIs were distributed to all stakeholders for review, feedback, and approval (20 December 2017).

3.2. Sources of Data

The sources of data collected for the 23 KPIs were numerous, including primary and secondary data collection methods.
Primary sources were self-administered, semi-structured, consisting of a series of closed and open-ended validated questionnaires distributed to program directors and trainees.
The secondary sources of data were collected from the academic affairs department databases at the Saudi Commission for Health Specialties (SCFHS), which includes databases from four different sections (admission, accreditation, training, and assessment sections databases). Table 3 represents the matrix of the data sources for each KPI.
Table 3. The Sources of Data for Each KPI.

3.3. Validity and Reliability of the Primary Data Collection Tool (i.e., Survey)

The validation process of the questionnaire took place by content validity, which was conducted by the content experts; after that, face validity was performed with the help of the medical educationist who found the survey fulfilling the objectives of the survey and that the flow of the questions was in a logical sequence. For the reliability of the questionnaire, a pilot study was conducted on 40 participants; the questionnaire was modified according to the feedback of the respondents. Cronbach’s alpha was calculated as above 0.7 which suffices for reliability.

3.4. Program Directors’ Survey

A well-designed interview questionnaire for program directors was developed through key informant interviews and extensive focused group discussion of the Quality Indicators Committee (QIC) panels. The questionnaire, consisting of 53 questions, was structured representing four main sections. The first section was demographic details of the program directors; this section had 14 questions exploring the personal information and the base of the participants’ practice. The second section was the supervision and training, which contained 15 questions. The third section was educational content and it had give questions. The fourth section was the support and resources section which had 19 questions.
Initially, the SCFHS had a list of 1145 program directors in the directory; after an extensive checking and verification of the list, it was found that there was some duplication and wrong information, so the final list came out to be 1095 program directors in the country. During the survey period (December 2018 to February 2019) and following the consecutive sampling technique, all the 1097 program directors were invited. An online link was created for the survey and was sent to all program directors through email but only 464 replied to the email. Another 252 program directors were contacted, and an interview was conducted through a computer-assisted telephonic interview (CATI) by SCFHS personnel. The total number of program directors who responded was 716 (65.2%).

3.5. Trainees Survey

The survey questionnaires for residents were developed by the PGMT Quality Indicators Committee (QIC) to produce an error-free measure of care quality. Measures should be based on characteristics of best practice such as validity, reliability, and transparency. The questionnaire was created on 25 September 2018, published on 1 October in the same year, and closed on 4 August 2019. The questionnaire consisted of six sections with questions distributed among them. Eight questions were covering the demographic section; in the second section, there were 14 questions asking about the trainees’ educational activities. Three questions in the third section were regarding the program satisfaction/training program. Section four had eight questions regarding perception and personal experience. The fifth and sixth sections had three and five questions for research participation and satisfaction with the SCHFS, respectively.
There was a total of 13,688 residents working in different specialties throughout Saudi Arabia; only 3696 (27%) of the residents agreed to participate in the online survey. The trainers were left out of the survey due to time constraints.
A total of 41 questions, representing indicators of the training programs’ quality, were validated by expert and QIC panels for clarity and content relevance. The KPI working group is responsible for collecting and analyzing data each year and making sure that the indicators remain precise and appropriate—unlike Toussaint et al.’s suggestion that the report should be on a quarterly basis [12].

3.6. Data Analysis

The computer program used for data analysis was the statistical program SPSS version 24. A test will be considered significant if p-value < 0.05. All data were analyzed based on each KPI’s parameters and targets.

3.6.1. Data Management

After the data were collected and entered into Microsoft Excel, respondents’ data were rechecked for blank/empty or any typo errors, and then the file was exported to SPSS format for further management and analysis.

3.6.2. Descriptive Statistics

For the presentation of descriptive statistics, categorical data (gender, region distribution, qualification, etc.) were calculated as frequency (n) and percentages (%), while numerical data such as age and years of experience were presented by mean + standard deviation.

3.6.3. Inferential Statistics

A Chi-square test was used to assess the association between two categorical variables like gender and country distribution.

4. Analysis and Main Findings

In this section, we provide the initial analysis of our data. We provide the statistical analysis for the key aspects of the research tools (surveys) discussed in the previous section. In Section 5, we will summarize the key interpretations and the impact of our research towards personalized medical training and the digital transformation of healthcare in the KSA.

4.1. Demographic Features of Our Respondents

4.1.1. Section One: Program Directors’ Basic Characteristics

There were 716 program directors who completed the questionnaire and were included in the final analysis. The overall response rate of the program director was 65.3% (716/1097). The respondents were mainly Saudi nationals (647, 92%) and predominantly males 560 (71.7%) (Figure 1) who were more prevalent in the northern and southern regions (Figure 2).
Figure 1. Gender Distribution among Program Directors.
Figure 2. Gender Distribution of Program Directors in Each Country Region.
Almost half (320, 44.7%) of the program directors were in the age group between 41 and 50 years (Figure 3 and Figure 4).
Figure 3. Age Distribution among Program Directors.
Figure 4. Gender Distribution in Each Age Group among Program Directors.
Saudi board graduate was the most common highest qualification for our program directors, 58.4%; followed by graduates from different regional programs, 15%, like Arab, Egyptian, or Syrian boards. Figure 5 describes the distribution of the primary qualification among program directors. The option “other” included boards in Australia, Pakistan, South Africa, etc. In Figure 6, we provide also the overview of the Qualification Distribution among Different Regions.
Figure 5. The Primary Qualification of the Program Directors.
Figure 6. Qualification Distribution among Different Regions.
Most of the program directors, 44%, were recently appointed with experience in the job for less than two years and 15% have been in this post for more than six years (see Figure 7). In Figure 8, we provide also an indication of the Time Spent as Program Director by Gender.
Figure 7. Time Spent in Years as Program Director.
Figure 8. Time Spent as Program Director by Gender.
Program directors who responded to our survey resided in over 27 cities. However, Riyadh was the most common city of residence for the program directors, 38.4%, followed by Jeddah city in 17.9%, (see Figure 9).
Figure 9. Program Directors’ City of Residence.
The respondents covered around 60 different appointed programs (Figure 10). Program directors from the discipline of medicine and its subspecialties, pediatrics, surgical specialties, dentistry, and OBGY were over two-thirds of the respondents (Figure 11).
Figure 10. Programs of the Appointed Program Directors Surveyed.
Figure 11. Gender Distribution of Program Directors among Disciplines.
In Figure 12, Figure 13, Figure 14, Figure 15, Figure 16, Figure 17, Figure 18, Figure 19, Figure 20 and Figure 21 we present some additional characteristics and qualitative features of the participants in our quality initiative including: age of program directors in each discipline (Figure 12), Health Sectors for the Included Program Directors (Figure 13).
Figure 12. Age of Program Directors in Each Discipline.
Figure 13. Health Sectors for the Included Program Directors.
Figure 14. Other Appointed Administrative Positions for the Program Directors.
Figure 15. Percentage of Program Directors with Degree in Health Profession Education among Disciplines.
Figure 16. Percentage of Program Directors with Degree in Health Profession Education in Different Regions.
Figure 17. Gender Distribution among Trainees.
Figure 18. Distribution of Trainees among Disciplines.
Figure 19. Level of Trainees in Our Study Sample.
Figure 20. Distribution of Trainees among Different Regions in Saudi Arabia.
Figure 21. The Distribution of Trainees among Different Cities.
Nearly half of the program directors work in the Ministry of Health (49.4%), followed by the Ministry of Defense (16.2%); and only three directors work in the Royal Commission for Jubail and Yanbu.
Nearly half of the program directors are holding other administrative jobs at their centers, including head of department or section for 29.9% (see Figure 14 below).
In Figure 15 and Figure 16 some more interesting data about our participants are presented. Over 27% of the PDs have other special qualifications including medical education, hospital administration, public health, clinical epidemiology, and others but only 8.9% of the program directors have some degree or certificates in health profession education.

4.1.2. Trainees’ Demographic Data

There were 3696 trainees who participated in this self-administered, structured, closed, and open-ended online survey. The response rate of the trainees through the online link survey was 27%. There were 1932 (52.3%) male respondents and most of the respondents were Saudi nationals, 91.9% (see Figure 17).
The respondents were representing all major disciplines, however, most of the trainees were trainees in medical disciplines, 90.9% (see Figure 18, below)
Residents in training represented the most common respondents, 94.5%, while fellows were 5.5%. One-third of the residents were working in the R2 level (1270, 34.4%) followed by 23.3% at the R3 level (860 =), (See Figure 19, below).
In Figure 20, we communicate that more than one-third of the trainees are in the Central Region (40.4%) of Saudi Arabia and after that approximately one-third in the Western Region (34.5%). A very small number are in the Northern Region (0.4%).
The highest number of trainees are working in the city of Riyadh (39.6%) and then 20.1% of the trainees are working in Jeddah; only 0.4% are working in the city of Hail (see Figure 21, below).

4.2. Results of Key Performance Indicator (KPI) Domains

Reaction domain of Kirkpatrick
The initial set of KPIs was based on the reaction domain of the Kirkpatrick training evaluation model, which measures the value of the training program among stakeholders, i.e., trainees, trainers, and program directors. There were four KPIs developed based on this domain.

4.2.1. Percentage of Trainees’ Satisfaction (GMH QI 1.1)

This is a score-based KPI, with a target set at 80% satisfaction. This is an accumulative score of eight different domains retrieved from the trainees’ questionnaire. The domains are residents’ satisfaction of academic activities in their training centers, residents’ satisfaction with other residents in their programs, residents’ satisfaction with the administrative component of their training, residents’ satisfaction with the program and recommending it to others, residents’ satisfaction with their training center, residents’ satisfaction with their competency, residents’ satisfaction with their specialty, and residents’ satisfaction with SCFHS functions. The calculated score for overall trainees’ satisfaction was 69%; more details are described in Figure 22. So the target is not yet achieved.
Figure 22. Trainees’ Satisfaction Scores.

4.2.2. Percentage of Trainers’ Satisfaction (GMH QI 1.2)

This KPI is defined as the total number of satisfied trainers divided by the total number of surveyed trainers in that period of time. The target score was set at an 80% satisfaction rate. Unfortunately, these data were unavailable; trainers were not surveyed last year.

4.2.3. Percentage of Program Directors’ (PDs’) Satisfaction (GMH QI 1.3)

This is a score-based KPI, with a target set at 80% satisfaction. This accumulative score contains four main satisfaction domains: satisfaction with training faculty; satisfaction with evaluation process; satisfaction with educational components; and satisfaction with the training center’s governance, education environment, and resources. The overall score for PDs’ satisfaction was calculated at 76%, slightly below our target value of 80%; this is explained in detail in Figure 23.
Figure 23. Program Directors’ Satisfaction Scores.

4.2.4. Percentage of Trainees’ Burnout (GMH QI 1.4)

This KPI was measured through the trainees’ questionnaire and defined as the number of trainees who feel burnout [38,39,40,41,42,43,44,45,46,47,48,49] divided by the total number of surveyed trainees in that period of time. The target value was set at a level equal to/below 10%. The calculated score was 66.7% and the breakdown score is demonstrated in Figure 24, Figure 25 and Figure 26.
Figure 24. The Prevalence of Burnout among Residents.
Figure 25. The Mean Burnout Scores According to the Trainees’ Level of Training.
Figure 26. The Mean Burnout Scores According to the Trainees’ City of Residency.
Learning domain of Kirkpatrick
The second set of KPIs was, it focused on the learning domain, which is the capability in developing new knowledge, skills, and attitude, and it contains 13 different KPIs.

4.2.5. Percentage of PDs Who Attended PD Training Course Offered by the SCFHS (GMH QI 2.1)

This KPI is defined as the number of program directors who attended the course divided by the total number of program directors in the same period. There was no target value set for this KPI. There were 38.4% of program directors who received such courses, which is better than last year, which was calculated at 15% (See Figure 27 and Figure 28, below). The details are described in the following figures.
Figure 27. Percentage of Program Directors Who Received Orientation by Gender.
Figure 28. The Percentage of Orientation Received by Program Directors by Region.

4.2.6. Percentage of Trainers in PGMT Programs Who Successfully Completed SCFHS Training (GMH QI 2.2)

This KPI is defined as the total number of certified trainers by SCFHS in 2018. The data source for this KPI was the raining department database. The target score was set at the value 200. In 2018, there were 276 trainers who completed the SCFHS training course. This KPI achieved its target and it is significantly higher than last year, 2017, when the score was 66% of the Target.

4.2.7. Percentage of Surveyors Who Have Successfully Completed SCFHS’s Accreditation Training (GMH QI 2.3)

This KPI is defined as the total number of certified surveyors by SCFHS divided by the total number of surveyors. The target was set at a level of 70%. The calculated score for this year was 67%, as per the feedback from the Accreditation Department Database.

4.2.8. Percentage of Trainees’ Compliance with Minimal Procedure, Case Exposure Policies Required Competency Index (GMH QI 2.4)

This KPI is target value was set at 90%, however, we did not identify the source of data yet to be used, and it was not yet measured. Our expectation is that this data will be available through the One45 program by 2020, and we will work on achieving that.

4.2.9. Percentage of Trainees Who Have Received Trainees’ Evaluation by Program in a Specific Period (GMH QI 2.5)

This KPI is defined as the total number of trainees who received their evaluation with two weeks of the end of every rotation (see Figure 29, below). The target was set at the level of 100% compliance. Ideally, this score should be collected from the trainees, but, unfortunately, it was not included in their survey, therefore, the only available data were in the program directors’ survey. The mean score for compliance was estimated to be 74.2%. This compliance rate was nearly identical in each region, with p = 0.9 (ANOVA test).
Figure 29. The Satisfaction Rate of the Program Directors with the Compliance of the Training Faculty in Submitting Trainees’ Evaluations in Time.

4.2.10. Percentage of Research Included in Curriculum (GMH QI 2.6)

This is defined as the number of programs that included research-related issues. The target value was set at 70%. This KPI source of information is the Training Department Database but, unfortunately, these data are not yet available. Research participation was documented in 33.1% of the trainees’ surveys.

4.2.11. Percentage of Programs with Burnout Policy (GMH QI 2.7)

This KPI target value was set at a 100% compliance rate. The source of information was the program directors’ survey which counted the compliance with such policy as 69%. This KPI in the future should rely on the accreditation data, as well.

4.2.12. Percentage of Compliance with Implementing Incorporated e-log System in Each Program (GMH QI 2.8)

This KPI is defined as the number of programs that are implementing an incorporated e-log system divided by the number of all programs. The target value was set at 30%. The source of such information is the Training Department Database. Unfortunately, the score is not available, as the data were not yet received.

4.2.13. Percentage of Trainees Who Fulfilled Their Promotion Criteria (GMH QI 2.9)

This KPI target was set at 70%, and the Training Department Database is the main source of this information. Unfortunately, the score is not available, as the data were not yet received.

4.2.14. Percentage of Trainees Who Passed the Board Exam (GMH QI 2.10)

This KPI target was set at a 70% pass rate in the board exam, and the Training Department Database (and Assessment Department Database) is the main source of this information. The calculated score for this KPI in 2018 was 79%.

4.2.15. Percentage of Programs That Incorporated Simulation in Their Curricula (GMH QI 2.11)

The target value of this KPI was set at 50% or more. At present, the source of this information was the program directors’ survey, Q 3.1, and the trainees’ survey, Q10. The score in the program directors’ survey was 3% only, while the score in the trainees’ survey was 34.2%. The accumulative mean score was only 18.6%.

4.2.16. Percentage of Programs with Trainees Receiving Annual Master Rotation Plan (GMH QI 2.12)

This KPI is defined as the number of trainees receiving their annual master training program before 1 October. The target value of this KPI was set at above 90%. The source of this information was the program directors’ survey. Around 73% of the program directors apply for the annual master rotation plan for residents in their center early in the academic year, as per program directors’ feedback (see Figure 30 and Figure 31, below).
Figure 30. The Rate of Applying for Annual Rotation Plan for Residents Early in the Academic Year.
Figure 31. The Rate of Applying for Annual Rotation Plan for Residents Early in the Academic Year by Region.

4.2.17. Percentage of Programs in Compliance with the Annual Master Plan (GMH QI 2.13)

This KPI is defined as the number of trainees who completed their block rotation. The target value of this KPI was set at a level above 90%. The source of this information was the program directors’ survey. The mean score of adherences to the master rotation planned early in the academic year was 76.9%; this rate was similar across genders, type of training center, and regions (see Figure 32, below).
Figure 32. The Compliance Rate (Adherence) to the Master Rotation Plan Set Early in the Academic Year.
Training domain of Kirkpatrick
The final set of KPIs is focused on the training governance domain and it contains six different KPIs.

4.2.18. Percentage of Programs with Complete Goals and Objectives for Residency Programs (GMH QI3.1)

This is defined as the total number of programs with a clear statement outlining goals and educational objectives for their residency program. The target value for this KPI was set at 80%. Clear goals and objectives of every rotation were provided to 87.7% of residents (see Figure 33, below). This was similar in both genders and all regions. However, program directors who received orientation about the role and responsibilities of the job were significantly better in providing the residents with those objectives in every rotation.
Figure 33. The Rate of Providing the Residents with Goals and Objectives of Every Rotation among Program Directors Who Did/or Did Not Receive Orientation on Their Roles and Responsibilities.

4.2.19. Percentage of Completed Trainer Evaluations by Trainee per Program (GMH QI 3.2)

The target score for this KPI was set at 70%. The trainees had a chance to evaluate the training faculty in a structured process in 48.7% of cases (see Figure 34, Figure 35 and Figure 36 below). This was not significantly different between genders, regions, or types of program—independent or joint program. However, there is a significant difference found when the program director spends 2–3 h managing the training activity in comparison to other time frames.
Figure 34. The Rate of Enabling the Trainees to Evaluate the Faculty in a Structured Process According to the Gender of Program Director.
Figure 35. The Rate of Enabling the Trainees to Evaluate the Faculty in a Structured Process by Region.
Figure 36. The Rate of Enabling the Trainees to Evaluate the Faculty in a Structured Process According to the Time Spent by Program Directors to Manage the Training Activities.

4.2.20. Percentage of Adherence to Accreditation Requirements (GMH QI 3.3)

This is defined as the rate of programs maintaining accreditation standards during the validity period. The target value of this KPI was set at over 90% compliance. The source of information is the Accreditation Department Database.

4.2.21. Percentage of PD Turnover Rate (GMH QI 3.4)

This is the number of program directors who did not complete their full term. This target was set below 10%. Unfortunately, the data of this information are not available. The source of such information should be the Training Department Database.

4.2.22. Percentage of Accreditation Compliance Score (GMH QI 3.5)

This is the number of centers that met accreditation standards. The target value of this KPI is to be determined.

4.2.23. Percentage of Violations with the Matching Regulations (GMH QI 3.6)

This is the total number of matching violations in a given year. The target value was set below 1%. The source of this information is the Admission and Registration Database. The rate of matching violation for 2018 was 4%.
In Table 4, we provide a summary of the Key Performance Indicators measurement.
Table 4. Summary of Key Performance Indicator Results.

6. The SCFHS Sustainable Medical Education Framework

Our research study based on the implementation of key performance indicators to benchmark, maintain, and improve the quality of postgraduate medical training provides interesting insights for the integration of knowledge management, medical education, and big data analytics research for sustainability. As an effort to codify the key findings and to generalize the implications, we present, in Figure 37, a proposed Sustainable Medical Education Framework. This framework can be used as a road map as well as a policy making document for the digital transformation of the SCFHS.
Figure 37. The SCFHS Sustainable Medical Education Framework.
The basic assumption for our proposed Sustainability Medical Education Framework is the integration of four integral pillars of efficiency.
Sophisticated big data ecosystem: At the first layer, a sophisticated big data ecosystem provides a sustainable infrastructure for the aggregation and analysis of diverse data required for medical training and education as well as for advanced decision making in the context of the SCFHS and its stakeholders.
Data quality and governance allow transparency, collaboration, and effective communication for the optimization of medical training. Special services are also facilitating data integration and flow for the provision of value-adding services. Multi-source feedback for all the training programs in all the phases of the relevant workflow also permits flexible communication. A unified big data framework exploits a sophisticated KPI-driven process for measuring and improving performance in parallel with the execution of medical training.
A big data ecosystem requires:
  • Data quality;
  • Transparency, collaboration, and effective communication;
  • Data integration and flow;
  • Multi-source feedback; and
  • A big data unified framework
Quality and performance management: The second pillar in the Sustainable Medical Education Framework is related to an integral quality and performance management component. The definition, implementation, execution, and monitoring of diverse, multi-facet KPIs provide a sophisticated quality assurance and monitoring component. The availability of a quality rating system together with the exploitation of quality benchmarks provides a transparent methodological way for enhanced quality and performance management. The execution of strategies for quality assurance, as presented in this research, operates as an analytical meta-level for performance management. The set of KPIs that were designed and implemented offers an initial testbed for further execution. In a future development, more additions can be done beyond Kirkpatrick’s framework. Emphasis on Bloom’s taxonomy of educational goals or sophisticated learning analytics for active learning are just some possible directions.
The SCFHS quality and performance management should include
  • KPI management,
  • A quality rating system, and
  • quality benchmarks.
Enhanced decision-making capability: The orchestration of the previous two layers, namely the big data ecosystem and quality and performance management, interconnect with the next capability in the Sustainable Medical Education Framework. Performance evaluation, monitoring, and control appear as transparent ubiquitous processes that offer real-time knowledge and critical data for significant decisions in the SCFHS organization and, overall, in relevant organizations worldwide. This pillar also cultivates a continuous improvement culture towards innovation and medical education impact. In the context of innovation, significant effort is invested in the integration of emerging technologies and modern healthcare management to the medical education process. The provision of new services, e.g., a sophisticated Social Network of Medical Experts for personalized medical education must be a priority. Furthermore, decision making must be a continuous driver of enhancements and improvement toward measurable education impact.
Enhanced decision making improves
  • Performance,
  • Monitoring and control,
  • Innovation,
  • Resource utilization, and
  • Education impact.
Sustainability in medical education: The final component in the Sustainable Medical Education Framework is the sustainability layer. All the investments and the initiatives presented in this research paper must be considered as developmental efforts that interconnect various performance capabilities in a long-term, viable, sustainable educational strategy for the healthcare domain.
Within the sustainability strategy, various actions could be critical milestones. The continuous sustainable investment in research competence of the members of SCFHS, as well as a parallel integration of the healthcare industry with academia and medical education institutions, can lead to a unique, innovative, sustainable Medical Education and Innovation Startup Ecosystem. Through this Saudi Medical Innovation Startup Ecosystem, the vision of digital transformation of the healthcare sector can be achieved in an easier way. In future research, we plan to operationalize the requirements for the Medical Innovation Startup Ecosystem and also specify some key projects in the context of the SCFHS. One of the key priorities within this sustainable vision is community development and enhancement through the so-called Saudi Social Network of Medical Experts in the context of SCFHS.
Sustainability includes
  • Saudi Social Network of the SCFHS,
  • Medical Education and Innovation Startup Ecosystem,
  • Research competence,
  • Social impact,
  • SCFHS Digital Transformation, and
  • Vision 2030
The implications of our study for training quality department in the context of the previously presented Sustainable Medical Education Framework are significant. Special and general actions towards efficiency improvements were evident in the survey data and are presented in Figure 38, as our proposed Medical Training Quality Framework. The constructive interpretation of the key findings in the previous section can be summarized in Figure 38.
Figure 38. The SCFHS Training Quality Approach—Medical Training Quality Framework.
Namely, four collaborative, integrative, value-adding processes enhance the medical training impact and also enhance the innovation capability and the social impact: training excellence, research excellence, simulation and virtual reality cloud medical laboratory, and work–life balance summarize socio-technical factors and contribute to greater medical education and training impacts. In future research, we plan to investigate further the determinants of this model.
The exploitation of our recommended Medical Training Quality Framework can be performed within the context of the previously presented Sustainable Medical Education Framework. The provision of the data and services ecosystem in the SCFHS orchestrates a holistic management of training, research, and technological resources within a fertile life–work balance education context.
The first pillar is the training excellence strategy: a transparent, organization-wide strategy for the management of medical training and education as a key enabler of Vision 2030 goals in the KSA. The key dimensions of this approach are summarized as follows:
  • Integrated training excellence organization-wide;
  • A unified ecosystem of smart medical educational content and distance learning facilities;
  • A blended virtual and physical space for reading, study, and research;
  • Flexible procedures and cloud training services;
  • Advanced notification and counseling procedures;
  • Awareness campaigns; and
  • Institutional leadership.
The overall idea is that training must be designed, executed, and supported with an integrated strategy for effective educational active learning strategies, learning analytics for quality assurance, and sophisticated socio-technical systems.
The second pillar is related to research excellence. It is a bold requirement for the evolution of the SCFHS towards the achievement of Vision 2030 goals to promote, support, and implement a total quality management of research excellence. The integration of research into medical training programs will require resources, but the impact will be high. Furthermore, the cultivation of a research culture, the systematic development of research skills, and the implementation of a virtual space for research skills development and research execution are additional aspects of the research excellence approach in the SCFHS. The ultimate objective is the enhancement of the visibility and impact of Saudi medical research together with a well-designed publication and awareness program. The research excellence dimension incorporates among others the following services and key capabilities:
  • Research integration into training,
  • A unified research skills lab and research cloud service,
  • Best practices and standardization of the research process,
  • Collective research towards higher accomplishment,
  • Visibility and impact of Saudi medical research, and
  • An advanced publication program.
The third dimension is related to an open cloud-based simulation lab with advanced Virtual Reality / Augmented Reality (VR/AR) capabilities. The implementation of this open educational and training space will significantly increase the exposure of the members and beneficiaries of the SCFHS to research with a significant impact on research capability. Special effort must be also paid to the development and integration of VR/AR medical content for training and integration in the open cloud-based simulation lab. The key dimensions of the simulation lab are summarized as follows at a general abstract level:
  • Integrated open simulation lab;
  • Saudi Agora of Virtual and Augmented Reality Medical Lab and Content Laboratory; and
  • Virtual and augmented reality content for medical training and labs
The last dimension of the Medical Training Quality framework is related to the human factor (work–life balance). This is a very sensitive area in our training quality strategy. The design of rewards for training efficiency and performance, together with time and space reserved for sophisticated research activities are necessary for higher satisfaction rates and training impact. Counseling and a professional social network for Saudi medical experts promote the feeling of belonging to a greater community of achievers and contributors to the social good of healthcare in the KSA. The following is a partial list of considerations for work–life balance (human factors):
  • Rewards for training efficiency and performance,
  • Time and space reservation for sophisticated research activities,
  • Advanced professional social networking of medical experts, and
  • Counseling.
Our research contributes to the theoretical knowledge of the integration of sustainability and medical training and education in multiple ways. The proposed SCFHS Sustainable Medical Education Framework is a novel theoretical abstraction that facilitates management and innovation in medical training and education. From a theoretical point of view, it organizes and integrates multidisciplinary contributions from the epistemology of medical education and information systems research and constructs a managerial framework that focuses on the sustainability aspect of the phenomenon.
It is also one of the first studies worldwide that connects medical education and training with sustainability. We propose a concrete sustainability layer in our framework with emphasis on the following pillars:
  • Integration of social network of medical experts, towards continuous input and feedback from the community of the SCFHS;
  • Establishment of a Medical Education and Innovation Startup Ecosystem capable of bringing to life and launching smart services and applications for domains of special interest for the evolution of medical research and education;
  • The development of a continuous research competence developmental process integrated into the sustainability framework; and
  • The direct integration of medical education and training excellence to the digital transformation vision. The quality initiative described in this research study is a first proof of concept approach for the capacity of multidisciplinary approaches to promote this vision to functional managerial and training practices.
The next connection of our research study to sustainability is also related to the impact of the key findings. It is our strategic decision to enhance all the medical training programs based on the findings and the related recommendation of this study. This is a bold move towards sustainable medical education. The design of transparent and participatory managerial procedures will enhance the quality of the value proposition of the SCFHS to its members and the Saudi economy and society
Our focus on KPIs in this research study is intended. It establishes a managerial framework for sustainable/medical training through a reliable, trusted, and transparent process in which the community of the SCFHS is active. The interpretation of this approach, from sustainability’s point of view, is also significant. We promote sustainability in medical training by connecting the beneficiaries and the stakeholders in the SCFHS though the training and quality assurance process. We emphasize human capital and we set up an integrated approach for measuring diverse aspects of quality and performance. The proposed framework and the quality initiative are bold responses to the need of Saudi healthcare for a continuous improvement approach towards innovation and training excellence. The preparation of highly qualified health specialists is a key aspect of sustainability in healthcare.

7. Conclusions and Future Research

This research study promotes inter-disciplinary research with a significant social impact. We discussed the implementation of key performance indicators to benchmark, maintain, and improve the quality of postgraduate medical training. We analyzed the details, the scientific background, and the implementation of the Training Quality Assurance Initiative by the Saudi Commission for Health Specialties as a knowledge management, big data analytics, and sustainability joint endeavor. The first significant contribution of this research is related to the introduction of a full set of measurable, trusted, and reliable KPIs defined for efficient medical training. At a glance, 23 KPIs were used to measure the efficiency of medical training around three dimensions, namely, training governance, learning, and reaction. We do plan in the near future to update this functional set of KPIs with additional ones related to active learning strategies as well as community engagement in the context of a newly developed Social Network of Medical Experts. Another direction for future enhancement of this analytics approach is to emphasize also on research and social impact.
Another significant dimension of our research study was the synthesis of the key finding of this Training Quality Assurance Initiative towards the generalization of findings for future exploitation and best practices communication. As a result, we introduced the Sustainable Medical Education Framework, with four integrative components, namely big data and analytics ecosystem, SCFHS quality and performance management, enhanced decision making capability, and sustainability. This framework can serve as a sophisticated decision-making roadmap for the promotion of social impact and sustainability in the context of medical training and education. The entire research paper also initiates a multidisciplinary debate on the role of human factors [50,51,52,53], knowledge management, KPI research, and sustainable data science for medical education and digital transformation of healthcare. We also introduced the SCFHS Training Quality Approach and the relevant Medical Training Quality Framework as a key enabler of personalized medical training services for individuals and the community in the KSA.
In the next developmental phases, we are looking forward to the integration of social networks for medical expertise sharing and community building [54], as a strategic enabler of leadership and innovation [55]. Our special interest in the resilience and wellbeing of healthcare professions trainees and healthcare professionals during public health crises [56] is challenging the knowledge creation [57] and the design of new timely medical training programs capable of promoting the vision of patient-centric healthcare [58].
Our research contributes to the body of knowledge management, KPI research, sustainable data science, and sustainability, with empirical evidence from a large-scale survey on postgraduate medical training in the KSA. The overall contribution to quality assurance in medical training will be analyzed further in the near future with the integration of more KPIs for active learning, research enhancement, and social impact.

Author Contributions

The authors contributed equally to this research work and they involved in an integrated way in the stages of this research including Conceptualization, methodology, software, validation, formal analysis, investigation, resources, data curation, writing—original draft preparation, writing—review and editing, visualization, supervision, project administration, funding acquisition. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Khoja, T.; Rawaf, S.; Qidwai, W.; Rawaf, D.; Nanji, K.; Hamad, A. Health Care in Gulf Cooperation Council Countries: A review of challenges and opportunities. Cureus 2017, 9, e1586. [Google Scholar] [CrossRef]
  2. ten Cate, O.; Scheele, F. Competency-based postgraduate training: Can we bridge the gap between theory and clinical practice? Acad. Med. 2007, 82, 542–547. [Google Scholar] [CrossRef]
  3. Kjaer, N.K.; Kodal, T.; Shaughnessy, A.F.; Qvesel, D. Introducing competency-based postgraduate medical training: Gains and losses. Int. J. Med. Educ. 2011, 2, 110–115. [Google Scholar] [CrossRef][Green Version]
  4. Scheele, F.; Teunissen, P.; Van Luijk, S.; Heineman, E.; Fluit, L.; Mulder, H.; Meininger, A.; Wijnen-Meijer, M.; Glas, G.; Sluiter, H.; et al. Introducing competency-based postgraduate medical education in the Netherlands. Med. Teach. 2008, 30, 248–253. [Google Scholar] [CrossRef]
  5. 5. Rees, C.J.; Bevan, R.; Zimmermann-Fraedrich, K.; Rutter, M.D.; Rex, D.; Dekker, E.; Ponchon, T.; Bretthauer, M.; Regula, J.; Saunders, B.; et al. Expert opinions and scientific evidence for colonoscopy key performance indicators. Gut 2016, 65, 2045–2060. [Google Scholar] [CrossRef]
  6. Sandhu, S.S.C. Benchmarking, key performance indicators and maintaining professional standards for cataract surgery in Australia. Exp. Ophthalmol. 2015, 43, 505–507. [Google Scholar] [CrossRef]
  7. Raitt, J.; Hudgell, J.; Knott, H.; Masud, S. Key performance indicators for pre hospital emergency Anaesthesia—A suggested approach for implementation. Scand. J. Trauma. Resusc. Emerg. Med. 2019, 27, 42. [Google Scholar] [CrossRef]
  8. Santana, M.J.; Stelfox, H.T. Trauma quality Indicator consensus panel. Development and evaluation of evidence-informed quality indicators for adult injury care. Ann. Surg. 2014, 259, 186–192. [Google Scholar] [CrossRef]
  9. Lytras, M.D.; Aljohani, N.R.; Hussain, A.; Luo, J.; Zhang, X.Z. Cognitive Computing Track Chairs’ Welcome & Organization. In Proceedings of the Companion of the Web Conference, Lyon, France, 23–27 April 2018. [Google Scholar]
  10. Lytras, M.D.; Raghavan, V.; Damiani, E. Big data and data analytics research: From metaphors to value space for collective wisdom in human decision making and smart machines. Int. J. Semant. Web Inf. Syst. 2017, 13, 1–10. [Google Scholar] [CrossRef]
  11. Visvizi, A.; Daniela, L.; Chen, C.W. Beyond the ICT- and sustainability hypes: A case for quality education. Comput. Hum. Behav. 2020. [Google Scholar] [CrossRef]
  12. Alkmanash, E.H.; Jussila, J.J.; Lytras, M.D.; Visvizi, A. Annotation of Smart Cities Twitter Microcontents for Enhanced Citizen’s Engagement. IEEE Access 2019, 7, 116267–116276. [Google Scholar] [CrossRef]
  13. Lytras, M.D.; Mathkour, H.I.; Abdalla, H.; Al-Halabi, W.; Yanez-Marquez, C.; Siqueira, S.W.M. An emerging social- and emerging computing-enabled philosophical paradigm for collaborative learning systems: Toward high effective next generation learning systems for the knowledge society. Comput. Hum. Behav. 2015, 5, 557–561. [Google Scholar] [CrossRef]
  14. Visvizi, A.; Lytras, M.D. Editorial: Policy Making for Smart Cities: Innovation and Social Inclusive Economic Growth for Sustainability. J. Sci. Technol. Policy Mak. 2018, 9, 1–10. [Google Scholar]
  15. Visvizi, A.; Lytras, M.D. Transitioning to Smart Cities: Mapping Political, Economic, and Social Risks and Threats; Elsevier-US: New York, NY, USA, 2019. [Google Scholar]
  16. Arnaboldi, M. The Missing Variable in Big Data for Social Sciences: The Decision-Maker. Sustainability 2018, 10, 3415. [Google Scholar] [CrossRef]
  17. Olszak, C.M.; Mach-Król, M. A Conceptual Framework for Assessing an Organization’s Readiness to Adopt Big Data. Sustainability 2018, 10, 3734. [Google Scholar] [CrossRef]
  18. Kent, P.; Kulkarni, R.; Sglavo, U. Finding Big Value in Big Data: Unlocking the Power of High Performance Analytics. In Big Data and Business Analytics; Liebowitz, J., Ed.; CRC Press Taylor & Francis Group, LLC: Boca Raton, FL, USA, 2013; pp. 87–102. ISBN 9781466565784. [Google Scholar]
  19. Kharrazi, A.; Qin, H.; Zhang, Y. Urban big data and sustainable development goals: Challenges and opportunities. Sustainability 2016, 8, 1293. [Google Scholar] [CrossRef]
  20. Wielki, J. The Opportunities and Challenges Connected with Implementation of the Big Data Concept. In Advances in ICT for Business, Industry and Public Sector; Mach-Król, M., Olszak, C.M., Pełech-Pilichowski, T., Eds.; Springer: Cham, Switzerland, 2015; pp. 171–189. ISBN 978-3-319-11327-2. [Google Scholar]
  21. Lytras, M.D.; Aljohani, N.R.; Visvizi, A.; De Pablos, P.O.; Gasevic, D. Advanced Decision-Making in Higher Education: Learning Analytics Research and Key Performance Indicators. Behav. Inf. Technol. 2018, 37, 937–940. [Google Scholar] [CrossRef]
  22. Zhang, J.; Zhang, X.; Jiang, S.; de Pablos, P.O.; Sun, Y. Mapping the Study of Learning Analytics in Higher Education. Behav. Inf. Technol. 2018, 37, 1142–1155. [Google Scholar] [CrossRef]
  23. Rajkaran, S.; Mammen, K.J. Identifying Key Performance Indicators for Academic Departments in a Comprehensive University through a Consensus-Based Approach: A South African Case Study. J. Sociol. Soc. Anthropol. 2014, 5, 283–294. [Google Scholar] [CrossRef]
  24. Asif, M.; Awan, M.U.; Khan, M.K.; Ahmad, N. A Model for Total Quality Management in Higher Education. Qual. Quant. 2013, 47, 1883–1904. [Google Scholar] [CrossRef]
  25. Varouchas, E.; Sicilia, M.-A.; Sánchez-Alonso, S. Towards an Integrated Learning Analytics Framework for Quality Perceptions in Higher Education: A 3-Tier Content, Process, Engagement Model for Key Performance Indicators. Behav. Inf. Technol. 2018, 37, 1129–1141. [Google Scholar] [CrossRef]
  26. Varouchas, E.; Sicilia, M.-Á.; Sánchez-Alonso, S. Academics’ Perceptions on Quality in Higher Education Shaping Key Performance Indicators. Sustainability 2018, 10, 4752. [Google Scholar] [CrossRef]
  27. Sanyal, B.C.; Martin, M. Quality assurance and the role of accreditation: An overview. In Higher Education in the World 2007: Accreditation for Quality Assurance: What Is at Stake? Palgrave Macmillan: New York, NY, USA, 2007; pp. 3–23. [Google Scholar]
  28. McDonald, R.; Van Der Horst, H. Curriculum alignment, globalization, and quality assurance in South African higher education. J. Curric. Stud. 2007, 39, 6. [Google Scholar] [CrossRef]
  29. Deming, W. Improvement of quality and productivity through action by management. Natl. Product. Rev. 2000, 1, 12–22. [Google Scholar] [CrossRef]
  30. Dlačić, J.; Arslanagić, M.; Kadić-Maglajlić, S.; Marković, S.; Raspor, S. Exploring perceived service quality, perceived value, and repurchase intention in higher education using structural equation modelling. Total Qual. Manag. Bus. Excell. 2014, 25, 141–157. [Google Scholar] [CrossRef]
  31. Suryadi, K. Key Performance Indicators Measurement Model Based on Analytic Hierarchy Process and Trend-Comparative Dimension in Higher Education Institution. In Proceedings of the 9th International Symposium on the Analytic Hierarchy Process for Multi-criteria Decision Making (ISAHP), Viña del Mar, Chile, 2–6 August 2007. [Google Scholar]
  32. Chalmers, D. Teaching and Learning Quality Indicators in Australian Universities. In Proceedings of the Institutional Management in Higher Education (IMHE) Conference, Paris, France, 8–10 September 2008. [Google Scholar]
  33. Varouchas, E.; Lytras, M.; Sicilia, M.A. Understanding Quality Perceptions in Higher Education: A Systematic Review of Quality Variables and Factors for Learner Centric Curricula Design. In EDULEARN16—8th Annual International Conference on Education and New Learning Technologies; IATED: Barcelona, Spain, 2016; pp. 1029–1035. [Google Scholar]
  34. Yarime, M.; Tanaka, Y. The Issues and Methodologies in Sustainability Assessment Tools for Higher Education Institutions: A Review of Recent Trends and Future Challenges. J. Educ. Sustain. Dev. 2012, 6, 63–77. [Google Scholar] [CrossRef]
  35. Sheng, Y.; Yu, Q.; Chen, L. A Study on the Process Oriented Evaluation System of Undergraduate Training Programs for Innovation and Entrepreneurship. Creative Educ. 2016, 07, 2330–2337. [Google Scholar] [CrossRef]
  36. Toussaint, N.D.; McMahon, L.P.; Dowling, G.; Söding, J.; Safe, M.; Knight, R.; Fair, K.; Linehan, L.; Walker, R.G.; A Power, D. Implementation of renal key performance indicators: Promoting improved clinical practice. Nephrology (Carlton) 2015, 20, 184–193. [Google Scholar] [CrossRef]
  37. Pencheon, D. The Good Indicators Guide: Understanding How to Use and Choose Indicators. London: Association of Public Health Observatories & NHS Institute for Innovation and Improvement. 2008. Available online: http://fingertips.phe.org.uk/documents/The%20Good%20Indicators%20Guide.pdf (accessed on 10 September 2020).
  38. Lunsford, L.D.; Kassam, A.; Chang, Y.F. Survey of United States neurosurgical residency program directors. Neurosurgery 2004, 54, 239–245, discussion 245–247. [Google Scholar] [CrossRef]
  39. Ahmadi, M.; Khorrami, F.; Dehnad, A.; Golchin, M.H.; Azad, M.; Rahimi, S. A Survey of Managers’ Access to Key Performance Indicators via HIS: The Case of Iranian Teaching Hospitals. Stud. Health. Technol. Inform. 2018, 248, 233–238. [Google Scholar]
  40. Aggarwal, S.; Kusano, A.S.; Carter, J.N.; Gable, L.; Thomas, C.R., Jr.; Chang, D.T. Stress and Burnout among Residency Program Directors in United States Radiation OncologyPrograms. Int. J. Radiat. Oncol. Biol. Phys. 2015, 93, 746–753. [Google Scholar] [CrossRef]
  41. Ishak, W.W.; Lederer, S.; Mandili, C.; Nikravesh, R.; Seligman, L.; Vasa, M.; Ogunyemi, D.; Bernstein, C.A. Burnout during residency training: A literature review. J. Grad. Med. Educ. 2009, 1, 236–242. [Google Scholar] [CrossRef]
  42. Krebs, R.; Ewalds, A.L.; van der Heijden, P.T.; Penterman, E.J.M.; Grootens, K.P. Burn-out, commitment, personality and experiences during work and training; survey among psychiatry residents. Tijdschr. Psychiatr. 2017, 59, 87–93. [Google Scholar]
  43. Gouveia, P.A.D.C.; Ribeiro, M.H.C.; Aschoff, C.A.M.; Gomes, D.P.; Silva, N.A.F.D.; Cavalcanti, H.A.F. Factors associated with burnout syndrome in medical residents of a university hospital. Rev. Assoc. Med. Bras. 2017, 63, 504–511. [Google Scholar] [CrossRef]
  44. Dyrbye, L.N.; West, C.P.; Satele, D.; Boone, S.; Tan, L.J.; Sloan, J.; Shanafelt, T.D. Burnout among U.S. medical students, residents, and earlycareer physicians relative to the general U.S. population. Acad. Med. 2014, 89, 443–451. [Google Scholar] [CrossRef]
  45. Shoimer, I.; Patten, S.; Mydlarski, P.R. Burnout in dermatology residents: A Canadian perspective. Br. J. Dermatol. 2018, 178, 270–271. [Google Scholar] [CrossRef]
  46. Porter, M.; Hagan, H.; Klassen, R.; Yang, Y.; Seehusen, D.A.; Carek, P.J. Burnout and Resiliency Among Family Medicine Program Directors. Fam. Med. 2018, 50, 106–112. [Google Scholar] [CrossRef]
  47. Chaukos, D.; Chad-Friedman, E.; Mehta, D.H.; Byerly, L.; Celik, A.; McCoy, T.H., Jr.; Denninger, J.W. Risk and Resilience Factors Associated with Resident Burnout. Acad. Psychiatry. 2017, 41, 189–194. [Google Scholar] [CrossRef]
  48. Holmes, E.G.; Connolly, A.; Putnam, K.T.; Penaskovic, K.M.; Denniston, C.R.; Clark, L.H.; Rubinow, D.R.; Meltzer-Brody, S. Taking Care of Our Own: Multispecialty Study of Resident and Program Director Perspectives on Contributors to Burnout and Potential Interventions. Acad. Psychiatry. 2017, 41, 159–166. [Google Scholar] [CrossRef]
  49. Dyrbye, L.; Shanafelt, T. A narrative review on burnout experienced by medical students and residents. Med. Educ. 2016, 50, 132–149. [Google Scholar] [CrossRef]
  50. Jagsi, R.; Griffith, K.A.; Jones, R.; Perumalswami, C.R.; Ubel, P.; Stewart, A. Sexual harassment and discrimination experiences of academic medical faculty. JAMA 2016, 315, 2120–2121. [Google Scholar] [CrossRef]
  51. Karim, S.; Duchcherer, M. Intimidation and harassment in residency: A review of the literature and results of the 2012 Canadian Association of Internsand Residents National Survey. Can. Med. Educ. J. 2014, 5, e50–e57. [Google Scholar] [CrossRef]
  52. Fnais, N.; Soobiah, C.; Chen, M.H.; Lillie, E.; Perrier, L.; Tashkhandi, M.; Straus, S.E.; Mamdani, M.; Al-Omran, M.; Tricco, A.C. Harassment and discrimination in medical training: A systematic review and meta-analysis. Acad. Med. 2014, 89, 817–827. [Google Scholar] [CrossRef]
  53. Bates, C.K.; Jagsi, R.; Gordon, L.K.; Travis, E.; Chatterjee, A.; Gillis, M.; Means, O.; Chaudron, L.; Ganetzky, R.; Gulati, M.; et al. It Is Time for Zero Tolerance for Sexual Harassment in Academic Medicine. Acad. Med. 2018, 93, 163–165. [Google Scholar] [CrossRef]
  54. Giroux, C.; Moreau, K. Leveraging social media for medical education: Learning from patients in online spaces. Med. Teach. 2020, 42, 970–972. [Google Scholar] [CrossRef]
  55. McKimm, J.; McLean, M. Rethinking health professions’ education leadership: Developing ‘eco-ethical’ leaders for a more sustainable world and future. Med. Teach. 2020, 42, 855–860. [Google Scholar] [CrossRef]
  56. Wald, H. Optimizing resilience and wellbeing for healthcare professions trainees and healthcare professionals during public health crises—Practical tips for an ‘integrative resilience’ approach. Med. Teach. 2020, 42, 744–755. [Google Scholar] [CrossRef]
  57. Naeve, A.; Yli-Luoma, P.; Kravcik, M.; Lytras, M.D. A modelling approach to study learning processes with a focus on knowledge creation. Int. J. Technol. Enhanc. Learn. 2018, 1, 1–34. [Google Scholar] [CrossRef]
  58. Spruit, M.; Lytras, M. Applied Data Science in Patient-centric Healthcare. Telemat. Inform. 2018, 35, 2018. [Google Scholar] [CrossRef]

Article Metrics

Citations

Article Access Statistics

Multiple requests from the same IP address are counted as one view.