Competences of Flexible Professionals: Validation of an Invariant Instrument across Mexico, Chile, Uruguay, and Spain

: The purpose of this study was to validate and test latent mean di ﬀ erences in a second-order factorial structure for self-assessed competences across four Spanish-speaking countries (Spain, Chile, Mexico, and Uruguay). Assessments of 11,802 higher education graduates about their own level of competences were examined. According to our ﬁndings, latent mean di ﬀ erences observed in our data lend support to earlier ﬁndings in the context of universities from these four countries. In order to compare assessments from di ﬀ erent countries, we previously found support for metric and scalar invariance in a second-order factor structure, including innovation, cooperation, knowledge management, and communication, organizational and participative competences. These ﬁndings have serious managerial implications in regard to institutional evaluations developed by national accreditation bodies and identiﬁcation of competence requirements by the labor market. In addition, our research provides a powerful tool for young students and employers, as it contains valuable information about what competences should be expected by students when ﬁnishing their studies.


Introduction
The inclusion of competences in study programs has most often been considered as an opportunity to improve quality in higher education [1,2]. Similarly, Latin American institutions have been working for the last decade in the development of quality assurance programs in higher education. These programs aim to set out the standards provided by universities and particularly the competences every student could expect to develop in universities. Most of them have been implemented by the National Accreditation Commission (CNAP) in Chile, the Higher Education Accreditation Council (COPAES) in Mexico, and the National System of Accreditation and Promotion of Quality in Higher Education in Uruguay. As a result, the majority of universities have upgraded existing study programs and initiated new courses with this approach.
Prior to examining these required competences, a definition for the term "competence" should be given. Many authors have attempted to define the concept of "competence", but currently, there is still no accepted definition. In fact, there is so much discussion around this term that it is difficult to find a definition capable to fit all approaches to use this term [3]. The understanding of competence depends to a high extent on the cultural context [4]. The term "competency", often used in the US, refers to a particular behavior that can be learned and assessed as job performance [5][6][7]. This approach belongs to a body of literature that emphasizes the importance of fitting the competences required in each workplace.
The alternative term "competence", prevalent in the UK, has been used to refer to the set of learning outcomes to be acquired as a result of a training period which qualifies the learner to develop a particular task or occupation. As clearly pointed out in [8], a competency is a part of generic competence, which can be used in real performance contexts. The well-known taxonomy of competences developed by [9] to denominate a combination of mental skills (knowledge), the affective domain (attitudes), and the psychomotor domain concerning manual or physical skills (skills) has become an essential reference of this approach. This influential taxonomy is widely considered as the basis for a multi-dimensional framework of competence to underpin the European Qualifications Framework [10]. Following this approach, competences have been defined as individual capacities, skills, and aptitudes that have a positive direct effect on different productivity gains [11]. It has also been pointed out that a competence is more than just knowledge and skills, as it involves the ability to meet complex demands by drawing on and mobilizing psychosocial resources (including skills and attitudes) [12]. Throughout this paper, we use the term "competence" in accordance with this approach (KSA: Knowledge, Skills, and Attitudes). More recent research has drawn its attention to competence measurement and assessment through systematic approaches [13,14].
The definition and selection of key competences and learning outcomes concerning the student's workload for teaching, learning, and assessment activities have received much attention over the last two decades [15][16][17]. In this context, competences for sustainable development have seen renewed importance as a tool for preparing graduates to transform our society into a more sustainable one [18]. In order to accomplish this aim, study programs are currently improving to provide graduates with a complete set of sustainability competences [19] through appropriate frameworks for teaching and learning involving specific teaching methodologies and alliances with other stakeholders [20]. The definition of this set of competences for sustainable development remains still unclear due to the complexity of their articulation in higher education programs [21]. While some programs have focused on the integration of emotional intelligence, other approaches have failed to address action taking, personal commitment, or system and future orientation [22]. Differences in the conceptualization of individual competences for sustainability vary to a great extent across countries due to cross-cultural validity issues [23].

Cross-Cultural Differences in Self-Assessment of Competences
Formal research concerning measurement invariance is particularly important in cross-cultural research. Through the application of this analysis in diverse groups of individuals, we may obtain critical information on the judicious use of latent construct assessments [24][25][26]. According to scientific contributions in the framework of the Tuning project, European and Latin American graduates differ in their opinion about the competences they possess. European graduates considered that some particular competences were important for them, such as the capacity for analysis and synthesis, problem-solving, ability to work autonomously, and information management skills. In contrast, Latin American graduates underlined that the commitment to quality, ethical commitment, and the ability to make decisions were also relevant [17,27].
There is a vast amount of literature on the assessment of competences by European graduates. Overall, graduates seemed to feel better prepared for their job than the job actually required. In all cases, a large majority of flexible graduates thought that their level of competences was high enough to meet employers' requirements. Nonetheless, these flexible graduates also experienced a shortage of competences pertaining to the realm of authority, the ability to mobilize the capacity of others, and the ability to perform well under pressure [28]. In particular, widespread dissatisfaction was found among Spanish graduates [29]. A sizable proportion of these graduates felt that their professional careers did not match their academic performance in higher education. Moreover, they associated the difficulties of their transition to the labor market with the excessively theoretical, generalist, and obsolete approach of their studies, whilst a substantial part of the education acquired at university was regarded as irrelevant [30].

Classifications of Competences
As previously mentioned, various heterogeneous approaches have been put forward to address the question of what competences should graduates possess. Unfortunately, there appears to be little agreement on this issue [31]. The term "generic competence" is generally understood to mean those competences which provide the basis for continuous learning, problem-solving, and analytical thinking. In the literature, generic competences refer to systemic, instrumental, and interpersonal competences [27]. On the other hand, the term "specific competence" has been applied for vocational or field-specific knowledge, skills, and attitudes [32,33]. There are three categories of specific competences, depending on their specificity to firms, tasks, or the economic sector [34].
Regarding the relationship between competences and the labor market, it has been examined to what extent specific and generic competences could predict the labor market outcomes [35]. On the other hand, several authors have attempted to characterize the link between different learning environments and competences. Proactive learning environments have been found to foster reflective competences [36,37], whereas other authors point out to their effectiveness in the acquisition of generic and specific competences [38].
According to the employers' perspective, some authors have studied the requirements of the labor market requires for Spanish university graduates: vocational and generic competences, the latter category being divided into interpersonal, methodological, and knowledge-related competences [39]. Furthermore, it has been called into question the particular approach of conflicts of interest between firms and apprentices, using this basic division of industry-specific and generic skills [40].
Based on a quantitative approach, a remarkable increase of new classifications of competences has been found in the literature. Some authors differentiated between management competences, compared to general-academic and discipline-specific competences, within the context of the EU's Targeted Socio-Economic Research (TSER) program [33]. A more exhaustive classification differentiated between generic, socio-emotional, participative, specialized, organizational, rule-application, physical, and methodological competences [41]. However, further analysis dropped the rule-application and physical competences from the list [42], with the inclusion of an item concerning the ability to assert one's authority. Although we may consider that this item does not refer to particular knowledge or skill, its attitudinal approach based on discipline and organizational routines is considered a useful resource in building one's identity work [13].
More recent evidence highlights the importance of a reduced number of competence dimensions, such as cognitive, professional, social-reflexive, and physical (or manual) skills [43]. Competences defined in the framework of the Latin American Tuning project were classified into learning capabilities, social values, interpersonal skills, and technological and international skills [17]. Other proposals of generic competences refer to the mobilization of human resources, functional flexibility, innovation, and knowledge management, whereas specific competences concern mainly to professional expertise [31].
Since 2009, much more information on this issue of a common classification of competences has become available in the Spanish context. Some authors suggested a division into methodological, social, participative, and specialized competences [44]. It has also been concluded that competences in higher education could be divided into six groups, namely interpersonal competences, knowledge management, communication, organizational skills, innovation, and participative competences [45], which is the factor structure this paper is based on.

Current Study
In light of the above, no one to the best of our knowledge has studied how the Bologna principles concerning competences have been implemented in Latin American universities. Moreover, despite this interest in a common classification of competences, there is little agreement on which competences should be emphasized throughout higher education studies. Lastly, although the analysis of competences in European universities is an undeniably interesting issue, current solutions to this question appear not to be well-grounded in the quantitative procedures of data analysis. While most previous works used exploratory procedures of data analysis, such as factor analysis, this research aimed to develop a more elegant methodology for addressing the question. This paper outlines a new approach to the issue of finding a set of competences in higher education across countries according to the competences classification provided by [45].

Participants and Procedure
The participants for this study were 11,802 higher education graduates from four Spanish-speaking countries: 4680 graduates (69.7%) were from Spain, 3994 (33.8%) from Mexico, 2554 (21.6%) from Chile, and 574 (4.9%) were from Uruguay. Data were obtained in the framework of two different research projects: the Spanish participants were surveyed in the "The flexible professional in the knowledge society" (REFLEX) during the period of 2005-2006. Likewise, participants from Chile, Mexico, and Uruguay were interviewed in the follow-up project, the "El Profesional Flexible en la Sociedad del Conocimiento" (PROFLEX) project in Latin America. In this project, interviews with graduates were carried out between 2007 and 2008. In both projects, the questionnaire was administered to graduates in any short-cycle degree of higher education, according to the International Standard Classification of Education (ISCED). This category includes graduates but also Bachelor's (ISCED 6), Master's (ISCED 7), and Doctoral (ISCED 8) degrees.
Both projects were implemented in different universities in Europe, Latin America, and Japan. Spain, Chile, Mexico, and Uruguay were particularly selected for the analysis of measurement invariance following a two-fold strategy. Firstly, Spanish-speaking countries were selected to avoid potential confusion due to language misunderstandings [25,27]. Regarding sample sizes, only for these four countries, representative samples were obtained at the national level.
All participants had obtained their degrees in a high number of universities: 33 from Spain, 17 from Chile, 9 from Mexico, and 12 from Uruguay. The average age of Spanish participants was 30.5 years (standard = 3.3), whilst 65.7% of the graduates in the sample were female. The average age of Mexican graduates was 28.5 years (SD = 3.8), and there were 54.5% females. In the case of Chilean graduates, the average age was 29.6 years (SD = 3.8), and 55.4% of the participants were female. Finally, in the Uruguayan group, the average age of the participants was 28.6 years (SD = 3.9), and the sample was made up of 59.8% of females. All participants were volunteers and had previously been instructed on the aim and purpose of the study, as well as their rights to withdraw from the questionnaire at any time, during or after the data collection.

Instrument
The instrument was composed of 19 items in which graduates were required to rate their own level of generic competences from 1 (very low) to 7 (very high) so that higher scores represented the perception of a higher level of competence. This scale allowed to measure six constructs of competences, including (a) innovation, (b) interpersonal, (c) knowledge management, (d) communication, (e) organizational, and (f) participative competences, as well as the general construct "competence" represented by a second-order factor [21]. Each latent factor was measured with three to four items, and its internal consistency was considered acceptable, despite the low Cronbach"s alpha values obtained in the second factor "Cooperation" (α = 0.695) and the fourth factor "Communication" (α = 0.677), due to the low number of items in the Cooperation subscale [32], as shown in Table 1, with additional descriptive statistics.

Data Analysis
This model hypothesized a priori that six first-order factors and a single second-order factor structure explained the variability found in the observed data (Model 0). Several criteria were used in order to evaluate the goodness-of-fit of this scale and test measurement invariance. Maximum likelihood estimation procedures were used to estimate all model parameters. In order to correct for non-normality, robust statistics were handled [46]. However, given the very large sample size and the well-known sensitivity of the chi-square statistic to the sample size, the appearance of a statistically significant model misfit was not surprising [47,48]. Therefore, the overall absolute model fit for each country was assessed using other goodness-of-fit indexes: Root Mean Square Error of Approximation (RMSEA), the Incremental Comparative Fit index (CFI), as well as the Standardized Root Mean Square Residual (SRMR). Confidence intervals for RMSEA values (CI) were also reported to test the accuracy of the analysis [49,50]. In terms of the RMSEA, values less than 0.05 indicated an acceptable model fit, representing a reasonable approach to the population [49], whereas CFI values near 1.0 were considered optimal, and values greater than 0.90 showed a satisfactory fit [51]. Finally, an SRMR value under 0.08 is generally considered as an indicator of good fit [48]. The analysis was computed using the EQS software version 6.2 [32].
Once the factorial structure was established, a multi-group confirmatory factorial analysis was performed to test the validity of the scale under study in Spain, Chile, Mexico, and Uruguay by building and assessing several nested models. The evidence of multi-group invariance laid on a set of incremental goodness-of-fit indexes, including both overall (CFI, RMSEA, and SRMR) and incremental goodness-of-fit indexes (∆CFI and ∆χ 2 ) [23]. Whenever a non-significant change in the χ 2 was observed, and changes in CFI were lower or equal to 0.01, we considered that requirements for invariant criteria were met [52][53][54]. Differences in scaled or corrected chi-square tests were computed using the robust procedure [55].
The primary purpose of determining the evidence of configural invariance (Model 1) across the samples focused on establishing a well-fitting multi-group baseline model [53]. The next stage in the procedure was to evaluate the first-order metric invariance (Model 2) for the four countries. For the following level, we tested scalar invariance (Model 3) to examine whether the scores from different countries had the same unit of measurement. In order to determine whether intercepts were invariant across countries, this model was nested (Model 3), and intercepts of the measured variables were constrained to make them equal across groups. Among these items, we included those associated with measurement variables whose factor loadings had previously been fixed to the value 1.00. Finally, we evaluated the invariance of the second-order factor loadings by introducing equality constraints on all second-order factor loadings (Model 4). Effect sizes were assessed subsequently (Model 5), following [56]. As in the case of first-order latent mean differences, analogous conditions were specified for the second-order model to avoid misspecification problems. Therefore, equality constraints were placed on second-order factor loadings or, as was the case, on the variances of latent factors. However, given that the estimation of variance parameters for dependent variables was not consistent with the hypothesized model, residual variances of first-order factors were constrained to the value 1.0 (Model 5). Furthermore, latent factor means for the Spanish sample were fixed to zero like the reference group due to the need to fix an arbitrary origin for the latent factor intercepts at this level of invariance [32]. The method for computing the standardized effect size was in [57].

Validation of the Measurement Instrument
As shown in Table 2, testing of the initially hypothesized model for each group yielded a marginally good fit. However, exploration of the modification indexes suggested that the model fit would improve if a new factor loading was added between Item 13 (Ability to assert your authority) and Factor 5 (Organizational skills). As this modification was meaningful and coherent with the theoretical background [13], the parameter corresponding to this factor loading was freely estimated. After carrying out the related modification, the model was estimated for each country. Goodness-of-fit indexes showed a remarkable improvement in the fit in the four countries, as can be seen in Table 2. The improvement in the corrected difference of the χ 2 statistic was significant in all cases. Moreover, the absolute increase in the CFI index equaled or surpassed the required threshold of 0.01 [52], indicating that the data fitted the model better. The unstandardized estimates for the additional factor loading between item 13 and factor 5 were 0.762, 0.918, 1.089, and 1.174 for Spain, Mexico, Chile, and Uruguay, respectively.

Factorial Invariance Analysis
Reasonable evidence of configural invariance across countries was achieved, as shown in Table 3. These results indicate a well-fitting multi-group baseline model against which to compare all subsequently specified invariance models [52]. Next, first-order factor loading equality (metric invariance) was tested by constraining factor loadings to be equal across countries (Model 2), obtaining a satisfactory fit of the multi-sample data. The decrease in the CFI index (−0.003) between Model 1 and Model 2 indicated that factor loadings were not substantially different across countries. Subsequently, scalar invariance of the model was assessed yielding a satisfactory fit to the multi-sample data. Goodness-of-fit results from Model 3 showed a remarkable improvement in the overall fit. Similarly, differences in CFI were negligible (i.e., ∆CFI = 0.007). Subsequently, the invariance of second-order factor loadings was examined in Model 4, leading to a good fit of the multi-sample data. Lastly, the computed difference in the robust CFI values between Model 4 and the configural Model 1 was ∆CFI = 0.005, which confirmed the invariance of all first and second order factor loadings. Invariant factor loadings can be examined in Appendix A.

Latent Factor Mean Differences
Thanks to the evidence of invariant factor loadings and intercepts, factor means could be compared further across countries. Model 5 showed a reasonable fit to the data. As shown in Table 4, Mexican, Chilean, and Uruguayan graduates obtained on average greater positive scores in all competences. The largest difference was found in participative competences in all countries, followed by communication competences in Chile and Uruguay and knowledge management in Uruguay. Differences in knowledge management and communication with regard to Spanish graduates were moderate in Chile and large in Mexico and Uruguay, but positive in these countries.

Discussion
This study examined the construct validity of an instrument for measuring self-assessed competences and its invariance across Spain, Chile, Mexico, and Uruguay, according to previous research. All items seemed to work well in these four countries, regardless of cultural differences. Results showed that the instrument comprised six first-order factors: interpersonal competences, knowledge management, communication, organizational skills, innovation, and participative competences [45], as well as a global competence factor.
Thus, our work led us to conclude that this factorial structure is invariant across countries, which confirms previous findings in the literature about the classifications of competences. However, the most valuable contribution of this work is the comparison of self-assessed competences by higher education graduates across countries. We obtained comprehensive results showing that competences can be quantified and compared across different contexts. The methodology we devised in this paper represents a powerful tool for the assessment of competences acquired in higher education.
An interesting result of this work was the definition of a knowledge management construct. Thanks to this factor, our instrument constitutes a novelty with regard to the traditional division of generic and specific competences. As reported in this work, we found evidence that the knowledge management factor is composed of learning processes and specific competences, not limited to the field of study of the graduate's degree. Our research is in line with previous results [31], as they also point out the existence of this factor. In fact, other works also refer to knowledge management competences [53] or professionally knowledgeable graduates [43]. On the other hand, the reference to specific competences is present in most of the previous research using this particular term, or the words "specialized" and "technical" [27,34]. Most of these works also corroborate the need to develop learning process skills [17], general-cognitive abilities [43], and methodological or theoretical competences [42,44]. This common agreement led us to strengthen our confidence in the conclusion that specific competences and learning skills should be combined in the same construct concerning knowledge management.
Regarding interpersonal competences, the current study differs in the term used to refer to this construct. Our interpersonal competences construct includes an item about the ability to mobilize the capacities of others. This supports the previous definition of a participative competence construct [18] as the ability to construct the environment, make decisions, and assume responsibility, among other tasks. The term "participative competences" has been frequently used in previous works to refer to this construct [41,42,44]. Other works have also used the term "mobilization of human resources" [31,58], whose meaning is hardly distinguishable from the item included in the interpersonal competences construct defined by this work. However, we should be aware that our construct does not consider the socio-emotional approach, which is undeniably essential in the graduates' workplaces.
Both constructs, knowledge management and interpersonal competences, were found to be consistent with well-established models of competences. However, this work reinforces the conclusions suggested by more recent works, such as those regarding innovation, communication, participative, and organizational competences. The innovation competences construct is in complete agreement with the works published during the last decade [31,58]. However, in contrast to them, we found that the item about awareness to new opportunities should be included in the participative competences construct, rather than in the innovation construct. This slight discordance could be due to the wider definition of the construct in [31] that combines the innovation and knowledge management constructs.
Likewise, our conclusions regarding communicative competences are in line with [59]. The only difference is the inclusion of the item regarding the use of computers and internet in the communication construct. We classified this item into the innovation construct, as well as other authors did in their research [31]. This lack of references to communication skills does not mean that previous works have discarded it from their classifications of competences. Instead, it has been frequently combined with cooperation skills.
Given an adequate level of configural, first-order metrics, and scalar and second-order metric invariance, differences between latent means were tested. As expected, Mexican, Chilean, and Uruguayan graduates gave a greater self-assessment of their own level of competences when compared to Spanish graduates. These results offer compelling evidence that Spanish graduates tend to feel less self-confident about their competences. This result could be due to the well-known difficulties in Spain during the last decade that have become apparent in the transition from higher education to the labor market [29]. Therefore, Mexican, Chilean, and Uruguayan graduates may be likely to rate their own competences higher than their Spanish counterparts, as only some of them consider their professional careers match their academic performance in higher education. Additionally, and according to previous research, Latin American academics, graduates, employers, and graduates considered all competences to be important [17].
The largest differences with reference to Spanish graduates were found in Uruguay, followed by Mexico and Chile, whereas the highest assessments corresponded to participative competences, communication competences, and knowledge management. These findings are consistent with the fact that Latin American students point to participative competences as one of the most important competences, through the commitment to quality, ethical commitment and the ability to make decisions [5]. Results concerning knowledge management were also in complete agreement with this report. Finally, although the approach to communication skills is solely focused on the ability to communicate in a second language, it obtained the greatest difference between what was considered important and the rating given to its achievement. Therefore, the latent mean differences observed in our data lend support to earlier findings in the context of Latin American universities.
Overall, our results found satisfactory agreement with earlier classifications of competences, although slight differences were also observed in some constructs. Further research could benefit from our conclusions, as a meaningful contribution to previous classifications of competences. However, to the best of our knowledge, we consider that the most remarkable result to emerge from the data is the validation of a second-order factorial structure. This result implies that a general construct of competence underlies for each construct we defined in this work. As far as we know, we believe that no other authors have provided evidence of such a hierarchical factorial structure in their measurement model.
From a practical point of view, the findings of our research have serious managerial implications for quality assurance programs in higher education developed by accreditation bodies. Our results could be exploited for establishing a common framework of key competences so that study plans and academic programs could be assessed. We are confident that the definition of a set of competences will make it easier for academics to create new teaching and learning activities, aimed to develop some particular learning outcomes, as well as the assessment of the level of competences achieved by students. With this background, universities will be able to determine whether the Bologna guidelines concerning competence-based education have been successfully implemented in new degree proposals. In addition, our research provides a powerful tool for young students and employers, as it contains valuable information about what competences should be expected by students when finishing their studies.
It is plausible that a number of limitations may have influenced the conclusions obtained in this work. The first is the limited selection of items shown in the questionnaire which omit some interesting approaches such as the emotional dimension of teamwork. The other negative factor inherent to our methodology is that we have focused on testing non-invariant factor loading and intercepts. Although we could have examined the equality of other parameters, such as error measurement covariates or factor variances, we considered that these parameters had little interest in this research. Finally, there is a lack of available updated datasets dealing with the research question of this paper. We analyzed a dataset gathered about a decade ago as we could not find any other international assessment performed for higher education graduates in the Spanish and Latin American countries.

Conclusions
In conclusion, this paper presented a robust measurement instrument for self-assessed competences and has described to what extent graduates' perceptions differ according to the country where they studied. Our research provided a framework for a new way to define hierarchical constructs of competences in higher education and succeeded in making considerable insight with regard to the implementation of competence-based education in Latin American countries, as it is further substantiated by findings from the present study.
This study contributed toward enhancing our understanding of competences in higher education. In our view, the strength of our study lies in the definition of a methodology for the quantification and comparison of self-assessed competences across different contexts through the validation of an invariant measurement instrument. The present findings might help to identify improvement areas concerning the processes of teaching and learning. Specifically, our approach would lend itself well to the use in the assessment of required competences by employers recruiting higher education graduates in the labor market. The use of standardized instruments may be useful for updating the contents of current study programs according to these requirements and the self-assessments provided by graduates.
Our research could also be a useful aid in the assessment of learning outcomes in the curricula of higher education students. Some universities are already working on the development of an evaluation system to provide a score in competences for each student, serving as a complement of academic marks. Policymakers could also encourage stakeholders to develop institutional accreditation processes based on this methodology. Meanwhile, we are confident that future students and employers may use these findings to examine which should be expected from each professional profile.

Conflicts of Interest:
The authors declare no conflict of interest.