Next Article in Journal
Eco-Design of Airport Buildings and Customer Responses and Behaviors: Uncovering the Role of Biospheric Value, Reputation, and Subjective Well-Being
Next Article in Special Issue
Efficacy of Social Networking Sites for Sustainable Education in the Era of COVID-19: A Systematic Review
Previous Article in Journal
A Vector Map of Carbon Emission Based on Point-Line-Area Carbon Emission Classified Allocation Method
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Assessment by Competences in Social Sciences: Secondary Students Perception Based on the EPECOCISO Scale

by
José-María Álvarez-Martínez-Iglesias
1,*,
Francisco-Javier Trigueros-Cano
2,
Pedro Miralles-Martínez
2 and
Jesús Molina-Saorín
1
1
Department of Didactics and School Organization, University of Murcia, 30100 Murcia, Spain
2
Department of Didactics of Mathematics and Social Sciences, University of Murcia, 30100 Murcia, Spain
*
Author to whom correspondence should be addressed.
Sustainability 2020, 12(23), 10056; https://doi.org/10.3390/su122310056
Submission received: 26 September 2020 / Revised: 24 November 2020 / Accepted: 27 November 2020 / Published: 2 December 2020

Abstract

:
The purpose of this work is to find out the students’ perception of the level of development of competences in the area of Social Sciences, Geography, and History. It has been developed taking into account what has been learned in the area, the level of difficulty for its acquisition, the assessment tools, and the transfer to a real situation. The development of a scale—original and unprecedented—called Evaluation of the Perception of Competences from the Social Sciences (EPECOSICO) has allowed, through a descriptive and quantitative study of validation of the instrument with an intentional sampling, to know the opinion of more than 1400 students of 4th year of secondary education (Spain). The instrument used presents good psychometric attributes, emphasizing the balanced characterization, considering that the number of students is similar (51%, 49%), the confidentiality and the validity. Finally, a KMO higher than 0.9 indicates high reliability in the consistency of the instrument used.

1. Introduction

The emergence in the 21st century of the new pedagogical concept of competencies represented a new challenge and impetus for the concept of educational evaluation. In this sense, there is a need to educate in competences, since—among other aspects—it favors interdisciplinarity with the rest of the subjects as is the case of language teaching. However, for this approach to be effective, a high level of procedural content and the application of knowledge to specific situations is necessary, for which there should be a greater presence of formative evaluation in this educational process [1]. In Spain, these competencies have become integral elements of the curriculum both in the Organic Law on Education (OLE) and in the Organic Law on the Improvement and Quality of Education (OLIQUE). All of these legislative changes show the need to develop evaluation systems that gather information on the three types of content (concepts, procedures, and attitudes) but also the mobilization of these contents in an adequate way, through the approach of true situations that require students to solve real problems, in which they can apply their knowledge in a creative way [2]. It also requires the use of a wide range of assessment test formats that help to verify complex thinking or problem-solving skills during and at the end of the process [3]. These assessment tests are already being designed in other European countries and have proven to be effective in detecting what historical and geographic thinking skills students have, and how far they have progressed in acquiring competencies in subjects such as social sciences in secondary education.
Furthermore, the implementation, development, and evaluation of competencies must be carried out transversally in all areas. In the opinion of [4], geography and history are disciplines that make possible the development and learning of these basic educational competences, not only the social and civic ones, redefined in the current OLIQUE as “Social and civic competences”. It is necessary to try to overcome an interpretation of competencies that connects only to competitiveness. Being competent means that students interact and are able to argue and make proposals for improvement [5,6]. These operations require knowledge about how society is and how it works, how human relations have been generated and modified over time, what consequences the actions carried out by individuals and groups have had and still have [7]. Understanding the meanings of human actions in different contexts is part of the essence of the social sciences. It is not necessarily the person who accumulates the greatest amount of scholarly information on a subject, whether geographical, historical, or otherwise, who is the most competent, but the person who knows how to use it correctly in the right context. Such skills are part of the methods with which geographers and historians work. Incorporating the geographic and historical method into education seems to be a good strategy for training more competent people [8].
At present, there are numerous studies that show that a teaching method based on the memorization of content is not the most appropriate for achieving significant learning in students, understanding—therefore—the teaching of history is a broad and complex challenge that must revise disciplinary knowledge and travel along the path of historical thought [9,10].
Even recent opinion articles have pointed out the urgency of alleviating the pressing lack of knowledge of history among young people [11]. There is, therefore, a real demand that specialists in social science education, among others, have to meet in order to alleviate the levels of ignorance of some subjects, which are fundamental for understanding the society in which we live. In this context, teachers who teach the area of social sciences in secondary education must be aware that social knowledge is essential for the formation of competent persons capable of functioning in today’s world. However, in order to be effective and to meet the social demands that the curricula make explicit, certain routines must be overcome and teaching practice reformulated. It is not a matter of dispensing disciplinary knowledge but rather of using it in a manner that is more linked to current challenges and to the experiences of students. In this sense, it is necessary that we understand the close link between the social, geographic, and historical formation of students and the development of democratic, critical, and responsible citizenship. This does not mean that social sciences should necessarily be at the service of civic and citizen education, but rather that they should contribute to it from the values that are derived from the teaching of geography and history and that foster the training of individuals to recognize and orient themselves in today’s world. To achieve this, it is necessary that geographic and historical knowledge be connected to the acquisition of complex cognitive abilities and social, civic, and educational skills [12,13].
The inclusion of basic competencies in the education system implies having to revise the current concept of evaluation, in which the acquisition of conceptual knowledge is still mainly valued in the field of social sciences, and in which the use of the exam has almost uncontested supremacy as a measurement instrument [14,15]. Research on social science teaching shows that assessment procedures and criteria are still linked to culturalist purposes, to a supposed “objectivity”, the almost exclusive use of the textbook as a teaching material, and the predominance of contents that are excessively conceptual and out of context with social reality [16,17]. The evaluation process of the disciplinary content is carried out as if it were static, imperishable, and immutable knowledge. The presentation of the evaluation of facts and data out of context is a general trend, and the most worrying thing is that the teaching staff gives more importance to quantity than to quality [18]. Directing education towards the achievement of basic educational skills requires a change in the conception of teaching and classroom practice that also involves a reflection on the nature of student assessment and the methods and instruments used for it. The idea behind this approach is that learning cannot be conceived exclusively or mainly as the acquisition of disciplinary knowledge—as it has traditionally been done in most curricular areas, particularly in the social sciences—but that teachers must take into account the ability to apply this knowledge in new situations that may arise in everyday life as adults [19].
These circumstances are partly a consequence of the fact that evaluation is still confused with examination, perhaps due to the influence of international assessments (PISA, PIRLS, ICCS, etc.) which, supposedly, seek excellence and measure the levels of quality of educational institutions and systems, classifying students into the three main categories: excellent, ordinary, and failed. In this sense, the introduction of basic competences can be an opportunity to include in teaching a democratization of assessment models. The final objective would be to be able to carry out what some authors have called authentic assessment [20,21]. With this concept, we want to link student assessment with the real needs of future citizens and professional performance. Trying to find out what they know or are able to do, using new strategies and assessment procedures, including complex and contextualized tasks that require assessment over time and not only at the end of the process [22,23]. To this end, it is important to pay attention to all elements of the curriculum, such as methodology and resources, encouraging the joint participation of all educational agents in the evaluation carried out in the school, and exploring new evaluation instruments that go beyond the final exam [24]. In this sense, combining quantitative and qualitative methods in student assessment is indispensable for the correct development of complex cognitive skills and competences in the area of Social Sciences. For this reason, there is an urgent need to design assessment tests that measure students’ cognitive skills in social sciences, with the aim of evaluating not only students’ knowledge in disciplines such as geography and history but the skills and abilities related to them. The results of these tests will allow the development of a scale of progression in the acquisition of competences that currently does not exist at a national level, but which is being developed and applied in other Northern European countries with better academic performance in students, such as the cases of the United Kingdom and Finland [25].
The aim of this research is to find out the students’ perception of the level of development of competences according to what they have learned in the area of Social Sciences, Geography, and History by applying a scale called EPECOCISO. Two main objectives were proposed to achieve this end:
RQ: To analyze the validity of the construct by means of exploratory factorial analysis, the Kaiser–Meyer–Olkin test (KMO test), and the Bartlett sphericity test.
RQ: To know the relationship between competences, acquired knowledge, evaluation instruments, and the transfer of learning in Social Sciences, Geography, and History.

2. Materials and Methods

2.1. Design

The research design was based on a descriptive study that follows the sequence of a quantitative methodology, whereby a questionnaire (see Appendix A) called Evaluation of Perception of Social Science Skills (EPECOCISO) was developed. Once the scale was applied, an exploratory factorial analysis of the data was carried out, from which it was possible to extract seven factors.
When extracting the components, the maximum likelihood method was used on the covariance matrix [26], resulting in the seven factors present in Table 1. In addition, the calculation of the standardized values of each variable was considered and, subsequently, a factorial analysis was carried out, standardizing the variables with the aim of being able to operate from the same scale. On the other hand, it was necessary to apply the Varimax rotation in order to rotate the factorial solution, thus minimizing the number of variables that could have high saturations in each factor, simplifying—at the same time—the interpretation of the results. In this sense, the factorial solution was not carried out by previously determining an established number (maximum) of factors, but rather this chose those whose eigenvalue is greater than 1 [27].

2.2. Sample

The research was developed in the Region of Murcia (Spain), where the sample size was calculated according to [28] from a total population of 14,714 students of the fourth year of secondary education, where the calculation of the necessary sample sizes for a significance level of 0.05—and an error of no more than 0.3—requires the survey of no less than 996 subjects, this figure being widely exceeded in this research. Thus, according to the analysis of the sample size carried out with the help of the Research Support Service of the University of Murcia, the sample of the population studied is representative, with 1573 students being surveyed, which is much more than is necessary for the conclusions to be extrapolated with adequate scientific rigor. An intentional type of sampling was used, with a standard error of 0.7% for the total universe (according to the results of the STATSTM analysis).
It should be noted that before concluding the final selection of the sample, and from the desire to detect possible hidden errors in the questionnaire used (in terms of understanding for students, internal organization aspects, etc.), it was decided to make a prior application for a group of students (n = 35) belonging also to a compulsory secondary school in the Region of Murcia. This phase was carried out under the same conditions as those reproduced later, during the mass application of the instrument. It was precisely the incorporation of this initial application phase that allowed the drafting of some items of the scale to be refined.
Another aspect to be assessed is that, in terms of gender, this is a practically equivalent group (51% ♂; 49% ♀), whose ages are between 15 and 18 years. Moreover, the total number of subjects selected from the sample (n = 1422) was obtained after eliminating all those data collection instruments that contained errors (or missing data), reaching, finally, a very high confidence level (over 95%, according to the analysis with STATS) and with a maximum error of 0.7% [29]. Overall, it should be noted that the final sample is made up of a total of 18 secondary schools. More specifically, as can be seen in Figure 1, these are schools belonging to the municipalities of the Northwest, Huerta de Murcia, Vega Alta, Vega Media, Bajo Guadalentín, Mar Menor, Valle del Ricote, Altiplano, and Campo de Cartagena, a variety which undoubtedly enriches the validity of the data obtained.

2.3. Instrument

It is an instrument that shows the different significant relationships between the variables that influence the perception of the level of development of competences according to what has been learned in Social Sciences, Geography, and History, the degree of difficulty that students have in assimilating them, the instruments used to evaluate the degree of achievement of competences and the transfer of what has been learned in social sciences to real situations. Regarding the characteristics it presents, it is important to highlight that it is balanced (because the number of students is similar), reliable (because of the stability and consistency of what has been measured), and valid (because it measures what it is intended to measure). To do so, students had to answer on a Likert-type scale (of six options) choosing the one they most identified with (1 = totally disagree; 2 = disagree; 3 = neither agree nor disagree; 4 = quite agree; 5 = totally agree, and NS = don’t know).
With regard to the design followed to develop the instrument, it is important to highlight that it was developed in four major stages: the construction and definition of the items on the scale, the analysis process by expert judges for the validation of the information collection instrument, the application of the scale and finally the data analysis process. It should be noted that the items of the EPECOCISO scale were developed in two ways: firstly, an exhaustive documentary analysis was carried out in the area of Didactics of the Social Sciences and, secondly, a consultation process was carried out by experts of recognized experience from secondary or higher education centers. At all times, the aim was to evaluate the relevance of the questions designed, the degree of success in the dimensions, as well as the semantic suitability and understanding in the writing of each item. As a whole, the group of judges was formed by nine professional experts in areas related to the content or nature of the research itself (evaluation, social perception, and research methodology), in which to preserve an item, the criterion of agreement equal to or greater than 75% of the judges was used. As a result of this process, of the 52 initial items, 44 were retained (after discarding 8), precisely because they were those which best managed to describe and limit the students’ perception of competences. Likewise, once the grammatical suggestions and proposals of the experts had been taken into account, these indicators were screened by virtue of a more refined process. Finally, once the first (previous) application of the questionnaire was made to a group of selected students (n = 35), it was necessary to discard four indicators, leaving 40 in the end.
The scale for the Perception of evaluation in Social Sciences, Geography and History, and its relation to the development of competences (EPECOCISO) was definitively established with the following structure: (1) objective of the questionnaire and instructions, (2) sociodemographic data, (3) student perception of the competences according to what has been learned in the area of Social Sciences, Geography, and History, made up of eight items, (4) student perception of the degree of difficulty that the competences have in assimilating them, made up of another eight questions, (5) students’ perception of the instruments used to assess the degree of achievement of the competences, which also consists of eight questions, (6) students’ perception of the transfer of what they have learned in Social Sciences to a real situation, consisting of 16 questions divided into two blocks, one to give an opinion on general questions and the other to answer questions applied to a practical case related to a possible job offer; giving rise to the 40 final items of the questionnaire.
The Kuder–Richardson method was used to calculate the level of reliability of the questionnaire, applying Cronbach’s α coefficient for each of the seven factors analyzed. As a result of this process, and by virtue of the recommendations contained in the specialized literature [30], those items whose α coefficients extracted did not reach the reference value for satisfactory reliability were discarded, resulting in an original tool in printed format whose data would have to be emptied from a digital reader.
As can be imagined for this type of study, before starting the data collection, the appropriate permission was requested from the responsible academic authorities. Once it was granted, it was applied to the different groups of fourth year students.
On the other hand, the criteria of applicability and efficiency were also considered, following the recommendations of [31] so that the instrument designed would be easy to apply and would consume little time for the participants, taking into account the high number of the sample with which work was done. In this sense, the completion of the scale was carried out by the research team going directly to each of the classrooms of the participating centers where—in real-time—the questionnaires would be distributed among the students for their response (with a duration of between 15 and 20 min). Therefore, taking into account that for the coefficient α with a value of 0.914, we can say that we are facing an instrument with high reliability.

3. Results

The extraction of the answers collected in the questionnaires was organised by setting up a database which, once codified, became a template for the SPSS version 24 program, with which the information was analyzed. Firstly, an exploratory factorial analysis was carried out in which, taking into account the large study sample (as well as the diversity of variables), a reduction of the data could be applied by grouping them into variables (as shown in Table 1).
On this basis, it is possible to analyze the way in which the different variables correlate with each other, allowing the study of the existing (statistical) significance between these relationships and ensuring independence between the different groups established. As we know, factorial analysis requires that there be a correlation between different items; in this case, the conditions established for the analysis carried out were calculated on the basis of the sample adaptation of Kaiser–Meyer–Olkin (KMO test) and, also, on the basis of the Barlett sphericity test, the result of both being fully satisfactory. In this sense, according to [32] if the KMO value reached values below 0.6, it would be considered inappropriate (and not at all relevant) to carry out a factorial analysis; taking into account that in the case of this research, the KMO value is 0.926, it should be valued as a fully reliable and acceptable level, and at the same time, it is totally recommendable to carry out an analysis of this type. Secondly, Bartlett’s sphericity test allows us to contrast the hypothesis that the correlations between the items are not null [33] so that the presence of a significant result will be satisfactory (p < −995). From this base, since the resulting value in this research was p = 0.000, it can be affirmed that in this study, it is completely pertinent to carry out this type of analysis (the null hypothesis of the sphericity of the data can be rejected).
Finally, in order to facilitate the interpretation of the factorial solution, a rotation was applied through the Varimax method which, as we know, involves a type of orthogonal rotation (which considers the factors independent of each other), while minimizing the number of variables that offer high saturations in each factor. In this way, the interpretation of the factors is greatly facilitated, since the linked items make up a smaller number of factors (thus simplifying the denomination of each factor). For this research, the variance explained is 83.113% of the total variance, as we can see in Table 2, where all the factors whose own values are greater than 1 are included.
In this sense, the first factor (1), which was called “Perception of the application of the acquisition of key competences in life in society”, is made up of seven items which indicate the relationship between the key competences and their application in the student’s real life (it explains 58.1% of the total variance). The second factor (2), called “Perception of competences in terms of learning”, is made up of another seven items which express the relationship between the competences acquired and the knowledge learned (explaining 24.9% of the total variance). The third factor (3) is made up of five items that explain 4.9% of the total variance, which can be interpreted as the degree of difficulty involved in the assimilation of the competences. The fourth factor (4) is made up of two items which explain 3.3% of the total variance and is interpreted as the perception of the methodology used by the teaching staff in order to acquire the competences. The fifth factor (5) is made up of three items that explain 3.2% of the total variance and defines the perception of the importance of mathematical competence in the social sciences. A sixth factor (6), made up of two items, would explain 2.9% of the total variance and represents the students’ perception of the transfer of what they have learned in a real situation. The seventh and last factor is interpreted as the perception of the assessment instruments used to evaluate the degree of achievement of competences and is made up of two items which explain 2.3% of the total variance. Once the analysis of the matrix of the residuals of the correlations reproduced for each of the seven factors studied is carried out, it should be noted that only 5% of the residuals have reached absolute values greater than 0.05, so it can be said that the adjustment of the model is good, as it can explain 83.1% of the total variance.
Along the same lines, the internal consistency coefficient (Cronbach’s alpha) was carried out, the purpose of which is to calculate the reliability and internal consistency between items; this index has shown a value of 0.914 for the scale, thus showing high quality (let us remember that this index reaches a maximum value of 1, so that the closer it is to this value, the greater the reliability of the instrument). In this sense, and according to more or less tacit considerations among authors [14,15,16,17,18,19,20,21,22,23,24] it is considered that alpha values higher than 0.7 or 0.8 (depending on the source) are sufficient to guarantee the reliability of the scale. From this point of view, the EPECOCISO scale has a factorial structure that is fully adequate when it comes to evaluating how students in the fourth year of Obligatory Secondary Education perceive the competences linked to the area of Social Sciences. In order to try to check the possible relationships between the different factors, the Pearson correlation was practiced.
In Table 3, we have highlighted all those correlations that, being statistically significant, express positive or negative relationships between the factors. In this sense, positive correlations were detected between factor 1 (F1) and factors F2, F4, F5, F6, and F7 (which means that the more F1 increases, the more factors F2, F4, F5, F6, and F7 also increase). Along with these, positive correlations were also found between different factors, of which the following are worth highlighting: the relationship found between factor 2 (F2) with factors F1, F4, F5, F6, and F7; the relationship between factor F4 with F1, F2, F5, F6, and F7; between F5 with F1, F2, F4, F6, and F7; F6 with F1, F2, F4, F5 and F7; and finally, F7 with F1, F2, F4, F5, and F6.

4. Discussion

As we observed in the previous section, and in response to the first research question, the instrument presents a high degree of reliability and validity. As for the second research question, taking into account the variety and richness of the data presented for analysis, it was considered appropriate to apply—as descriptive statistics—the cross tables. This is a test that allows the analysis of the direction in which the responses given by the key informants to the different data contained for each of the factors specified above are polarized.
In this line, the existence of a positive relationship between factors F3 and F4 is highlighted, insofar as an adequate methodology for the teaching of competences reduces the difficulty in the assimilation of such competences, making it easier for the students to perceive this process than when working from a traditional methodology. On the other hand, the factor F3 was correlated with F7, showing that the assimilation of competences is easy for them as long as the assessment instruments proposed in the questionnaire are used (summaries, maps, etc.). Similarly, we have correlated factor F4 with factor F7, showing that students who are satisfied with the assessment instruments, also with regard to the methodology used by the teaching staff, thus observing the existence of a close relationship between the examination and the methodology. After applying a correlation between factor F2 and factor F6, the existence of concordant values between the two factors is highlighted, insofar as—for the students—what they have learned in the subject allows them to acquire a series of competences, while at the same time working by competences facilitates the transfer of learning to situations of their daily lives. Something similar happens when correlating F2 with F4, appreciating a degree of positive agreement in both factors which translates into an adequate methodology (such as that suggested through the designed questionnaire) helping to acquire the competences by virtue of the contents worked on in the classroom. To conclude, it was also considered totally pertinent to correlate factors F4 and F6, extracting a positive relationship in terms of the degree of agreement, as shown by the participants, those who are in favor of the methodology suggested in the questionnaire (use of images, commentary on texts, documentaries) also consider that the transfer of what has been learned (by that means) to a real, significant situation would be greater; that is to say, an adequate methodology would help to better transfer the knowledge and competences acquired to everyday situations for the students.
In addition to the analysis that we can extract from these cross-referenced tables (referenced ut supra), it is also convenient to carry out other types of considerations that confer a greater scope to this study (given the power of its instrument). In this sense, and once we have followed the development of the research process carried out, it becomes clear that the first of the factors configured to explain the proposed scale (F1) incorporates a whole series of indicators that express the present correspondence between the key competences and their transfer to the student’s real life, the greater their perception of these skills, the greater their assessment of their preparation (as students) to undertake their own actions related to these skills (knowing how to calculate costs, design itineraries for third parties, create presentations, identify heritage elements in leisure trips, etc.). Without a doubt, the items which make up this factor explain the highest accumulated percentage of the total variance and show to what extent the perception of competences is linked to their most applied facet. Therefore, it is a factor that would confirm this importance, already detected in the specialized literature [34] referring to the contribution that, from the area of Social Sciences, is made with regard to the development of key competences in secondary education. The second of the factors extracted (F2) groups together a series of indicators that refer to the perception that the students have of the competences according to what they have learned during the course. A whole series of items are included here that group together the perception that the students have extracted in this respect from the key competences and the contribution that all that has been dealt with (what has been learned) from the area of Social Sciences has made to their development, highlighting the marked transversal nature of this area. In this sense, it expresses the value that, for the student the contribution of what has been learned in Social Sciences, Geography, and History has when it comes to expressing their emotions (in a way that can be understood by others), in the face of the challenge of forging a social commitment with the environment that surrounds them, by virtue of the development of technological skills, or in the function of what such knowledge has served to awaken the initiative to undertake social participation, to better understand artistic productions or even to develop that much needed critical sense implicit in the construction of learning itself.
According to this study, and in line with other authors, the development of this type of competence represents a decisive advance in reducing the level of uncertainty of students when taking standardized tests—generally exams [35,36]—and therefore work on competences of this nature is recommended. Closely related to the previous and subsequent factors, we highlight the factor (F3), because according to the perception, the possibility of transfer, the methodology used, and the way of evaluating the competences, the student will show more or less difficulty in the acquisition of them.
Another factor (F4), referring to the methodology used by the teaching team, shows that the improvement of the written expression, the comprehension they experience about the reading of the texts, as well as the interpretation of graphics, are the result of the use of adequate methodologies that, in the end, facilitate the development of the calculation, interpretation, or synthesis of the information; standing out among such methodologies is the realization of summaries, as well as the use of graphics for their analysis. Another of the factors analyzed (F5) refers to the importance that students give to mathematical competence within the social sciences; thus, it was verified the existence of a positive association between three items that analyze the relevance of that competence. In this way, mathematical knowledge is sublimated as vital support for the resolution of those problems which take place in everyday life, granting an outstanding value when it comes to adequately understanding certain concepts related to the (nearby) economy. The sixth factor (F6) values the knowledge learned and its transfer to a real situation (applicable to other knowledge or close to the student’s daily life). It is a factor directly related to the second one (F2), insofar as the competences enable students to put into practice those historical, geographical, cultural, or artistic concepts in real-life situations, relevant to the conservation of the environment and natural resources, as well as to forge their own criteria about them, together with the importance of their contribution as citizens towards their conservation. Finally, the seventh factor (F7) focuses exclusively on assessment instruments (including objective tests), evaluating the perception that students have of the practice and the results obtained in the assessments of the subject in terms of the type of test (test or development type). These instruments are shown to be valuable measures of the skills and knowledge acquired in the subject in relation to history and its evolution, as well as the expression of a critical analysis of the facts and their impact on societies throughout history.

5. Conclusions

Following the recommendations of the European Parliament and the Council (of 18 December 2006, on key competences for lifelong learning, 2006/962/EC), as well as what the OECD is doing in 2019, competences are defined as “a combination of knowledge, skills, and attitudes appropriate to the context; [...] they are those which all individuals need for their personal fulfillment and development, as well as for active citizenship, social inclusion and employment” [37].
In addition to the OECD, some authors and research works establish them as a key factor in education [38,39]. For this reason and due to the importance of the competences, in this study, some categories (factors) were delimited in order to be able to relate them to what has been learned in the field of Social Sciences. On this basis (and as has been shown), each factor relates—directly or indirectly—to the students’ perception of the acquisition of these competences based on knowledge, facts, or situations that occur in the classroom. Thus, we have seen that, for example, the first factor (F1) shows how students perceive the acquisition of skills for their application in society; that is, if these skills are valid for decision-making, for travel, or even for preparing a curriculum that allows them to undertake an active search for employment. On this basis, and as we have seen, the correlated analysis between the factors F3 and F7 has shown the existence of a positive relationship towards the development of an assessment by competencies; however—to our great regret—we suspect that, in practice, they have completely forgotten the subject to whom all possible options for improvement are directed: the student. For this reason, giving students a voice—collecting their perception—becomes a task that is consubstantial with that authentic assessment that the specialized literature speaks of [40], and to that end—to that empowerment—we contribute as much with this work as with others already developed in this line. Finally, and by virtue of the data obtained in this research, we are in a position to state that this instrument is entirely appropriate and relevant for use within the field of social sciences, which is why we invite the scientific community that wishes to investigate and expand this line of research to use and apply it. For our part, based on the information provided by this phase of our study (completely covered with its own meaning, content, and structure), and once the instrument validated here is presented with this unbeatable profile to meet the objectives of the research, our next aim will be to undertake the design of proposals for action that could be valid and relevant when it comes to facilitating teachers’ understanding of the teaching and learning process through the incorporation of competences as a navigation chart (and always from the perspective proposed in this study).
For all the above reasons, it can be concluded that this questionnaire is a valid tool for measuring the transfer of the acquisition of competences by students in particular and of learning in general. Similarly, it can be used to evaluate methodological and assessment processes and to be able to adapt them more to the individual reality of the student. However, one limitation it can have is that the questionnaire is focused on the field of social sciences and on methodologies of the same, so it is absolutely necessary to take this information into account in the case of using it as an assessment of the teaching-learning process.

Author Contributions

Conceptualization, J.-M.Á.-M.-I., and J.M.-S.; Data curation, J.-M.Á.-M.-I.; Formal analysis, J.-M.A.-M.-I and J.M.-S.; Funding acquisition, F.-J.T.-C. and P.M.-M.; Investigation, F.-J.T.-C., J.-M.Á.-M.-I., and M.J.; Methodology, F.-J.T.-C. and P.M.-M.; Project administration, P.M.-M.; Resources, F.-J.T.-C.; Writing—review and editing, J.-M.Á.-M.-I., P.M.-M., and J.M.-S. All authors have read and agreed to the published version of the manuscript.

Funding

This research has been financed by the projects “The evaluation of competencies and the development of cognitive abilities on history in Obligatory Secondary Education” (EDU2015-65621-C3-2-R), subsidized by the Spanish Ministry of Economy and Competitiveness and co-financed with EU ERDF funds. and “Methodological concepts and active learning methods to improve teachers’ teaching skills” (PGC2018-094491-B-C33) subsidized by the Ministry of Science, Innovation and Universities and co-financed with EU ERDF funds.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

QUESTIONNAIRE FOR FOURTH YEAR STUDENTS
“Perception of assessment in Social Sciences, Geography and History, and its relation to the development of competences (EPECISOCO Scale)”
Students’ perception of basic competences according to what they have learned in the area of Social Sciences, Geography, and History
  • After studying Social Sciences, I find the development of competence in linguistic communication very useful, as with it I will be able, among other things, to use language to express my emotions, experiences, and opinions, so that others can understand me.
  • I believe that mathematical competence is necessary, as I will know how to apply mathematics in order to be able to solve problems related to the Social Sciences.
  • I consider that the training received in Social Sciences helps me to realize the importance of competence in the knowledge of and interaction with the physical world, as it allows me to be committed to the environment around me and to society.
  • What I have learned at the institute, in Social Sciences, makes me see how essential competence is in the treatment of information and digital competence, since it allows me to develop skills and abilities to use more and/or better information and communication technologies (ICT).
  • I consider the contents worked on in Social Sciences for the development of social and citizen competence to be very valuable, as they have enabled me to be participative at a social level (taking part in elections, creating associations, etc.).
  • According to what I have learned in Social Sciences, I think that cultural and artistic competence is very interesting, as it makes it easier for me to express myself, communicate, and perceive and understand different realities and productions in the world of art and culture.
  • What I learn in Social Sciences helps me to understand how fundamental competence is for learning to learn, since with it I am aware of what I know and what I need to learn in order to build my own knowledge.
  • Thanks to the subject of Social Sciences, Geography, and History I have managed to appreciate the benefit of developing the competence of autonomy and personal initiative since it facilitates having one’s own initiative to imagine and carry out something important and with a critical sense.
Students’ perception of the degree of difficulty they have in assimilating the basic skills
9.
In relation to what I learned in the Social Sciences course, it would be very easy for me to write and present a formal complaint in an establishment, to file a complaint, or to present an instance in a body such as a Town Hall.
10.
It is very easy for me to apply the mathematics that I have learned in school when it comes to properly interpreting and deducing information on economics or some economic graphs, data related to the Social Sciences.
11.
Based on what I learned in the Social Sciences course, I find it difficult to locate existing industries in an area or region, to indicate which rivers I would cross if I made an exit or a journey, or to draw up a route taking into account the possible obstacles that we may encounter due to human constructions.
12.
As a result of my work in Social Sciences at the Institute, when I read a document, it is very difficult for me to know how to differentiate the really important information from that which is only complementary or filling in.
13.
Knowing how to express my own ideas and listen to the ideas of others, respecting them even if they are different from my own, with the aim of dialogue and reaching agreements to resolve conflicts, is something that would be very difficult for me depending on what I have learned in the Social Sciences subject.
14.
It is very difficult for me to know how or with what criteria I have to evaluate a creation to know if it is a work of art or something without value, using what we have studied in Social Sciences.
15.
I think that learning which strategies to use or which resources to resort to in order to know my rights and obligations as a citizen is easy, based on what I have learned in Social Sciences up to now.
16.
With what I have learned in Social Sciences, it is normally difficult for me and generates a lot of insecurity in trying to face or assume the problems that happen to me on a daily basis.
Students’ perception of the instruments used to assess the degree of achievement of basic competences
17.
I have found that the summaries I make in the subject of Social Sciences are a good strategy for me to improve my written expression and understanding of the texts I read.
18.
In the Social Science subject, the tables or graphs that I make are a good strategy to calculate, interpret, or deduce data.
19.
Watching videos or documentaries and making a later commentary in Social Sciences helps develop my critical spirit and modify my consumption habits.
20.
Having to make or interpret maps in Social Sciences is a good strategy to assess my ability to obtain, analyze, and synthesize different types of information.
21.
I believe that taking an exam with short or multiple choice questions in the social sciences is a good strategy to evaluate my knowledge and critical analysis of societies, their evolution, and changes in them.
22.
I consider that the way I am evaluated in the subject of Social Sciences (exams, tasks, activities...) does not allow me to demonstrate my knowledge of different cultural and artistic manifestations, nor my attitude towards heritage and cultural life.
23.
I consider that taking an exam with developmental questions in the subject of Social Sciences is a good strategy to evaluate my knowledge and skills.
24.
I consider that taking an oral exam (or an oral presentation) in Social Sciences, is a good strategy to be able
Students’ perception of the transfer of what they have learned in Social Sciences to a real situation
25.
What I have learned in Social Sciences is useful to me when I have to participate in debates and be able to express myself in public.
26.
Studying Social Sciences I have been able to see that some of the things I have learned in Mathematics have usefulness and I can apply them in my daily life.
27.
The historical, geographical, cultural, or artistic knowledge that I have learned in the subject of Social Sciences seems to me essential to develop my own criteria and to better understand how I should help to care for the environment and use natural resources in a responsible way.
28.
The search for and use of information through digital media and the knowledge that I have acquired in the subject of Social Sciences about history, culture, art or geography, seem to me very useful when it comes to obtaining information and data that I can use in other areas of my life.
29.
My attitude of respect and help to others, or my acceptance and understanding of people who have different ideas from mine (or who have a different culture or religion), has improved because I have studied the subject of Social Sciences.
30.
What I have learned in Social Sciences helps me to appreciate and value the cultural and historical heritage of the Region of Murcia (archaeological remains, museums, etc.).
31.
I think that what I have learned in Social Sciences can be useful to relate it to other knowledge I already have, or serve as a basis to learn new things or to learn them in another way.
32.
On many occasions, what I have learned in Social Sciences has helped me to have my own criteria when deciding on a situation in my life, assuming advantages and risks.
33.
What I have learned in Social Sciences, Geography and History could be useful to write a good letter of introduction, to elaborate my curriculum vitae, and also to answer adequately the questions asked in the interview.
34.
I think that having learned how to interpret maps in the Social Sciences subject, I could use it to calculate the economic cost of the trip, to choose the most appropriate means of transport according to the distance, to adjust the trips to the number of passengers, to draw up the visits, to calculate the price of fuel, etc.
35.
I believe that, with what I have learned in the Social Sciences course, I am capable of designing and organizing a trip for some clients, taking into account the possible routes, human customs, and elements of the landscape that they would cover.
36.
I believe that the knowledge I have acquired in Social Sciences enables me to create a video (or slide show) that is both motivating and attractive, and that allows future clients to get to know different aspects of the cities to be visited so that they can be convinced and sign up for the trip.
37.
I have difficulties in resolving conflicts that may arise due to the different particular interests that may exist within a group, so, with what I have learned in Social Sciences, I believe that I am able to dialogue with future travelers with a constructive attitude and reach democratic agreements, without having to prepare myself for this.
38.
With what I have learned in Social Sciences I believe that I have acquired sufficient skills, abilities, knowledge, or capacities to know what would be the heritage, cultural or leisure elements that I should include in the design of a trip.
39.
What I have worked on up to now in Social Sciences has prepared me sufficiently to be able to apply for this job call and do everything specified in it.
40.
Assuming that the company already has a model information dossier for group trips, I believe that with what I have learned in Social Sciences I will be able to use that dossier to create a new one, updating it, improving it, and incorporating the data I have obtained; all this through the planning and implementation of a new project that adapts to the needs of the travelers.

References

  1. Gómez, C.J.; Miralles, P. Historical skills in compulsory education: Assessment, inquiry based strategies and students argumentation. NAER J. 2016, 5, 130–136. [Google Scholar]
  2. Villardón, L. Evaluación del aprendizaje para promover el desarrollo de competencias. Educatio Siglo XXI 2006, 24, 57–76. [Google Scholar]
  3. Castro, M. ¿Qué sabemos de la medida de las competencias? Características y problemas psicométricos en la evaluación de competencias. Bordón 2011, 63, 109–123. [Google Scholar]
  4. López Facal, R. Competencias y Enseñanza de las Ciencias Sociales. Íber 2013, 74, 5–8. [Google Scholar]
  5. Fordham, M. Tradition, authority and disciplinary practice in history education. Educ. Philos. Theory 2017, 49, 631–642. [Google Scholar] [CrossRef]
  6. Bingham, T.; Conner, M. The New Social Learning: A Guide to Transforming Organizations through Social Media; American Society for Training & Development: San Francisco, CA, USA, 2010. [Google Scholar]
  7. López, R.; Miralles, P.; Prats, J.; Gómez, C.J. Pensamiento histórico, enseñanza de la historia y competencias educativas. In Educación Histórica y Desarrollo de Competencias; López, R., Miralles, P., Prats, J., Gómez, C.J., Eds.; Graó: Barcelona, Spain, 2017; pp. 7–22. [Google Scholar]
  8. González, E. El desconocimiento de la Historia. Canarias7. 2017. Available online: https://www.canarias7.es/opinion/firmas/el-desconocimiento-de-la-historia-JJ2803391 (accessed on 16 September 2020).
  9. Thorp, R.; Persson, A. On historical thinking and the history educational challenge. Educ. Philos. Theory 2020, 52, 891–901. [Google Scholar] [CrossRef] [Green Version]
  10. Bertram, C. Doing history?: Assessment in history classrooms at a time of curriculum reform. J. Educ. 2018, 45, 155–177. [Google Scholar]
  11. Gómez, C.J.; Miralles, P. Los contenidos de ciencias sociales y las capacidades cognitivas en los exámenes de tercer ciclo de Educación Primaria ¿Una evaluación en competencias? Revista Complutense de Educación 2013, 24, 91–121. [Google Scholar]
  12. Páez, D.; Bobowik, M.; Liu, J. Social representations of the past and competences in history education. In Palgrave Handbook of Research in Historical Culture and Education; Carretero, M., Berger, S., Grever, M., Eds.; Palgrave Macmillan: London, UK, 2017; pp. 491–510. [Google Scholar]
  13. Dalton-Puffer, C.; Bauer-Marschallinger, S. Cognitive Discourse Functions meet Historical Competences: Towards an integrated pedagogy in CLIL history education. J. Immers. Content Based Lang. Educ. 2019, 7, 30–60. [Google Scholar] [CrossRef]
  14. Gómez, C.J.; Miralles, P. ¿Pensar históricamente o memorizar el pasado? La evaluación de los contenidos históricos en la educación obligatoria en España. Revista de Estudios Sociales 2015, 52, 52–68. [Google Scholar] [CrossRef]
  15. Gómez, C.J.; Monteagudo, J.; López Facal, R. El examen y la evaluación de los contenidos de ciencias sociales en tercer ciclo de Educación Primaria. Capacidades, conceptos y procedimientos. Revista Electrónica Interuniversitaria de Formación del Profesorado 2012, 40, 37–49. [Google Scholar]
  16. Rodríguez, E.A. Hacia una evaluación auténtica de la lectura de obras literarias en estudiantes de enseñanza media. Revista Electrónica Diálogos Educativos 2017, 10, 2–13. [Google Scholar]
  17. Gómez, C.J.; Miralles, P. Los Espejos de Clío. Usos y Abusos de la Historia en el Ámbito Escolar; Sílex: Madrid, Spain, 2017; pp. 10–13. [Google Scholar]
  18. Tiana, A. Análisis de las competencias básicas como núcleo curricular en la educación obligatoria española. Bordón. Revista de Pedagogía 2011, 63, 63–75. [Google Scholar]
  19. Cárdenas, J.A.; Suárez, M.I. Evaluación auténtica: Una alternativa para posibilitar la comprensión del aprendizaje en el aula. Magazín Aula Urbana 2018, 111, 1–8. [Google Scholar]
  20. Morris, R.V. Drama and authentic assessment in a social studies classroom. Soc. Stud. 2001, 92, 41–44. [Google Scholar] [CrossRef]
  21. Trillo, F. Competencias docentes y evaluación auténtica: ¿falla el protagonista? Revista Perspectiva Educacional 2005, 45, 85–103. [Google Scholar]
  22. Alfageme, M.B.; Miralles, P. Instrumentos de evaluación para centrar nuestra enseñanza en el aprendizaje de los estudiantes. Íber 2009, 60, 8–20. [Google Scholar]
  23. Virta, A. New and persisting challenges for history, social studies and citizenship education in Finnish compulsory education-highlights from recent assessments and research. Revista Latinoamericana de Estudios Educativos 2014, 10, 49–66. [Google Scholar]
  24. Gallati, F. Research, Projects and Experiences in Didactics of History and Heritage from the Dipast Center of the University of Bologna, Italy. In Handbook of Research on Citizenship and Heritage Education; IGI Global: Hershey, PA, USA, 2020. [Google Scholar]
  25. Pérez, C. Métodos Estadísticos Avanzados con SPSS; Thomson: Madrid, Spain, 2005; pp. 1–5. [Google Scholar]
  26. Vallejo, P.M. Estadística Aplicada a las Ciencias Sociales; Universidad Pontificia Comillas: Madrid, Spain, 2008. [Google Scholar]
  27. Hernández, R.; Fernández, C.; Baptista, P. Metodología de la Investigación, 2nd ed.; McGraw Hill: Mexico City, Mexico, 2008; pp. 22–26. [Google Scholar]
  28. Malhotra, N. Pesquisa de Marketing: Uma Orientação Aplicada, 3rd ed.; Bookman: Porto Alegre, Brazil, 2009; pp. 34–39. [Google Scholar]
  29. Hastad, D.N.; Lacy, A.C. Measurement and Evaluation in Physical Education and Exercise Science; Allyn and Bacon: Boston, MA, USA, 2001. [Google Scholar]
  30. Hair, J.F., Jr.; Black, W.C.; Babin, B.J.; Anderson, R.E. Multivariate Data Analysis, 7th ed.; Prentice Hall: Upper Saddle River, NJ, USA, 2010. [Google Scholar]
  31. Molina Saorín, J.; Miralles Martínez, P.; Trigueros Cano, F.J. La evaluación en ciencias sociales, geografía e historia: Percepción del alumnado tras la aplicación de la escala EPEGEHI-1. Educación XX1 2014, 17, 289–311. [Google Scholar] [CrossRef] [Green Version]
  32. Stufflebeam, D.L.; Shinkfield, A.J. Evaluation Theory, Models, and Applications; Jossey Bass: San Francisco, CA, USA, 2007. [Google Scholar]
  33. Bricklin, B.; Bricklin, M. Causas Psicológicas del Bajo Rendimiento Escolar; Pax Mexico: México D.F., Mexico, 1988; pp. 1–5. [Google Scholar]
  34. Estévez, E.H. Nuevas ideas sobre el aprendizaje. México Revista de Investigación y Práctica 1996, 1, 42–47. [Google Scholar]
  35. Gómez, C.J.; Ortuño, J.; Molina, S. Aprender a pensar históricamente. Retos para la historia en el siglo XXI. Tempo e Argumento 2014, 6. [Google Scholar] [CrossRef]
  36. Nieto, T.F.; Pastor, V.L. Evaluación auténtica, coevaluación y uso de las TIC en educación física: Un estudio de caso en secundaria. Revista Infancia, Educación y Aprendizaje 2017, 3, 42–46. [Google Scholar] [CrossRef]
  37. OECD Estrategia de Competencias de la OCDE 2019. Competencias para Construir un Futuro Mejor; OECD Publishing: Paris, France; Fundación Santillana: Madrid, Spain, 2019. [Google Scholar] [CrossRef]
  38. Frey, B.; Schmitt, V.; Allen, J. Defining authentic classroom Assessment. Pract. Assess. Res. Eval. 2012, 17, 1–18. [Google Scholar]
  39. Garzón Artacho, E.; Martínez, T.S.; Ortega Martín, J.L.; Marín Marín, J.A.; Gómez García, G. Teacher Training in Lifelong Learning—The Importance of Digital Competence in the Encouragement of Teaching Innovation. Sustainability 2020, 12, 2852. [Google Scholar] [CrossRef] [Green Version]
  40. Ruiz, M.V.; Molina-Saorín, J.M. La evaluación auténtica de los procesos educativos. Revista Iberoamericana de Educación 2014, 64, 11–25. [Google Scholar] [CrossRef]
Figure 1. Distribution of research participants grouped by county.
Figure 1. Distribution of research participants grouped by county.
Sustainability 12 10056 g001
Table 1. Descriptive statistics of the subscales.
Table 1. Descriptive statistics of the subscales.
ScaleNAverageD. Typ.
F1: Perception of the application of the acquisition of key competencies in society life14223.31070.84607
F2: Perception of competencies based on what they’ve learned14223.50800.77057
F3: Perception of the degree of difficulty that the skills have in assimilating them 14222.57620.78134
F4: Perception of the methodology used for the acquisition of skills.14223.65581.02768
F5: Perception of the importance of mathematical competence in the social sciences14223.00380.94818
F6: Perception about transferring what you learned to a real situation14223.63090.92828
F7: Perception of the instruments used to assess the degree of achievement of skills14223.57230.79740
N Valid 1422
Table 2. Total variance explained after rotation of the selected factors: rescaled matrix maximum likelihood extraction method.
Table 2. Total variance explained after rotation of the selected factors: rescaled matrix maximum likelihood extraction method.
FactorInitial Self-ValuesSquared Load Extraction Sums
Total% of VarianceCumulative %Total% of VarianceCumulative %
1422958,13158,131422958,13158,131
2304924,98283,113304924,98283,113
32764491688,029
42657338591,414
52508326194,675
62419298197,656
723742344100
Table 3. Pearson’s correlation between factors.
Table 3. Pearson’s correlation between factors.
FactorsFactor 1Factor 2Factor 3Factor 4Factor 5Factor 6Factor 7
F1Pearson Cor. 10.584 **−0.0360.464 **0.403 **0.556 **0.286 **
Sig. (bilateral) 0.0000.3040.0000.0000.0000.000
N1036762814981929977947
F2Pearson Cor. 0.584 **10.0270.406 **0.416 **0.564 **0.294 **
Sig. (bilateral)0.000 0.4430.0000.0000.0000.000
N7621030802961908958937
F3Pearson Cor. −0.0360.0271−0.0570.055−0.0610.025
Sig. (bilateral)0.3040.443 0.0690.0860.0510.433
N814802108610169661014984
F4Pearson Cor. 0.464 **0.406 **−0.05710.352 **0.388 **0.282 **
Sig. (bilateral)0.0000.0000.069 0.0000.0000.000
N98196110161422121912821265
F5Pearson Cor. 0.403 **0.416 **0.0550.352 **10.299 **0.206 **
Sig. (bilateral)0.0000.0000.0860.000 0.0000.000
N9299089661219130011891164
F6Pearson Cor. 0.556 **0.564 **−0.0610.388 **0.299 **10.250 **
Sig. (bilateral)0.0000.0000.0510.0000.000 0.000
N97795810141282118913941248
F7Pearson Cor. 0.286 **0.294 **0.0250.282 **0.206 **0.250 **1
Sig. (bilateral)0.0000.0000.4330.0000.0000.000
N9479379841265116412481369
* Pvalor = ≤ 0.1; ** Pvalor = ≤ 0.05; *** Pvalor = ≤ 0.01.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Álvarez-Martínez-Iglesias, J.-M.; Trigueros-Cano, F.-J.; Miralles-Martínez, P.; Molina-Saorín, J. Assessment by Competences in Social Sciences: Secondary Students Perception Based on the EPECOCISO Scale. Sustainability 2020, 12, 10056. https://doi.org/10.3390/su122310056

AMA Style

Álvarez-Martínez-Iglesias J-M, Trigueros-Cano F-J, Miralles-Martínez P, Molina-Saorín J. Assessment by Competences in Social Sciences: Secondary Students Perception Based on the EPECOCISO Scale. Sustainability. 2020; 12(23):10056. https://doi.org/10.3390/su122310056

Chicago/Turabian Style

Álvarez-Martínez-Iglesias, José-María, Francisco-Javier Trigueros-Cano, Pedro Miralles-Martínez, and Jesús Molina-Saorín. 2020. "Assessment by Competences in Social Sciences: Secondary Students Perception Based on the EPECOCISO Scale" Sustainability 12, no. 23: 10056. https://doi.org/10.3390/su122310056

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop