Next Article in Journal
Local and External Stakeholders Affecting Educational Change during the Coronavirus Pandemic: A Study of Facebook Messages in Estonia
Previous Article in Journal
A Comparison of Provision and Access to Inclusive Education for Children with Disabilities in a Metropolitan City and a Rural District in Telangana State, India
Previous Article in Special Issue
Discovering Unwritten Stories—A Modular Case Study in Promoting Landscape Education
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Design and Initial Validation of a Questionnaire on Prospective Teachers’ Perceptions of the Landscape

by
Rubén Fernández Álvarez
1,* and
José Fernández
2
1
Department of Geography, School of Education and Tourism, University of Salamanca, 05003 Ávila, Spain
2
Department of Geography, University of Salamanca, 37007 Salamanca, Spain
*
Author to whom correspondence should be addressed.
Educ. Sci. 2021, 11(3), 112; https://doi.org/10.3390/educsci11030112
Submission received: 11 January 2021 / Revised: 23 February 2021 / Accepted: 5 March 2021 / Published: 9 March 2021
(This article belongs to the Special Issue An Educational Approach to Landscape)

Abstract

:
This research focuses on the design, construction, and validation of a questionnaire that seeks to analyse the perception of the landscape amongst undergraduates studying for a Degree in Primary School Teaching at Salamanca University. The process has involved using both qualitative and quantitative techniques to test the content’s validity and the construct’s reliability and suitability through the participation of a panel of expert judges and a sample of 432 subjects. This has been followed by the introduction of an exploratory factor analysis (EFA) of the data provided by the cohort that has led to a study of the questionnaire’s core characteristics, a reduction in its size, and the validation of its pertinence.

1. Introduction

Since the signing of the European Landscape Convention (ELC) of the Council of Europe on 20 October 2000, the landscape now has a regulatory document with an international scope [1] that provides it with legal certainty in those countries adhering to the treaty [2,3,4]. The ELC’s articles provide, among other aspects, common rules to ensure that all the countries involved have the same point of departure regarding the treatment of the landscape [5]. A series of measures are proposed to enable countries to design and develop the necessary measures for managing and organising it [6,7] and educating people accordingly [8]. In addition, the ELC seeks to foster a network of shared experiences across countries with a view to boosting those teaching and management techniques that are providing the best results [3,4]. What is more, the ELC ensures that each country has the freedom required to adapt its decision-making process according to its own idiosyncrasies and characteristics [3], thereby catering for the proposal of measures suited to social and territorial requirements [3,7]. The ELC is designed to cover all matters involving the landscape [6], which means it does not focus solely on the management and organisation of landscape resources or on territorial features with an impact upon a given landscape [7], as it also considers that landscape teaching and education from early schooling onwards is a step that will help to raise awareness about it [8]. At the same time, it proposes training professionals to acquire the right skills for managing and teaching about the landscape [8,9]. The latter considerations will be the focus of our attention here, namely, the landscape’s importance in education and in teacher training, with the ELC being used as the international referent.
The landscape has traditionally featured in schooling as part of different subjects in which it has been studied depending on the overall perspective adopted in each specific case [5]: art, history, literature, geography, etc. This interdisciplinary nature [10] favours its inclusion in the teaching–learning process and provides teachers with an instrument that allows them to apply a mainstream approach to territorial issues [11,12]. The landscape has also been a major component of the subject of geography because of both its integrating nature [10] and its classroom use according to two main approaches: the landscape as curricular content and the landscape as a didactic instrument [13]. This means that the teaching of geography takes the landscape into account during the training process [14] for teachers responsible for early schooling in the form of primary education. At this stage, the landscape is part of the syllabus that pupils are required to learn [13], and although it appears in a range of subjects, as noted earlier, it plays a key role in the social sciences, and especially in geography [14]. Due to this twin purpose it serves as both content and didactic instrument. Primary education in Spain is divided into six years, with a total of five core subjects: Social Sciences, Natural Sciences, Spanish Language and Literature, Maths, and a Foreign Language [15]. This means that the Social Sciences, whose syllabus consists mainly of content involving geography and history, are taught during the six years of primary education, thereby highlighting the importance of suitably instructing teachers in landscape-related matters [14] including its interpretation and assessment. It is necessary to know how the landscape is perceived in order to value it as an integrated educational resource for teaching curricular content, values (conservation and respect for landscape diversity) [16,17], cultural and identity issues [18]. The lived landscape provides nuances and singularities that differentiate it from the immediate areas [18,19,20]. Therefore, the teacher must be able to understand how this specific landscape is interpreted by the students, because their behaviour in relation to the landscape (conservation, feelings, etc.) will be conditioned by the way they perceive it [16,19,20,21]. Moreover, depending on the teacher’s knowledge and the perception and sensitivities about the landscape, the way of teaching will be adapted [19]. Additionally, it is necessary to identify how the landscape is perceived, analysed, and interpreted [22] from the teacher’s perspective. In addition, landscape has a significant symbolic charge associated with identity and emotions [20]. The teaching–learning process of landscape has been evolving and adapting to different teaching methodologies and different didactic instruments [19] conditioned by the subjectivity of landscape [20,22]. The subjective character of landscape influences the teaching process [20]. According to Zanato [20], landscape has a variety of didactic characteristics that can be summarised in three functions: hermeneutic function; pragmatic function and social function. Zanato [20] emphasises also the possibilities offered by landscape from a pedagogical perspective. Landscape is a didactic resource that facilitates the development of a critical thinking [20]. When students compare landscapes, they can identify the cultural aspects underlying each landscape and learn habits of respect for different territorial identities [18]. In other words, landscape must be identified as something more than a didactic instrument and as something more than a curricular content; it must be treated as an element of identity and as an element for people’s comfort. The perception of the landscape by teachers and students as well as its symbolic charge will influence the way it is interpreted, taught, and learnt [20].
This informs the need to provide prospective teachers with sufficient knowledge and skills to conduct the teaching–learning process [23], whereby their future primary pupils can achieve the subject’s goals and the specific targets in this stage of schooling. Teacher training in Spain involves a Degree in Primary School Teaching, and this is the qualification upon which we shall be focusing our research, seeking to analyse the knowledge these prospective teachers have of the landscape. Furthermore, the teaching–learning processes involving the landscape need to adapt to new teaching methods and include, among other aspects, Information and Communication Technologies (ICTs) [24,25]. New instruments are appearing that support the teaching both of and with the landscape; for example, augmented reality (AR), virtual reality (VR), and 3D modelling. All these resources are now seen as useful instruments in the teaching process [24].
Faced with this new paradigm for teaching the landscape, there is a need to understand the students’ perception of it; to identify this perception as a didactic resource or instrument, as curricular content, and identify those methods or techniques they consider most suitable for their teaching and whether they look upon ICTs as a useful resource [23]. The survey technique is one way of addressing this problem of understanding the students’ perception [26]. There are different types of instruments and procedures that furnish us with information and enable us to conduct descriptive research based on survey studies that include interviews and questionnaires [26,27]. This research has used a bigger sample and compiled information from a higher number of participants [27] by using questionnaires as the data-gathering instruments. Education research often uses these kinds of tools for collecting information [26] and obtaining, amongst other things, the perception that students and future teachers have of some or other aspect of their profession [26]. It is commonplace to encounter studies in the field of education that use a questionnaire for data-gathering purposes [26], as it provides a general snapshot of some specific aspect that needs to be mastered or understood by the subjects in the sample because it is part of their training process or professional careers.
The aim here, therefore, is the design and validation of an ad hoc questionnaire using qualitative and quantitative analysis techniques [26] for creating a valid and reliable instrument for garnering information on the knowledge of students studying for their Degree in Primary School Teaching as regards the landscape (Cuestionario sobre Percepción del Paisaje—CPP) [Questionnaire on Landscape Perception]. This article has the following specific objectives regarding the CPP: analyse the validity of its content, study its construct validity, and confirm its reliability.

2. Materials and Methods

2.1. Participants: Panel of Experts and Sample

The instrument’s validation process (validity of content, validity of construct, and reliability) involved three kinds of participants. In the first case, the validity of content was performed by a panel of experts [28,29,30,31,32,33]. This provided us with more significant information on specific aspects of the content to be measured with the questionnaire and analyse the consistency between the study’s objectives and those of the instrument [29,31,34,35], as well as the pertinence of the items [36]. According to the suggestion made by Escobar-Pérez and Cuervo-Martínez [34] (pp. 30–31), the following steps were taken for implementing the experts’ judgements: define the panel of experts’ remit (i.e., validate the content of the measuring instrument; select the experts taking part; explain the measuring instrument’s purpose; explain the dimensions and indicators measured by each one of the questionnaire’s items; determine the weighting of the questionnaire’s dimensions; design the experts’ assessment scorecards; conclusions of the assessment. Our research involved five experts [31,37,38,39,40] in geography, in the teaching of landscape, and in working with questionnaires, three of whom were male and two were female. Some scholars recommend between two and ten experts, indicating that this number will depend on their level of expertise in the subject [31,41]. The participating experts were selected by means of causal or deliberate non-probability sampling [42,43], as according to García and Cabero, Merino-Barrero, Valero-Valenzuela, and Moreno-Murcia, and Singh [44,45,46] they had to meet a series of requirements related to their experience and backgrounds in a specific field of education and research. Our case involved university lecturers in three knowledge areas (Social Sciences, Natural Sciences, and the Humanities) with a proven research record in landscape-related topics and more than fifteen years’ service in teaching, following the guidelines provided by Galán [47], who recommends a minimum term of service of five years within the field of knowledge [47].
Secondly, continuing with the content validation process, deliberate non-probability sampling [48] was used to select two students from each year and from each campus, providing a total of sixteen students (eight females and eight males). The following selection criteria were applied: parity between sexes and enrolment as a student for the Decree in Primary School Teaching at Salamanca University. This trial run was used to identify those aspects that could be confusing or prompt the wrong interpretations in the drafting of the questionnaire’s items [43,48]. Besides analysing syntax, this application also enabled us to measure the questionnaire’s duration and check the wording of the instructions [26,48]. This meant that sixteen students with the same characteristics as those in the overall sample could assess the measuring instrument. None of them were included in the final sample, and they were pledged to secrecy so as not to interfere with the results. In this case, the students undertaking the abridged trial run, and as opposed to the experts, did not have a specific instrument for this purpose, but instead answered only the CPP and simply added comments on the instructions and those items they might find confusing.
Finally, the analysis of the construct’s validity and the questionnaire’s reliability involved a sample made up of undergraduates studying for their Degree in Primary School Teaching at Salamanca University. Simple random sampling was administered to all the students enrolled in the degree’s four-year course at the two campuses the university has in Ávila and Salamanca for taking part in the study. Out of the overall number of students enrolled, 432 subjects answered the questionnaire, accounting for approximately 60% of the total (N = 715), whereby our sample had more than the 200 subjects that some scholars consider to be the minimum sample size [49,50,51]. Of these, 172 students were enrolled at the Ávila campus (Escuela Universitaria de Educación y Turismo—University College for Education and Tourism) and 260 at the Faculty of Education in Salamanca. The sample was broken down into 65.74% females and 34.26% males, with an age range of 18–38, although 87.73% of the sample were 24 or under. In terms of the distribution by years, 28–24% (122) were first-years, 33–33% (144) were in second, 27–55% (119) in third, and the remaining 10.88% (47) in fourth.

2.2. Instruments

The entire process involved the use of two measuring instruments: a questionnaire for measuring the expert judges’ rating of the items in the CPP and the main questionnaire considered in this research.
Firstly, the questionnaire compiled for gathering the expert judges’ ratings was based on the models proposed by Backhoff, Aguilar, and Larrazolo; García and Cabero; Martín-Romera and Molina, [36,43,44]. They informed the design of a questionnaire divided into two blocks (see Figure 1). The first block focused on identifying the expert and on gathering data related to their time of service in teaching and research. The second block consisted of a total of 70 items in which the judges had to use a score of 1 to 10 [48] to rate their consistency, relevance, and clarity [44]. Furthermore, each item had a space set aside for comments so that the experts could add any remarks on the item they considered necessary [43]. The aim was therefore to gather information that would allow us to conduct the statistical calculations for measuring the validity of the content and thus discard or modify those items that did not comply with minimum quality requirements.
The second instrument was the CPP, and it was designed to measure the students’ perception of the landscape as a concept, as a teaching tool and as educational content. It began with a brief outline of the study and of the researchers involved, as well as specifying how long it should take to fill in and providing basic instructions for doing so. This questionnaire was divided into two main blocks: the first one was used to gather sociodemographic data on the subjects taking part (see Figure 2); the second one, in turn, was used for actually exploring the students’ perception of the landscape.
The sociodemographic data were collated through twelve items, involving an open format (4) and a closed one (8) [48] in which the subjects provided information on the year they were studying, campus, sex, age, town/city, pre-university studies in geography, and their parents’ occupation and studies, amongst others.
The second part of the questionnaire initially involved 70 items with a five-point Likert score for the purpose of addressing the matter of continuity [51,52]: fully disagree (1); disagree (2); neither agree nor disagree (3); agree (4); fully agree (5) [53]. In this case, and according to McMillan and Schumacher [48], the decision was made to include an intermediate category (3) so that the subjects were not forced to select an option that they did not agree with. Based on a literature review and according to the theoretical underpinnings, the 70 variables were grouped into four dimensions, although they were randomly presented in the questionnaire [48]: Theoretical knowledge of the landscape and its features; Landscape and a sense of identity; Measures for managing and protecting the landscape, and Pedagogical expertise (teaching–learning the landscape). The variables were introduced through statements [53] to which the subjects indicated their degree of agreement or disagreement [53]. The dimension dealing with Theoretical knowledge of the landscape comprised 22 variables, all designed to identify the student’s potential understanding of the landscape as a theoretical concept and its component features [54]. This was based on a series of variables centred around the definition of the term landscape (as provided by the ELC), for subsequently identifying whether the subjects had any understanding of the features (units) that make up the landscape and their influence on its development or dynamism. The ELC’s definition was used because there are numerous scholars that consider it to be both all-inclusive and covering a broad spectrum that groups together all the landscape’s features, whether anthropic or natural [6,55,56,57,58], and which includes educational considerations by linking landscape and society through perception [5,59]. What is more, there was also a variable that focused exclusively on the ELC, thereby seeking to discover whether the students were familiar with this convention. The second dimension, Landscape and a sense of identity, involved variables for verifying whether the subjects considered there might be a sense of identity linked to the landscape [9,54,60,61,62,63,64], and regarding their experiences in it [65]. Accordingly, this group consisted of eight variables, all of which focused on aspects related to the subjective values the landscape may evoke, such as positive or negative feelings, moods, and territorial identity or belonging. The dimension called Measures for managing and protecting the landscape had ten variables for discovering whether the students had any notion of the measures adopted accordingly (of a natural or cultural nature), as well as their opinion on those aspects of the landscape that might be more significant when designing these management and protection measures [60,66,67]. The most numerous dimension was the one involving Pedagogical expertise (teaching–learning the landscape), which comprised 30 items overall. The aim here was to identify the students’ perception of the landscape in the teaching–learning process from the viewpoint of the landscape as a didactic instrument and as part of the syllabus, as well as single out those teaching methods considered more appropriate [13,62,67,68,69,70,71]. This group included teaching methods and techniques, curricular knowledge on the landscape as content in primary education, the use of the landscape as an instrument for teaching certain aspects of geography [72] (e.g., climate, plant life, relief, and land usages) and the adoption of specific approaches for teaching the landscape (e.g., fieldwork [72], photos, and textbook). The teaching methods considered included items focusing on constructivism [73,74,75,76] and the flipped classroom [77].

2.3. Procedure

2.3.1. Design and Content Validation

This study adopted two approaches: a descriptive one [48,78] and a mainstream one [27] using the survey technique [48], involving an ad hoc questionnaire as a data-gathering instrument [27]. Based on the guidelines for designing a questionnaire proposed by Cohen, Manion, and Morrison (p. 472) [26] the process was orchestrated into four stages (see Figure 3) based on the need to design and validate a questionnaire on landscape perception amongst undergraduates studying for their degree in primary school teaching.
Stage 1 (see Figure 3) was devoted to the literature review and the selection of the measuring instrument. A thorough study of the literature [26] revealed theoretical aspects on the landscape and its teaching. This process brought us up to speed with the state-of-the-art and enabled us to study the different data-gathering mechanisms used [26]. At the same time, this thorough review enabled us, according to the guidelines provided by Martín-Romera and Molina [43] (p. 198), to define the theoretical variables, dimensions and indicators corresponding to the research goals proposed. Thus, now in stage 2 (see Figure 3), we began the operationalization process that led to the conversion of the theoretical aspects into items on the questionnaire [26,43]. This gave way to the questionnaire’s initial design [26,43], identifying those sociodemographic details needed to obtain and draft the items considering the theoretical contributions obtained during the literature review [43].
Stage 3 (see Figure 3) covered the validation of the questionnaire’s content and its final design. This involved calling upon experts to give their opinion [26,27,28,29,30,31,32,33] and a brief trial run involving a small group of students with similar characteristics to the subjects in the sample [43,48]. The experts submitted their opinions in September 2019 by completing a questionnaire in which they assessed the coherence, relevance, and clarity of the items, using a score from 1 to 10 to rate each one. The judges’ scores were analysed using Aiken’s V content validity coefficient [79]. This involved using the critical value V0 = 0.70 [80]. Aiken’s V coefficient was calculated using the formula adapted by Penfield and Giacobbi [81]:
v = x ¯ l k
where x ¯ = mean of the judges’ ratings; l = lowest possible rating (1); k = highest possible rating (10). Aiken’s V calculation reports the judges’ rating of each item.
The critical value was used to discard those items [82] whose score was below or outside the confidence interval set by the Wilson score method [83], whereby the lower and upper thresholds were calculated according to the following equations:
L = 2 n k V + z 2 z 4 n k v ( 1 v ) + z 2 2 ( n k + z 2 )
U = 2 n k V + z 2 + z 4 n k v ( 1 v ) + z 2 2 ( n k + z 2 )
where L = the interval’s lower threshold; U = the interval’s upper threshold; z = value in standard normal distribution; V = Aiken’s V; n = number of judges. With these equations the confidence interval can be calculated. The equations have been adapted by Penfield and Giacobbi [81] in order to use Aiken’s V as a ratio.
The analysis of the experts’ ratings was followed by the rejection or modification of any problematic items. The final questionnaire then underwent a trial run with a small group of students. The trial was held in a classroom during class-time in order to replicate the same conditions as the overall sample would experience when administered the final questionnaire as a pilot test [84]. The trial students simply assessed the wording of the items and reported any that were difficult to understand [85]. The experts’ ratings and the brief trial run therefore informed the definitive version of the questionnaire.

2.3.2. Construct’s Validity

Stage 4 involved an analysis of the construct’s validity, its reliability, and the questionnaire’s definitive version. A pilot test was conducted in November 2019. With the prior permission of the lecturer responsible for the subject, a member of the research team administered the questionnaires to the students for their completion in the classroom and provided the instructions on how to do so. The confidentiality and anonymity of the data gathered were guaranteed at all times. The trial lasted 35 min once the instructions had been given and any queries resolved. Once the questionnaire had been returned, the students’ results were used to study the construct’s validity through an exploratory factor analysis (EFA) [51] with a view to detecting an underlying structure in the set of variables analysed [86]. An EFA first requires analysing the normality of the sample distribution by calculating the coefficients of skewness and kurtosis [51] taken as confidence interval in both cases (−1–1) [87]. The variables scoring outside this interval are discarded from the test. This was followed by a statistical analysis of Bartlett’s test of sphericity and the KMO (Kaiser-Meyer-Olkin) sampling test for verifying the data’s adequacy for the EFA [50,51,52,88]. A sample is considered to be adequate for the application of an EFA when the KMO value >0.70 and the value provided by Bartlett’s test of sphericity is p < 0.001 [50,51,52,88]. The next step involved an EFA with a principal components extraction method and a varimax rotation that enabled us to remove those variables whose saturation in the factor was lower than 0.40 [89]. The result of the EFA prompted the creation of a series of factors, considering solely those with an eigenvalue ≥1 [51,90,91] and taking into account other aspects such as the interpretability of the solution found and the initial theory [51] (p. 1161). In turn, the reliability analysis involved Cronbach’s alpha [92]. All the statistical calculations involved the IBM SPSS Statistics 26 package, except for those tests related to Aiken’s V coefficient and its confidence interval, which were calculated with Microsoft Excel.

3. Results

3.1. Content Validity and Reliability

Content validity was calculated according to two key processes: the panel of expert judges [31,41] and the trial run involving a small group of subjects (n = 16). The expert judges analysed and rated each one of the items in the measuring instrument, as well as the overall questionnaire. The five judges concurred on the comments made about the questionnaire’s first block referring to sociodemographic data, and did not request any changes to the variables in it. As regards the variables in the questionnaire’s second block, which dealt with the students’ knowledge of the landscape, the judges assessed the items by rating their consistency, relevance, and clarity from 1 to 10 points. They also added any comments they wished to make. The assessment of the variables was followed by the calculation of Aiken’s V in order to discard those items that scored below the critical value established (>0.70) [80] (see Table 1). In this case, items 3, 19, 21, 30, 35, 40, 44, 47, and 53 were removed because they did not comply with the critical value and recorded an Aiken’s V of less than 0.70, and would not therefore be included in the validation process involving 16 students or in the final questionnaire.
In qualitative terms, the judges provided a number of comments on certain variables. One such comment that was common to all the experts involved the item referring to the definition of landscape proposed by the ELC. This definition initially appeared in only one item. The experts recommended dividing it into two new variables, which meant that instead of having one item that read ‘The landscape is any part of the territory people perceive, whose character is the outcome of the action and interaction of natural and/or human factors’, there were two items that were worded as follows: The landscape is any part of the territory that people perceive. The landscape is the outcome of the action and interaction of natural and/or human factors. This simplified the variable, rendering it easier to understand. The experts also unanimously agreed that it would be more appropriate to modify the way of answering the item on the students’ familiarity with the ELC, replacing a five-point Likert-type scale with a Yes/No dichotomous variable.
In qualitative terms, the questionnaire’s technical structure and its content were also validated. In this case, a small sample of subjects (n = 16) with the same characteristics as the subjects that would later be administered the final questionnaire, studied its features, and completed it. This helped to quantify the estimated time needed to carry out the task [43] and uncover any issues involved in interpreting the variables’ meaning [84]. The average time required for completing the questionnaire was x ¯ = 28.31 min (SD = 2.99), and no comments were made on the variables’ wording, whereby it was understood that there was no issue with the way the questionnaire was drafted. The subjects’ assessment indicated that the approximate time for satisfactorily undertaking the task was around 35 min.
The reliability analysis involved the calculation of overall internal consistency via the Cronbach’s alpha coefficient [48], with the result being 0.860. It may therefore be concluded that the questionnaire has a high degree of internal consistency [93,94] as it recorded a coefficient of close to 1, which is the value considered to indicate an ideal correlation [48,93].

3.2. Construct Validity

Following the ratings provided by the experts and their comments, the next step was to administer a pilot test to 432 undergraduates studying for a Degree in Primary School Teaching at Salamanca University. Firstly, following the indications provided by Ferrando and Anguiano-Carrasco; Lloret-Segura, Ferreres-Traver, Hernández-Baeza, Tomás-Marco; Morales, Urosa, and Blanco [50,51,88] the sample size was deemed suitable for analysing the instrument’s technical quality. A series of statistical calculations were made, with their results being reported in the following paragraphs.
The statistical treatment began with an analysis of kurtosis and skewness to verify whether each variable’s data approached a normal distribution. Considering the values forthcoming to be valid, in terms of both kurtosis and skewness, between −1 and 1 [50], those values scoring outside this interval were discarded to avoid any errors in the EFA. This led to the removal of eighteen variables that did not comply with a normal distribution. All the variables were analysed to identify the problem they had, finding that these variables alluded to highly specific theoretical issues on the landscape or to banal matters that did not add anything to the study, whereby their removal had no impact on the final data on landscape perception.
The verification of the normality of the distribution was followed by an analysis of the pertinence of the factorial test. This involved calculating the KMO mean and Bartlett’s test of sphericity, with results of 0.814 (KMO) and a significance of p < 0.001 (Bartlett) (see Table 2). The results were satisfactory in both cases for applying an EFA [50,88].
An EFA was conducted on the remaining variables that revealed six factors that explain 50.803% of the variance [95]. All those variables that did not reach a minimum coefficient of 0.40 in the assigned factor were removed from the questionnaire [89], thereby providing twenty variables distributed across the six factors (see Table 3 and Table 4).
Five factors had at least three variables with saturations of over 0.40, whereby we could assume that we had enough items to define them appropriately [88,91,96]. By contrast, one of the factors had only two variables, which meant it was not fully defined, even though both recorded saturations >.80. These variables were not removed from the questionnaire because the coefficients they saturated with exceeded 0.50 [88,91] and signalled a relationship between them. In turn, this could be creating subconstructs or factors that were part of other factors [97], which might be combining with each other. This case involved two variables used to put forward a definition of landscape with its theoretical grounding being the definition provided by the ELC. It was therefore deemed necessary to include new definitions of the term landscape in order to create a construct that was more robustly defined. This would require the use of a sample of subjects N > 500 and conducting a confirmatory factor analysis (CFA) to confirm that the variables centred on the definition of landscape were grouped into the same factor.
Factor number 1 consisted of four variables in all, recording an eigenvalue of 3.915 and explaining 19.573% of the variance. It involved items 51, 57, 58, and 59, all of which addressed theoretical aspects of the possibilities the landscape provides as a teaching instrument. In our case, this factor was referred to as: The landscape as a didactic instrument. In turn, the second factor defined was called Qualities of the landscape and teaching practice. This factor had an eigenvalue of 1.613 and explained 8.065% of the variance. It consisted of five items (items 29, 36, 48, 55, and 62), which addressed theoretical issues involving the landscape’s qualities and the part it plays as content in the teaching–learning process. Factor 3 had an eigenvalue of 1.377 and explained 6.886% of the variance, grouping three items (items 8, 38 and 52). It was referred to as Anthropic features in teaching the landscape and consisted of a series of variables designed to define the presence of cultural aspects in the dynamics of the landscape and in its teaching. Factor number 4 explained 5.855% of the variance, in total grouping three items (items 17, 18 and 31), with an eigen value of 1.171. This factor theoretically explained socialisation in the landscape teaching–learning process. It grouped variables that measured aspects centred on the social construct and the perception of the landscape, and was referred to as Social context and landscape. Factor 5 involved the definition of landscape, recording an eigenvalue of 1.090 and explaining 5.449% of the variance. It consisted of two items (items 1 and 2). In theoretical terms, it grouped those variables that involved the definition of the concept of landscape, and was referred to as Concept of landscape. As noted earlier, there was a need to define two new variables focusing on the definition of landscape that helped to provide a more robust definition of this factor’s characteristics [91]. The last of these factors had an eigenvalue of 1.001 and explained 5.003% of the variance. It was referred to as Interpreting and teaching the landscape. It focused on issues involving the positive or negative feelings the landscape arouses when it is contemplated and how this may have an impact on the teaching process. It comprised items 4, 41, and 61.

3.3. Final Questionnaire on Landscape Perception: Cuestionario Sobre Percepción del Paisaje-CPP

The various processes involved in validating the questionnaire on the perception of the landscape, of both a qualitative and quantitative nature, culminated in the design of the Cuestionario sobre Percepción del Paisaje—CPP (see Table 5). The use of EFA allowed reducing the number of variables, as well as analysing the internal and underlying structure grouping the variables [51,86,88]. This ultimately led to the design of a reliable questionnaire that can be used to interpret [51] the perception that students have of the landscape during their education. The final questionnaire consists of two blocks; a first one involves the compiling of sociodemographic data, and the second one focuses on the students’ perception of the landscape as theoretical content and a didactic instrument in the teaching–learning process. The first block is unchanged from the way it has been presented in the second section of this paper. By contrast, the second block has undergone substantial changes that have informed a questionnaire made up in total of 21 variables (20 involving a Likert-type scale and one requiring a dichotomous yes/no answer). This has led to a reduction in the number of variables, discarding those that made no significant contribution to the study and those that could distort the final results.

4. Discussion and Conclusions

Following the ELC’s postulates, the signatory countries are to put in place measures for managing and organising the landscape and educating in this subject [6,7,8]. As far as Spain is specifically concerned, schemes have been rolled out to include the population at large in these processes [5] and the landscape has a noticeably greater presence as curricular content in basic education [13]. This last aspect, education and the landscape’s presence in the teaching–learning process, is the one that has focused our attention here. According to Pedroli and Van Mansvelt; García; Fernández [8,13,14], the landscape currently serves a dual process in this process; on the one hand, it continues to be a didactic instrument in the teaching of geography, and on the other, it is another content in the general syllabus. This means there is a need to analyse the perception that prospective teachers have of the landscape, as they will be the ones to teach it. This article therefore addresses the design and validation of a questionnaire for identifying this perception. The results obtained after administering this questionnaire to a sample of the population of undergraduates studying for their Degree in Primary School Teaching at Salamanca University support this instrument’s reliability and validity. Yet at the same time, we should also proceed with caution, for according to Álvarez-García et al.; Arribas [84,98], questionnaires may contain biases when the subjects surveyed reply to the items in what they consider to be the most appropriate manner and according to their prior responses, without expressing their true opinion. Hence the need for subsequent studies to include qualitative analyses that assess the subjects’ answers and compare the results involving different cohorts [26,48].
Based on a proposal made by Cohen, Manion, and Morrison [26] for the design and construction of a questionnaire, this research has conducted a literature review on theoretical aspects of the landscape and its teaching. This has been used to define a series of variables and construct an initial questionnaire made up of 70 items involving a Likert-type assessment scale. This has been followed by a quantitative and qualitative content validity analysis conducted by a panel of expert judges [28,31,41] and a brief trial run involving a small group of students [42,48]. This initial validation process has led to the removal of the nine variables that the judges have not deemed suitable for the study according to the scores they have given them, and the average time for completing the questionnaire has been established. The instrument’s overall reliability analysed by means of Cronbach’s alpha has recorded a result of 0.860. This indicates a high overall internal consistency [48,93,94]. After confirming the sample’s suitability for an EFA involving an analysis of the normality of the distribution and calculation of the KMO statistic (0.814) and Bartlett’s test of sphericity (p < 0.001), a further eighteen items were removed from the questionnaire, as they did not record a normal distribution and could be problematic when conducting the EFA. Their analysis and individual assessment have revealed that they all involve theoretical matters that might be challenging for students studying subjects other than geography. We should not forget that the sample consists of students studying for their Degree in Primary School Teaching.
The EFA provided an initial view of the structure underpinning the variables [51,52], which in this case have been grouped into six factors in total. The literature review and the ELC as a reference were used to design an initial questionnaire that in theoretical terms was arranged into four dimensions (Theoretical understanding of the landscape; Landscape and a sense of identity; Measures for the management and protection of the landscape; and Pedagogical knowledge) and 70 variables. The EFA has indicated that there are six constructs (The landscape as a didactic instrument; Qualities of the landscape and teaching practice; Anthropic features in teaching the landscape; Social context and landscape; Concept of landscape; and Interpreting and teaching the landscape) and has helped us to reduce the number of variables to 21. According to Frías-Navarro and Pascual; Lloret-Segura et al.; Pérez and Medrano [51,52,97], the EFA is also a sound technique for reducing the number of variables and avoiding overly long questionnaires that may make the subjects in the sample lose interest. In addition, it provides the guidelines for validating a measuring instrument of proven reliability, although it requires a CFA calculated on the basis of a matrix of data gathered from subjects other than those involved in the pilot task and with a larger sample according to Kline [99] of N > 500. The CFA is used to confirm whether the same dimensions are obtained as in other cohorts of the same characteristics, as we now have a starting hypothesis in which we can set the number of factors to be extracted [100,101]. According to Batista-Foguet, Coenders, and Alonso; Fernández; Herrero; Urrutia et al. [102,103,104,105], this analysis will allow confirming the dimensions and underlying characteristics of the questionnaire’s variables and suitably concluding its validation process.
In short, this research has prompted the design and validation of a questionnaire for analysing the perception of the landscape amongst undergraduates studying for a Degree in Primary School Teaching at Salamanca University: Cuestionario sobre Percepción del Paisaje-CPP. In the future, this analysis must be confirmed with a CFA certifying the results obtained with the EFA.

Author Contributions

Conceptualization, R.F.Á.; methodology, R.F.Á.; software, R.F.Á. and J.F.; formal analysis, R.F.Á. and J.F.; investigation, R.F.Á. and J.F.; resources, R.F.Á.; data curation, J.F.; writing-original draft preparation, R.F.Á.; writing-review and editing, R.F.Á.; visualization, R.F.Á.; supervision, R.F.Á.; project administration, R.F.Á.; funding acquisition, R.F.Á. All authors have read and agreed to the published version of the manuscript.

Funding

This research was supported by the University of Salamanca under grant OMEU (Observatorio de la Marca España en Europa) and Ministerio de Ciencia, Innovación y Universidades under grant CSO2017-84623-R La Realidad Aumentada como herramienta para explicación del paisaje. Aplicaciones a la docencia y al turismo.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Fernández, R. La aplicación de Landscape Character Assessment a los espacios de montaña media: El paisaje del macizo de Las Villuercas. Ciudad. Territ. Estud. Territ. 2015, 185, 499–518. [Google Scholar]
  2. Nogué, J.; Puigbert, L.; Sala, P.; Bretcha, G. Landscape and Public Participation. The Experience of the Landscape Catalogues of Catalonia; Catalan Landscape Observatory and Regional Government of Catalunya: Barcelona, Spain, 2010. [Google Scholar]
  3. Ortega, M. El Convenio Europeo del Paisaje: Claves para un compromiso. Ambienta 2007, 63, 18–26. [Google Scholar]
  4. Zoido, F. El paisaje, patrimonio público y recurso para la mejora de la democracia. Boletín Inst. Andal. Patrim. Hist. 2004, 50, 66–73. [Google Scholar]
  5. Fernández, R.; Plaza, J.I. Participación ciudadana y educación en materia de paisaje en el marco del Convenio Europeo del Paisaje en España. Cuad. Geogr. 2019, 58, 262–286. [Google Scholar] [CrossRef]
  6. Prieur, M.; Durousseau, S. Landscape and Public participation. In Landscape and Sustainable Development: Challenges of the European Landscape Convention; Prieur, M., Luginbühl, Y., Zoido, F., De Montmollin, B., Pedroli, B., Van Mansvelt, J.D., Durousseau, S., Eds.; Council of Europe: Strasbourg, France, 2006; pp. 165–208. [Google Scholar]
  7. Zoido, F. Landscape and spatial planning policies. In Landscape and Sustainable Development: Challenges of the European Landscape Convention; Prieur, M., Luginbühl, Y., Zoido, F., De Montmollin, B., Pedroli, B., Van Mansvelt, J.D., Durousseau, S., Eds.; Council of Europe: Strasbourg, France, 2006; pp. 52–88. [Google Scholar]
  8. Pedroli, B.; Van Mansvelt, J.D. Landscape and awareness-raising, training and education. In Landscape and Sustainable Development: Challenges of the European Landscape Convention; Prieur, M., Luginbühl, Y., Zoido, F., De Montmollin, B., Pedroli, B., Van Mansvelt, J.D., Durousseau, S., Eds.; Council of Europe: Strasbourg, France, 2006; pp. 119–142. [Google Scholar]
  9. Casas, M.; Puig, J.; Erneta, L. El paisaje en el contexto curricular de la LOMCE: Una oportunidad educativa, ¿aprovechada o desaprovechada? Didact. Geogr. 2017, 18, 39–68. [Google Scholar]
  10. Delgado, E. El paisaje en la formación de maestros, un recurso educativo de alto interés para la Educación Primaria. Tabanque Rev. Pedagog. 2015, 28, 117–138. [Google Scholar]
  11. García, A. Un enfoque innovador en didáctica del paisaje: Escenario y secuencia geográfica. In Innovación en la Enseñanza de la Geografía ante los Desafíos Sociales y Territoriales; De Miguel, R., De Lázaro, M.L., Marrón, M.J., Eds.; Instituto Fernando el Católico: Zaragoza, Spain, 2013; pp. 257–277. [Google Scholar]
  12. Tort, J. El paisaje como pedagogía del territorio. Didact. Geogr. 2004, 6, 133–153. [Google Scholar]
  13. García, A. El paisaje: Un desafío curricular y didáctico. Rev. Didact. Especificas 2018, 4, 7–26. [Google Scholar]
  14. Fernández, R. La enseñanza del paisaje desde una concepción constructivista: Propuesta didáctica. Dedica. Rev. Educ. Hum. 2019, 15, 135–159. [Google Scholar] [CrossRef]
  15. BOE (Boletín Oficial del Estado—Spain’s Offocial State Gazette). Royal Decree 126/2014, of 28 de February, establishing the basic syllabus for Primary Education, 2014. Available online: https://www.boe.es/buscar/pdf/2014/BOE-A-2014-2222-consolidado.pdf (accessed on 1 July 2020).
  16. Busquets, J. La educación en paisaje. Iber. Didact. Cienc. Soc. Geogr. Hist. 2010, 65, 7–17. [Google Scholar]
  17. Nardi, A. El paisaje como instrumento de intermediación cultural en la escuela. Iber. Didactica Cienc. Soc. Geogr. Hist. 2010, 65, 25–37. [Google Scholar]
  18. Domínguez, A.; López, R. Patrimonio, paisaje y educación: Formación inicial del profesorado y educación cívica del alumnado de primaria. Clio. Hist. Hist. Teach. 2014, 40, 1–26. [Google Scholar]
  19. Gómez-Zotano, J.; Riesco-Chueca, P. Landscape learning and teaching: Innovations in the context of The European Landscape Convention. In Proceedings of the INTED2010 Conference, Valencia, Spain, 8–10 March 2010; pp. 1–13. [Google Scholar]
  20. Zanato, O. Lo sguardo sul paesaggio da una prospettiva padagógico-ambientale. In Il Paesaggio Vicino a Noi. Educazione, Consapevolezza, Responsabilità; Castiglioni, B., Celi, M., Gamberoni, E., Eds.; Museo Civico di Storia Naturale e Archeologia: Montebelluna, Italy, 2007. [Google Scholar]
  21. Santana, D.; Morales, A.J.; Souto, X.M. Las representaciones sociales del paisaje en los trabajos de campo con Educación Primaria. In Nuevas Perspectivas Conceptuales y Metodológocas para la Educación Geográfica; Martínez, R., Tonda, E.M., Eds.; Asociación Española de Geografía: Córdoba, Spain, 2014; pp. 167–182. [Google Scholar]
  22. Castiglioni, B. Education on landscape: Theoretical and practical approaches in the frame of the European Landscape Convention. In Geographical View on Education for Sustainable Development; Reinfried, S., Schleicher, Y., Rempfler, A., Eds.; Geographiedidaktische Forschungen: Lucerne, Switzerland, 2007. [Google Scholar]
  23. McMahon, M.; Pospisil, R. Laptops for a digital lifestyle: Millennial students and wireless mobile technologies. Available online: https://www.researchgate.net/publication/49280225_Laptops_for_a_Digital_Lifestyle_Millennial_Students_and_Wireless_Mobile_Technologies/stats (accessed on 8 September 2020).
  24. Cabero, J.; Barroso, J. The educational possibilities of Augmented Reality. New Approaches Educ. Res. 2016, 5, 44–50. [Google Scholar]
  25. Prensky, M. Digital Natives, Digital Immigrants. Horizon 2001, 9, 1–20. [Google Scholar]
  26. Cohen, L.; Manion, L.; Morrison, K. Research Methods in Education; Routledge: New York, NY, USA, 2011. [Google Scholar]
  27. Torrado, M. Estudios de Encuesta. In Metodología de la Investigación Educativa, 5th ed.; Bisquerra, R., Ed.; La Muralla: Madrid, Spain, 2016; pp. 223–249. [Google Scholar]
  28. Ayuga, E.; González, C.; Ortiz, M.A.; Martínez, E. Diseño de un cuestionario para evaluar conocimientos básicos de estadística de estudiantes del último curso de ingeniería. Form. Univ. 2012, 5, 21–32. [Google Scholar] [CrossRef] [Green Version]
  29. Cabero, J.; Llorente, M.C. La aplicación del juicio de experto como técnica de evaluación de las tecnologías de la información (TIC). Eduweb. Rev. Tecnol. Inf. Comun. Educ. 2013, 7, 11–22. [Google Scholar]
  30. Ding, C.; Hershberger, S. Assessing content validity and content equivalence using structural equation modelling. Struct. Equ. Modeling Multidiscip. J. 2002, 9, 283–297. [Google Scholar] [CrossRef]
  31. Escurra, L.M. Cuantificación de la validez de contenido por criterio de jueces. Rev. Psicol. 1988, 6, 103–111. [Google Scholar]
  32. Garrido, M.; Romero, S.; Ortega, E.; Zagalaz, M. Diseño de un cuestionario para niños sobre los padres y madres en el deporte (CHOPMD). J. Sport Health Res. 2011, 3, 153–164. [Google Scholar]
  33. Oloruntegbe, K.O.; Zamri, S.; Saat, R.M.; Alam, G.M. Development and validation of measuring instruments of contextualization of science among Malaysian and Nigerian serving and preservice chemistry teachers. Int. J. Phys. Sci. 2010, 5, 2075–2083. [Google Scholar]
  34. Escobar-Pérez, J.; Cuervo-Martínez, A. Validez de contenido y juicio de expertos: Una aproximación a su utilización. Av. En Med. 2008, 6, 27–36. [Google Scholar]
  35. Utkin, L.V. A method for processing the unreliable expert judgments about parameters of probability distributions. Eur. J. Oper. Res. 2005, 175, 385–398. [Google Scholar] [CrossRef]
  36. Backhoff, E.; Aguilar, J.; Larrazolo, N. Metodología para la validación de contenidos de exámenes normativos. Rev. Mex. Psicol. 2006, 23, 79–86. [Google Scholar]
  37. Delgado-Rico, E.; Carretero-Dios, H.; Ruch, W. Content validity evidences in test development: An applied perspective. Intern. J. Clin. Health Psych. 2012, 31, 67–74. [Google Scholar]
  38. Gable, R.K.; Wolf, J.K. Instrument Development in the Affective Domain: Measuring Attitudes and Values in Corporate and School Settings; Kluwer Academic: Boston, MA, USA, 1993. [Google Scholar]
  39. Grant, J.S.; Davis, L.L. Selection and use of content experts for instrument development. Res. Nurs. Health 1997, 20, 269–274. [Google Scholar] [CrossRef]
  40. Lynn, M. Determination and quantification of content validity. Nurs. Res. 1986, 35, 382–385. [Google Scholar] [CrossRef]
  41. McGartland, D.; Berg-Weger, M.; Tebb, S.S.; Lee, E.S.; Rauch, S. Objectifying content validity: Conducting a content validity study in social work research. Soc. Work Res. J. 2003, 27, 94–104. [Google Scholar]
  42. Chacón, S.; Pérez-Gil, J.A.; Holgado, F.P.; Lara, A. Evaluación de la calidad universitaria: Validez de contenido. Psicothema 2001, 13, 294–301. [Google Scholar]
  43. Martín-Romera, A.; Molina, E. Valor del conocimiento pedagógico para la docencia en Educación Secundaria: Diseño y validación de un cuestionario. Estud. Pedagog. 2017, 43, 195–220. [Google Scholar] [CrossRef] [Green Version]
  44. García, E.; Cabero, J. Diseño y valoración de un cuestionario dirigido a describir la evaluación en procesos de educación a distancia. Edutec-E 2011, 35, 1–26. [Google Scholar]
  45. Merino-Barrero, J.; Valero-Valenzuela, A.; Moreno-Murcia, J. Análisis psicométrico del cuestionario estilos de enseñanza en Educación Física (EEEF). Rev. Int. Med. Cienc. Act. Fis. Deporte 2017, 17, 225–241. [Google Scholar]
  46. Singh, K. Quantitative Social Research Methods; Sage Publications: London, UK, 2007. [Google Scholar]
  47. Galán, M. Desarrollo y validación de contenido de la nueva versión de un instrumento para clasificación de pacientes. Rev. Latinoam. Enferm. 2011, 19, 1–9. [Google Scholar]
  48. McMillan, J.H.; Schumacher, S. Investigación Educativa. Una Introducción Conceptual, 5th ed.; Pearson Educación: Madrid, Spain, 2011. [Google Scholar]
  49. De Winter, J.C.F.; Dodou, D.; Wieringa, P.A. Exploratory factor analysis with small sample sizes. Multivar. Behav. Res. 2009, 44, 147–181. [Google Scholar] [CrossRef]
  50. Ferrando, P.J.; Anguiano-Carrasco, C. El análisis factorial como técnica de investigación en Psicología. Pap. Psicólogo 2010, 31, 18–33. [Google Scholar]
  51. Lloret-Segura, S.; Ferreres-Traver, A.; Hernández-Baeza, A.; Tomás-Marco, I. El análisis factorial exploratorio de los ítems: Una guía práctica, revisada y actualizada. An. Psicol. 2014, 30, 1151–1169. [Google Scholar] [CrossRef]
  52. Frías-Navarro, D.; Pascual, M. Prácticas del análisis factorial exploratorio (AFE) en la investigación sobre la conducta del consumidor y marketing. Suma Psicol. 2012, 19, 45–58. [Google Scholar]
  53. Joshi, A.; Kale, S.; Chandel, S.; Pal, D.K. Likert Scale: Explored and Explained. Br. J. Appl. Sci. Technol. 2015, 7, 396–403. [Google Scholar] [CrossRef]
  54. Zube, E.H.; Sell, J.L.; Taylor, L.G. Landscape perception: Research, application and theory. Landsc. Urban Plan. 1982, 9, 1–33. [Google Scholar] [CrossRef]
  55. Mata, R. Métodos de estudio del paisaje e instrumentos para su gestión. Consideraciones a partir de experiencias de planificación territorial. In El Paisaje y la Gestión del Territorio. Criterios Paisajísticos en la Ordenación del Territorio y el Urbanismo; Mata, R., Torroja, A., Eds.; Diputación (Provincial Council) de Barcelona: Barcelona, Spain, 2006; pp. 199–239. [Google Scholar]
  56. Oliva, J.; Iso, A. Diseños metodológicos para la planificación participativa del paisaje. Empiria. Rev. Metodol. Cienc. Soc. 2014, 27, 95–120. [Google Scholar]
  57. Serrano, D. Paisajes y políticas públicas. Investig. Geográficas 2007, 42, 109–123. [Google Scholar] [CrossRef] [Green Version]
  58. Zoido, F. El paisaje, ideas para la actuación. In Estudios Sobre el Paisaje; Martínez de Pisón, E., Ed.; Fundación Duques de Soria: Madrid, Spain, 2000; pp. 293–311. [Google Scholar]
  59. Calcagno, A. Landscape and education. In Landscape Dimensions. Reflections and Proposals for the Implementation of the European Landscape Convention; Council of Europe: Strasbourg, France, 2017; pp. 55–119. [Google Scholar]
  60. Beauchamp, E.; Clements, T.; Milner-Gulland, E.J. Investigating Perceptions of Land Issues in a Threatened Landscape in Northern Cambodia. Sustainability 2019, 11, 5881. [Google Scholar] [CrossRef] [Green Version]
  61. Cornwall, A. Locating citizen participation. Inst. Dev. Stud. Bull. 2002, 33, 9–19. [Google Scholar] [CrossRef] [Green Version]
  62. Liceras, A. Didáctica del paisaje. Didáctica Cienc. Soc. Geogr. Hist. 2013, 74, 85–93. [Google Scholar]
  63. Peng, J.; Yan, S.; Strijker, D.; Wu, Q.; Chen, W.; Ma, Z. The influence of place identity on perceptions of landscape change: Exploring evidence from rural land consolidation projects in Eastern China. Land Use Policy 2020, 99. [Google Scholar] [CrossRef]
  64. Rodríguez, E.J.; Granados, C.S.T.; Santo-Tomás, R. Landscape perception in Peri-Urban Areas: An expert-based methodological approach. Landsc. Online 2019, 75, 1–22. [Google Scholar] [CrossRef] [Green Version]
  65. Yli-Panula, E.; Persson, C.; Jeronen, E.; Eloranta, V.; Pakula, H.M. Landscape as experienced place and worth conserving in the drawings of Finnish and Swedish students. Educ. Sci. 2019, 9, 93. [Google Scholar] [CrossRef] [Green Version]
  66. Council of Europe. Explanatory Report to the European Landscape Convention; Council of Europe: Florence, Italy, 2000. [Google Scholar]
  67. Iurk, M.C.; Biondi, D.; Dlugosz, F.L. Perception, landscape and environmental education: An investigation with students from the municipality of Irati, state of Paraná, Brazil. Floresta 2018, 48, 143–152. [Google Scholar] [CrossRef] [Green Version]
  68. Batllorí, R.; Serra, J.M. D’ensenyar geografía a través del paisatge a educar en paisatge. Doc. Anal. Geogr. 2017, 63, 617–630. [Google Scholar]
  69. García, A. Perspectivas de futuro en el aprendizaje del paisaje. Didact. Geogr. 2019, 20, 55–77. [Google Scholar]
  70. Liceras, A. Observar e Interpretar el Paisaje. Estrategias Didácticas; Grupo Editorial Universitario: Granada, Spain, 2003. [Google Scholar]
  71. Nevrelová, M.; Ružicková, J. Educational Potential of Educational Trails in Terms of Their Using in the Pedagogical Process (Outdoor Learning). Eur. J. Contemp. Educ. 2019, 8, 550–561. [Google Scholar]
  72. García, A. El itinerario geográfico como recurso didáctico para la valoración del paisaje. Didact. Geogr. 2004, 6, 79–95. [Google Scholar]
  73. Coll, C. Constructivismo y educación escolar: Ni hablamos siempre de los mismo ni lo hacemos siempre desde la misma perspectiva epistemológica. Anu. Psicol. 1996, 69, 153–178. [Google Scholar]
  74. Díaz, F.; Hernández, G. Estrategias Docentes Para un Aprendizaje Significativo. Una Interpretación Constructivista; McGraw Hill: Mexico City, Mexico, 2002. [Google Scholar]
  75. García, A. Aplicación didáctica del aprendizaje basado en problemas al análisis geográfico. Rev. Didact. Especificas 2010, 2, 43–60. [Google Scholar]
  76. Solé, I.; Coll, C. Los profesores y la concepción constructivista. In El Constructivismo en el Aula; Coll, C., Mauri, T., Miras, M., Solé, I., Zabala, A., Eds.; Graó: Barcelona, Spain, 1993; pp. 7–24. [Google Scholar]
  77. Zamora-Polo, F.; Corrales-Serrano, M.; Sánchez-Martín, J.; Espejo-Antúnez, L. Nonscientific University Students Training in General Science Using an Active-Learning Merged Pedagogy: Gamification in a Flipped Classroom. Educ. Sci. 2019, 9, 297. [Google Scholar] [CrossRef] [Green Version]
  78. Mateo, J. La Investigación “Ex-Post-Facto”; Universitat Oberta de Catalunya: Barcelona, Spain, 1997. [Google Scholar]
  79. Aiken, L.R. Three coefficients for analyzing the reliability and validity of ratings. Educ. Psychol. Meas. 1985, 45, 131–142. [Google Scholar] [CrossRef]
  80. Charter, R.A. A Breakdown of reliability coefficients by test type and reliability method, and the clinical implications of low reliability. J. Gen. Psychol. 2003, 130, 290–304. [Google Scholar] [CrossRef]
  81. Penfield, R.D.; Giacobbi, P.R. Applying a score confidence interval to Aiken’s item content relevance index. Meas. Phys. Educ. Exerc. Sci. 2004, 8, 213–225. [Google Scholar] [CrossRef]
  82. Merino, C.; Livia, J. Intervalos de confianza asimétricos para el índice de validez de contenido: Un programa Visual Basic para la V de Aiken. An. Psicol. 2009, 21, 169–171. [Google Scholar]
  83. Wilson, E.B. Probable inference, the law of succession and statistical inference. J. Am. Stat. Assoc. 1927, 22, 209–212. [Google Scholar] [CrossRef]
  84. Álvarez-García, O.; Sureda-Negre, J.; Comas-Forgas, R. Diseño y validación de un cuestionario para evaluar la alfabetización ambiental del profesorado de primaria en formación inicial. Profr. Rev. Curric. Form. Profr. 2018, 22, 265–284. [Google Scholar] [CrossRef]
  85. León-Larios, F.; Gómez-Baya, D. Diseño y validación de un cuestionario sobre conocimientos de sexualidad responsable en jóvenes. Rev. Esp. Salud Publica. 2020, 92, 1–15. [Google Scholar]
  86. Méndez, C.; Rondón, M.A. Introducción al análisis factorial exploratorio. Rev. Colomb. Psiquiatr. 2012, 41, 197–207. [Google Scholar]
  87. Forero, C.G.; Maydeu-Olivares, A.; Gallardo-Pujol, D. Factor and analysis with ordinal indicators: A monte Carlo study comparing DWLS and ULS estimation. Struct. Equ. Modeling 2009, 16, 625–641. [Google Scholar] [CrossRef]
  88. Costello, A.B.; Osborne, J. Best practices in exploratory factor analysis: Four recommendations for getting the most from your analysis. Pract. Assess. Res. Eval. 2005, 10, 1–9. [Google Scholar]
  89. Hair, J.F.; Anderson, R.E.; Tatham, R.; Black, W.C. Análisis Multivariante, 5th ed.; Prentice Hall: Madrid, Spain, 1999. [Google Scholar]
  90. Kaiser, H.F. The varimax criterium for analytic rotation in factor analysis. Psychometrika 1958, 23, 187–200. [Google Scholar] [CrossRef]
  91. Landa, M.R.; Ramírez, M.Y. Diseño de un cuestionario de satisfacción de estudiantes para un curso de nivel profesional bajo modelo de aprendizaje invertido. Rev. Paginas Educ. 2018, 11, 153–175. [Google Scholar]
  92. Hernández, R.; Fernández, C.; Baptista, P. Metodología de la Investigación; Mc-Graw Hill: Mexico City, Mexico, 2006. [Google Scholar]
  93. Pegalajar, M.C. Diseño y validación de un cuestionario sobre percepciones de futuros docentes hacia las TIC para el Desarrollo de prácticas inclusivas. Pixel-Bit. Rev. Medios Educ. 2015, 47, 89–104. [Google Scholar] [CrossRef]
  94. Santos, M.A.; Jover, G.; Naval, C.; Álvarez, J.L.; Vázquez, V.; Sotelino, A. Diseño y validación de un cuestionario sobre práctica docente y actitud del profesorado universitario hacia la innovación (CUPAIN). Educ. Xxi 2017, 20, 39–71. [Google Scholar] [CrossRef] [Green Version]
  95. Merenda, P. A guide to the proper use of factor analysis in the conduct and reporting of research: Pitfalls to avoid. Meas. Eval. Couns. Eval. 1997, 30, 156–163. [Google Scholar] [CrossRef]
  96. Kim, J.; Mueller, C.W. Factor analysis, statistical methods and practical issues. In Factor Analysis and Related Techniques; Lewis-Beck, M., Ed.; Sage Publications: London, UK, 1994; pp. 75–155. [Google Scholar]
  97. Pérez, E.R.; Medrano, L. Análisis Factorial Exploratorio: Bases conceptuales y metodológicas. Rev. Argent. Cienc. Comport. 2010, 2, 58–66. [Google Scholar]
  98. Arribas, M. Diseño y validación de cuestionarios. Matronas Prof. 2004, 5, 23–29. [Google Scholar]
  99. Kline, P. An Easy Guide to Factor Analysis; Sage: Newbury Park, Ventura Country, CA, USA, 1994. [Google Scholar]
  100. Long, J.S. Confirmatory Factor Analysis: A preface to LISREL; Paper Series on Quantitative Applications in the Social Sciences: London, UK, 1983. [Google Scholar]
  101. Stapleton, C.D. Basic concepts in Exploratory Factor Analysis (EFA) as a tool to evaluate score validity: A right-brained approach. In Proceedings of the Annual Meeting of the Southwest Educational Research Association, Austin, TX, USA, 24 January 1997. [Google Scholar]
  102. Batista-Foguet, J.M.; Coenders, G.; Alonso, J. Análisis Factorial Confirmatorio. Su utilidad en la validación de cuestionarios relacionados con la salud. Med. Clin. 2004, 122, 21–27. [Google Scholar] [CrossRef]
  103. Fernández, A. Aplicación del análisis factorial confirmatorio a un modelo de medición del rendimiento académico en lectura. Rev. Cienc. Econ. 2015, 33, 39–66. [Google Scholar] [CrossRef]
  104. Herrero, J. El análisis factorial confirmatorio en el estudio de la estructura y estabilidad de los instrumentos de evaluación: Un ejemplo con el Cuestionario de Autoestima CA-14. Interv. Psicosoc. 2010, 19, 289–300. [Google Scholar] [CrossRef]
  105. Urrutia, M.; Barrios, S.; Gutiérrez, M.; Mayorga, M. Métodos óptimos para determinar validez de contenido. Educ. Med. Super. 2014, 28, 1–9. [Google Scholar]
Figure 1. Model of scorecard for the judges’ assessment of the items on the questionnaire. Source: compiled according to Backhoff, Aguilar, and Larrazolo; García and Cabero; Martín-Romera and Molina [36,43,44].
Figure 1. Model of scorecard for the judges’ assessment of the items on the questionnaire. Source: compiled according to Backhoff, Aguilar, and Larrazolo; García and Cabero; Martín-Romera and Molina [36,43,44].
Education 11 00112 g001
Figure 2. Sociodemographic data on the sample.
Figure 2. Sociodemographic data on the sample.
Education 11 00112 g002
Figure 3. Diagram of the methodological process for the design and validation of the Questionnaire on Landscape Perception (CPP) based on the proposal made by Cohen, Manion, and Morrison. Source: adapted from Cohen et al. [26] (p. 472).
Figure 3. Diagram of the methodological process for the design and validation of the Questionnaire on Landscape Perception (CPP) based on the proposal made by Cohen, Manion, and Morrison. Source: adapted from Cohen et al. [26] (p. 472).
Education 11 00112 g003
Table 1. Mean, Aiken’s V, and confidence interval of the assessments made by expert judges.
Table 1. Mean, Aiken’s V, and confidence interval of the assessments made by expert judges.
Items x ¯ Aiken’s VCI (95%)Items x ¯ Aiken’s VCI (95%)
Item 18.40.740.59–0.85Item 368.80.780.63–0.88
Item 28.40.740.59–0.85Item 3790.80.65–0.89
Item 350.400.25–0.55Item 389.40.840.70–0.92
Item 49.20.820.67–0.91Item 3990.80.65–0.89
Item 59.20.820.67–0.91Item 405.60.460.32–0.61
Item 69.80.880.74–0.95Item 419.80.880.74–0.95
Item 790.80.65–0.89Item 429.40.840.70–0.92
Item 88.80.780.63–0.88Item 439.20.820.67–0.91
Item 99.40.840.70–0.92Item 446.20.520.37–0.67
Item 109.20.820.67–0.91Item 4590.80.65–0.89
Item 119.80.880.74–0.75Item 469.40.840.70–0.92
Item 128.80.780.63–0.88Item 476.40.540.39–0.68
Item 139.60.860.72–0.94Item 4890.80.65–0.89
Item 14100.90.77–0.96Item 499.80.880.74–0.95
Item 15100.90.77–0.96Item 509.40.840.70–0.92
Item 169.80.880.74–0.95Item 5190.80.65–0.89
Item 17100.90.77–0.96Item 528.60.760.61–0.87
Item 189.20.820.67–0.91Item 5330.20.10–0.35
Item 1940.30.18–0.45Item 548.60.760.61–0.87
Item 208.80.780.63–0.88Item 559.40.840.70–0.92
Item 215.80.480.33–0.63Item 5690.80.65–0.89
Item 2290.80.65–0.89Item 579.40.840.70–0.92
Item 238.20.720.57–0.84Item 589.20.820.67–0.91
Item 2490.80.65–0.89Item 599.40.840.70–0.92
Item 259.60.860.72–0.94Item 609.40.840.70–0.92
Item 268.20.720.57–0.84Item 618.80.780.63–0.88
Item 279.40.840.70–0.92Item 628.60.760.61–0.87
Item 289.60.860.72–0.94Item 63100.90.77–0.96
Item 2990.80.65–0.89Item 649.20.820.67–0.91
Item 304.80.380.25–0.53Item 659.40.840.70–0.92
Item 319.80.880.74–0.95Item 669.20.820.67–0.91
Item 328.80.780.63–0.88Item 679.40.840.70–0.92
Item 339.80.880.74–0.98Item 689.80.880.74–0.95
Item 3490.80.65–0.89Item 699.40.840.70–0.92
Item 355.80.480.33–0.63Item 709.40.840.70–0.92
Table 2. KMO and Bartlett tests.
Table 2. KMO and Bartlett tests.
KMO and Bartlett Tests
Kaiser-Meyer-Olkin (KMO) Measure of Sampling Adequacy0.814
Bartlett’s test of sphericityApprox. Chi-squared1719.359
df190
Signif.0.000
Table 3. Factors forthcoming from the exploratory factor analysis (EFA).
Table 3. Factors forthcoming from the exploratory factor analysis (EFA).
Total Variance Explained
ComponentInitial EigenvaluesExtraction Sums of Squared LoadingsRotation Sums of Squared Loadings
Total% of VarianceCumulative %Total% of VarianceCumulative %Total% of VarianceCumulative%
13.91519.57319.5733.91519.57319.5732.45012.24812.248
21.6138.06527.6381.6138.06527.6381.9969.98022.228
31.3776.88634.5241.3776.88634.5241.5957.97730.205
41.1715.85540.3791.1715.85540.3791.4367.17937.383
51.0905.44945.8281.0905.44945.8281.3896.94644.330
61.0015.00350.8301.0015.00350.8301.3006.50050.830
70.9624.81055.640
80.9164.57860.218
90.9064.53064.748
100.8164.08268.830
110.7713.85672.686
120.7373.68476.370
130.7203.59879.968
140.6823.40983.377
150.6453.22686.603
160.6133.06789.670
170.5702.85092.520
180.5422.71195.230
190.4972.48597.715
200.4572.285100.000
Extraction method: principal component analysis.
Table 4. Rotated component matrix.
Table 4. Rotated component matrix.
Rotated Component Matrix a
Component
123456
Item570.767
Item580.630
Item590.522
Item510.480
Item36 0.665
Item55 0.587
Item48 0.517
Item62 0.490
Item29 0.442
Item38 0.713
Item52 0.693
Item8 0.403
Item17 0.846
Item31 0.450
Item18 0.439
Item1 0.804
Item2 0.802
Item4 0.716
Item41 0.463
Item61 −0.447
Extraction method: principal component analysis. Rotation method: Varimax with Kaiser normalisation. a. Rotation converged in 12 iterations.
Table 5. Final Questionnaire on Landscape Perception—CPP.
Table 5. Final Questionnaire on Landscape Perception—CPP.
Number of the Item in the Questionnaire Prior to ValidationNumber in Final QuestionnaireVariableLikert (1 to 5)
Item 0Item 1Are you familiar with the European Landscape Convention?YES/NO
Item 1Item 2The landscape is any part of the territory that people perceive1 to 5
Item 2Item 3The landscape is everything that can be seen from a given place1 to 5
Item 29Item 4The landscape needs to be taught through fieldwork or day trips1 to 5
Item 36Item 5The landscape should be taught as a mainstream subject in primary education1 to 5
Item 48Item 6The landscape is a dynamic environment1 to 5
Item 62Item 7The landscape is part of the syllabus in primary education1 to 5
Item 38Item 8The landscape should be taught through a textbook1 to 5
Item 52Item 9The cultural landscape is made up of human features1 to 5
Item 8Item 10The cultural landscape is more important than the natural landscape1 to 5
Item 17Item 11The social context we live in conditions the way we study the landscape1 to 5
Item 31Item 12Pupils should know how to identify anthropic features in order to interpret the landscape1 to 5
Item 18Item 13A combination of photographs and fieldwork is useful for studying the landscape’s social features1 to 5
Item 4Item 14Contemplating any kind of landscape generates positive feelings1 to 5
Item 41Item 15Simulation techniques enable me to understand how the landscape and the feelings it conveys change1 to 5
Item 61Item 16There is a need to further our understanding of the landscape because it is a feature that arouses positive feelings1 to 5
Item 57Item 17The city helps me to understand the urban landscape1 to 5
Item 58Item 18The landscape helps me to teach about rural life1 to 5
Item 59Item 19The main reason for studying the landscape is to provide pupils with tools for understanding geography1 to 5
Item 51Item 20Augmented reality is very useful for teaching about the landscape1 to 5
Item 55Item 21The landscape is a didactic instrument1 to 5
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Fernández Álvarez, R.; Fernández, J. Design and Initial Validation of a Questionnaire on Prospective Teachers’ Perceptions of the Landscape. Educ. Sci. 2021, 11, 112. https://doi.org/10.3390/educsci11030112

AMA Style

Fernández Álvarez R, Fernández J. Design and Initial Validation of a Questionnaire on Prospective Teachers’ Perceptions of the Landscape. Education Sciences. 2021; 11(3):112. https://doi.org/10.3390/educsci11030112

Chicago/Turabian Style

Fernández Álvarez, Rubén, and José Fernández. 2021. "Design and Initial Validation of a Questionnaire on Prospective Teachers’ Perceptions of the Landscape" Education Sciences 11, no. 3: 112. https://doi.org/10.3390/educsci11030112

APA Style

Fernández Álvarez, R., & Fernández, J. (2021). Design and Initial Validation of a Questionnaire on Prospective Teachers’ Perceptions of the Landscape. Education Sciences, 11(3), 112. https://doi.org/10.3390/educsci11030112

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop