Next Article in Journal
Buckling Characteristics of Bio-Inspired Helicoidal Laminated Composite Spherical Shells Under External Normal and Torsional Loads Subjected to Elastic Support
Previous Article in Journal
MAML Bridges the Data Gap in Deep Learning-Based Structural Health Monitoring
Previous Article in Special Issue
Material Passports in Construction Waste Management: A Systematic Review of Contexts, Stakeholders, Requirements, and Challenges
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Methodological Approach to Assessing Constructability in Building Maintenance and Its Impact on University Quality

Postgraduate University School—EUPG UNFV, Federico Villarreal National University, Lima 15039, Peru
*
Authors to whom correspondence should be addressed.
Buildings 2025, 15(17), 3164; https://doi.org/10.3390/buildings15173164
Submission received: 27 July 2025 / Revised: 17 August 2025 / Accepted: 20 August 2025 / Published: 3 September 2025
(This article belongs to the Special Issue A Circular Economy Paradigm for Construction Waste Management)

Abstract

This study introduces and evaluates an innovative methodology for assessing constructability in the maintenance of university buildings, aiming to improve the quality of academic infrastructure. The proposed approach is based on four key criteria: functionality, usage, investment, and curricular planning. These criteria are derived from the principles established by the Chilean Construction Industry Council (CCI Chile, 2024) and were applied in a case study at Ricardo Palma University. A quasi-experimental research design was implemented in two physical spaces within the Faculty of Architecture and Urbanism, one of which underwent a maintenance intervention while the other remained unaltered. Data was collected through expert-validated instruments, administered to senior students and technical staff before and after the intervention. The results revealed significant improvements, with satisfaction levels increasing from 44% to 56% among students and a 10% rise in positive technical evaluations (p < 0.005) which reflected an improvement in the perceived quality of the academic environment, especially in areas related to maintenance planning, execution, control, safety, and user comfort. This study concludes that integrating constructability criteria into the maintenance phase can optimize infrastructure management, enhancing sustainability, operational efficiency, and user satisfaction. The developed methodology offers a practical and replicable tool for other academic units and universities, supporting continuous improvement and promoting evidence-based decision-making in the management of educational facilities.

1. Introduction

The sustained growth of university enrollment worldwide has placed significant pressure on the physical and functional capacity of higher education institutions. According to UNESCO, by 2024, more than 256 million individuals were enrolled in tertiary education programs, representing an 80% increase compared to 2010 [1]. This unprecedented expansion reflects not only an increase in demand for higher education but also the urgent need for adequate physical infrastructure to support such growth. The global expansion of higher education has intensified pressure on university campuses, which now face the dual challenge of accommodating more students while ensuring spaces remain functional, safe, and aligned with evolving academic needs. In the case of Peru, this tension is aggravated by reactive maintenance practices and limited budgets, creating a structural gap between enrollment growth and the functional capacity of existing educational spaces [2,3].
In Latin America, these dynamics unfold within contexts marked by unequal investment and long-standing infrastructure deficits. While countries such as Chile and Brazil have made advances in educational infrastructure modernization, others, including Peru, continue to struggle with deficiencies in comfort, accessibility, and technological integration. At Ricardo Palma University, for example, a study revealed that 54% of surveyed students perceived no substantial improvements in renovated spaces between 2014 and 2018 [4,5].
As illustrated in Figure 1, global disparities in higher education enrollment rates highlight inequities not only in access but also in the quality of infrastructure that supports academic activities. For Peru, this global contrast underscores how deficiencies in planning, outdated construction systems, and underfunded maintenance programs exacerbate existing inequalities in higher education, directly affecting students’ learning conditions and overall university quality [1,6]. In other words, while countries with higher enrollment rates have managed to couple expansion with systematic investments in infrastructure, the Peruvian case reflects how rapid enrollment growth has not been matched by adequate maintenance and modernization policies, producing an increasing gap between demand and capacity.
One of the recurring problems in university infrastructure management in Peru is the absence of standardized instruments to evaluate academic spaces after maintenance or renovation interventions. While projects are executed with the intention of modernizing facilities, there is no systematic way to verify whether these efforts enhance durability, functionality, or curricular fitness [7,8]. In practice, this weakness translates into premature deterioration, fragmented interventions, and limited evidence to guide future investments. Without reliable methodological indicators, universities risk making decisions based on superficial aspects, neglecting the long-term impact of maintenance on academic performance [9,10].
Maintenance, although often overlooked, is one of the most critical phases in the life cycle of a building. It is not merely a technical afterthought but a stage that directly affects sustainability, cost-efficiency, and user experience. Studies indicate that maintenance can represent between 60% and 80% of a building’s total life cycle costs [10], a figure that highlights its strategic importance.
Figure 2 illustrates the four dimensions proposed for evaluating constructability in the maintenance of academic infrastructure: functionality, use, investment, and alignment with the curricular plan [11,12]. These criteria enable a more integrated perspective, combining technical evaluation with educational objectives. Despite this importance, decisions regarding infrastructure investment in Peruvian universities are frequently influenced by immediate needs and budgetary constraints rather than by systematic methodologies. Consequently, many interventions lack continuity, present deficiencies in ventilation or spatial distribution, or fail to incorporate emerging technologies. Institutional assessments have confirmed that even renovated environments continue to exhibit shortcomings in comfort, flexibility, and technological adequacy, which negatively impact both teaching and learning activities [13]. This reality demonstrates the necessity of adopting models that evaluate maintenance not as an isolated phase but as a multidimensional process with long-term implications.
Constructability provides a useful conceptual framework to address these shortcomings. Initially designed to improve construction efficiency during design and execution, it has gradually expanded to encompass other phases of the building life cycle [14].
As shown in Figure 3, the life cycle of a building consists of six key stages, namely, planning, design, construction, operation, maintenance, and decommissioning, with maintenance being the longest and most influential for ensuring safety, efficiency, and adaptability of academic spaces [11,12,15]. By applying constructability principles to maintenance, universities can anticipate potential deficiencies, optimize resources, and align technical decisions with educational goals.
The adaptation of constructability principles to the maintenance phase requires reinterpreting them in a way that reflects the specific context of higher education.
Table 1 summarizes the evaluation criteria and associated variables used in this study, which include functionality (circulation flow, spatial flexibility, equipment integration), use (compatibility with activities, accessibility, intensity of use), investment efficiency (resource optimization, maintenance costs, durability), and curricular alignment (fit with academic programs, space distribution, scheduling). This framework provides a structured approach to evaluating interventions and ensuring that resources are used effectively to enhance both infrastructure and learning conditions [16,17].
Another innovative aspect of this study is the inclusion of Post-Occupancy Evaluation (POE). POE is essential in educational contexts because it integrates user perceptions, comfort, accessibility, and adequacy of learning conditions into the evaluation process, complementing technical assessments that might otherwise overlook these dimensions [18,19]. In environments where adaptability and flexibility are critical, POE ensures that interventions are aligned not only with architectural standards but also with the day-to-day experiences of students and faculty. In Peru, however, the lack of articulation between physical investments and curricular requirements remains a persistent challenge. Renovations are often executed without considering academic intensity, compatibility with specific activities, or the flexibility needed to accommodate curriculum changes, producing underutilized environments and inefficient use of resources [20].
This article proposes and validates an innovative methodology that combines constructability criteria with post-occupancy evaluation. Using a quasi-experimental design, two comparable academic spaces within the Faculty of Architecture and Urbanism at Ricardo Palma University were analyzed. One of them, computer laboratories, was renovated under the proposed criteria, while the other, a BIM laboratory, remained unaltered, serving as a control group [21]. This design enabled a systematic comparison of pre- and post-intervention conditions, providing robust evidence of the impact of constructability-based maintenance on both technical performance and user satisfaction [4,15,16].
The contributions of this study are threefold: first, it develops a methodological approach to assess constructability during the maintenance phase of university infrastructure; second, it empirically demonstrates the impact of this methodology on perceived quality and service performance using statistical tests; and third, it offers a replicable and context-sensitive framework that can guide decision-making in other higher education institutions. Section 2 presents methodological design, Section 3 outlines the results, Section 4 discusses the implications for practice and policy, and Section 5 concludes with recommendations for future research.
Operational Definitions and Scope. In this study, “a methodological approach” refers to a mixed-method, quasi-experimental design that integrates technical evaluation (checklists and normative contrast) with user-centered tools (Likert surveys and structured observation) to compare an intervened space with a non-intervened control. “Constructability in the maintenance phase” is understood as the systematic application of constructability principles, adapted from CCI Chile, to planning, execution, and control of maintenance activities to optimize functionality, use, investment, and curricular alignment during the longest stage of the building life cycle. “University quality,” for the purposes of this article, denotes the perceived quality and service performance of academic spaces by their users (students and technical staff), in coherence with educational-organization criteria such as fitness for purpose, safety/well-being, and alignment with curricular requirements [21,22].

2. Materials and Methods

This study adopts a mixed-methods approach to evaluate the impact of applying a tool based on constructability principles on the perceived quality of academic spaces in the context of Peruvian higher education. Based on an applied, explanatory, and quasi-experimental design (Figure 4), two comparable spaces were selected: one that underwent intervention and another that did not, both located within the Faculty of Architecture and Urbanism at Ricardo Palma University. This approach allowed for the identification of measurable differences between the two environments and provided insight into how technical criteria and user experience influence the evaluation of university infrastructure. Furthermore, the study is based on the hypothesis that integrating constructability criteria into academic maintenance processes can significantly improve users’ perception of quality in university environments.
As shown in Table 2, the methodological framework adopted in this study integrates the fundamental components needed to evaluate the influence of constructability criteria on the maintenance of educational buildings in a structured manner. It is an applied and comparative approach that combines both quantitative and qualitative tools, such as Likert-scale surveys, semi-structured interviews, and technical checklists, to gather data from multiple perspectives. The study population includes both direct users (students) and technical staff responsible for maintenance, allowing for robust methodological triangulation. This framework makes it possible not only to measure the perception of spatial quality but also to compare it against national and international technical standards such as the RNE and ISO 21001. Overall, the research design reflects an effort to align the planning, execution, and control of maintenance processes with the actual needs of the university environment.

2.1. Study Design

The study was conducted under a quasi-experimental design with non-randomized groups, allowing for a comparative analysis of the perceived quality of two equivalent academic spaces. One space, a computer laboratory, was intervened using an innovative methodological proposal, while the other, a BIM laboratory, retained its original configuration and served as the control group. This design enabled a more precise observation of the effects attributable to the architectural intervention based on the adapted constructability criteria [23].
A pretest and post-test evaluation were carried out from both a technical perspective and the users’ perception, with the aim of identifying significant differences in variables related to comfort, functional adequacy, operational efficiency, and curricular alignment. The implemented methodology was based on the adaptation of four key principles from the constructability model developed by CCI Chile: functionality, use, investment efficiency, and alignment with the curricular plan [24].
This design enabled a more precise observation of the effects attributable to the architectural intervention based on the adapted constructability criteria [23]. To ensure replicability, all steps of the quasi-experimental design were documented in a procedural protocol including site selection, timing of pre/post evaluations, and identical instruments across groups.
Rationale for Site Selection. The computer laboratory (intervention) and the BIM laboratory (control) were selected due to comparable size, user load, and curricular centrality in digital design courses; both exhibit intensive technological use and similar scheduling, enabling a like-for-like contrast of maintenance decisions.

2.2. Participants and Context

The study population consisted of senior architecture students who used both evaluated spaces during the 2023-2 academic semester. A non-probabilistic, purposive sampling method was applied, selecting students who had direct experience with both environments. A minimum of 30 students were surveyed, meeting the threshold required to perform statistical analyses with acceptable significance levels [25].
In addition, the technical perspective of ten professionals and technicians from the university’s Maintenance Office was included. Their participation was essential to validate the feasibility of the methodology from both operational and regulatory perspectives. The intervened spaces were characterized by intensive use and were directly related to key academic activities such as computer-aided design, three-dimensional modeling, and BIM simulation. The sampling rationale was to capture both user perception and technical feasibility, thereby triangulating perspectives. The study period was one full academic semester (2023-2), ensuring repeated exposure of participants to both spaces.

2.3. Instruments and Data Collection

Three primary instruments were developed and specifically validated to assess the influence of the applied methodology on both technical and perceptual aspects:
  • User Perception Survey: Administered as both pretest and post-test to students using a five-point Likert scale. It assessed variables such as perceived spatial quality, thermal comfort, lighting, accessibility, furniture distribution, and functional adequacy.
  • Technical Evaluation Form: Designed for technical staff, this instrument evaluated the implementation of the four adapted constructability criteria across the three phases of maintenance: planning, execution, and control. It included indicators such as compliance with design requirements, operational sustainability, costs, and durability [26].
  • Structured Observation Guide: Used by the research team to document physical, technical, and spatial conditions of both environments. This tool supported the triangulation of user perceptions with observable evidence.
All instruments were validated by expert judgment, including architects, faculty members, and maintenance technicians. The content validity indices exceeded 0.80. The reliability of the user perception survey was verified using Cronbach’s alpha, with results ranging from 0.82 to 0.90 across dimensions, ensuring the internal consistency of the applied questionnaire [27].
Each dimension was measured with multiple items: for example, Planning and Design included questions on adequacy of spatial distribution, compliance with regulations, and furniture integration; Execution considered indicators such as ventilation, lighting, and safety; Control examined maintenance monitoring, durability, and compliance with scheduling; while Perceived Quality included user comfort, accessibility, and overall satisfaction.
All instruments were piloted with a small group (n = 8) to refine wording and scale reliability.

2.4. Data Analysis

The data was processed using a mixed-methods approach, combining both quantitative and qualitative analyses. On the quantitative side, descriptive and inferential statistics were applied, including paired-sample Student’s t-tests and ANOVA, to compare results from pretest and post-test stages, as well as between control and experimental groups. This analysis enabled the identification of statistically significant differences attributable to architectural intervention, particularly regarding thermal comfort, spatial flexibility, and alignment with the academic curriculum [28].
On the qualitative side, structured interviews with technical staff and observation records were analyzed through content analysis. This triangulation helped to validate users’ perceptions against actual conditions observed in the field.
Additionally, a comparative technical evaluation matrix was developed to assess both spaces. This matrix was based on regulatory dimensions from the Peruvian National Building Code (RNE) and the ISO 21001 [21], which targets educational organizations. The comparison enabled the identification of objective physical improvements associated with the implementation of the proposed methodology.
Given the ordinal nature of Likert responses and the non-normality expected in small educational samples, we prioritized non-parametric tests (Wilcoxon signed-rank and Mann–Whitney U) with a two-tailed α = 0.05; effect directions were interpreted alongside medians and percentage shifts to enhance practical significance. Quantitative analyses were carried out using SPSS v.27; qualitative data from interviews and observation guides were examined through content analysis. Two independent researchers coded the data manually to enhance reliability and reduce bias. All datasets were anonymized prior to analysis.

3. Results

The Results section is organized into five subsections corresponding to the methodological dimensions defined in Section 2: (i) overall influence of constructability on service quality, (ii) planning and design, (iii) execution, (iv) control, and (v) technical–normative contrast. This structure allows for a stepwise presentation from general findings to specific dimensions, ensuring logical coherence and alignment with the methodological framework.
This chapter presents the findings obtained after implementing the constructability evaluation methodology in the maintenance of university buildings. The main objective of the research was to analyze the impact of this methodology on the perceived quality of academic infrastructure by comparing two equivalent environments: one intervened using adapted constructability criteria, and the other maintained without modifications, serving as the control group. The results are structured according to the methodological dimensions evaluated, planning and design, execution, and maintenance control, and are aligned with the perceived service quality indicators: user satisfaction, environmental comfort, safety conditions, and alignment with the academic curriculum.

3.1. General Influence of Constructability Methodology on the Quality of University Infrastructure Services

The overall results show that the implementation of the methodology had a positive impact on both maintenance performance and the perceived quality of infrastructure services. Table 3 summarizes the comparison of both variables before and after the intervention.
As shown in Table 4, there was an increase in both variables. The level of maintenance performance improved from “effective” to “efficient,” while user satisfaction rose from “slightly satisfied” to “satisfied.” These changes reflect a significant improvement attributable to the structured application of constructability principles.
As shown in Figure 5, following the implementation of the methodology, 50% of the technical staff rated the intervened environment as either “efficient” or “highly efficient,” whereas only 40% gave this rating to the non-intervened environment, indicating a 10% improvement.
As shown in Figure 6, the intervened laboratory reached 56% of students reporting being “satisfied” or “very satisfied,” representing a 6% increase compared to the non-intervened environment.
To statistically verify the significance of these differences, the Wilcoxon signed-rank test (Table 5) and the Mann–Whitney U test (Table 6) was applied, with the results presented below.
The results of the three tests confirm the existence of statistically significant differences between the compared groups (p < 0.005), validating the positive effect of the implemented methodology on the quality of service and the maintenance of academic infrastructure (Table 7).

3.2. Planning and Design of Maintenance and Its Impact on Perceived Quality of Academic Spaces

Following the general results, the analysis now turns to the planning and design stage, which represents the anticipatory phase of maintenance. Planning and designing maintenance activities represent the most decisive phases to ensure that academic spaces adequately meet pedagogical and operational needs. By applying constructability methodological criteria at this stage, the aim was to align technical processes with the user experience, adopting an integrated and anticipatory approach.
The comparison between the intervened and non-intervened laboratories revealed substantial improvements in perceived quality from both students and technical staff. Table 8 presents the results obtained for both key variables before and after the intervention:
As shown in Table 8, both variables exhibit a one-point improvement on the scale, indicating a shift from an “acceptable” level to an “optimal” one. This improvement is reinforced by interview testimonials, where users highlighted greater coherence between the functional requirements of the course and the physical conditions of the environment.
As shown in Figure 7, 50% of the maintenance personnel rated the intervened environment as efficient or notably efficient, compared to 40% in the non-intervened laboratory. This indicates a 10% increase in favorable technical perception following the implementation of the methodology.
According to Figure 8, student satisfaction with lighting and ventilation increased from 48% before the intervention to 72% after the intervention, representing a statistically significant improvement.
To statistically verify the significance of these improvements, the Wilcoxon signed-rank test was applied, with the results presented for the planning and design stage (Table 9) and for the perception of academic space quality (Table 10).
The above results confirm that the implemented methodology had a significant effect on improving both technical planning and the perception of academic spaces (Table 11). Together, these findings demonstrate that anticipatory design based on constructability criteria enables better resource optimization and creates more functional and comfortable environments for university education.

3.3. Execution of Maintenance and Its Influence on Safety and Wellbeing in University Environments

Building on planning/design, the next focus is the execution stage, where planning decisions materialize into tangible interventions. The maintenance execution phase represents the point at which planning decisions materialize. An efficient execution aligned with constructability principles can generate direct impacts on physical safety, user wellbeing, and academic performance. This section evaluates the relationship between the application of such methodology and tangible improvements in university environmental conditions (Table 12).
The results show a consistent one-point improvement in both variables following the intervention, suggesting a positive impact on operational effectiveness and the environmental conditions of the academic setting. This improvement was supported by field observations and the technical responses from maintenance personnel. Specifically, the intervention included the replacement of conventional desks with modular ergonomic furniture, the installation of structured cabling to improve connectivity, and the application of technical porcelain flooring to enhance durability. These tangible changes were directly associated with improvements in safety and comfort.
As shown in Figure 9, 50% of the staff considered that the intervened environment reached a level of “efficient” or “efficient with merit,” compared to only 40% in the non-intervened environment, reflecting a 10% improvement in the perception of execution.
According to Figure 10, the intervened laboratory obtained 57% positive responses (“satisfied” and “very satisfied”) compared to 43% in the non-intervened environment, representing a 14% improvement in this dimension.
To statistically assess the intervention effects, the Wilcoxon signed-rank test was applied, with significant results obtained for maintenance execution (Table 13) and for the perception of safety and well-being (Table 14).
Specifically, the intervention included the replacement of conventional desks with modular ergonomic furniture, installation of structured cabling to improve connectivity, and application of technical porcelain flooring that enhanced both durability and hygiene. The results obtained reflect a significant influence of the implemented methodology on the improvement of maintenance execution, especially regarding environmental hygiene, safe access, adequate ventilation, and risk control (Table 15). These findings confirm that an execution phase based on structured technical criteria can positively impact on the perception of well-being and safety among university users.

3.4. Maintenance Control and Its Relationship with Perceived Quality of Academic Space

After execution, attention shifts to the control stage, emphasizing supervision and monitoring to ensure long-term quality. Maintenance control represents the final stage of the technical process, during which the intervention is monitored, validated, and reviewed. At this stage, the rigorous application of a methodology based on constructability not only ensures the quality of execution but also helps maintain the long-term functionality of spaces, optimizing resources and enhancing user experience.
In this study, the influence of technical control, based on four key criteria (functionality, use, investment, and curricular alignment), was evaluated in relation to users’ perception of academic space quality. To this end, the conditions before and after the intervention were compared (Table 16).
The results show a uniform improvement of one point in both variables. This progression suggests that strengthening technical control supports not only compliance with operational standards but also enhances users’ perception of order, spatial coherence, and comfort.
As shown in Figure 11, 70% of the technical staff rated the intervened environment as efficient or highly efficient in terms of maintenance control, compared to only 50% in the non-intervened environment. This 20% difference represents a notable improvement in the consistency and quality of project monitoring.
As shown in Figure 12, the intervened environment reached 50% satisfaction and high satisfaction, surpassing the non-intervened environment by 6%. This reflects a more favorable perception of the academic environment because of effective supervision.
To compare the control and intervention groups, the Wilcoxon signed-rank test was applied, showing significant differences in maintenance execution (Table 17) and in the perception of safety and well-being (Table 18).
The statistical analyses demonstrate that the constructability-based intervention had significant impacts on the maintenance control dimension, consolidating a more rigorous, transparent, and standards-aligned management model (Table 19). This model is consistent with regulatory frameworks such as the Peruvian National Building Regulations (RNE), ISO 21001, and BIM principles for higher education infrastructure.

3.5. Technical-Normative Contrast: Physical–Spatial Analysis of Academic Environments

Finally, a technical–normative contrast was conducted to validate whether perceived changes corresponded to objective improvements against standards. As a complement to the statistical findings, a comparative technical analysis was conducted between the intervened and non-intervened environments, considering physical–spatial, operational, and regulatory criteria. This phase is aimed at triangulating quantitative and qualitative data through an objective evaluation based on recognized standards, to verify whether the changes perceived by users and technical staff were reflected in tangible improvements in infrastructure.
The evaluation was guided by the methodological principles of the constructability model proposed by CCI Chile (2024), adapted to four key criteria: functionality, use, investment, and curricular alignment. In addition, national and international standards were incorporated, including the Peruvian National Building Regulations (RNE), ISO 21001, ISO 9241-6, ISO 11064-4, and the guidelines of the BIM Plan Peru.
As observed in Table 20, the intervened environment shows notable improvements across all evaluated criteria. Greater functional efficiency was recorded in terms of spatial distribution, accessibility, and ergonomic furniture design. Additionally, better environmental conditions (lighting and ventilation) were identified, along with a higher level of technological adequacy and curricular alignment. For instance, the increase in area per person (1.90 m2 vs. 1.71 m2) improved circulation and reduced overcrowding, which students highlighted as enhancing comfort. Likewise, improved ventilation and optimized lighting were perceived by users as contributing to thermal stability and reduced visual fatigue. The technological upgrades (48 PCs with BIM software and interactive whiteboards) were explicitly mentioned by students as strengthening curricular alignment and academic performance.
This contrast confirms that the application of a technically and normatively driven methodology not only improves user perception but also ensures sustainable transformations aligned with the contemporary demands of higher education.

4. Discussion

The results of this research confirm that the implementation of a methodology grounded in the principles of constructability has a significant effect on improving the quality of university infrastructure. This conclusion is supported by a robust analysis that integrates quantitative, qualitative, and technical–normative approaches, offering clear and systematic evidence of the effectiveness of the applied intervention.
From a statistical perspective, the Z value = −2.705 obtained through the Wilcoxon test, with a significance level of α = 0.007, indicates significant differences in perceived quality between the control and experimental groups. This favorable difference for the intervened environment is also reflected in the technical analyses, which report objective improvements in key indicators such as accessibility, thermal comfort, lighting, and curricular alignment. The methodology not only articulated technical and operational dimensions, but also incorporated functional and pedagogical aspects, in accordance with normative frameworks such as the National Building Code (RNE), ISO 21001, and the Constructability Guide by CCI Chile (2024).
The positive outcomes were evident both in user perceptions and in technical evaluations. The analyzed figures show notable percentage increases: for instance, during the planning and design stage, student satisfaction increased by 6%, and technical evaluation improved by 10% compared to the non-intervened environment. This impact is directly associated with improved spatial adequacy for academic activities, in line with standards such as ISO 11064-4 and the BIM Plan Peru guidelines.
Likewise, the execution of maintenance activities showed significant improvements. Following the intervention, a 14% increase in perceived efficiency was recorded, reflecting enhancements in response times, handling of critical access areas, and resource optimization. These findings demonstrate that a more technical and anticipatory maintenance management approach can directly influence user safety and well-being.
In the control phase, there was a 6% increase in student satisfaction and a 20% improvement in the maintenance team’s evaluation, further consolidating the positive influence of the applied methodology. These results are complemented by the physical–spatial comparative analysis, in which the intervened environment consistently outperformed the non-intervened one across the four evaluated criteria: functionality, use, investment, and curricular alignment. These technical improvements are in line with current regulations and meet the evolving academic demands of higher education.
In summary, this study demonstrates that applying constructability principles during the maintenance phase effectively addresses structural gaps that persist in many Latin American universities. This incorporation not only enhances comfort, safety, and functionality levels within educational spaces but also strengthens the alignment between infrastructure and institutional learning objectives. In this sense, constructability transcends its traditional association with the early phases of a building’s life cycle and reaffirms itself as an essential tool for the continuous management of educational quality.

5. Conclusions

The application of a methodology to evaluate constructability in the maintenance of university buildings proved to be effective, coherent, and replicable. The analysis confirmed its positive influence on the functional, technical, and perceived quality of academic environments, allowing for tangible improvements in the spatial and operational conditions of the intervened areas.
The proposed methodology, based on the criteria of functionality, use, investment, and curricular alignment, and guided by the twelve principles of CCI Chile (2024), successfully integrated technical considerations with the real needs of the educational community. This integration enabled more efficient and sustainable maintenance processes in accordance with regulatory standards such as the National Building Code, ISO 21001, and the BIM Plan Peru guidelines.
Statistical results empirically validated both the general and specific hypotheses. The improvements observed across the planning, execution, and control stages of maintenance were significant, not only in terms of user perception but also from a technical and normative evaluation standpoint. Furthermore, this study demonstrated that constructability should not be restricted to the early phases of a building’s life cycle. Its application during the maintenance stage can optimize resources, minimize rework, and enhance the overall quality of the academic environment.
The evaluation of key indicators, such as compliance with quality requirements, fulfillment of initial design, cost and time efficiency, and social and environmental impact, provided a comprehensive perspective on university maintenance performance. These indicators, beyond offering technical evidence, also helped establish useful parameters for institutional decision-making.
Overall, the developed methodology represents a strategic management tool that is technically sound and scalable. Its implementation not only improves the physical quality of university facilities but also strengthens the relationship between infrastructure and curriculum. It fosters environments that are more functional, safe, and aligned with contemporary academic goals. This proposal can be adopted and adapted by other institutions seeking to transition toward more efficient, sustainable, and user-centered maintenance models.

Author Contributions

Conceptualization, D.E. and M.E.; methodology, M.E.; formal analysis, M.E.; investigation, M.E.; resources, D.E.; data curation, M.E.; writing—original draft preparation, M.E.; writing—review and editing, D.E.; visualization, M.E.; supervision, D.E.; project administration, M.E.; funding acquisition, D.E. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Ethical review and approval were waived for this study due to its non-interventional and observational nature. No sensitive or personally identifiable information was collected.

Informed Consent Statement

Informed consent was waived due to the anonymous nature of the survey and the minimal risk posed to participants.

Data Availability Statement

The data presented in this study are available on request from the corresponding author. The data are not publicly available due to privacy restrictions.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. UNESCO. Global Education Monitoring Report 2024; United Nations Educational, Scientific and Cultural Organization: Paris, France, 2024. [Google Scholar]
  2. Arredondo, J.; López, D. Calidad y sostenibilidad en infraestructuras educativas: Desafíos en Latinoamérica. Rev. Educ. Desarro. 2022, 45, 33–47. [Google Scholar]
  3. Gómez, F. Espacios para el aprendizaje: Arquitectura educativa contemporánea. Rev. Arquit. 2021, 28, 55–69. [Google Scholar]
  4. Escate, M. Post-Occupancy Evaluation in Academic Environments: The Case of Ricardo Palma University. Master’s Thesis, Universidad Nacional Federico Villarreal, Lima, Peru, 2024. [Google Scholar]
  5. López, R. Percepción de la calidad del espacio universitario. Rev. Innov. Educ. 2020, 22, 15–28. (In Spanish) [Google Scholar]
  6. CEPAL. Panorama Social de América Latina 2023; Comisión Económica para América Latina y el Caribe (CEPAL): Santiago, Chile, 2023. [Google Scholar]
  7. Soto, A.; García, P. Obsolescencia funcional en la infraestructura universitaria. Estudios Urbanos 2023, 36, 101–120. (In Spanish) [Google Scholar]
  8. Durand, L. Estrategias de mantenimiento preventivo para entornos educativos. Rev. Ing. Soc. 2022, 39, 90–104. (In Spanish) [Google Scholar]
  9. Muñoz, E. La gestión del mantenimiento como factor de calidad educativa. Educare 2021, 25, 67–84. (In Spanish) [Google Scholar]
  10. Cardoso, J. El mantenimiento en el ciclo de vida del edificio. Rev. Ing. Civ. 2022, 56, 49–63. (In Spanish) [Google Scholar]
  11. Sánchez, A.; Torres, M. Enfoque sistémico para el mantenimiento universitario. Rev. Gestión Educ. 2021, 29, 13–24. (In Spanish) [Google Scholar]
  12. Salazar, P. Constructabilidad: Fundamentos y aplicaciones en arquitectura. Rev. Técnica Constr. 2022, 18, 77–89. [Google Scholar]
  13. Ministerio de Educación del Perú. Lineamientos para el Diseño y Mantenimiento de Infraestructura Educativa Universitaria; Ministerio de Educación del Perú: Lima, Peru, 2023. (In Spanish)
  14. Escate, M. Application of Constructability Principles in University Environments. Master’s Thesis, Universidad Nacional Federico Villarreal, Lima, Peru, 2024. (In Spanish). [Google Scholar]
  15. Martínez, V. Arquitectura sostenible y bienestar en entornos académicos. Rev. Arquit. Soc. 2023, 34, 58–70. [Google Scholar]
  16. CCI Chile. Guía de Constructabilidad para Proyectos de Construcción; Consejo de Construcción Industrializada: Santiago, Chile, 2024. [Google Scholar]
  17. CCI Chile. 12 Principios de la Constructabilidad Adaptada; Consejo de Construcción Industrializada: Santiago, Chile, 2024. [Google Scholar]
  18. Fainstein, S.; Cedeño, J. Análisis post-ocupacional: Metodología centrada en el usuario. Rev. Eval. Espac. 2021, 12, 45–60. [Google Scholar]
  19. Rojas, B. Flexibilidad espacial en contextos educativos. Revista de Arquitectura. Pedagógica 2022, 19, 34–49. (In Spanish) [Google Scholar]
  20. MINEDU. Plan BIM Perú 2023–2027; Ministerio de Educación del Perú: Lima, Peru, 2023.
  21. ISO 21001:2018; Management Systems for Educational Organizations—Requirements with Guidance for Use. International Organization for Standardization: Geneva, Switzerland, 2018.
  22. ISO 9241-6:2001; Ergonomic Requirements for Office Work with Visual Display Terminals (VDTs)—Part 6: Guidance on the Work Environment. International Organization for Standardization: Geneva, Switzerland, 2001.
  23. Hernández, R.; Fernández, C.; Baptista, P. Metodología de la Investigación; McGraw-Hill: Mexico City, Mexico, 2020. [Google Scholar]
  24. Rubio, M. Adaptación de principios constructivos al contexto educativo. Rev. Edif. 2021, 15, 91–108. [Google Scholar]
  25. Mertens, D.M. Research and Evaluation in Education and Psychology: Integrating Diversity with Quantitative, Qualitative, and Mixed Methods; SAGE Publications: Thousand Oaks, CA, USA, 2019. [Google Scholar]
  26. ININSA. Normas Técnicas para Instalaciones Eléctricas en Edificaciones Públicas; Instituto Nacional de Infraestructura Sanitaria: Lima, Peru, 2023. [Google Scholar]
  27. Tavakol, M.; Dennick, R. Making sense of Cronbach’s alpha. Int. J. Med. Educ. 2011, 2, 53–55. [Google Scholar] [CrossRef] [PubMed]
  28. Field, A. Discovering Statistics Using IBM SPSS Statistics, 5th ed.; SAGE: London, UK, 2018. [Google Scholar]
  29. ISO 11064-4:2013; Ergonomic Design of Control Centres—Part 4: Layout and Dimensions of Workstations. International Organization for Standardization: Geneva, Switzerland, 2013.
Figure 1. Global distribution of higher education enrollment rates (%).
Figure 1. Global distribution of higher education enrollment rates (%).
Buildings 15 03164 g001
Figure 2. Key dimensions for evaluating constructability during the maintenance phase of university infrastructure.
Figure 2. Key dimensions for evaluating constructability during the maintenance phase of university infrastructure.
Buildings 15 03164 g002
Figure 3. Building lifecycle diagram with emphasis on the maintenance phase.
Figure 3. Building lifecycle diagram with emphasis on the maintenance phase.
Buildings 15 03164 g003
Figure 4. Methodological framework for constructability evaluation in building maintenance.
Figure 4. Methodological framework for constructability evaluation in building maintenance.
Buildings 15 03164 g004
Figure 5. Comparison of perceived performance by maintenance professionals between the intervened environment and the non-intervened environment.
Figure 5. Comparison of perceived performance by maintenance professionals between the intervened environment and the non-intervened environment.
Buildings 15 03164 g005
Figure 6. Student satisfaction before (44%) and after maintenance intervention (56%) in the computer laboratory. This figure highlights a 12-percentage point increase in satisfaction (Wilcoxon signed-rank test, p < 0.005).
Figure 6. Student satisfaction before (44%) and after maintenance intervention (56%) in the computer laboratory. This figure highlights a 12-percentage point increase in satisfaction (Wilcoxon signed-rank test, p < 0.005).
Buildings 15 03164 g006
Figure 7. Perceived safety before and after the intervention in the computer laboratory compared with the control laboratory.
Figure 7. Perceived safety before and after the intervention in the computer laboratory compared with the control laboratory.
Buildings 15 03164 g007
Figure 8. Student satisfaction with lighting and ventilation before and after the maintenance intervention.
Figure 8. Student satisfaction with lighting and ventilation before and after the maintenance intervention.
Buildings 15 03164 g008
Figure 9. Evaluation of maintenance execution according to technical staff.
Figure 9. Evaluation of maintenance execution according to technical staff.
Buildings 15 03164 g009
Figure 10. Percentage of students satisfied with durability and materials before (46%) and after the maintenance intervention (70%) in the computer laboratory.
Figure 10. Percentage of students satisfied with durability and materials before (46%) and after the maintenance intervention (70%) in the computer laboratory.
Buildings 15 03164 g010
Figure 11. Perceived accessibility and comfort before and after the intervention in the computer laboratory compared with the control laboratory.
Figure 11. Perceived accessibility and comfort before and after the intervention in the computer laboratory compared with the control laboratory.
Buildings 15 03164 g011
Figure 12. Perceived quality of academic space according to students.
Figure 12. Perceived quality of academic space according to students.
Buildings 15 03164 g012
Table 1. Evaluation criteria and variables applied in the constructability-based methodology.
Table 1. Evaluation criteria and variables applied in the constructability-based methodology.
Evaluation CriterionVariables
FunctionalityCirculation flow, spatial flexibility, equipment integration
UseCompatibility with activities, accessibility, intensity of use
Investment EfficiencyResource optimization, maintenance costs, durability
Curricular PlanAlignment with academic program, space distribution, scheduling
Table 2. Dimensions and reliability of the user perception survey.
Table 2. Dimensions and reliability of the user perception survey.
Evaluated DimensionCronbach’s Alpha
Planning and Design (V1 −D1)0.825
Execution (V1 −D2)0.894
Control (V1 −D3)0.903
Perceived Quality (V2 −D1)0.889
Safety and Well-being (V2 −D2)0.837
Table 3. Comparative evaluation matrix: intervened vs. non-intervened environment.
Table 3. Comparative evaluation matrix: intervened vs. non-intervened environment.
CriterionIntervened EnvironmentNon-Intervened Environment
Internal circulationAdequateObstructed
Natural lightingOptimizedInsufficient
VentilationCross naturalPoor
Thermal comfortStableUnstable
Curricular alignmentHighPartial
Table 4. Comparison of maintenance performance and service satisfaction before and after the intervention.
Table 4. Comparison of maintenance performance and service satisfaction before and after the intervention.
Group/DimensionBefore (Median)After (Median)
Variable 1: Maintenance performance level2 *3 *
Variable 2: Satisfaction level2 *3 *
* Likert scale from 1 (low) to 4 (high).
Table 5. Wilcoxon signed-rank test for the maintenance performance variable.
Table 5. Wilcoxon signed-rank test for the maintenance performance variable.
StatisticValue
Z−2.705
p-value0.007
Table 6. Wilcoxon signed-rank test for the service satisfaction variable.
Table 6. Wilcoxon signed-rank test for the service satisfaction variable.
StatisticValue
Z−3.028
p-value0.002
Table 7. Mann–Whitney U test for both variables post-intervention.
Table 7. Mann–Whitney U test for both variables post-intervention.
StatisticValue
U0.000
Z−4.692
p-value<0.001
Table 8. Comparison of maintenance planning/design performance and perceived quality of academic spaces before and after the intervention.
Table 8. Comparison of maintenance planning/design performance and perceived quality of academic spaces before and after the intervention.
Group/DimensionBefore (Median)After (Median)
Variable 1: Performance in maintenance planning and design2 *3 *
Variable 2: Perceived quality of academic spaces2 *3 *
* Likert scale from 1 (low) to 4 (high).
Table 9. Wilcoxon test results for the planning and design stage.
Table 9. Wilcoxon test results for the planning and design stage.
StatisticValue
Z−2.677
p-value0.007
Table 10. Wilcoxon test for perception of academic space quality.
Table 10. Wilcoxon test for perception of academic space quality.
StatisticValue
Z−3.097
p-value0.002
Table 11. Post-test Mann–Whitney U (planning and perception).
Table 11. Post-test Mann–Whitney U (planning and perception).
StatisticValue
U37.800
Z−3.531
p-value<0.001
Table 12. Comparison of maintenance performance and safety/wellbeing before and after the intervention.
Table 12. Comparison of maintenance performance and safety/wellbeing before and after the intervention.
Group/DimensionBefore (Median)After (Median)
Variable 1: Performance in maintenance execution2 *3 *
Variable 2: Safety and wellbeing in university facilities2 *3 *
* Likert scale from 1 (low) to 4 (high).
Table 13. Wilcoxon Test Results: maintenance execution (Pre–Post Comparison).
Table 13. Wilcoxon Test Results: maintenance execution (Pre–Post Comparison).
StatisticValue
Z−2.705
p-value0.007
Table 14. Wilcoxon Test Results: perception of safety and well-being (Pre–Post Comparison).
Table 14. Wilcoxon Test Results: perception of safety and well-being (Pre–Post Comparison).
StatisticValue
Z−2.799
p-value0.005
Table 15. Mann–Whitney U post-test: execution and well-being.
Table 15. Mann–Whitney U post-test: execution and well-being.
StatisticValue
U0.000
Z−4.745
p-value<0.001
Table 16. Evaluation of maintenance control and perceived quality of academic spaces.
Table 16. Evaluation of maintenance control and perceived quality of academic spaces.
Group/DimensionBefore (Median)After (Median)
Variable 1: Performance of maintenance control2 *3 *
Variable 2: Perceived quality of academic spaces2 *3 *
* Likert scale from 1 (low) to 4 (high).
Table 17. Wilcoxon Test Results: maintenance execution (Control vs. Intervention).
Table 17. Wilcoxon Test Results: maintenance execution (Control vs. Intervention).
StatisticValue
Z−2.670
p-value0.008
Table 18. Wilcoxon Test Results: perception of safety and well-being (Control vs. Intervention).
Table 18. Wilcoxon Test Results: perception of safety and well-being (Control vs. Intervention).
StatisticValue
Z−2.799
p-value0.005
Table 19. Post-test Mann–Whitney U: execution and well-being.
Table 19. Post-test Mann–Whitney U: execution and well-being.
StatisticValue
U24.500
Z−3.938
p-value<0.001
Table 20. Technical–normative contrast between intervened and non-intervened environments.
Table 20. Technical–normative contrast between intervened and non-intervened environments.
CriteriaTechnical IndicatorsIntervened EnvironmentNon-Intervened EnvironmentNormative/Reference Framework
FunctionalityArea per person, circulation space, furniture, visibility1.90 m2/person, circulation ≥ 1.50 m, modular furniture1.71 m2/person, circulation ≤ 1.20 m, linear furnitureRNE A-130, ISO 11064-4 [29]
UseNatural lighting, ventilation, signage, evacuationUtilization of natural light, visible signagePartial lighting, basic signageRNE A-010, ISO 9241-6 [22]
InvestmentFinishes, electrical installations, furniture qualityTechnical porcelain tile, structured cable, new furnitureBasic ceramic tiles, exposed wiring, conventional furnitureRNE, National Electrical Code
Curricular AlignmentInstalled technology, software, alignment with curriculum48 PCs with BIM software (Revit, S10), interactive whiteboards22 PCs with basic software (AutoCAD, Revit), no projectorsISO 21001 [21], BIM Plan Peru, FAU Curriculum
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Escate, M.; Esenarro, D. A Methodological Approach to Assessing Constructability in Building Maintenance and Its Impact on University Quality. Buildings 2025, 15, 3164. https://doi.org/10.3390/buildings15173164

AMA Style

Escate M, Esenarro D. A Methodological Approach to Assessing Constructability in Building Maintenance and Its Impact on University Quality. Buildings. 2025; 15(17):3164. https://doi.org/10.3390/buildings15173164

Chicago/Turabian Style

Escate, Mónica, and Doris Esenarro. 2025. "A Methodological Approach to Assessing Constructability in Building Maintenance and Its Impact on University Quality" Buildings 15, no. 17: 3164. https://doi.org/10.3390/buildings15173164

APA Style

Escate, M., & Esenarro, D. (2025). A Methodological Approach to Assessing Constructability in Building Maintenance and Its Impact on University Quality. Buildings, 15(17), 3164. https://doi.org/10.3390/buildings15173164

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop