Next Article in Journal
The Innovative Human Resource Management Framework: Impact of Green Competencies on Organisational Performance
Previous Article in Journal
Advocating Urban Transition: A Qualitative Review of Institutional and Grassroots Initiatives in Shaping Climate-Aware Cities
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:

Effect Analysis of Online and Offline Cognitive Internships Based on the Background of Engineering Education Accreditation

School of Automobile and Traffic Engineering, Wuhan University of Science and Technology, Wuhan 430065, China
Hubei Provincial Key Laboratory of Mechanical Transmission and Manufacturing Engineering, Wuhan University of Science and Technology, Wuhan 430081, China
School of Computing, Engineering & Mathematics, University of Brighton, Brighton BN2 4GJ, UK
Author to whom correspondence should be addressed.
Sustainability 2022, 14(5), 2706;
Submission received: 28 January 2022 / Revised: 21 February 2022 / Accepted: 22 February 2022 / Published: 25 February 2022


The Engineering Education Accreditation (EEA) has been adopted in colleges and universities worldwide. As a foundational component of practical teaching, cognitive internship is a focus of EEA, with the offline mode as the common teaching method, while the online mode has been increasingly used in recent years. To meet the requirements of EEA, the effects of the online and offline cognitive internships need to be analyzed and compared. Focusing on the demand, this paper proposes a comprehensive method combining the Goal Achievement Degree (GAD) and CIPP model. The majors of the School of Automotive Transport and Engineering at Wuhan University of Science and Technology (WUST), which include the fields of vehicle engineering, traffic engineering, traffic transport, and logistics engineering, are taken as the example to conduct this research. The main findings of this paper are as follows: (1) Compared with online cognitive internship, offline internship students have a stronger ability to enhance theoretical knowledge and perceive the environment. (2) Utilizing such an online teaching method could be helpful for cultivating students’ analytical skills. (3) From the students’ perspective, the lack of funding for teaching cognitive internships, both online and offline, is the most serious problem. (4) Students perceive that the online cognitive internship helps them to understand knowledge in class and determine career planning, but it does not provide the same effect on those as the offline cognitive internship.

1. Introduction

The EEA (Engineering Education Accreditation), as a quality assurance mechanism for engineering specialties in colleges and universities, is implemented by the professional institutions and based on certain standards [1,2]. It has been reported that its application brings great benefits for colleges and universities to cultivate undergraduate students [3,4,5,6]. At present, there are mainly four types of EEA used worldwide, including the Washington Accord, the Sydney Accord, the Dublin Accord, and the EUR-ACE [7,8]. China joined the Washington Accord in 2016 [9] and implemented the “New Engineering Research and Practice” to prompt and deepen the application of EEA [10].
As an essential component of engineering education, internship teaching has been widely considered as the bridge between theory and practice [11,12,13]. It includes cognitive, production, and graduation internships, among which a cognitive internship arranged before processional courses plays a vital role in promoting the transformation of students’ engineering knowledge from surface understanding to deeper awareness [14,15]. The cognitive internship is performed mainly offline, using activities such as fieldwork, enterprise visits, etc. Simultaneously, online teaching that makes use of resources such as videos and pictures has been increasingly used in some universities and colleges due to the influence of the COVID-19. To ensure the constant improvement of teaching quality under the EEA’s requirements, it is imperative to analyze and compare the effects of online and offline cognitive internships.
Academic literature related to online and offline practice has been gradually generated. For the offline practice, Walo (2001) [16] designed a questionnaire completed by students to assess the contribution of internships in developing students’ management competencies based on the Competing Values Framework (CVF); Leydon (2013) [13] performed a study to investigate the challenges of incorporating fieldwork into undergraduate courses using a questionnaire and an in-depth interview, and Barton (2017) [17] explored how the fieldwork provides opportunities for students and reports its benefits on helping students internalize course content. For the online practice, Romero Oliva (2019) [18] designed a survey for students and teachers using a questionnaire and an interview to explore the impact and value of Virtual Learning Environments on students’ oral skills; Oriji (2019) [19] reported the application and challenges of social media in teaching and learning via adopting a questionnaire and an interview, and Darr (2019) [20] designed a virtual practice laboratory to evaluate the impact of an online internship on students’ perception and learning with the help of a questionnaire. Additionally, researchers also designed experiments to compare the effects of online and offline practice. Seifan (2019) [21] investigated the importance and usefulness of virtual vs. real field tripin promoting student’s knowledge and perceptions, and then Seifan (2020) [22] investigated the students’ perception of the effect of real and virtual practice on the essential skills and career aspirations of students using a questionnaire; Sobia (2020) [23] invited the students who are trained in both virtual and traditional simulation environments to complete the questionnaire to explore their preference for training methods. Additionally, some researchers have adopted the EEA concept to achieve the evaluation of internship teaching. Jamil (2013) [24] adopted Outcome-Based Education (OBE) to conduct a survey with an internship host organization using a questionnaire to investigate the students’ performance in an industrial internship. Laingen (2015) [25] developed a questionnaire to explore the approach for continuous improvement practices (CIP), adopting the data obtained from students’ competency assessments.
Given the above, the studies that refer to the EEA concept, such as OBE and CIP, place a particular emphasis on the study of the offline practice. In terms of the online mode, the above studies comparing the two methods focus on the students’ perceptions, where the questionnaire and the interview are the common research methods used. The above studies do not include the research on cognitive internships, nor do they research the online mode from the EEA concept. Additionally, none of the research combined the EEA concept and students’ perceptions to compare the effect of the online and offline modes. Therefore, it is necessary to use the method involving the EEA concept and the students’ perception to study the online and offline cognitive internships. Consequently, this study attempts to fill this gap. Specifically, this paper aims to compare the achievement effect of online and offline cognitive internship according to EEA’s standards and compare the effect of online and offline cognitive internship on students’ perception.
The remaining part of the paper is organized as follows. Section 2 provides the methods and materials used in the study. The results, involving GADs and CIPP-based evaluation, are presented in Section 3. We outline the findings and provide a discussion comparing the results of this study with the results of previous studies. Lastly, Section 5 summaries the conclusions and limitations of the study, as well as future lines of research.

2. Methods and Materials

2.1. Models

In this study, the analysis is carried out by adopting a mixed methods approach involving a GAD Model, an evaluation method frequently based on EEA, and a CIPP Model, a method commonly used in the analysis of students’ perceptions. The models used in this study is shown in Figure 1.

2.1.1. GAD Model

A GAD model is developed and adopted to quantify the course goal and their index points by analyzing the degree of the achievement of the course requirement [10]. Generally, it is calculated at the end of course through a two-phase process:
For the first phase, the GAD of the i-th sub-objective of the course is obtained by the weighted sum of the achievement of the corresponding evaluation items of the sub-objective, and the specific calculation method is shown as Formula (1):
c i = 1 n s j 1 n o j ,
where c i is the GAD of the i-th sub-objective, s j is the average score of the j-th evaluation item, o j is the total-score of the j-th evaluation item of the i-th sub-objective, and n is the number of evaluation items of the i-th sub-objective.
In the second phase, the GAD of the course is calculated based on the GADs of the sub-objectives, and the specific calculation Formula is:
C = 1 m w i c i ,
where w i is the weight of the i-th sub-objective, c i is the GAD of the i-th sub-objective, m is the number of sub-objectives, and
1 m w i = 1 .

2.1.2. CIPP-Based Evaluation Model

The CIPP Model is an important approach for the evaluation from the perspective of the students’ perception. As an evaluation model for education, it focuses on improvement [26]; this corresponds with the objective of this research. The survey based on the CIPP Model introduced the entire process of the teaching activities, from the beginning (objectives and the program of the internship), during the implementation (internship progress), to the final evaluation of the learning outcomes (effectiveness of the internship), to the scope of the evaluation.
The CIPP Model, proposed by Stufflebeam in 1965 [26,27,28], includes four stages, namely, context evaluation, input evaluation, process evaluation, and product evaluation. First, the context evaluation aims to assess the objectives of the activities. Then, the input evaluation is used to help assess the items, such as resources, funds, budgets, service strategies, etc. Next, the process evaluation is an ongoing check to see if the prepared plan is reasonable, and if it needs further improvement or innovation. Finally, the product evaluation seeks to assess, interpret, and judge an enterprise’s achievements [28,29].
The CIPP-based Evaluation Model applied in this study was determined according to the four stages of the CIPP Model, as well as professional features, as shown in Table 1.

2.2. Materials

2.2.1. Course Objectives

The cognitive internship is arranged during the first semester of the third year, which aims to shape the students’ system of knowledge about the major and lay the foundation for future courses in the major. Additionally, students are expected to understand the company’s operation process, post setting, and owned equipment, as well as the types of companies they may enter in the future by placing them in several companies related to their major. Locations around Wuhan, such as the Yangsigang Yangtze River Bridge, Yichang Bus Rapid Transit, and China Railway Bridge Bureau, are selected for the offline cognitive internship, while the videos and materials collected for the online cognitive internship are similarly based on these fields. Students spend one week visiting those companies, where observation and consultation are the primary forms of learning, and with less of a focus on operation. During this period, teachers observing the visits will score students according to their internship performance. Thereafter, students are asked to submit their reports on their internship experience at the end of course, which are also scored by teachers. Additionally, the internship defense is another critical section in examining the students’ perceptions of their major.
According to the EEA’s requirement, sub-objectives (index points) need to be identified to support the achievement of the cognitive internship’s course goal. Simultaneously, the achievement of sub-objectives is evaluated on the basis of three evaluation items, namely the performance, reports, and defense of the internship mentioned above. Subsequently, we take the Traffic Engineering major of the School of Automotive Transport and Engineering as an example to analyze the interaction between the course goal and the sub-objectives via calculating the GADs. The cognitive internship’s sub-objectives, including sub-objective 1, sub-objective 2, and sub-objective 3, are described as follows.
Sub-objective 1: Understand the basic knowledge of traffic engineering and have a preliminary cognition of traffic engineering facilities, design methods and relevant theories. Students are required to understand the main research goals, basic technologic methods, and future development trends of traffic engineering via observing and analyzing the structure of road traffic system, various traffic facilities and their functions, traffic phenomena, etc. They should also be able to summarize the content of the site visits and put forward relevant questions about traffic engineering.
Sub-objective 2: Collect and analyze the original data regarding road networks, intersection networks, signal controls, and others. Students should be able to analyze and evaluate the interaction between traffic facilities and factors such as traffic management, health, safety, and legal and cultural issues.
Sub-objective 3: Understand the concept of environmental protection and socially sustainable development related to transportation engineering facilities and management.

2.2.2. Participants

The participants in this study are the sophomore and junior undergraduate students in the School of Automotive Transport and Engineering who are majoring in vehicle engineering, traffic engineering, traffic transport, and logistics engineering. They enrolled in their cognitive internship course in the first semester of the third year and complete it according to the EEA’s requirements. Additionally, the sophomores participated in the online cognitive internship, and the juniors participated in the offline cognitive internship.

2.2.3. Data and Tools

The GADs of cognitive internship are calculated by the teachers at the end of the course based on the performance of students during the course, the quality of their reports, and the performance in the defense. In this study, the GADs of online cognitive internship were calculated based on the data from sophomore students, while those of the offline cognitive internship were based on the data from junior students. The detailed results are presented in the Section 3.
A questionnaire survey was used to obtain the students’ feedback concerning online and offline cognitive internships. The questionnaire was designed based on the CIPP-based Evaluation Model shown in Table 1. A five-point Likert-type scale (1 = strongly disagree, 2 = disagree, 3 = unsure, 4 = agree, and 5 = strongly agree) was employed in the questionnaire design. The questionnaires were distributed to students via the corresponding grade managers during the one-month period from 11 October to 11 November 2020. A total of 340 questionnaires were sent out, 275 were returned, and 271 were valid. From the total validly sample of questionnaires (N = 271), 47.6% were participants in the online cognitive internship and 52.4% were offline-course participants. The data collected was first examined using a reliability and validity analysis that could ensure the credibility of the research using the Statistical Package for Social Sciences (SPSS 23.0) and Amos 26.0. The Independent Samples t-test was performed to explore the differences in the effects of the online and offline cognitive internships using SPSS 23.0 and we determined that p value is statistically significant if p value < 0.05 [24].

3. Results

3.1. Results Based on GAD

Taking the GADs of online cognitive internship course as an example, the calculation is shown as follows:
The total score of the cognitive internship is 100 and the specific score distribution is shown in the Table 2.
The score distribution for both online and offline course is the same. According to the Formula (1) and the data listed in Table 2 and Table 3, c1, c2, and c3 are obtained as follows:
{ c 1 = 16.36   +   16.20   +   8.03 20   +   20   +   10 = 0.812 c 2 = 15.92   +   7.44   +   3.96 20   +   10   +   5 = 0.781 c 3 = 7.51   +   3.94 10   +   5 = 0.763
which means that the GAD of sub-objective 1 is 0.812, the GAD of sub-objective 2 is 0.781, and the GAD of sub-objective 3 is 0.763.
Based on the value of Ci obtained above and the weight of Wi (W1 = 0.5, W2 = 0.3, and W3 = 0.2), C is calculated as follows:
C = c 1 w 1 + c 2 w 2 + c 3 w 3 = 0.812 0.5 + 0.781 0.3 + 0.763 0.2 = 0.793
which means that the GAD of the online cognitive internship course is 0.793.
Similarly, we conclude that the GADs of the sub-objectives of the offline cognitive internship are 0.815, 0.775 and 0.778, respectively; and the GAD of the offline cognitive internship course is 0.795. The specific results of this section are listed in Table 4:

3.2. Results of CIPP-Based Evaluation

3.2.1. Results of the Reliability and Validity Analysis

Reliability Analysis

To verify the reliability of the scale, Cronbach’s alpha coefficient is calculated using SPSS and presented in Table 5. The larger the value of Cronbach’s alpha coefficient, the stronger the effect [30]. According to the resulting output from SPSS, for all the items, Cronbach’s alpha is larger than 0.7, indicating that all items are proved to be highly reliable.

Validity Analysis

An analysis of the construction validity is conducted to examine the validity of the scale, which is measured in terms of convergent validity using Amos. From the calculation results of Amos (see Table 6), we see that the AVE values of all items are larger than 0.5, which indicates that the convergent validity of all items is excellent.

3.2.2. Results of the Independent Samples t-Test

From the results obtained from Independent Samples t-test (see Table 7), there are no significant differences found between online and offline cognitive internship in the students’ responses to a2, a8, a12, and a14, whereas the responses of students in the online cognitive internships to a1, a3, a4–a7, a9-a11, a13, and a15 are significantly different from those of the students in the offline cognitive internships.
For additional analysis of the differences between the online and offline cognitive internships, the percentages (score > 3) of items with statistically significant p values are calculated (see Figure 1). According to Figure 2, the students’ evaluation of the four stages of the offline cognitive internship are comparatively stable, whereas that of the online cognitive internship are less satisfactory in the context and program evaluation. It can be seen that in a4, the difference between the proportion of online and offline cognitive internship students with an evaluation score greater than three presents the largest difference (42.9%). In addition, the difference between the proportion of online and offline cognitive internship students who evaluated a13 with more than three points shows the smallest difference, only 18.6%.

4. Discussion

Although many studies have examined both online [18,19,20] and offline [13,16,17] teaching methods, as well as compared their differences [21,22,23], most authors are limited to the use of a questionnaire survey, which demonstrates that challenges remain in the area of research methods. An integrated approach involving the use of a GAD Model and a CIPP-based Evaluation Model adopted in this study provides a new perspective for evaluating the effectiveness of internship teaching.
Based on the GAD Model, this study shows that students who have been trained by an offline cognitive internship have more advantages in theoretical knowledge and environmental perception than those trained in an online cognitive internship. The GADs of sub-objective 1 and sub-objective 3 for the offline cognitive internships are good indicators of the results. This finding supports the work of Eiris Pereira et al. [31], who argued that the offline internship mode requiring real field visits could prompt the students’ in-depth insight into theoretical concepts. As indicated in previous research [22], real field visits provide opportunities to view specialized equipment comprehensively, communicate with professional engineers, and realize original experiences. This may be the reason why students with offline internships have a better perception of the environment. Consequently, more attention could be placed on strengthening the professional lectures in the online cognitive internships, from which students could enhance their theoretical knowledge and environmental perception via practical information and personal experience conveyed by specialists who have real fieldwork experience.
The results of GADs also reflect that the online cognitive internships can assist students in enhancing their analytical skills, as evidenced by its larger sub-objective 2 GAD scores as compared to those of the offline internship students. The online cognitive internship has many advantages over the offline cognitive internship, such as abundant online resources, teaching materials, and skills training organized by teachers. Moreover, the online teaching method can break the spatial-temporal limits of the classroom, enabling students to access internship opportunities at convenient times and locations [18,22]. Additionally, it offers students repeated viewing of the materials and provides plenty of time for reflection on the internship content, allowing them to revisit forgotten content and enhance their analytical skills. In the future, more resources and training which effectively stimulates opportunities for the development of students’ analytical skills could be collected and presented.
The results of the students’ perceptions based on the CIPP-based Evaluation Model in this study reflects that the students’ evaluation of the offline cognitive internship is higher. This finding aligns with previous studies [21,22]. The students agreed that the virtual field trip has absolute advantages in enhancing students’ perceptions and developing their essential skill. Further, results from this study have shown that the online cognitive internship can also help students improve their understanding of theoretical knowledge and clarify their career planning. Additionally, students perceive that the effects of online and offline cognitive internships on those two aspects exhibit the smallest difference when all factors are considered. Actually, the existence of the online cognitive internship provides a new way for students to acquire pre-knowledge of the majorand to learn more about a future career. However, the differences reported by the results show that the information related to theoretical knowledge and career planning needs to be increased in future online cognitive internships to produce the same effects as the offline cognitive internships.
Apart from the usefulness of the two forms of teaching, this study also reports the limitations in implementing cognitive internships. Students participating in this study perceived that insufficient teaching funds for both online and offline cognitive internships is the biggest issue; this lack of funding perceived to be more critical for the online than the offline internships. This finding supports the work of Leydon et al. [13], where the introduction of a field trip was considered for an introductory geography class, and the challenges and rewards were explored. Thus, to address this issue, corporate sponsors could be considered by teachers who are responsible for cognitive internships, as the schools cannot support larger financial budgets for these programs.

5. Conclusions

This research proposed a comprehensive method for analyzing the difference in effectiveness between online and offline cognitive internships. Our method consists of two major processes, including a GAD-based evaluation and a CIPP-based evaluation. The data for the former are determined using students’ scores during their cognitive internships, including the performance, the report, and the defense of the internship. The results reveal that the offline cognitive internship does a favorable job of enhancing students’ theoretical knowledge and environmental perceptions, while the online cognitive internship has a better effect on strengthening students’ analytical skills. The data for the latter are collected using a questionnaire survey completed by the students. The results show that there are significant differences in the evaluation results between the online and offline cognitive internships in terms of eleven aspects, with the largest difference observed in the evaluation of the funding for the cognitive internships and the smallest difference shown in the effectiveness of the cognitive internships in promoting students’ understanding of classroom knowledge and the clarification of career planning. Furthermore, suggestions for the improvement of both internships are identified on the basis of those differences. This study offers broad implications for the implementation and improvement of cognitive internships, such as adequate preparation, sustainable internship methods, and better internship effects. By providing effective cognitive internships, new opportunities could be developed to cultivate high-quality talents for the benefit of society.
Although the effects of the two forms of cognitive internship are compared from multiple perspectives, there are still some limitations that could be improved in future studies. Since the online cognitive internship has only been implemented for one year, there is limited relevant data that can be collected in this study. Besides, as described in the Materials and Methods section, all participants came from one school. Data from students at one school may not be representative for all cases, as there may be differences in the implementation of cognitive internships at different schools. In reference to the weights used to calculate the GADs, they were determined according to the subjective judgment of the teachers, which may also affect the evaluation results.
Whereas the current study found that offline cognitive internships are more popular, a crucial goal for future research is to investigate whether the results differ by major, school, evaluation indicators, and other factors. A second goal for future research is to seek a suitable method for identifying the weights of sub-objectives, such as voting, consulting administrators of companies, expert scoring, etc. Moreover, methods used for evaluation could be continuously innovated. For example, further work could introduce several open-ended questions in future questionnaires. Additionally, since the existing literature promotes the notion that online teaching can be a powerful supplement to offline teaching, the effect of combining both online and offline internships should be of interest to educators.

Author Contributions

Conceptualization, X.Z. and R.L.; methodology, X.Z. and R.L.; software, R.L.; formal analysis, X.Z. and R.L.; data curation, X.Z.; writing—original draft preparation, R.L.; writing—review and editing, X.Z., R.L., W.Y., Y.W., Z.J. and Z.F.; supervision, X.Z.; funding acquisition, X.Z. All authors have read and agreed to the published version of the manuscript.


This research was funded by the open fund of the Hubei Key Laboratory of Mechanical Transmission and Manufacturing Engineering at Wuhan University of Science and Technology, Wuhan, China, grant number [MTMEOF2019B11]; The First Batch of 2018 Industry-University Collaborative Education Program of The Ministry of Education (201801287002); the Teaching Research Project of Wuhan University of Science and Technology: Research on the Construction of Higher Education Informatization Quality Evaluation System under the Concept of Internet plus [2017X048]; the Reform and Practice of “1 + 1” Talent Cultivation Model for Transportation Majors in the Context of New Engineering (Project No. 2020366); and the Reconstruction and Practice of Interdisciplinary Curriculum System Driven by Human-Vehicle-Road Synergy in the Context of Smart Transportation (2020Z004).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.


The authors would like to thank the students of the School of Automotive and Traffic Engineering at WUST, for their active participation in this research work.

Conflicts of Interest

The authors declare no conflict of interest.


  1. Prados, J.W.; Peterson, G.D.; Lattuca, L.R. Quality assurance of engineering education through accreditation: The impact of Engineering Criteria 2000 and its global influence. J. Eng. Educ. 2005, 94, 165–184. [Google Scholar] [CrossRef]
  2. Wang, X.; Gao, X. Information modeling and system realization for supporting engineering education accreditation process based on polychromatic graph. Aslib J. Inf. Manag. 2021, 73, 921–945. [Google Scholar] [CrossRef]
  3. Harrison, I.; Bond, K. Transnational education and engineering accreditation. Eng. Educ. 2012, 7, 24–28. [Google Scholar] [CrossRef]
  4. He, L. Transnational higher education institutions in China: A comparison of policy orientation and reality. J. Stud. Int. Educ. 2016, 20, 79–95. [Google Scholar] [CrossRef]
  5. Kohli, N. Role of accreditation in engineering education. In Proceedings of the 2014 IEEE International Conference on MOOC, Innovation and Technology in Education (MITE), Patiala, India, 19–20 December 2014; IEEE: Piscataway, NJ, USA, 2014; pp. 157–159. [Google Scholar]
  6. Feng, X.; Wang, M.; Huang, Q. Construction of Practical Course of “Data Structure” for Engineering Education Certification. Adv. Educ. 2019, 9, 38–42. [Google Scholar] [CrossRef]
  7. Turhan, C.; Sengul, G.; Koyuncu, M. A comprehensive assessment plan for accreditation in engineering education: A case study in Turkey. Int. J. Eng. Educ. 2015, 31, 1270–1281. [Google Scholar]
  8. Patil, A.; Codner, G. Accreditation of engineering education: Review, observations and proposal for global accreditation. Eur. J. Eng. Educ. 2007, 32, 639–651. [Google Scholar] [CrossRef]
  9. De Mauro, A.; Greco, M.; Grimaldi, M.; Ritala, P. Human resources for Big Data professions: A systematic classification of job roles and required skill sets. Inf. Process. Manag. 2018, 54, 807–817. [Google Scholar] [CrossRef]
  10. Zhang, F.; Zhang, X. Achievement Degree Evaluation Method of Operating System Course Targets under the Background of Engineering Education Accreditation. In Proceedings of the 2021 IEEE 3rd International Conference on Computer Science and Educational Informatization (CSEI), Xinxiang, China, 18–20 June 2021; IEEE: Piscataway, NJ, USA, 2021; pp. 5–9. [Google Scholar]
  11. Havard, T.S.; Morgan, J.; Patrick, L. Providing authentic leadership opportunities through collaboratively developed internships: A university-school district partnership initiative. J. Res. Leadersh. Educ. 2010, 5, 460–480. [Google Scholar] [CrossRef] [Green Version]
  12. Day, T. Undergraduate teaching and learning in physical geography. Prog. Phys. Geogr. 2012, 36, 305–332. [Google Scholar] [CrossRef]
  13. Leydon, J.; Turner, S. The challenges and rewards of introducing field trips into a large introductory geography class. J. Geogr. 2013, 112, 248–261. [Google Scholar] [CrossRef]
  14. Wang, D.; Wang, Y. Evolution and Development of engineering management education in China. In Proceedings of the 2008 IEEE International Engineering Management Conference, Singapore, 8–11 December 2008; IEEE: Piscataway, NJ, USA, 2008; pp. 1–4. [Google Scholar]
  15. Huang, S. Research on Development and Application of University Students Off-campus Internship Management Platform under Mobile Internet. In Proceedings of the 2016 5th International Conference on Social Science, Education and Humanities Research, Tianjin, China, 11–12 June 2016; Atlantis Press: Paris, France, 2016; pp. 1641–1645. [Google Scholar]
  16. Walo, M. Assessing the contribution of internship in developing Australian tourism and hospitality students’ management competencies. Int. J. Work. Integr. Learn. 2001, 2, 12. [Google Scholar]
  17. Barton, K. Exploring the benefits of field trips in a food geography course. J. Geogr. 2017, 116, 237–249. [Google Scholar] [CrossRef]
  18. Romero, M.F.; Corpas, A. Students’ perception of Virtual Learning Environments and the development of oral communication competence. A case study. Rev. Espac. 2019, 40, 2. [Google Scholar]
  19. Oriji, A.; Anikpo, F.P. Social media in teaching-learning process: Investigation of the use of Whatsapp in teaching and learning in University of Port Harcourt. Eur. Sci. J. 2019, 15, 15–39. [Google Scholar]
  20. Darr, A.Y.; Erickson, S.; Devine, T.; Tran, T.T. Design and students’ perceptions of a virtually facilitated outpatient pharmacy practice laboratory course. Curr. Pharm. Teach. Learn. 2019, 11, 729–735. [Google Scholar] [CrossRef]
  21. Seifan, M.; Dada, D.; Berenjian, A. The effect of virtual field trip as an introductory tool for an engineering real field trip. Educ. Chem. Eng. 2019, 27, 6–11. [Google Scholar] [CrossRef]
  22. Seifan, M.; Dada, O.D.; Berenjian, A. The effect of real and virtual construction field trips on students’ perception and career aspiration. Sustainability 2020, 12, 1200. [Google Scholar] [CrossRef] [Green Version]
  23. Zafar, S.; Lai, Y.; Sexton, C.; Siddiqi, A. Virtual Reality as a novel educational tool in pre-clinical paediatric dentistry training: Students’ perceptions. Int. J. Paediatr. Dent. 2020, 30, 791–797. [Google Scholar] [CrossRef]
  24. Jamil, N.A.; Shariff, S.M.; Abu, Z. Students’ practicum performance of industrial internship program. Procedia-Soc. Behav. Sci. 2013, 90, 513–521. [Google Scholar] [CrossRef] [Green Version]
  25. Laingen, M.A.; Freeman, S.A.; Brumm, T.J.; Shelley, M.C. Examining the use of engineering internship workplace competency assessments for continuous improvement. Agric. Biosyst. Eng. Conf. Proc. Present. 2015. [Google Scholar] [CrossRef] [Green Version]
  26. Zhang, G.; Zeller, N.; Griffith, R.; Metcalf, D.; Williams, J.; Shea, C.; Misulis, K. Using the context, input, process, and product evaluation model (CIPP) as a comprehensive framework to guide the planning, implementation, and assessment of service-learning programs. J. High. Educ. Outreach Engagem. 2011, 15, 57–84. [Google Scholar]
  27. Hartati, S.J.; Sayidah, N. The use of CIPP model for evaluation of computational algorithm learning program. Proc. J. Phys. Conf. Ser. 2018, 1088, 012081. [Google Scholar] [CrossRef]
  28. Jati Aurum Asfaroh, D.R. Development of CIPP Model of Evaluation Instrument on the Implementation of Project Assessment in Science Learning. Int. J. Environ. Sci. Educ. 2017, 12, 1999–2010. [Google Scholar]
  29. Stufflebeam, D.L. The CIPP model for evaluation. In Evaluation Models; Springer: Dordrecht, The Netherlands, 2000; pp. 279–317. [Google Scholar]
  30. Liu, C.; Park, E.M.; Jiang, F. Examining effects of context-awareness on ambient intelligence of logistics service quality: User awareness compatibility as a moderator. J. Ambient. Intell. Humaniz. Comput. 2020, 11, 1413–1420. [Google Scholar] [CrossRef]
  31. Eiris Pereira, R.; Gheisari, M. Site visit application in construction education: A descriptive study of faculty members. Int. J. Constr. Educ. Res. 2019, 15, 83–99. [Google Scholar] [CrossRef]
Figure 1. Models involved in this study.
Figure 1. Models involved in this study.
Sustainability 14 02706 g001
Figure 2. The percentages (score > 3) of items with statistically significant p values.
Figure 2. The percentages (score > 3) of items with statistically significant p values.
Sustainability 14 02706 g002
Table 1. Student survey items.
Table 1. Student survey items.
Evaluation IndexItems
Context EvaluationThe online (offline) cognitive internship achieved the objectives as expected. (a1)
The teaching objectives of the cognitive internship are aligned with the job requirements in the future. (a2)
I think that the teaching objectives of the online (offline) cognitive internship are feasible. (a3)
Input EvaluationThe internship has sufficient funds for the construction of the infrastructure, teaching materials, transportation and meals, etc. (a4)
The environment of the internship companies is sufficient for us to understand the basic information and situations of our major. (a5)
The companies visited during the internship possess a variety of professional-related equipment. (a6)
The current program of the online (offline) cognitive internship could be conducted continuously. (a7)
Process EvaluationThe schedule of the online (offline) cognitive internship is reasonably arranged. (a8)
The internship in which I was involved was completed perfectly, as planned. (a9)
Each part of the internship is well organized. (a10)
There is a variety of forms of visits included during the online (offline) cognitive internship. (a11)
Product EvaluationThe online (offline) cognitive internship arouses my interest in learning. (a12)
The online (offline) cognitive internship promotes my understanding of theoretical knowledge. (a13)
The online (offline) cognitive internship improves my understanding of the major. (a14)
The online (offline) cognitive internship provides great help for clarifying my career planning. (a15)
Table 2. The score distribution of evaluation items.
Table 2. The score distribution of evaluation items.
Evaluation ItemsScore Distribution (Total-Score)
Sub-Objective 1Sub-Objective 2Sub-Objective 3
Internship Performance2020-
Internship Reports201010
Internship Defense1055
Total Score503515
Table 3. The average score of evaluation items of online cognitive internship.
Table 3. The average score of evaluation items of online cognitive internship.
Evaluation ItemsAverage Score
Sub-Objective 1Sub-Objective 2Sub-Objective 3
Internship Performance16.3615.92-
Internship Report16.207.443.96
Internship defense8.037.513.94
Total Score32.5630.877.90
Table 4. The GADs of the two forms of cognitive internship.
Table 4. The GADs of the two forms of cognitive internship.
Table 5. The results of the reliability analysis.
Table 5. The results of the reliability analysis.
VariableItemCorrected Item—Total CorrelationCronbach’s Alpha, If Item Is DeletedCronbach’s AlphaTotal Cronbach’s Alpha
Objective Evaluationa10.5190.9170.7690.919
Program Evaluationa40.6420.9140.851
Process Evaluationa80.6060.9150.824
Effectiveness Evaluationa120.6040.9150.811
Table 6. The result of the reliability analysis.
Table 6. The result of the reliability analysis.
Table 7. The results of the reliability analysis.
Table 7. The results of the reliability analysis.
Context Evaluationa10.004
Input Evaluationa40
Process Evaluationa80.386
Product Evaluationa120.185
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Zhang, X.; Liu, R.; Yan, W.; Wang, Y.; Jiang, Z.; Feng, Z. Effect Analysis of Online and Offline Cognitive Internships Based on the Background of Engineering Education Accreditation. Sustainability 2022, 14, 2706.

AMA Style

Zhang X, Liu R, Yan W, Wang Y, Jiang Z, Feng Z. Effect Analysis of Online and Offline Cognitive Internships Based on the Background of Engineering Education Accreditation. Sustainability. 2022; 14(5):2706.

Chicago/Turabian Style

Zhang, Xumei, Ruyuan Liu, Wei Yan, Yan Wang, Zhigang Jiang, and Zhaohui Feng. 2022. "Effect Analysis of Online and Offline Cognitive Internships Based on the Background of Engineering Education Accreditation" Sustainability 14, no. 5: 2706.

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop