Next Article in Journal
Study on Physical Properties, Rheological Properties, and Self-Healing Properties of Epoxy Resin Modified Asphalt
Previous Article in Journal
Spatial Distribution and Controlling Factors of Groundwater Quality Parameters in Yancheng Area on the Lower Reaches of the Huaihe River, Central East China
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Effects of ICT Integration in Teaching Using Learning Activities

by
Florentina Toma
1,
Andreea Ardelean
2,
Cătălin Grădinaru
2,
Alexandru Nedelea
3 and
Daniel Constantin Diaconu
4,*
1
Simion Mehedinți “Nature and Sustainable Development” Doctoral School, University of Bucharest, 010041 Bucharest, Romania
2
Faculty of Administration and Business, University of Bucharest, 030018 Bucharest, Romania
3
Department of Geomorphology, Faculty of Geography, University of Bucharest, 010041 Bucharest, Romania
4
Department of Meteorology and Hydrology, Faculty of Geography, University of Bucharest, 010041 Bucharest, Romania
*
Author to whom correspondence should be addressed.
Sustainability 2023, 15(8), 6885; https://doi.org/10.3390/su15086885
Submission received: 13 March 2023 / Revised: 3 April 2023 / Accepted: 17 April 2023 / Published: 19 April 2023

Abstract

:
Progress in schooling using competence-based teaching for students is a priority in setting up a quality-centered educational process. Thus, using ICT tools as teaching–learning techniques represents an important objective in reaching scholastic performance and the capacity to offer students various experiences in teaching. Starting with an analysis of research on integrating ICT tools in the educational environment, this study is aimed at presenting teaching opportunities for all students, seeing education through the lens of those instruments specific to a secondary school level, referencing the subject of geography. The applied research design includes mixed methods applied to a total of 674 students, based on results received before and after a written evaluation, to determine the level of knowledge of the students. The present study evaluates the learning environment in which a successful and practical integration of ICT tools is anticipated in the teaching and learning process, from the perspective of the comparative measurement of the impact on the tests, resulting from the formative assessment. The conducted experiment and the Google Forms questionnaires suggest, through the degree of involvement of all students, how technology can facilitate the teaching–learning process. The results showed that there are statistically significant differences between the experimental and the control group, and that information and communication technologies (ICT) represent an important tool for improving performance and developing participatory skills, having the ability to provide students with positive results.

1. Introduction

Teaching is constantly changing and adapting to current requirements. The evolution of the teaching base and scientific materials leads to an increase in school performance [1].
The use of computer technology has generally led to an increase in students’ ability to analyze problems and identify solutions [2].
In addition to developing students’ programming skills, the correct mastery of information communication technologies is very important. Much research has shown that ICT has been the most significant development in education over the last 20 years [3,4].
Globally, the development of this system has been hampered by the flexibility of school curricula and the specific material base required. However, the implementation process was suddenly accelerated with the global outbreak of COVID-19. Restrictions on travel and community activity led to the widespread use of ICT methods. However, few post-pandemic studies have reported on the results of almost 2 years of use of these modern technologies [5,6,7,8,9].
The emergence of new teaching–learning models during the COVID-19 pandemic (online and hybrid) at the international level and the influence of globalization on the educational process, determine the rapid progress in the field of technology. Hence, skills to respond to the challenges imposed by rapid changes are important.
Thus, this study investigates the use of modern information and communication technologies (ICT) in didactic activity, as teaching–learning tools for all students in a class, which can ensure the quality of school learning, adapted to a competitive society based on information.
ICT tools have been subjects of interest for numerous specialized studies. Most of the studies present the advantages of using these applications for learning and the attitude of students from this perspective, but focus less on their disadvantages.
The computer-based learning environment can improve students’ learning, motivation, and confidence [10,11,12], having the capacity to lead towards an increase in educational inclusion [13,14].
However, many studies on this topic, with reference to the applications used in the present research, are from university education and used as a form of assessment.
Work that presents the Google Forms tool as enhancing learning and preparing students for future professional contexts is noteworthy in this regard [15]. Moreover, the student, through this application, becomes an active agent in the learning process [16,17] and can develop writing abilities and ICT capabilities [18]. Google Forms enables asynchronous communication [19] that can be advantageous in fulfilling certain tasks. At the same time, Google Docs has an essential role in improving student motivation and engagement [20] whilst stimulating collaborative writing as well as having a positive effect on the perceptions of learning various tasks [21], and Google apps are featured as a collaboration tool [22]. Studies have also investigated the use of Google Classroom, which uses the tools provided by the Google ecosystem such as Docs or slides, showing that it supports the pedagogical process and “both in-class and out-of-class work” [23] (p. 7), and creates a common learning environment [24] that makes the process interesting, easy and useful for students [25,26].
The JavaScript program is considered interactive and attractive [27], and is used as an assessment tool [28].
Also, a case study analyzed the establishment of an interactive virtual learning platform that connects all students together—by blended learning—improving students’ vocabulary learning performance through the use of synchronous and asynchronous games and activities [29]. A downside to the use of the online environment and, implicitly, learning applications and platforms was specific to safety as, during the pandemic period, secure evaluation platforms did not appear, which constituted a major problem among teachers.
Likewise, another direction of research, intensively studied during the pandemic period, is learning and assessment through games in the educational process, although they were already used before the pandemic in many states. Although there have been many research studies on learning through play, few researchers have addressed pre-university education and the discipline of geography.
Various academic studies have shown that game-based learning motivates [30] and can be beneficial for classroom activation [31]. Other research presents game models that enhance learning, including Virtual Age video games based on sound and design [32] and mixed methods exercise-based educational games [33]. At the same time, other research has examined the learning outcomes of students using a Serious Educational Game, showing that students tend to learn better if given virtual content [34]. Moreover, “new teaching methods that encourage students to be active participants in their own learning”, were designed in a study based on using mobile phone applications to enhance learning [35] (p. 1). Another example is provided by a study that investigated the effectiveness of a teacher-created educational game application, using mixed methods and collecting pre- and post-test results [36].
Studies have demonstrated the positive impact of student learning through ICT, even before the pandemic [37], but also during it [8] as it is focused on stimulating “cognitive, social, emotional, creative and physical skills” [38] (p. 3).
However, these types of educational games were integrated only for the assessment sequence, not for the teaching–learning directing sequence, which we considered necessary in the present study.
Also, regarding the use of ICT, a specialized study examined the importance of film in understanding geographic concepts through virtual courses in the isolation situation induced by COVID-19, i.e., how a team of teachers uses geographic media literacy as a pedagogic tool to improve their students’ knowledge in online courses [39].
Regarding game-based assessment, the usage of ICT has been accentuated, especially the learning platform Kahoot!, which has demonstrated a positive influence among students, creating an optimal climate for learning [40]. Also regarding the assessment of student results with the help of the interactive exercise Kahoot!, it was shown that it stimulates grade increases and the motivation of students for learning [41], and this positive influence impacts the learning performance, the dynamics within the classroom, as well as teachers’ and students’ attitudes and perceptions [42,43].
On the other hand, for a complete didactic activity, the evaluation results are important by calculating the effect size d and r [44,45,46]; moreover, the increase in student performance is essential after improving the teaching–learning evaluation methods, ensuring visible education [47].
At the same time, certain educational research has shown that students actively involved in the learning activity will learn more than passive students [48]. On the other hand, teachers tend to use ICT tools mainly to prepare study materials, rather than to work with students in the classroom [49].
Despite the integration of ICT in the educational process, significant gaps remain in school reality, depending on various factors that prevent or promote their use.
Research on teaching–learning through ICT for pre-university education in geography are generally lacking. Also, didactic elements to ensure visible learning, as done in this study with the use of the Google Forms tool and in the JavaScript web game, were not found in the studies that included ICT tools.
The proposed research responds to this need for an approach to highlight the importance of the involvement of all students in dense teaching–learning activities, based on information and communication technology (ICT), but also in the assessment of geography, to keep up with the digital age. This aspect of the proposed approach is considered necessary as a means of facilitating the transition from a teacher-centered learning environment to a collaborative and modern one, with major implications for the educational purpose, but also to create—over time—didactic tools approved by students, seen through the teachers’ lens [47].
This study is aimed at comparatively evaluating (quantitatively and qualitatively) teaching–learning methods with the help of ICT.
This study investigates the school progress of students through the training of specific skills according to the school curriculum.
The study aims to train students in the specific competencies of the school curriculum, identifying teaching–learning ICT-based tools that respond to the challenges imposed by the rapid changes and the uncertainty and volatility in a world of fast-paced development.

2. Materials and Methods

This paper has as its general objective the usefulness of ICT integration in the teaching–learning of a subject which can be highlighted through the following 5 objectives.
Specific objective no. 1: Measuring the level of students’ previously acquired knowledge (pre-test 1) through a standardized formative assessment with the interactive exercise of Kahoot! during the period of the online model and the traditional model during the pandemic period, by using the test method of measuring the size/effect size of the formative tests on the learning platform Kahoot! and their comparative method.
Specific objective no. 2: Measuring the level of knowledge acquired by students through a standardized formative assessment with the interactive exercise Kahoot! during the teaching–learning method implemented with the Google Forms tool (online model)/custom-created JavaScript web game (traditional model), associated with the Microsoft PowerPoint presentation (experimental group) and the teaching–learning method with Microsoft PowerPoint presentation (control group), by using the test method of measuring the size/size of the effect in the formative tests (post-test 1/pre-test 2) and their comparative method.
Specific objective no. 3: Measuring the level of knowledge subsequently acquired (post-test 2) through a standardized formative assessment with the interactive exercise Kahoot! during the period of the online model and the traditional model during the pandemic period, by using the test method of measuring the size/effect size of the formative tests on the learning platform Kahoot! and their comparative method.
Specific objective no. 4. Identifying the opinion of students in the experimental group about the use of ICT tools in learning geography.
Specific objective no. 5. Identifying the teachers’ opinions about the use of ICT tools in the teaching–learning of geography, by conducting an interview (focus group).
The research was carried out at the Mihai Eminescu National College in Bucharest, Romania. The college has classes from grades 0 to 12, where 0 to 8 represent primary education and 9 to 12 secondary education [50]. The approval of the Ethics Committee for the research was obtained from the high school.
In carrying out the study, the authors went through three stages:
The experimental research, considered the first stage of the research, was carried out between September 2020 and May 2022, and consisted of the teaching–learning evaluation of some content in the field of geography, using the following research methods: observation, psychopedagogical experiment, survey, test method, comparative method and statistical methods of data interpretation, i.e., parametric and non-parametric statistical tests to compare samples (two-sided and one-sided t-test/Mann–Whitney U-test), to check assumptions (F-test, Shapiro–Wilk test) or to determine the effect size (Cohen’s d effect size index/Wilcoxon R), which were processed and validated using R software version 4.2.2, on an approximately equal sample of students [51,52,53].
ICT tools were applied to the experimental group in the sequence of directing teaching–learning to 12 lessons from the Hydrosphere contents, all personally made, and the ICT tools were considered as interventions, i.e., the Google Forms tool (online model, 168 students) and the custom-created JavaScript web tool (traditional model, 168 students), associated with the Microsoft PowerPoint presentation. Thus, the total number was 336 students (experimental group—of which, in each of the two learning models, the sample was comprised of two 5th, two 6th, six 9th and two 10th grade classes), in which students in the discovery experimental group respond in writing to 18 identical and standardized tasks in Google Forms form/JavaScript game, based on the Microsoft PowerPoint presentation of the lesson presented by the teacher. The teacher was in this case the moderator, the students being evaluated with a score at the end of submitting the Google Forms test/self-created web game, in order to be motivated. The other students in the control group, respectively 168 students (online model)/170 students in the traditional model (control group), benefited from the Microsoft PowerPoint presentation teaching–learning method, with the same contents and tasks. The treatment period was five weeks for 5th and 9th graders and one week for 6th and 10th graders. For all students, the Kahoot! game was used as a standardized assessment tool for the feedback sequence as a type of formative assessment. The formative assessment is the permanent assessment at the end of a course of the contents covered (Figure 1).
This first stage included three sub-stages, simultaneously using the following research methods: observation, psycho-pedagogical experiment, survey, test method, comparative method and statistical methods of data interpretation (parametric and non-parametric statistical tests):
  • The ascertainment substage (pre-test 1) for measuring the level of knowledge of students previously acquired through a standardized assessment with the interactive game of Kahoot!.
  • The experimental substage (post-test 1 and pre-test 2) consisted of implementing in the teaching–learning sequence the interventions specified above and measuring the results obtained through a standardized assessment with the Kahoot! game, to observe if there is a statistically significant influence on the level of knowledge assimilated by students.
  • Post-test 2 substage: measuring the level of knowledge of students acquired after applying the implemented methods, through a standardized assessment with the Kahoot! game.
All self-created lessons from the intervention period on the 12 Hydrosphere contents were translated as learning activities (18 per lesson) into Google Forms/JavaScript game and scored for enhancing student motivation.
The creation of lessons in Google Forms (which contain the same didactic elements as in the game) was done on the drive of the institutionalized email address.
The creation of online games involved the integration of requirements into web pages using the following types of computer languages: JavaScript and HTML, CSS. A sample sheet for the 12 lessons was introduced into this program and uploaded to an online server (www.hidrosfera-jocuri.com, accessed on 16 April 2023); the students only needed access to the Internet via any compatible technological device (such as laptop, desktop, mobile phone, etc.). From a single button on an internet browser, they can be translated into any international language. The games created are unique because they include didactic elements: specific skills, grading performance standards and responsible attitudes, and at the end, when the students’ scores are displayed, they identify the skills in which students have gaps if they got the answers wrong. The researchers wanted a learning-based application focused on the students’ skills and the connection of the lessons with real life, but also a development of the students’ self-evaluation capacity.
The data were obtained by storing the scores obtained from the learning activities during the teaching–learning management sequence from the Google Forms survey (online model) or from the self-created JavaScript web game (traditional model) and from the platform formative assessment Kahoot!.
The obtained data were analyzed in the following manner: the individual grades of the students from the formative tests were processed using R, a programming language for statistical computing, with the “lsr”, “rcompanion”, “dplyr”, “car” and “ggpubr” packages in order to compare and identify the magnitude/effect for school purchases of the expected progress of students by test grades during the intervention period. The following were calculated: descriptive indices like mean, quartile values (Q1, median, Q3), standard deviation, skewness, kurtosis and the results of parametric and non-parametric statistical tests for all evaluations (Table 1), respectively bilateral and right-sided t-test, F-test, Shapiro–Wilk test, Mann–Whitney U-test, the d Cohen effect size index, r, from the three sub-stages, as well as the creation of additional boxplot graphs, to provide the translation of numerical data into a visual form and to facilitate their understanding.
The second stage consisted of the application of the questionnaire transposed in the Google Forms application regarding the perception of students from the experimental group on the ICT tools used in learning. The data were stored on the institutionalized email address and were analyzed based on their responses and quantified in percentages.
The third stage consisted of conducting an interview in the form of a focus group, conducted on teaching staff with experience in the field, regarding their opinion on the ICT tools used in the teaching–learning management sequence. The data obtained were stored and analyzed based on their responses and quantified in percentages.

3. Results

The content analysis performed on the data collected through the experiment generated a stage that involved three sub-stages:
In the first sub-stage (statistical, pre-intervention, pre-test 1), we analyzed the results of the previous formative assessments by comparing the score obtained individually at the class level (in the 12 classes included in the research), during five weeks (considered pre-test 1), as an analysis total per experimental group and control group in the two school years to establish that there is no statistically significant difference between the two groups before the experiment was conducted. It is observed (Table 1) that the differences are very small between the groups (Q1, Q3 and median quartile values), respectively 0.37 points in the online model and 0.051 points in the traditional model). The number of tests is unequal, although the number of students is equal, since during the pandemic, the legislation was more permissive for students.
In the graphic representations (Figure 2), with the aim of ascertaining whether this difference affects the homogeneity of the dispersion of subjects belonging to the experimental group, it is observed that they tend to be normal, with a small asymmetry in the control group in the 2021–2022 school year pandemic traditional model. The vertical axis means the grades obtained by the students and the horizontal axis means the type of tests of the two groups of students specific to the teaching-learning model (in all three Figure 2, Figure 3 and Figure 4).
In the second sub-stage (experimental, during the intervention period—of post-test 1/pre-test 2) of the results obtained, after applying the intervention to the experimental group, the results of the formative evaluations of the two groups (considered post-test 1) were analyzed by comparing the score obtained individually by the students per class level in the 12 classes included in the online model research and 12 classes based on the traditional pandemic model. The intervention period is considered post-test 1 compared to the previous period (pre-test 1) and pre-test 2 compared to the later period (post-test 2).
In order to find out if there are statistical differences at the group level between the experimental group and the control group, the authors checked, one by one, in R, if the necessary conditions for the application of the t-test are met, after applying the Shapiro–Wilk test (Table 2), and it is observed that in the experimental group and the control group, the p-values are lower than the 0.05 significance level (converted on the website https://calculator.name/scientific-notation-to-decimal/ 9.205 × 10−5 (accessed on 26 February 2023). The conditions to apply the t-test were not met, for the two compared groups, because they do not have a normal distribution, nor following the application of the F-test, in which the obtained p-values are lower than the significance level alpha = 0.05. The normality trend was not identified, and the Mann–Whitney U-test was applied to see if there were differences between the two groups. After applying the Mann–Whitney U-test, the p-values obtained were lower than 0.05, and it can be stated that the difference between the two groups is statistically significant. In the two cases, the obtained values of the effect size r are between a value of 0.239 in the online model, during the intervention period of the Google Forms tool (which indicates a moderate association), and a value of 0.721 in the traditional model, during the game treatment period in the web-created JavaScript (which underlines a very large statistical effect, indicating a moderate and strong association).
In the experimental group (Table 3), with the Google Forms tool associated with the Microsoft PowerPoint presentation, the average per class was 1.297 points higher, and during the intervention period with the custom-created JavaScript web game, the average per class was 1.08 points higher, and more than 75% (Q3 quartile) of students obtained very good grades, above 9.
The graphs (Figure 3) show that there are significant differences both between the minimum, median and maximum values, in favor of the experimental group. In the experimental group, in the online model, although the minimum score was lower, the circles in the graph indicate that it was a coincidence for two children and that the scores usually start at a higher level, and those who used the game JavaScript scored above 5.
In the third sub-stage (post-intervention, post-test 2), the results of each class are analyzed, thus comparing the assessment results obtained during the intervention period and during the subsequent period, during five weeks in all classes, in order to observe the progress or regression of the students after the end of the treatment. Pre-test 2 includes the tests administered during the application of the intervention, and post-test 2 refers to the tests following the application of the treatment.
In the experimental group, in the online model, the post-test2 class mean was 1.617 points lower than the pre-test 2, and in the traditional model, the post-test 2 class mean was 1.892 points lower than the pre-test 2. In the control group, the obtained results remain relatively constant. It also shows that the values of quartile 1, median and quartile 3 are higher, in favor of the experimental group and the application of the intervention (pre-test 2). At the same time, the obtained values of the standard deviation in all cases are lower in the experimental group and during the treatment period (pre-test 2). The results are presented in Table 4.
It can be seen from the graphs (Figure 4) in the experimental group that there are differences both between the median, the minimum values and the maximum values; the distribution of the grades has a tendency towards normality, as they decreased significantly after the end of the intervention. In the control group, there are no significant differences in the median and maximum values, showing slight negative asymmetries, with lower values predominating.
The content analysis performed on the data collected through the questionnaire and the interview generated two other stages of the study.
By applying socio-metric methods, the authors also wanted constructive feedback from the students and other teaching staff on the ICT tools integrated in the teaching–learning of geography.
Regarding the Google Forms questionnaire applied to students regarding their opinion about the use of ICT tools during the intervention period, 80.3% of the students in the experimental group ticked total agreement on reuse (299 students answered out of the 336 students in total). Starting from the previous premise that the feedback provided by the students is constructive for the teacher, it was considered necessary to apply such a research tool, because it informs the teacher about the strengths and weaknesses of the implemented method and its eventual reuse in the future. The advantages and disadvantages of the ICT tools used, presented by the students, correspond as answers to those formulated by the teaching staff, which are presented below.
Then, an interview was used (focus group guide on a sample of 29 teaching staff) to provide a clearer picture of the future use of the proposed ICT tools by other teaching staff. Thus, 75.9% of the teachers fully agreed with use. They presented the advantages of the two ICT tools, summarized in Table 5.
They also presented the disadvantages of the two ICT tools, summarized in Table 6.
The conclusion is that the respondents were satisfied with all the discussions carried out during the focus group meeting, showing the usefulness of the presented tools in their future personal teaching activity.

4. Discussion

In this research, the authors investigated, by comparison, the efficiency of integrated ICT tools based on the scores obtained in the formative assessment of the students, whilst proving the research objectives. Based on the interpretation of the results, we had in mind the purpose of the study, to validate ensuring the school progress of the students based on the specific skills from the school curriculum, by using some ICT means in teaching–learning, for all students in a class.
The results obtained showed that the students who benefited from the Google Forms teaching–learning tools and the custom-created JavaScript game, associated with the Microsoft PowerPoint presentation, obtained significantly better scores, highlighting the higher level of acquired knowledge, improving the quality of the educational process, thus validating the general objective of the research.
This research demonstrated that through the active participation of all students in a class, the specific competences of the school program can be formed for all students and can ensure school progress.
At the same time, the ICT tools used can be considered participatory methods of learning through discovery, in which students develop a stimulating and competitive attitude towards learning and responsible attitudes towards real life, and learn the importance of checking the results, just as other studies have proven [54].
Also, it was later noticed that the students make analogies with the contents studied during the intervention period, which proves that the information selected through ICT tools is lasting. On the other hand, the creation of ICT tools represents open educational resources that can be easily used by other teachers who teach geography, according to their opinion. On this topic, there have been studies that have examined the long-term effects of some ICT tools, in the form of a game, but focusing less on students’ attitudes and more on knowledge [55].
The factor analysis of the results obtained from the experimental group reflects that the positive influencing factors are the learning activities for all students in a class incorporated in ICT tools and their attractiveness for the students, especially in the traditional pandemic model, which also developed competitiveness. The lower results for the control group represent the demand for work tasks, but only for a part of the students. Also, during the pandemic period, the two learning models can also contribute as negative factors.
In addition to the constant and increased attention of the students, the teaching–learning ICT tools integrated in the study require electronic devices and permanent and quality connection to the Internet, which can also constitute negative factors for both groups; similarly, other researchers have identified financial issues [56]. The quality of school learning depends in particular on the IT endowments of the school and students, on the needs of teachers and students, but also on the level of professional training, as was also found in other studies [57,58].
The authors observed the need to analyze the results obtained from the continuous evaluation, so that the teaching staff can subsequently adequately integrate the didactic strategies, in order to ensure the quality of the educational process.
Although the form of formative assessment used, through the interactive game Kahoot!, has demonstrated over time that it has a positive effect, in that it is motivational and provides a sense of well-being to students [7], the competitiveness between students turns the ticking of answers into a form of assessment far too quickly, so they may tick incorrectly.
In addition, these digital technologies give confidence in school success and dynamize the students and the conducted lesson for teachers representing motivational tools in didactic activity. At the same time, the use of digital technology in school learning is suitable for measuring and accepting multimedia educational tools, as demonstrated by various other studies [59].
Pedagogical reasoning is represented by the ability of ICT to improve the teaching and learning process and to help students and teachers acquire and develop applied skills through the use of these tools.
The main contribution of this study also reveals the optimal solutions for practice, such as a constant use of teaching–learning and formative assessment during a lesson with the help of ICT tools, in relation to the didactic elements presented within them. By providing scoring, not only the teacher, but also the student notices the deficiencies in each lesson, there is an opportunity to remediate in time, and the student develops a deep understanding of the information, just as other studies have shown regarding the assessment sequence [41,60].
The disadvantages of using the ICT tools used in the research carried out are few in number compared to their advantages, also demonstrated by the opinion of other teaching staff, along with the results obtained. Therefore, students have more chances that would improve their learning results, if they will be used by other teachers, either in the teaching-learning sequence or in the evaluation sequence. Therefore, it is important that integrated ICT tools are promoted, and their use encouraged among education beneficiaries.
As ICT tools have proven to be effective in school learning, they cannot induce a departure from the objectives of learning, an objective that has been attempted to be achieved.
As hypothesized, students’ school learning was improved in the study; these results are in line with other research showing that ICT instructional aids (in the form of games) have positive influences on students’ knowledge acquisition [17,61,62,63].
The present study has certain limitations that must be considered. One of the main limitations is the size and origin of the sample (336 students, from a single high school, although it was reapplied personally and by other teachers later) and the long-term effect of the game (the game created was implemented for a period of five weeks), but also the fact that not all schools currently have the necessary ICT training means to conduct classes with sufficient didactic means, so that the teacher and the students would make special financial efforts to ensure the necessary minimum of technological means, and school learning cannot be achieved everywhere at maximum levels. Also, the subjects may have been biased, as the teacher was present in the classroom and the students were relatively young.
At the level of implementation, the results of the research indicated the need for the existence of two electronic devices—a video projector for the teaching staff and a telephone or other device for the students, in order to obtain better results, an aspect found after their re-application this school year, where they recorded significantly better results.
The results of this study can contribute to the mapping of the main ways of use of technological tools by teachers for the purpose of teaching–learning–school evaluation.
The implications of this research for the field of knowledge can be considered that teaching–learning models for all students in a class with the help of integrated ICT tools can be extended for use at national and international levels, especially in special teaching situations, as was the case during COVID-19.
Also, the proposed methodological addition contributes to the diversification of didactic activities and the completion of practical approaches to assessment tools and effective teaching–learning methods, which can eventually lead to a unitary assessment in education.
The present study evaluated the effective teaching–learning relationship through ICT tools with significant results in assessment—through the positive perception of students and teachers regarding integrated ICT tools. Possible recommendations and further research directions in the pre-university education sector would be to expand on larger samples from several geographical areas that have the appropriate technology and to build other instruments of this type, proposing that they have an influence on the specialized literature.
It is possible and desirable to compare the results of this research with the application of ICT by other teachers (and in other fields of activity by using other topics as well), contributing to the development of more general perspectives and conclusions and their improvement.
The new results from this research explain the use with an original approach as teaching-learning ICT tools (as well as the personal development of a web tool), although the applied tools have been studied in the literature review, but only as an evaluation tool and without a didactic element in their content.
The results from this research bring an original approach to the teaching–learning ICT tools (as well as the personal development of a web tool) and learning activities for all students in a class, as well as their presentation with didactic elements included, although the applied tools have been studied in the literature, but only as evaluation tools and without didactic elements in their content. The results also provide an overview of how the integration and application of ICT in school learning should be achieved.
This model provides educational decision makers with a comprehensive knowledge base regarding the current state of ICT integration (in schools that are not equipped with the necessary technology) and can promote the overall goal by identifying the factors that hinder this process versus the factors that ensure the quality of the educational process.

5. Conclusions

The ICT tools presented, used as teaching–learning techniques, create effective opportunities for learning and assessment and can be recommended for ICT analysis in the field of research, especially in pre-university education, in alignment with modern training to keep pace with development. They represent an accessible and relevant way to approach the educational process centered on the student. Most of the students from the experimental group obtained better grades than the students from the control group. As differences, the custom web tool created as a java script game worked better in terms of obtained results, because the game is generally is more attractive, more enjoyable for students and cannot become monotonous over time. ICT tools are also accessible to both students and teachers, requiring only an internet connection.
At the same time, the creation of well-being in the learning and evaluation of the main beneficiary of education, the student, and the full active participation of a class of students during a lesson, demonstrated by the results obtained, has major implications for ensuring the quality of the instructional–educational process.
Also, the adequate completion of the stages of the didactic activity, the density of the work tasks for the students, the awareness of the evaluation results and the creation of ICT tools agreed by the students, but in accordance with the teaching methodology of a subject, have special implications in ensuring the scholastic progress of the students.

Author Contributions

Conceptualization, F.T., A.A. and C.G.; methodology, F.T., A.N. and A.A.; software, F.T.; validation, F.T. and A.N.; formal analysis, F.T. and A.A.; investigation, F.T.; resources, F.T., A.A. and C.G.; data curation, F.T.; writing—original draft preparation, F.T.; writing—review and editing, F.T., A.A. and D.C.D.; visualization, F.T.; supervision, F.T. and D.C.D.; project administration, F.T. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study. The study was conducted in accordance with the Declaration of Helsinki, and approved by Ethics Committee of the “Mihai Eminescu” National College, Bucharest, no. 3065/27.10.2020 for conducting the study with the students of the college, for studies involving humans.

Data Availability Statement

The raw data supporting the conclusion of this article will be made available by the authors, without undue reservation.
  • Below, you have access to data and applications, by category:
  • Fifth grade:
1.
Lesson 1 with Google Forms tool:
2.
Lesson 1 with our JavaScript web tool:
3.
Kahoot game lesson 1 feedback/formative assessment sequence:
4.
Lesson 2 with Google Forms tool:
5.
Lesson 2 with our JavaScript web tool:
6.
Kahoot game lesson 2 feedback/formative assessment sequence:
7.
Lesson 3 with Google Forms tool:
8.
Lesson 3 with our JavaScript web tool:
9.
Kahoot game lesson 3 feedback/formative assessment sequence:
10.
Lesson 4 with Google Forms tool:
11.
Lesson 4 with our JavaScript web tool:
12.
Kahoot game lesson 4 feedback/formative assessment sequence:
13.
Lesson 5 with Google Forms tool:
14.
Lesson 5 with our JavaScript web tool:
15.
Kahoot game lesson 5 feedback/formative assessment sequence:
  • Sixth grade:
1.
Lesson 1 with Google Forms tool:
2.
Lesson 1 with our JavaScript web tool:
3.
Kahoot game lesson 1 feedback/formative assessment sequence:
  • Ninth grade:
1.
Lesson 1 with Google Forms tool:
2.
Lesson 1 with our JavaScript web tool:
3.
Kahoot game lesson 1 feedback/formative assessment sequence:
4.
Lesson 2 with Google Forms tool:
5.
Lesson 1 with our JavaScript web tool:
6.
Kahoot game lesson 2 feedback/formative assessment sequence:
7.
Lesson 3 with Google Forms tool:
8.
Lesson 3 with our JavaScript web tool:
9.
Kahoot game lesson 3 feedback/formative assessment sequence:
10.
Lesson 4 with Google Forms tool:
11.
Lesson 4 with our JavaScript web tool:
12.
Kahoot game lesson 4 feedback/formative assessment sequence:
13.
Lesson 5 with Google Forms tool:
14.
Lesson 5 with our JavaScript web tool:
15.
Kahoot game lesson 5 feedback/formative assessment sequence:
  • Tenth grade:
1.
Lesson 1 with Google Forms tool:
2.
Lesson 1 with our JavaScript web tool:
3.
Kahoot game lesson 1 feedback/formative assessment sequence:
  • Microsoft PowerPoint lessons Hydrosphere:
  • Results Kahoot! games from the feedback/formative assessment sequence:
  • Focus group guide on a sample of 29 teaching staff:

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Wang, Y.-H. The effectiveness of integrating teaching strategies into IRS activities to facilitate learning. J. Comput. Assist. Learn. 2017, 33, 35–50. [Google Scholar] [CrossRef]
  2. Sarpong, K.A.M.; Arthur, J.K.; Amoako, P.Y.O. Causes of failure of students in computer programming courses: The teacher-learner perspective. Int. J. Comput. Appl. 2013, 77, 27–32. [Google Scholar]
  3. Kennewell, S. Using affordances and constraints to evaluate the use of ICT in teaching and learning. J. IT Teach. Educ. 2001, 10, 101–116. [Google Scholar]
  4. Harrison, C.; Comber, C.; Fisher, T.; Haw, K.; Lewin, C.; Linzer, E.; McFarlane, A.; Mavers, D.; Scrimshaw, P.; Somekh, B.; et al. Impact2: The Impact of Information and Communication Technologies on Pupil Learning and Attainment; Becta: Coventry, UK, 2002. [Google Scholar]
  5. Kennewell, S.; Tanner, H.; Jones, S.; Beauchamp, G. Analysing the use of interactive technology to implement interactive teaching. J. Comput. Assist. Learn. 2007, 24, 61–73. [Google Scholar] [CrossRef]
  6. Toma, F.; Diaconu, D.C.; Popescu, C.M. The Use of the Kahoot! Learning Platform as a Type of Formative Assessment in the Context of Pre-University Education during the COVID-19 Pandemic Period. Educ. Sci. 2021, 11, 649. [Google Scholar] [CrossRef]
  7. Toma, F.; Diaconu, D.C.D. The Efficiency of Using the Google Forms Tool at the Stage of a Lesson Focusing on Directing the Teaching-Learning Process for Geography Discipline—An Online Model. Ann. Univ. Craiova 2022, 23, 101–124. [Google Scholar] [CrossRef]
  8. Chakma, U.; Li, B.; Kabuhung, G. Creating online metacognitive spaces: Graduate research writing during the COVID-19 pandemic. Issues Educ. Res. 2021, 31, 37–55. [Google Scholar]
  9. Rios-Campos, C.; Gutiérrez Valverde, K.; Vilchez de Tay, S.B.; Reto Gómez, J.; Agreda Cerna, H.W.; Lachos Dávila, A. Argentine Universities: Problems, COVID-19, ICT & Efforts. Cuest. Políticas 2022, 40, 880–884. [Google Scholar]
  10. Bhagwat, M.; Kulkarni, P. Impact of COVID-19 on the Indian ICT Industry. Cardiometry 2022, 23, 699–709. [Google Scholar]
  11. Li, B. Ready for Online? Exploring EFL Teachers’ ICT Acceptance and ICT Literacy During COVID-19 in Mainland China. J. Educ. Comput. Res. 2022, 60, 196–219. [Google Scholar] [CrossRef]
  12. Tomei, L.A. Taxonomy for the Technology Domain; Information Science Publishing, Robert Morris University: Moon Township, PA, USA, 2005. [Google Scholar]
  13. Zhang, P.; Aikman, S. Attitudes in ICT Acceptance and use. In Human-Computer Interaction; Jacko, J., Ed.; Springer: Berlin/Heidelberg, Germany, 2007. [Google Scholar]
  14. García-Perales, R.; Almeida, L. An enrichment program for students with high intellectual ability: Positive effects on school adaptation. Comunicar 2019, 60, 39–48. [Google Scholar] [CrossRef]
  15. Glover, M.J. Google Forms can stimulate conersations in discussion-based seminars? An activity theory perspective. South Afr. J. High. Educ. 2020, 34, 99–115. [Google Scholar] [CrossRef]
  16. Murphy, M.P.A. “Blending” Docent Learning: Using Google Forms Quizzes to Increase Efficiency in Interpreter Education at Fort Henry. J. Mus. Educ. 2018, 43, 47–54. [Google Scholar] [CrossRef]
  17. Rejón-Guardia, F.; Polo-Peña, A.I.; Maraver-Tarifa, G. The acceptance of a personal learning environment based on Google apps: The role of subjective norms and social image. J. Comput. High. Educ. 2019, 32, 203–233. [Google Scholar] [CrossRef]
  18. Basri, M.; Husain, B.; Modayama, W. University students’ perceptions in implementing asynchronous learning during Covid-19 era. Metathesis J. Engl. Lang. Lit. Teach. 2021, 4, 263–276. [Google Scholar] [CrossRef]
  19. Simamora, R.M. The Challenges of Online Learning during the COVID-19 Pandemic: An Essay Analysis of Performing Arts Education Students. Stud. Learn. Teach. 2020, 1, 86–103. [Google Scholar] [CrossRef]
  20. Liu, S.H.J.; Lan, Y.J. Social Constructivist Approach to Web-Based EFL Learning: Collaboration, Motivation, and Perception on the Use of Google Docs. Educ. Technol. Soc. 2016, 19, 171–186. [Google Scholar]
  21. Zhang, R.; Zou, D. Types, features, and effectiveness of technologies in collaborative writing for second language learning. Comput. Assist. Lang. Learn. 2022, 35, 2391–2422. [Google Scholar] [CrossRef]
  22. Andrew, M. Collaborating Online with Four Different Google Apps: Benefits to Learning And Usefulness for Future Work. J. Asia TEFL 2019, 16, 1268–1288. [Google Scholar] [CrossRef]
  23. Bondarenko, O.; Mantulenko, S.; Pikilnyak, A. Google Classroom as a tool of support of blended learning for geography students. arXiv 2019, arXiv:1902.00775. [Google Scholar]
  24. Basilaia, G.; Dgebuadze, M.; Kantaria, M.; Chokhonelidze, G. Replacing the classic learning form at universities as an immediate response to the COVID-19 virus infection in Georgia. Int. J. Res. Appl. Sci. Eng. Technol. 2020, 8, 101–108. [Google Scholar] [CrossRef]
  25. Albashtawi, A.; Al Bataineh, K. The Effectiveness of Google Classroom Among EFL Students in Jordan: An Innovative Teaching and Learning Online Platform. Int. J. Emerg. Technol. Learn. 2020, 15, 78–88. [Google Scholar] [CrossRef]
  26. Okmawati, M. The use of Google Classroom during pandemic. J. Engl. Lang. Teach. 2020, 9, 438–443. [Google Scholar] [CrossRef]
  27. Krumm, S.; Thum, I. Distance learning on the Web supported by JavaScript: A critical appraisal with examples from clay mineralogy and knowledge-based tests. Comput. Geosci. 1998, 24, 641–647. [Google Scholar] [CrossRef]
  28. Jaimez-González, C.R. Evaluation of online teaching resources to support the teaching-learning process of web programming with JavaScript and Java Server Pages. Dilemas Contemp. Educ. Politica Valores 2019, 6, 54. [Google Scholar]
  29. Karaaslan, H.; Kilic, N.; Guven-Yalcin, G.; Gullu, A. Students’ reflections on vocabulary learning through synchronous and asynchronous games and activities. Turk. Online J. Distance Educ. 2018, 19, 53–70. [Google Scholar] [CrossRef]
  30. Gee, J.P. What video games have to teach us about learning and literacy. Comput. Entertain. 2003, 1, 20. [Google Scholar] [CrossRef]
  31. Sharples, M. The design of personal mobile technologies for lifelong learning. Comput. Educ. 2020, 34, 177–193. [Google Scholar] [CrossRef]
  32. Cheng, M.T.; Lin, Y.W.; She, H.C. Learning through playing Virtual Age: Exploring the interactions among student concept learning, gaming performance, in-game behaviors, and the use of in-game characters. Comput. Educ. 2015, 86, 18–29. [Google Scholar] [CrossRef]
  33. Lin, Y.C.; Hsieh, Y.H.; Hou, H.T.; Wang, S.M. Exploring students’ learning and gaming performance as well as attention through a drill-based gaming experience for environmental education. J. Comput. Educ. 2019, 6, 315–334. [Google Scholar] [CrossRef]
  34. Cheng, M.T.; Annetta, L. Student’s learning outcomes and learning experiences through playing a Serious Educational Game. J. Biol. Educ. 2012, 46, 203–213. [Google Scholar] [CrossRef]
  35. Elsherbiny, M.M.K.; Al Maamari, R.H. Game-based learning throug mobile phone apps: Effectively enhancing learning for social work students. Soc. Work. Educ. 2020, 40, 315–332. [Google Scholar] [CrossRef]
  36. Annetta, L.; Mangrum, J.; Holmes, S.; Collazo, K.; Cheng, M.T. Bridging Realty to Virtual Reality: Investigating gender effect and student engagement on learning though video game play in an elementary school classroom. Int. J. Sci. Educ. 2009, 31, 1091–1113. [Google Scholar] [CrossRef]
  37. Zaharias, P.; Chatzeparaskevaidou, I.; Karaoli, F. Learning Geography Through Serious Games: The Effects of 2-Dimensional and 3-Dimensional Games on Learning Effectiveness, Motivation to Learn and User Experience. Int. J. Gaming Comput. Mediat. Simul. 2017, 9, 28–44. [Google Scholar] [CrossRef]
  38. Parker, R.; Thomsen, B.S.; Berry, A. Learning Through Play at School—A Framework for Policy and Practice. Front. Educ. 2002, 7, 1–12. [Google Scholar] [CrossRef]
  39. Mullik, R.; Haque, S.S. Film as a pedagogical tool for geography during the pandemic induced virtual classes. GeoJournal 2022, 88, 465–477. [Google Scholar] [CrossRef]
  40. Iwamoto, D.H.; Hargis, J.; Taitano, E.J.; Vuong, K. Analyzing the efficacy of the testing effect using KahootTM on student performance. Turk. Online J. Distance Educ. 2017, 18, 80–93. [Google Scholar] [CrossRef]
  41. Dolezal, D.; Posekany, A.; Motschnig, R.; Kirchweger, T.; Pucher, R. Impact of game-based student response systems on factors of learning in a person-centered flipped classroom on C programming. In EdMedia+ Innovate Learning; Association for the Advancement of Computing in Education (AACE): Waynesville, NC, USA, 2018; pp. 1143–1153. [Google Scholar]
  42. Taylor, B.; Reynolds, E. Building vocabulary skills and classroom engagement with Kahoot. In Proceedings of the 26th Korea TESOL International Conference, Seoul, Republic of Korea, 13–14 October 2018; p. 89. [Google Scholar]
  43. Wang, A.I.; Tahir, R. The effect of using Kahoot! for learning—A literature review. Comput. Educ. 2020, 149, 103818. [Google Scholar] [CrossRef]
  44. Cohen, J. Statistical Power Analysis for the Behavioral Sciences, 2nd ed.; Routledge: New York, NY, USA, 1988; 567p. [Google Scholar] [CrossRef]
  45. Rosenthal, R.; Rosnow, R.L.; Rubin, D.B. Contrasts and Effect Sizes in Behavioral Research: A Correlational Approach; Cambridge University Press: Cambridge, UK, 2000. [Google Scholar]
  46. Schagen, L.; Hodgen, E. How Much Difference Does It Make? Notes on Understanding, Using, and Calculating Effect Sizes for Schools; Research Division, Ministry of Education and Edith Hodgen, New Zealand Council for Educational Research: Wellington, New Zealand, 2009. [Google Scholar]
  47. Hattie, J.A.C. Visible Learning for Teachers—Maximizing Impact on Learning; Routledge: New York, NY, USA, 2012. [Google Scholar] [CrossRef]
  48. Butler, J.A. Use of teaching methods within the lecture format. Med. Teach. 1992, 14, 11–25. [Google Scholar] [CrossRef]
  49. Wastiau, P.; Blamire, R.; Kearney, C.; Quittre, V.; Van de Gaer, E.; Monseur, C. The use of ICT in education: A survey of schools in Europe. Eur. J. Educ. 2013, 48, 11–27. [Google Scholar] [CrossRef]
  50. Apostu, O.; Balica, M.; Fartusnic, C.; Horga, I.; Novak, C.; Voinea, L. Analysis of the pre-university education system in Romania from the perspective of statistical indicators. In Data-Based Education Policy; Institute of Education Sciences: Bucharest, Romania, 2015. [Google Scholar]
  51. Fink, A. How to Conduct Surveys—A Step-by-Step Guide; SAGE Publications, Inc.: New York, NY, USA, 2015; ISBN 9781483378480. [Google Scholar]
  52. Frost, J. Hypothesis Testing: An Intuitive Guide for Making Data Driven Decisions; Jim Publishing: Costa Mesa, CA, USA, 2020. [Google Scholar]
  53. Pfefferman, D.; Rao, C.R. Sample Surveys Design, Methods and Applications; Elsevier: Amsterdam, The Netherlands; Boston, MA, USA, 2009. [Google Scholar]
  54. Martin, D.; Treves, R. Embedding e-learning in geographical practice. Br. J. Educ. Technol. 2007, 38, 773–783. [Google Scholar] [CrossRef]
  55. Mondozzi, M.A.; Harper, M.A. In search of effective education in burn and fire prevention. J. Burn. Care Rehabil. 2001, 22, 277–281. [Google Scholar] [CrossRef]
  56. Almaiah, M.A.; Al-Khasawneh, A.; Althunibat, A. Exploring the Critical Challenges and Factors Influencing the E-Learning System Usage during COVID-19 Pandemic. Educ. Inf. Technol. 2020, 25, 5261–5280. [Google Scholar] [CrossRef]
  57. Ana, A.; Minghat, A.D.; Purnawarman, P.; Saripudin, S.; Muktiarni, M.; Dwiyanti, V.; Mustakim, S.S. Students’ Perceptions of the Twists and Turns of E-learning in the Midst of the Covid 19 Outbreak. Rom. Mag. Multidimens. Educ. 2020, 12 (Suppl. 2), 15–26. [Google Scholar] [CrossRef]
  58. Gewin, V. Into the Digital Classroom. Five Tips for Moving Teaching Online as COVID-19 Takes Hold. Nature 2020, 580, 295–296. [Google Scholar] [CrossRef]
  59. Joo, Y.J.; Lee, H.W.; Ham, Y. Integrating user interface and personal innovativeness into the TAM for mobile learning in Cyber University. J. Comput. High. Educ. 2014, 26, 143–158. [Google Scholar] [CrossRef]
  60. Ruan, L.; Long, Y.; Zhang, L.; Lv, G.A. Platform and Its Applied Modes for Geography Fieldwork in Higher Education Based on Location Services. ISPRS Int. J. Geo-Inf. 2021, 10, 225. [Google Scholar] [CrossRef]
  61. Barab, S.; Thomas, M.; Dodge, T.; Carteaux, R.; Tuzun, H. Making learning fun: Quest Atlantis, a game without guns. Educ. Technol. Res. Dev. 2005, 53, 86–107. [Google Scholar] [CrossRef]
  62. Chirca, R. The Educational Potential of Video Games. Manag. Intercult. 2015, 34, 415–419. [Google Scholar]
  63. Toma, F.; Diaconu, C.D.; Dascălu, G.V.; Nedelea, A.; Peptenatu, D.; Pintilii, R.D.; Marian, M. Assessment of geography teaching-learning process through game, in pre-university education. Stud. UBB Geogr. 2022, 67, 93–121. [Google Scholar] [CrossRef]
Figure 1. Explanatory flowchart for the three stages and sub-stages.
Figure 1. Explanatory flowchart for the three stages and sub-stages.
Sustainability 15 06885 g001
Figure 2. Boxplot graphs specific to the (a) model and (b) traditional model with the results of the parametric and non-parametric statistical tests obtained from the Kahoot! (pre-test 1) prior to the intervention, showing individual grades per class level and group composition; data processed in the RStudio program. Source: Student sample data.
Figure 2. Boxplot graphs specific to the (a) model and (b) traditional model with the results of the parametric and non-parametric statistical tests obtained from the Kahoot! (pre-test 1) prior to the intervention, showing individual grades per class level and group composition; data processed in the RStudio program. Source: Student sample data.
Sustainability 15 06885 g002
Figure 3. (a) Boxplot graphs specific to the (a) model and (b) traditional model with the results of the parametric and non-parametric statistical tests obtained from the Kahoot! (post-test 1/pre-test 2) prior to the intervention, showing individual grades per class level; data processed in the RStudio program. Source: Student sample data.
Figure 3. (a) Boxplot graphs specific to the (a) model and (b) traditional model with the results of the parametric and non-parametric statistical tests obtained from the Kahoot! (post-test 1/pre-test 2) prior to the intervention, showing individual grades per class level; data processed in the RStudio program. Source: Student sample data.
Sustainability 15 06885 g003
Figure 4. Boxplot graphs specific to the (a) model and (b) traditional model with the results of the parametric and non-parametric statistical tests obtained from the Kahoot! (pre-test 2/post-test 2) during and after the intervention method, showing individual grades per class level; data processed in the RStudio program. Source: Student sample data.
Figure 4. Boxplot graphs specific to the (a) model and (b) traditional model with the results of the parametric and non-parametric statistical tests obtained from the Kahoot! (pre-test 2/post-test 2) during and after the intervention method, showing individual grades per class level; data processed in the RStudio program. Source: Student sample data.
Sustainability 15 06885 g004
Table 1. The results of the parametric and non-parametric statistical tests obtained from the Kahoot! (pre-test 1) prior to the intervention method, showing individual grades per class level and group composition; data processed in the RStudio program.
Table 1. The results of the parametric and non-parametric statistical tests obtained from the Kahoot! (pre-test 1) prior to the intervention method, showing individual grades per class level and group composition; data processed in the RStudio program.
Group
Type of Test
Intervention MethodTerm and Number of Tests Number of StudentsMinimumQ1MedianQ3MaximumAverageStandard Deviation
Experimental group
Pre-test 1
Online model
Microsoft Power Point presentation5 weeks/
552 tests
1682578106.6451.77087
Control group Pre-test1
Online model
Microsoft Power Point presentation5 weeks/
522 tests
1683578106.6821.63409
Experimental group
Pre-test 1
Online model
Microsoft Power Point presentation5 weeks/
699 tests
1683678107.331.51520
Control group Pre-test 1
Online model
Microsoft Power Point presentation5 weeks/
651 tests
1703679107.2791.55354
Source: Student sample data.
Table 2. The results of the parametric and non-parametric statistical tests obtained from the Kahoot! (post-test 1/pre-test 2) during the intervention period, comparing individual grades per class level, data processed in the RStudio program.
Table 2. The results of the parametric and non-parametric statistical tests obtained from the Kahoot! (post-test 1/pre-test 2) during the intervention period, comparing individual grades per class level, data processed in the RStudio program.
Group
Type of Test
Intervention Methodp-Value Shapiro–Wilk Testp-Value Test F
Experimental Group/Control Group
p-Value Mann–Whitney U-Test
Experimental Group
/Control Group
Effect Size r
Control Group/Experimental Group
Experimental group Post-test 1/Pre-test 2
Online model
Google Forms tool and Microsoft Power Point presentation<2.2 × 10−16 (converted to the value of <0.0001) =1.791 × 10−5 (converted to the value of <0.0001) <2.2 × 10−16 (converted to the value of <0.0001) 0.239, small statistical effect, which indicates that it is an association
Control group Post-test 1/Pre-test 2
Online model
Microsoft Power Point presentation=2.224 × 10−13 (converted to the value of <0.0001)
Experimental group
Post-test 1/Pre-test 2
Traditional model
Own web game created JavaScript and Microsoft Power Point presentation<2.2× 10−16 (converted to the value of <0.0001) =3.426 × 10−16 (converted to the value of <0.0001) <2.2 × 10−16 (converted to the value of <0.0001) 0.745, very high statistical effect, which indicates that it is a very strong association
Control group Post-test 1/Pre-test 2
Traditional model
Microsoft Power Point presentation=3.641 × 10−14 (converted to the value of <0.0001) =3.426 × 10−16 (converted to the value of <0.0001) <2.2 × 10−16 (converted to the value of <0.0001) 0.745, very high statistical effect, which indicates that it is a very strong association
Source: Student sample data.
Table 3. The results of the parametric and non-parametric statistical tests obtained from the Kahoot! (post-test 1/pre-test 2) prior to the intervention method, showing individual grades per class level; data processed in the RStudio program.
Table 3. The results of the parametric and non-parametric statistical tests obtained from the Kahoot! (post-test 1/pre-test 2) prior to the intervention method, showing individual grades per class level; data processed in the RStudio program.
Group
Type of Test
Intervention MethodTerm and Number of Tests Number of StudentsMinimumQ1MedianQ3MaximumAverageStandard Deviation
Experimental group Post-test 1/
Pre-test 2
Online model
Google Forms tool and Microsoft Power Point presentation5 weeks/
488 tests
1682789108.0681.5185
Control group Post-test 1
/Pre-test 2
Online model
Microsoft Power Point presentation5 weeks/
487 tests
1682678106.8761.8247
Experimental group
Post-test 1/
Pre-test 2
Traditional model
Our own web game created
JavaScript and Microsoft Power Point presentation
5 weeks/
394 tests
16858910108.821.0984
Control group Post-test 1/
Pre-test 2
Traditional model
Microsoft Power Point presentation5 weeks/
372 tests
1702789107.741.6750
Source: Student sample data.
Table 4. The results of the parametric and non-parametric statistical tests obtained from the Kahoot! (pre-test 2/post-test 2) during and after to the intervention method, showing individual grades per class level; data processed in the RStudio program.
Table 4. The results of the parametric and non-parametric statistical tests obtained from the Kahoot! (pre-test 2/post-test 2) during and after to the intervention method, showing individual grades per class level; data processed in the RStudio program.
Group
Type of Test
Intervention MethodNumber of Test and TermNumber of StudentsMinimumQ1MedianQ3MaximumAverageStandard Deviation
Experimental group
Pre-test 2
Online model
Own web game created in JavaScript and Microsoft Power Point presentation5 weeks/
488 tests
1682789108.0681.5375
Experimental group Post-test 2
Online model
Microsoft Power Point presentation5 weeks/
596 tests
1682678106.7881.6541
Control group Pre-test 2
Online model
Microsoft Power Point presentation5 weeks/
487 tests
1682578106.7711.8192
Control group Post-test 2
Online model
Microsoft Power Point presentation5 weeks/
649 tests
1682678106.7831.5685
Experimental group
Pre-test 2
Traditional model
Own web game created in JavaScript and Microsoft Power Point presentation5 weeks/
394 tests
16858910108.821.0984
Experimental group
Post-test 2
Traditional model
Microsoft Power Point presentation5 weeks/
372 tests
1683688107.2031.5599
Control group Pre-test 2
Traditional model
Microsoft Power Point presentation5 weeks/
633 tests
1702789107.741.6750
Control group Post-test 2
Traditional model
Microsoft Power Point presentation5 weeks/
568 tests
1703678107.4221.5051
Source: Student sample data.
Table 5. Teachers’ analysis of the benefits of using the custom-created web-based JavaScript game tool associated with the Microsoft PowerPoint presentation for all students in the class during the teaching–learning lesson sequence.
Table 5. Teachers’ analysis of the benefits of using the custom-created web-based JavaScript game tool associated with the Microsoft PowerPoint presentation for all students in the class during the teaching–learning lesson sequence.
Category: Perceptions of the Benefits of Integrating the JavaScript Web Game into LearningCategory: Perceptions of the Benefits of Google Forms in Learning
-
The lesson is interactive, captures the students’ attention, without the risk of them losing interest;
-
Develops a wide range of skills—working with maps, using ICT tools, but also communication in the mother tongue;
-
Stimulates learning, maintains motivation and provides immediate feedback;
-
Provides the correlation of learning activities with the specific competencies in the program;
-
Scientific content adapted to the students’ level;
-
Students learn without stress, and school progress will be visible;
-
The content of the game with all didactic elements is new and original, being well thought out and realized;
-
Learning activities are presented in an attractive and accessible form;
-
Real-time correction by validating answers (“correct”; “you were wrong!”)
-
All students are active, they quickly associate the learning contents with the questions in the game;
-
All students are evaluated in real time—objectivity and speed in evaluation;
-
Motivational and fast scoring;
-
The accuracy of the images and the way the items are worded, quick and efficient assessment of all students in real time;
-
It is an original game in which the students are transparently presented with the specific skills, the performance standards of the targeted lesson, in order to achieve visible learning;
-
Students prefer such games in learning, because they stimulate motivation and competitiveness;
-
The speed with which the information from the current lesson is fixed, learning by encouraging competition between students;
-
Items formulated clearly and concisely and accompanied by relevant images and maps;
-
The group of students is focused and motivated at the same time,
-
Creating a state of well-being;
-
The emergence of a sense of belonging to the game and competition for all students;
-
Learning the lesson in the classroom, being in a certain way co-creators of the lesson;
-
Developing digital competence for both students and teachers through the confident and critical use of information and communication technology for learning;
-
The simultaneous involvement of a large number of students in learning activities;
-
The use of images and maps, particularly important means in teaching–learning geography;
-
The fact that the students know if the answer is right/wrong and the number of the question they are at;
-
Presentation of solving time, score and detailed answers;
-
The existence of progress in learning proven by the presented experiment, for the contents about the hydrosphere;
-
Students can replay the game/test, thus allowing to fix problems “on the fly”;
-
Reduces the waiting time for the results, which allows the student and the teacher to analyze their progress;
-
Because the questions follow one another in accordance with the sequence of contents, they provide progressive feedback and therefore, remediation;
-
Represents an attractive and assumed way to develop the competence “learning to learn”;
-
The active involvement of students even in the design of such tools, thus demonstrating to them the existing links between the subjects studied;
-
Web games can be applied to several classes, even by different teachers, becoming open educational resources;
-
Learning and assimilating knowledge step by step;
-
The possibility of being implemented at the level of other schools, its promotion on social media and student groups;
-
The use of modern tools in a period of technological learning;
-
The different approach to learning, by combining some impact methods in the formation of key skills;
-
Learning through games will be increasingly appreciated by students in the context of digitization; it could be used as a standardized assessment method in the current context of changes in the education law;
-
The possibility of carrying out the teaching–learning process in accordance with the requirements of the current socio-economic environment;
-
Unitary teaching and standardized assessment;
-
Allows students to get closer to understanding what assessment means, by being able to see what the objectives and standards are—students can also become test creators.
-
In the case of using multiple choice or combination items, the evaluation is done automatically;
-
Develops a wide range of skills—working with maps, using ICT tools;
-
Uses time efficiently;
-
Results and progress are provided immediately;
-
Efficient and objective evaluation method;
-
Adapting the contents to the particularities of the age of the students, respecting the school program;
-
Interactive and attractive lessons for all students;
-
The lessons offered as a model are very well done on specific skills and operational objectives;
-
Provides data for rapid feedback for all students in the class;
-
Based on the results, graphs and statistics can be created regarding the way to perform work tasks on different skills and objectives;
-
Activates students simultaneously;
-
Various items can be applied (objective, semi-objective and subjective);
-
Develops students’ attention, imagination, quickness and depth of thinking, memory, observational spirit and several intellectual traits that will contribute to the correct acquisition of geography concepts;
-
Prevents monotony and boredom;
-
The presented concept of using the tool in the teaching–learning sequence is extremely interesting;
-
The simultaneous involvement of a large number of students in learning activities;
-
The use of images and maps, particularly important means in teaching–learning geography;
-
Clear tasks, appropriate images, the existence of learning progress proven by the presented experiment;
-
No costs for using Google forms;
-
Reduces the time to evaluate the answers, compared to the classic evaluation.
Source: Sample of answers from teachers in a custom-designed questionnaire.
Table 6. Teachers’ analysis of the disadvantages of using the custom-created web-based JavaScript game tool associated with the Microsoft PowerPoint presentation for all students in the class during the teaching–learning lesson sequence.
Table 6. Teachers’ analysis of the disadvantages of using the custom-created web-based JavaScript game tool associated with the Microsoft PowerPoint presentation for all students in the class during the teaching–learning lesson sequence.
Category: Perceptions of Disadvantages of Integrating JavaScript Web Game into LearningCategory: Perceptions of the Disadvantages of Google Forms Integration
-
Lack of necessary electronic equipment in all classrooms or for certain students;
-
Can become the favorite for students and they will accept another ICT tool with difficulty;
-
Games cause addiction, eye problems;
-
The location of the unit in an area with fluctuating Internet signal connection;
-
Time-limit of 30 min may be insufficient for slower students.
-
For questions with short answers, the evaluation cannot be done automatically because the system recognizes as correct only the answer in the form typed by the creator of the instrument;
-
Mainly objective type items are used;
-
The exclusive use of the method can lead to low communication, and the ability to argue on a certain topic is lost;
-
The effort of the teaching staff to prepare the lessons and to interpret the results;
-
Requires the use of a digital device package to apply this online tool;
-
The tendency to emphasize more on the verification of assimilated knowledge and less on the ability to analyze and synthesize, to understand and operate with this information;
-
There may be cases when the students in the last positions are frustrated;
-
The feedback received by the students is not as fast as in the case of the assessment made through the game (quiz), therefore, less effective for the formative assessment during the lesson.
Source: Sample of answers from teachers in a custom-designed questionnaire.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Toma, F.; Ardelean, A.; Grădinaru, C.; Nedelea, A.; Diaconu, D.C. Effects of ICT Integration in Teaching Using Learning Activities. Sustainability 2023, 15, 6885. https://doi.org/10.3390/su15086885

AMA Style

Toma F, Ardelean A, Grădinaru C, Nedelea A, Diaconu DC. Effects of ICT Integration in Teaching Using Learning Activities. Sustainability. 2023; 15(8):6885. https://doi.org/10.3390/su15086885

Chicago/Turabian Style

Toma, Florentina, Andreea Ardelean, Cătălin Grădinaru, Alexandru Nedelea, and Daniel Constantin Diaconu. 2023. "Effects of ICT Integration in Teaching Using Learning Activities" Sustainability 15, no. 8: 6885. https://doi.org/10.3390/su15086885

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop