Next Article in Journal
Dynamic Privacy-Preserving Recommendations on Academic Graph Data
Next Article in Special Issue
A Cognitive Diagnostic Module Based on the Repair Theory for a Personalized User Experience in E-Learning Software
Previous Article in Journal
Fine-Grained Cross-Modal Retrieval for Cultural Items with Focal Attention and Hierarchical Encodings
Previous Article in Special Issue
Motivation, Stress and Impact of Online Teaching on Italian Teachers during COVID-19
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Development of an Educational Application for Software Engineering Learning

by
Antonio Sarasa-Cabezuelo
1,* and
Covadonga Rodrigo
2
1
Department of Computer Systems and Computing, School of Computer Science, Complutensian University of Madrid, 28040 Madrid, Spain
2
Department of Languages and Computer Systems, Computer Science Engineering Faculty, UNED, 28040 Madrid, Spain
*
Author to whom correspondence should be addressed.
Computers 2021, 10(9), 106; https://doi.org/10.3390/computers10090106
Submission received: 28 July 2021 / Revised: 15 August 2021 / Accepted: 17 August 2021 / Published: 25 August 2021
(This article belongs to the Special Issue Present and Future of E-Learning Technologies)

Abstract

:
Software engineering is a complicated subject for computer engineering students since the explained knowledge and necessary competencies are more related to engineering as a general knowledge area than to computer science. This article describes a software engineering learning application that aims to provide a solution to this problem. Two ideas are used for this. On the one hand, to facilitate its use it has been implemented as an Android app (in this way it can be used anywhere and at any time). In addition, and on the other hand, a gamification system has been implemented with different learning paths that adapt to the learning styles of each student. In this way, the student is motivated by competing with other classmates, and on the other hand, the application adapts to the way of learning that each one has.

1. Introduction

Software engineering is a subject that generally does not motivate computer science students [1]. The theoretical and conceptual nature of the contents explained is far from the purely coding tasks [2], and is closer to the engineering tasks that are applied to execute a project [3]. For this reason, students do not show enthusiasm and often have difficulties [4] to understand the usefulness of these techniques and apply them in the development of a computer project. However, the contents and tools of software engineering are key for any computer engineer [5] in order to be able to develop and execute a project in the professional life. It is for this reason that teachers need tools to motivate them.
There are multiple options to motivate students such as actively monitoring less motivated students [6], carrying out internships [7] or curricular adaptation. In the particular case of software engineering, a very widespread technique consists of simulating the realization and execution of a computer project [8]. In this way, the student can experience the same problems and difficulties that occur when working in a company. There are variants in the implementation of this simulation [9]. The most common is to create workgroups that specify and run the same job. However, there is another variant that consists of changing [10] the roles of the students throughout the simulation so that they can play the role of analyst, designer and developer, or else they have to work with projects that have been specified by others partners.
In recent years, the usefulness of the use of gamification as a motivational element in education has been proven in different areas of knowledge [11]. Games promote competitiveness among students [12], and thus encourage them to become more involved in the study of the content [13] that the game deals with in order to obtain good results. There are quite a few studies [14,15] that support the positive effect of its use in education and its influence on a better understanding and comprehension of the contents [16], and on an improvement in academic results [17]. In this sense, different computer tools have been developed that allow the creation of games whose purpose is to show and teach the contents of a subject [18,19]. The result can be games with a significant multimedia load [20] being similar to a video game or simpler games where the important thing is the competitive nature that arises in it.
On the other hand, in the last decade, there has been a revolution in the way in which content is accessed [21]. Thus, it has moved towards digital access based on the use of mobile devices [22,23]. Most students use mobile devices as the main, and in many cases the only way, to access information and to interact with others [24]. In particular, intensive use is made for the consumption of multimedia elements [25] such as movies, video, photos and others. Mobile devices have important advantages in this regard, since they can be used at any time [26] and anywhere. This offers great flexibility as there are no restrictions in terms of schedules [27], leaving the user free to use it. Likewise, mobile devices have another advantage [28] with respect to the speed of access and updating of content. The content creator can keep the content updated in a very simple way [29], as well as report updates and news in the content immediately.
In this article, a tool is presented that aims to help a software engineering teacher to motivate students and complement their training. For this, two design principles have been considered; on the one hand, the advantages offered by using a game as a motivational element that favors the competitiveness of students and their involvement in the game. In addition, the second idea is the format of the game, for this it has been decided to implement it as an application for a mobile device [30], given the widespread use by students and the advantages it offers to be able to be used anywhere and at any time, offering great flexibility of schedules.
The structure of the paper is as follows. In Section 2, the architecture of the application, the data model used and the REST API implemented will be described. Next, in Section 3, the functionality of the application is presented according to the three types of users that can use it: administrator, student and teacher. Section 4 describes an evaluation of usability that has been carried out among a group of professors, students and non-university personnel. Finally, Section 5 presents a set of conclusions and lines of future work.

2. Architecture and Data Model

The application has been implemented using a client-server model where the client is an Android application that runs on the mobile device, which does requests to the server and waits for a response. The server implements a set of services that are used by clients. An API (application programming interface) of REST services has been implemented in the server that allows access to a specific service. When the client does requests through HTTP to the services exposed by the REST API, it retrieves the necessary data from the database, processes it and returns it to the client with the necessary structure. To use them, the client only needs to know the format and content of the response to the requested service.
The data model has been implemented using a MariaDB-type relational database consisting of nine tables:
  • User table. Stores profile information and manages the three types of users present in the application.
  • Topic table. Stores the information of the topics present in the application and their description.
  • Question table. Stores the information of the question repository present in the application.
  • NodoCA table. Represents a node in the tree that contains an exam question that will be shown to the student user. It contains information about the question, a pointer to the node with the alternative question, a pointer to the topic that the question refers to, and a pointer to the teacher user who created the question.
  • Alternative node table. Describes the information that represents the “Object” defined as an alternative question in the exam, if the student fails the main question. This entity represents a node in the tree structure formed for the learning path.
  • Achievements table. This describes the information represented in the Achievements table, which represents the level obtained by a student user in a certain topic, so it is related to both tables.
  • Answer table. Represents the information of an answer to a specific question, so it contains a pointer to the question from which it came.
  • Statistics table. It contains the information that a student has obtained in a certain exam, that is, number of correct and failed answers, average response time.
  • Question_Suspended table. It contains the information that represents the information of the questions that a student has failed in a certain exam, that is, the ids of these questions.
Finally, regarding the implemented REST API, services have been defined for the following modules:
  • User Module. It contains all the services related to the user’s profile, login and registration, all the achievements and milestones corresponding to the students and their statistics. The base URL on which these services would be mounted would be “/user”. The description of the available endpoints is shown in Table 1.
  • Question module. It contains all the services related to the questions of the common repository for all teachers. The base URL on which these services will be mounted will be “/question”. The description of the available endpoints is shown in Table 2.
  • Topic module. It contains all the services related to the themes of the common repository for all users. The base URL on which these services will be mounted will be “/topic”. The description of the available endpoints is shown in Table 3.
  • NodoCA module. It contains all the services related to the nodes that together will represent a complete exam, that is, the tree corresponding to a learning path created by a teacher. The base URL on which these services will be mounted will be “/nodeca”. The description of the available endpoints is shown in Table 4.

3. Functionality

The functionality of the Android application will be explained below. For this, the dynamics of the game will be explained first, and then the functions of each type of user will be explained: administrator, student and teacher.

3.1. The Quizz

From the student’s perspective, the application implements a game that is structured in several phases or content modules. In each module there are different topics that have a certain level of difficulty associated with them. The four levels of the game are initial, bronze, silver or gold. The objective of a student is to obtain the highest possible level in each module. To pass a level you must take an exam that consists (Figure 1) of 10 main questions and 10 alternatives for each main question. There are 30 s to answer each question so that if this time is exceeded, the system goes to the next one and it will be counted as a failure.
Structurally, each exam is represented as a question tree where in each node there is a child that represents the main question and another child that represents the alternative question. If the student fails, they lose 0.5 points, a message is displayed on the screen, and then an alternate question is displayed. If the alternative question is answered correctly then 0.5 points are added to it and if it is failed, again it is assigned 0 points. If the student answers the main question correctly, 0.5 points are added. This process continues (Figure 2) until Question 10 (in the worst case it would be necessary to answer 20 questions in total).
When the exam ends, the student is informed by a sound according to the score, whether or not they have passed the exam. If the student passes, the application adds the percentage of the game. In the case of abandoning the exam, the student will drop one level with respect to the one they have at that moment. Each exam counts a percentage of the completed game based on whether an exam is being taken for a certain level, that is, the full gold level, adds 45% of the completed game, silver level 35% and bronze level 20%. If the user is in the initial level and takes an exam on Topic 1 to go up to the bronze level (there are 16 topics available) then in case of passing it, a percentage of the game completed of 20/16 = 1.25% will rise. In the same way, it is calculated for the levels of gold and silver.
The criteria to pass to the next level are as follows:
  • If the user had an initial level and obtains a score ≥5 in the exam, they will pass to the bronze level;
  • If the user was bronze level and gets a score ≥7 in the exam, they will pass to silver level;
  • If the user had a silver level and gets a score ≥8 in the exam, they will pass to the gold level;
The game ends when the user reaches the gold level in all topics.

3.2. Student

The main functions of the student user are:
  • Take an exam. Figure 3 shows the screen where the different content modules appear. When clicking on a module, the user is asked if they want to take an exam to level up, and if they accept, the exam to take is shown. To do this, the student clicks on the “Game” (“Juego”) menu tab, and a screen is displayed with all the available topics and the level of each one of them. Next, if it is clicked on one of the topics, then the student will be able to take an exam to obtain the next level in that particular topic. For it, the student must click on the “Play” (“Jugar”) button. If there are published exams for that topic and that level, the student will be able to do the exam. Otherwise, the initial screen is returned and it is reported that there are no exams in the repository to obtain the next level.
  • View statistics. Figure 4 shows the screen where the modules that a user has made appear. When you click on an exam, the statistics of each module appear, and the incorrect answers are shown for each exam performed. The user can see the statistics of the exams taken, the average response time or the level obtained in each topic. To do this, the user must go to the Ranking screen of the menu where a list of all students is displayed, and click on the own profile or that of any other student, showing the levels in each topic. Regarding statistics, only the own statistics are visible but not those of the rest of students. On the other hand, if it is clicked on the “Show Fails” (“Mostrar Suspensos”) button, a screen will be displayed with a summary with the exams it has been failed, and by clicking on a specific exam; the content of the exam and the failed questions will be displayed.
  • Reset state. The student can reset the game counter and clear all the statistics and tests taken. To do this, on the main screen it must be clicked on the “Reset Score and Achievements” (“Resetear Puntuación y Logros”) button.

3.3. Teacher

The main functions of the teacher user are the following:
  • Check the status of a student. It is possible to view the ranking and the statistics associated with any student registered in the application.
  • Manage question repository. Figure 5 shows the teacher’s screen with all the content modules. When you click on a module, another screen is displayed where the teacher can create or edit exams for that module. When a teacher user accesses the menu tab called “Game” (“Juego”), it is possible to view an interface where the course topics are listed but without levels associated with each topic. If it is clicked on a topic, a screen is displayed where the next actions are shown:
    • Add question. Figure 6b shows the teacher’s screen from which a new question can be created to include in the question repository. To do this, it is necessary to click on the “Create Question” (“Crear Pregunta”) button that will display a new screen where the data for the question are entered. Then, click on the “Save” (“Guardar”) button, being stored in the common repository of questions for all teacher users. The questions in the repository can be modified, deleted or used to create an exam by any teacher.
    • Create a question repository. Figure 6a shows the teacher’s screen from which a question repository can be created. To do this, it is necessary to click on the “Create repository” (“Crear Repositorio”) button, which will display a form where the number of questions in the repository must be indicated. The same process of creating questions is repeated as many times as the number of questions indicated.
    • Delete a question. To do this, it is necessary to click on the “Delete Question” (“Borrar Pregunta”) button, showing a screen with all the questions available in the common repository of teachers. If it is clicked on any of the questions, it will be asked if the user would like to delete that question. If it is confirmed, then it will permanently delete it from the repository. If the question is used in some exam, then it cannot be deleted and an informational message will be displayed to the user.
    • Modify question. To do this, it is necessary to click on the “Modify Question” (“Modificar Pregunta”) button and a screen will appear with all of the questions available in the common repository of teachers. If it is clicked on any of the questions, the same screen used for “Add Question” appears but with the data filled in. Next, the data is modified, and when the modification is finished, it must be clicked on “Save” (“Guardar”) button.
    • Create an exam. Figure 7a shows the screen that shows the list of exams that have not been published and Figure 7b shows the screen for creating a new exam. The process is the following. It must be clicked on the “Create CA” (“Crear CA”) button, and a screen will be displayed where it must be indicated the level of the exam (bronze, silver or gold). Next, it is shown the question repository where it must be repeated 10 times: choose a main question and choose an alternate question each time. The repository must have at least 11 questions in order to create an exam.
    • Modify exam. Figure 7c shows the screen for modifying an exam and Figure 7d shows the screen that allows you to delete a specific exam. The process is the following. It is clicked on the “Modify CA” (“Modificar CA”) button, showing a screen with a summary of the exams that have not yet been published. If it is clicked on one of them, all the information of the exam will be displayed, and below the repository of questions. If one of the exam questions is selected, the application will ask if the user would like to replace the main question or the alternative. The user must then select a question from the repository to replace the previous one. Finally, it must be clicked on the “Modify” (“Modificar”) button.
    • Delete exam. Figure 7d shows the screen that allows you to delete a specific exam. The process is the following. It is clicked on the “Delete CA” (“Borrar CA”) button, showing a screen with a summary of all published and unpublished exams. If it is clicked on an exam, all the information of the exam will be displayed. Next, it must be clicked on “Delete” (“Borrar”) button to delete it from the system.
    • Publish exam. It is clicked on the “Publish CA” (“Publicar CA”) button, showing a screen with a summary of all unpublished exams. If it is clicked on an exam, all the information of the exam will be displayed. Next, it must click on “Publish” (“Publicar”) button to publish it in the system.

3.4. Administrator

The main functions of the administrator user are the following:
  • Deactivate teacher Figure 8a shows the screen where a teacher can be deactivated. The process is the following. The administrator must search for the user that it must be deactivated, select it and click on the button “Unsubscribe Users” (“Dar de baja usuarios”).
  • Activate teacher. Figure 8b shows how to activate a teacher. The process is the following. If there is a teacher user pending activation, when the administrator logs in a notification is displayed. Next, to activate the user, the administrator must click on the notification, and a screen appears where the activation must be confirmed by clicking “Accept” (“Aceptar”).

3.5. Other Common Functions

The application has a set of functions common to all users:
  • Login. Figure 9a shows the login screen. The process is as follows: It is necessary to be registered in the application. In the login screen, the username and password must be entered. When the session starts, the user’s home page is displayed. Always, it is possible to log out and return to the login screen using the log out (“Cerrar sesión”) button.
  • Register. Figure 9b shows the screen for editing a user’s profile. The process is as follows: the “Register” (“Registro”) button that appears on the login screen must be clicked, and the user must enter the requested data on the screen that appears. In particular, it must be indicated if the user will register as a teacher or as a student.
  • Edit profile. Figure 9c shows the screen for registering. The process is the following. To do this, click on the edit button that appears on the user’s home page and a screen is displayed with the user’s data that can be modified: password, image and others. To confirm the changes, it must be clicked on the “Update” (“Actualizar”) button.
  • Unsubscribe user. Figure 10c shows the screen to unsubscribe from the application. To do this, it must be clicked on the “Unsubscribe” (“Dar de baja”) button that appears on the user’s home page, and confirm the action.
  • Check ranking. Figure 10a shows the application options menu and Figure 10b shows the user profile edit screen. To do this, it must be clicked on the menu located at the top right of the user’s home page. As a result, a list of students is shown, so that if it is selected one, then the name, nick, email and percentage of the game completed will be displayed.

4. Evaluation

An evaluation of the usability of the implemented application has been carried out. For this, a study population of 43 people has been considered, made up of 13 teachers, 17 students and 13 people with no relation to the university. The administrator, student or teacher roles have been evaluated, and based on the type; one type of question or another has been shown (half of the respondents tested the application with the role of student, and the other half with the role of administrator and teacher). The questions asked in the evaluation are measured using a Likert scale between one and five where zero would be the least satisfied and five would be the most. The evaluation has been carried out using Google Forms. In each block of the survey, it is possible to see the steps to follow before asking the user the assessment question (Questions used in the assessment can be found in Appendix A). The classification of the question blocks is as follows:
  • The first block of questions shows general questions related to age, relationship with university, and gender.
  • The second block shows questions related to the user’s role. At this point, the user must follow the steps indicated before evaluating the questions.
The third block of questions deals with the global evaluation of the application and proposals for improvements and changes where the user can freely give their opinion. The results of the evaluation have been as follows. The evaluation of the usability of the profile editing interface has been assessed in all cases with values greater than three points (Figure 11a). Likewise, the interface of the game screen that shows the study topics has been evaluated with four and five points (Figure 11b), and has a high percentage with two and three points, in terms of the process of uploading an image to the profile (Figure 11c). Finally, all the respondents evaluate with five points, the process of taking an exam (Figure 11d).
The process of editing the profile and uploading an image of the teacher role is evaluated with two points in most cases and three points in the rest (Figure 12a,b). On the other hand, the game interface for the teacher role is evaluated in all cases with four and five points (Figure 12c). The process of creating an exam or updating a question from the repository is rated by participants above three points (Figure 12d). Finally, satisfaction with the quiz post and delete screens for users taking the survey as teachers is rated above four in all cases (Figure 12e,f).
(The usability of the interface of the role administrator has been evaluated by all participants with four points (Figure 13).
A total of 33% of those surveyed have evaluated the ease of use of the application with four points and 66% with five points (Figure 14a). Likewise, most of the participants evaluate with four and five points, the colors used in the application and the usability of the interface (Figure 14b), and only a few cases evaluate it with two and three points.
In Block 3 of the evaluation, there was a block of free response questions where the respondents have made some proposals for improvements such as:
  • View the screen horizontally.
  • Eliminate background sound when pressing.
  • Improve the colors and design of the interface.
  • Improve the editing of an exam.
  • Improve the display of some texts in the application.
  • Improve the color and font of some texts
  • Improve the identification of possible answers in question tests.

5. Conclusions and Future Work

This article has described an Android application that allows complementing the training of university students in software engineering. The application implements a game of questions organized in 16 different topics with different levels of difficulty. To finish the game it is necessary to obtain the maximum level in each one of the subjects. The game consists of taking exams with questions about the corresponding topic and with the appropriate difficulty for the level at which it is being played. Each exam consists of 10 main questions and 10 alternative questions for the case of failure the main questions. For each correct answer to a main question, one point is obtained, and if it is an alternative question, one half of a point is obtained. In all other cases, zero points are obtained. The students who participate in the game are organized in a ranking according to the points they have obtained in the different exams carried out, and can consult the points of the rest of the participants. In addition to the student user, the application administrator role and the teacher role in charge of preparing the questions and exams have been defined.
The application has been evaluated among students, professors and people outside the university, obtaining a good result and a good acceptance.
The advantages of the application are the use of gamification to introduce and complement a subject that is normally complex and not very motivating for computer science students. Through play and the promotion of the competitive aspect, students are motivated and receptive to the contents explained. Likewise, another advantage is the possibility of using the application from a mobile phone, so that it is easy to access anytime, anywhere. This aspect is key, given that the habits of current students demonstrate intensive use of mobile devices. On the other hand, it is important to highlight the intuitive and simple interfaces of the application, which facilitates its use and favors its use. Finally, from the teacher’s point of view, the application is interesting since it constitutes an optimal tool to complement the regular training of face-to-face classes. In addition, it serves as an instrument to better know their students and their understanding of the subject: concepts they do not understand, the level of knowledge of the students, participation and motivation of the students.
There are a wide variety of systems that offer functionalities similar to those implemented in this work. However, the described tool presents some novelties and differences. In the first place, the learning process developed is novel since it allows the creation of different learning itineraries adapted to the levels of the students. For this, a tree-shaped structure of the exams is used in which alternative questions to the main ones can be created, functioning as a Socratic tutor. Second, from a technological point of view, the application presents as a novelty the implementation of a layer of REST-type web services that allows the modification and addition of new services in a simple way since all services are independent of each other, achieving a loosely coupled, consistent and easily extensible and maintainable system. Lastly, the tool implements a “learning by doing” strategy, given that learning is carried out through practical tests where the teacher can delve into a topic through questions aimed at reinforcing learning.
However, the application has some limitations that represent future lines of work. In the first place, with regard to the exams, it would be interesting to expand the types of questions that can be used as well as the possibility of obtaining a certificate of completion of the course. Regarding the services offered to students, the application could be improved with functions such as creating groups of friends among users so that they can follow the activity of friends, implement the interface in languages other than Spanish or provide new tools, providing communication for teachers and students. In addition, with regard to the operation of the application, it could be improved by allowing competitions between the best players, expanding the application to other areas of knowledge or connecting the application with the course management system used at the university to share information about students, the activity and the grades obtained.
On the other hand, it should be noted that the evaluation carried out consisted of conducting an exploratory investigation with a non-representative sample. However, it is proposed as an improvement to consider a more significant sample and perform an evaluation in a real software engineering class with the aim of evaluating the usefulness to improve student learning. The qualitative and quantitative data obtained would be analyzed using analysis tools such as SAS Enterprise Miner.

Author Contributions

Conceptualization, A.S.-C. and C.R.; methodology, A.S.-C.; software, A.S.-C. and C.R.; validation, A.S.-C. and C.R.; investigation, A.S.-C.; writing—original draft preparation, A.S.-C. and C.R.; writing—review and editing, A.S.-C. and C.R.; supervision, C.R. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

All data has been present in main text.

Acknowledgments

I would like to thank Rubén García Mateos for implementing the system described in the article.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

The evaluation questions asked are shown below. There are three blocks of questions, Block 1 contained more generic questions about the respondents, Block 2 contained personal opinion about the different functionalities of the application depending on the role of the person who carried out the survey, and Block 3 contained changes proposed by the respondent and general assessment.
The first block of questions that are shown to the user are questions related to age, relationship with the university or teaching, and gender (Figure A1).
The second block shows questions related to the role of the user who has decided when taking the survey in the last question of Block 1:
(a)
Assessment questions for registered student role (Figure A2, Figure A3, Figure A4 and Figure A5)
Figure A1. First block of questions.
Figure A1. First block of questions.
Computers 10 00106 g0a1
Figure A2. Assessment questions for registered student role.
Figure A2. Assessment questions for registered student role.
Computers 10 00106 g0a2
Figure A3. Assessment questions for registered student role.
Figure A3. Assessment questions for registered student role.
Computers 10 00106 g0a3
Figure A4. Assessment questions for registered student role.
Figure A4. Assessment questions for registered student role.
Computers 10 00106 g0a4
Figure A5. Assessment questions for registered student role.
Figure A5. Assessment questions for registered student role.
Computers 10 00106 g0a5
(b)
Assessment questions for a registered teacher role (Figure A6, Figure A7, Figure A8, Figure A9 and Figure A10)
Figure A6. Assessment questions for a registered teacher role.
Figure A6. Assessment questions for a registered teacher role.
Computers 10 00106 g0a6
Figure A7. Assessment questions for a registered teacher role.
Figure A7. Assessment questions for a registered teacher role.
Computers 10 00106 g0a7
Figure A8. Assessment questions for a registered teacher role.
Figure A8. Assessment questions for a registered teacher role.
Computers 10 00106 g0a8
Figure A9. Assessment questions for a registered teacher role.
Figure A9. Assessment questions for a registered teacher role.
Computers 10 00106 g0a9
Figure A10. Assessment questions for a registered teacher role.
Figure A10. Assessment questions for a registered teacher role.
Computers 10 00106 g0a10
(c)
Assessment questions for an administrator role (Figure A11 and Figure A12).
Figure A11. Assessment questions for administrator role.
Figure A11. Assessment questions for administrator role.
Computers 10 00106 g0a11
Figure A12. Assessment questions for an administrator role.
Figure A12. Assessment questions for an administrator role.
Computers 10 00106 g0a12
The third block of questions consists of a series of questions on the global evaluation of the application and proposals for improvements and changes (Figure A13).
Figure A13. Third block of questions.
Figure A13. Third block of questions.
Computers 10 00106 g0a13

References

  1. Ouhbi, S.; Pombo, N. Software engineering education: Challenges and perspectives. In Proceedings of the IEEE Global Engineering Education Conference (EDUCON), Porto, Portugal, 27–30 April 2020; IEEE: Piscataway, NJ, USA; pp. 202–209. [Google Scholar]
  2. Streveler, R.A.; Pitterson, N.P.; Hira, A.; Rodriguez-Simmonds, H.; Alvarez, J.O. Learning about engineering education research: What conceptual difficulties still exist for a new generation of scholars? In Proceedings of the IEEE Frontiers in Education Conference (FIE), El Paso, TX, USA, 21–24 October 2015; IEEE: Piscataway, NJ, USA; pp. 1–6. [Google Scholar]
  3. Cheah, C.S. Factors contributing to the difficulties in teaching and learning of computer programming: A literature review. Contemp. Educ. Technol. 2020, 12, ep272. [Google Scholar] [CrossRef]
  4. Bosse, Y.; Gerosa, M.A. Why is programming so difficult to learn? Patterns of Difficulties Related to Programming Learning Mid-Stage. ACM SIGSOFT Softw. Eng. Notes 2017, 41, 1–6. [Google Scholar] [CrossRef]
  5. Jaccheri, L.; Morasca, S. On the importance of dialogue with industry about software engineering education. In Proceedings of the 2006 International Workshop on Summit on Software Engineering Education, Shanghai, China, 20 May 2006; IEEE: Piscataway, NJ, USA; pp. 5–8. [Google Scholar]
  6. Barkley, E.F.; Major, C.H. Student Engagement Techniques: A Handbook for College Faculty; John Wiley & Sons: Hoboken, NJ, USA, 2020; pp. 167–205. [Google Scholar]
  7. Hodges, C.B. Designing to motivate: Motivational techniques to incorporate in e-learning experiences. J. Interact. Online Learn. 2004, 2, 1–7. [Google Scholar]
  8. Verner, J.M.; Babar, M.A.; Cerpa, N.; Hall, T.; Beecham, S. Factors that motivate software engineering teams: A four country empirical study. J. Syst. Softw. 2014, 92, 115–127. [Google Scholar] [CrossRef]
  9. Sarasa-Cabezuelo, A. Desarrollo de competencias mediante la realización de proyectos informáticos. Experiencia en la asignatura de Ingeniería del Software. In Actas del Congreso Virtual: Avances en Tecnologías, Innovación y Desafíos de la Educación Superior (ATIDES 2020); Servei de Comunicació i Publicacions: Castelló de la Plana, Spain, 2020; pp. 137–151. [Google Scholar]
  10. Goñi, A.; Ibáñez, J.; Iturrioz, J.; Vadillo, J.Á. Aprendizaje Basado en Proyectos usando metodologías ágiles para una asignatura básica de Ingeniería del Software. In Actas de las Jornadas de Enseñanza Universitaria de la Informática; Universidad de Oviedo: Oviedo, Spain, 2014; pp. 20–35. [Google Scholar]
  11. Knutas, A.; Hynninen, T.; Hujala, M. To get good student ratings should you only teach programming courses? Investigation and implications of student evaluations of teaching in a software engineering context. In Proceedings of the IEEE/ACM 43rd International Conference on Software Engineering: Software Engineering Education and Training (ICSE-SEET), Madrid, Spain, 22–30 May 2021; IEEE: Piscataway, NJ, USA; pp. 253–260. [Google Scholar]
  12. Herranz, E.; Palacios, R.C.; de Amescua Seco, A.; Sánchez-Gordón, M.L. Towards a Gamification Framework for Software Process Improvement Initiatives: Construction and Validation. J. Univ. Comput. Sci. 2016, 22, 1509–1532. [Google Scholar]
  13. Monteiro, R.H.B.; de Almeida Souza, M.R.; Oliveira, S.R.B.; dos Santos Portela, C.; de Cristo Lobato, C.E. The Diversity of Gamification Evaluation in the Software Engineering Education and Industry: Trends, Comparisons and Gaps. In Proceedings of the IEEE/ACM 43rd International Conference on Software Engineering: Software Engineering Education and Training (ICSE-SEET), Madrid, Spain, 22–30 May 2021; IEEE: Piscataway, NJ, USA; pp. 154–164. [Google Scholar]
  14. Rodríguez, G.; González-Caino, P.C.; Resett, S. Serious games for teaching agile methods: A review of multivocal literature. In Computer Applications in Engineering Education; John Wiley & Sons: Hoboken, NJ, USA, 2021; pp. 207–225. [Google Scholar]
  15. Alhammad, M.M.; Moreno, A.M. Gamification in software engineering education: A systematic mapping. J. Syst. Softw. 2018, 141, 131–150. [Google Scholar] [CrossRef]
  16. Ivanova, G.; Kozov, V.; Zlatarov, P. Gamification in software engineering education. In Proceedings of the 2019 42nd International Convention on Information and Communication Technology, Electronics and Microelectronics (MIPRO), Opatija, Croatia, 20–24 May 2019; IEEE: Piscataway, NJ, USA; pp. 1445–1450. [Google Scholar]
  17. García, F.; Pedreira, O.; Piattini, M.; Cerdeira-Pena, A.; Penabad, M. A framework for gamification in software engineering. J. Syst. Softw. 2017, 132, 21–40. [Google Scholar] [CrossRef]
  18. Berkling, K.; Thomas, C. Gamification of a Software Engineering course and a detailed analysis of the factors that lead to it’s failure. In Proceedings of the 2013 International Conference on Interactive Collaborative Learning (ICL), Kazan, Russia, 25–27 September 2013; IEEE: Piscataway, NJ, USA; pp. 525–530. [Google Scholar]
  19. Vera, R.A.A.; Arceo, E.E.B.; Mendoza, J.C.D.; Pech, J.P.U. Gamificación para la mejora de procesos en ingeniería de software: Un estudio exploratorio. ReCIBE Rev. Electrónica Comput. Inf. Biomédica Y Electrónica 2019, 8, C1. [Google Scholar]
  20. Morschheuser, B.; Hassan, L.; Werder, K.; Hamari, J. How to design gamification? A method for engineering gamified software. Inf. Softw. Technol. 2018, 95, 219–237. [Google Scholar] [CrossRef] [Green Version]
  21. Sattarov, A.; Khaitova, N. Mobile learning as new forms and methods of increasing the effectiveness of education. Eur. J. Res. Reflect. Educ. Sci. 2020, 7, 1169–1175. [Google Scholar]
  22. Tangirov, K. Didactical possibilities of mobile applications in individualization and informatization of education. Ment. Enlight. Sci. Methodol. J. 2020, 2020, 76–84. [Google Scholar]
  23. Raelovich, S.A.; Mikhlievich, Y.R.; Norbutaevich, K.F.; Mamasolievich, J.D.; Karimberdievich, A.F.; Suyunbaevich, K.U. Some didactic opportunities of application of mobile technologies for improvement in the educational process. J. Crit. Rev. 2020, 7, 348–352. [Google Scholar]
  24. Zhang, X.; Lo, P.; So, S.; Chiu, D.K.; Leung, T.N.; Ho, K.K.; Stark, A. Medical students’ attitudes and perceptions towards the effectiveness of mobile learning: A comparative information-need perspective. J. Librariansh. Inf. Sci. 2021, 53, 116–129. [Google Scholar] [CrossRef]
  25. Qureshi, M.I.; Khan, N.; Hassan Gillani, S.M.A.; Raza, H. A Systematic Review of Past Decade of Mobile Learning: What we Learned and Where to Go. Int. J. Interact. Mob. Technol. 2020, 14, 67–81. [Google Scholar] [CrossRef]
  26. Qureshi, M.I.; Khan, N.; Raza, H.; Imran, A.; Ismail, F. Digital Technologies in Education 4.0. Does it Enhance the Effectiveness of Learning? A Systematic Literature Review. Int. J. Interact. Mob. Technol. 2021, 15, 31–47. [Google Scholar] [CrossRef]
  27. Shaqour, A.; Salha, S.; Khlaif, Z. Students’ Characteristics Influence Readiness to Use Mobile Technology in Higher Education. Educ. Knowl. Soc. (EKS) 2021, 22, e23915. [Google Scholar] [CrossRef]
  28. Gupta, Y.; Khan, F.M.; Agarwal, S. Exploring Factors Influencing Mobile Learning in Higher Education–A Systematic Review. iJIM 2021, 15, 141. [Google Scholar]
  29. Venkataraman, J.B.; Ramasamy, S. Factors influencing mobile learning: A literature review of selected journal papers. Int. J. Mob. Learn. Organ. 2018, 12, 99–112. [Google Scholar] [CrossRef]
  30. Hu, S.; Laxman, K.; Lee, K. Exploring factors affecting academics’ adoption of emerging mobile technologies-an extended UTAUT perspective. Educ. Inf. Technol. 2020, 25, 4615–4635. [Google Scholar] [CrossRef]
Figure 1. Tree structure containing the exam.
Figure 1. Tree structure containing the exam.
Computers 10 00106 g001
Figure 2. Game development sequence diagram.
Figure 2. Game development sequence diagram.
Computers 10 00106 g002
Figure 3. Take an exam.
Figure 3. Take an exam.
Computers 10 00106 g003
Figure 4. View statistics.
Figure 4. View statistics.
Computers 10 00106 g004
Figure 5. Manage question repository.
Figure 5. Manage question repository.
Computers 10 00106 g005
Figure 6. (a) Create a question repository; (b) add question.
Figure 6. (a) Create a question repository; (b) add question.
Computers 10 00106 g006
Figure 7. (a) Unpublished exams repository; (b) create exam; (c) modify exam; (d) delete exam.
Figure 7. (a) Unpublished exams repository; (b) create exam; (c) modify exam; (d) delete exam.
Computers 10 00106 g007
Figure 8. (a) Deactivate teacher; (b) activate teacher.
Figure 8. (a) Deactivate teacher; (b) activate teacher.
Computers 10 00106 g008
Figure 9. (a) Login; (b) edit profile; (c) register.
Figure 9. (a) Login; (b) edit profile; (c) register.
Computers 10 00106 g009
Figure 10. (a) Menu; (b) ranking; (c) unsubscribe process.
Figure 10. (a) Menu; (b) ranking; (c) unsubscribe process.
Computers 10 00106 g010
Figure 11. Student role evaluation. (a) Usability of the profile editing interface, (b) Usability of interface of the game screen, (c) The process of uploading an image, (d) The process of taking an exam.
Figure 11. Student role evaluation. (a) Usability of the profile editing interface, (b) Usability of interface of the game screen, (c) The process of uploading an image, (d) The process of taking an exam.
Computers 10 00106 g011
Figure 12. Teacher role evaluation. (a) The process of editing the profile, (b) The process of uploading an image, (c) Evaluating of interface, (d) The process of creating exams, (e) Satisfaction of the quiz post, (f) Satisfaction with delete screens for users.
Figure 12. Teacher role evaluation. (a) The process of editing the profile, (b) The process of uploading an image, (c) Evaluating of interface, (d) The process of creating exams, (e) Satisfaction of the quiz post, (f) Satisfaction with delete screens for users.
Computers 10 00106 g012
Figure 13. Administrator role evaluation.
Figure 13. Administrator role evaluation.
Computers 10 00106 g013
Figure 14. Overall satisfaction evaluation. (a) Ease of use of the application, (b) Usability of the interface.
Figure 14. Overall satisfaction evaluation. (a) Ease of use of the application, (b) Usability of the interface.
Computers 10 00106 g014
Table 1. Endpoints available in the User module.
Table 1. Endpoints available in the User module.
ENDPOINTHTTP MethodAnswerDescription
/POST Error code (200 or 400) with msgInsert a user in DB
/{nick} GET Error code (200 or 400) with user object in bodyGets a user from the DB
/GET Error code (200 or 400) with usersGet list of registered users
/{nick} PUT Error code (200 or 400) with msgUpdate a user in DB
/level/{nick}/reset PUT Error code with msgUpdate user statistics
/{nick} DELETE Error code with msgDelete a user from the DB
/teachers GET Error code with msg and users (teachers) objects in JsonGets all teachers who have request submitted for registration and not approved by admin
/students GET Error msg and users (students) objects in JsonGets all student users sorted by percentage
/activate/{nick} PUT Msg with error codeActivates a teacher user
/porc/{nick}/{porcentaje} PUT Msg with error codeUpdates a user’s game completion percentage
/picture/{nick} POST Msg with error codeUpdates a user’s profile picture
/pictures GET User imagesGet the profile images of the students
/level/{user}/{IdTema} GET Error msg along with user levelGets the level of a student user in a given topic
/levels/{user} GET Error msg, along with array of levels per topicGets all levels of a user per topic
/level/increment/{nick}/{idTema} PUT Msg with response codeIncrease the level of a user in a certain topic
/level/decrement/{nick}/{idTema} PUT Msg with response codeDecreases the level of a user in a certain topic
/stadistics/{nick}/{nodoCA} PUT Msg with response codeUpdates a user’s statistics based on their current level
/stadistics/{user} GET Msg with response code, if ok, array of user statisticsGet all user statistics for all levels
/exam/suspended/{user} GET Msg with response code, if ok, exam object arrayGets the failed exams of a student
/{idUsuario}/{idNodoCa} DELETE Msg with response codeRemoving a user’s statistics
Table 2. Endpoints available in the Question module.
Table 2. Endpoints available in the Question module.
ENDPOINTHTTP MethodAnswerDescription
/POST Error code (200 or 400) with msgInsert a question in DB
/questions/{idTema} GET Error code (200 or 400) with question array objectGet all questions for a given topic
/questions/{idTema}/{language} GET Error code (200 or 400) with question array objectGet all questions for a given topic and language
/{id} GET Error code (200 or 400) with msg and in case of ok, object asks.Get a question
/PUT Error code with msg Update all data for a question
/{id} DELETE Error code with msgDelete a question with its answers from the DB
/suspended/{userNick}/{nodoCa} GET Error code with msg and query object arrayGet a list of failed questions in an exam
Table 3. Endpoints available in the Topic module.
Table 3. Endpoints available in the Topic module.
ENDPOINTHTTP MethodAnswerDescription
/GET Error code (200 or 400) with msg and in case of ok, topic object arrayGet all topics
/{id} GET Error code (200 or 400) with msg and in case of ok, topic objectGet a topic
Table 4. Endpoints available in the NodoCA module.
Table 4. Endpoints available in the NodoCA module.
ENDPOINTHTTP MethodAnswerDescription
/POSTError code with msg Insert a node in DB
/saveCA GETError code with msgAdd a complete CA, that is, save an exam in the DB.
/getCA/{nick}/{tema}/{nivel}/{language} GETError code (200 or 400) with question array objectGet complete random CA for a given topic at a given level (initial, bronze, silver, gold) and language that is published
/list/nopublished/{tema}/{teacher}/{language} GETError code with msg and in case of ok, full nodeca object array (tree structure)Get all unpublished CAs (Exams) from a teacher on a certain topic
/list/{tema}/{teacher}/{language} GETError code with msg and in case of ok, full nodeca object array (tree structure)Get all the CAs of a teacher in a certain topic
/{id} GETError code with msg and in case of ok, CA node object without childrenGet a CA node
/public/{idParent} PUTError code with msgPublish a CA or exam (Parameter id of the parent node)
/{idParent} DELETEError code with msgDelete an exam from the DB
/original/{idParent}/{idNodoASust}/{idQuestionaAn} PUTError code with msgUpdate a CA modifying original question {idQuestioASust} id of the question we want to replace, {idQuestionaAn} id of the question to add
/alt/{idParent}/{idNodoASust}/{idQuestionaAn} PUTError code with msgUpdate a CA by modifying Alternative question, {idQuestioASust} id of the question we want to replace, {idQuestionaAn} id of the question to add
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Sarasa-Cabezuelo, A.; Rodrigo, C. Development of an Educational Application for Software Engineering Learning. Computers 2021, 10, 106. https://doi.org/10.3390/computers10090106

AMA Style

Sarasa-Cabezuelo A, Rodrigo C. Development of an Educational Application for Software Engineering Learning. Computers. 2021; 10(9):106. https://doi.org/10.3390/computers10090106

Chicago/Turabian Style

Sarasa-Cabezuelo, Antonio, and Covadonga Rodrigo. 2021. "Development of an Educational Application for Software Engineering Learning" Computers 10, no. 9: 106. https://doi.org/10.3390/computers10090106

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop