Next Article in Journal
Data-Oriented Software Development: The Industrial Landscape through Patent Analysis
Previous Article in Journal
A Closer Look at Machine Learning Effectiveness in Android Malware Detection
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

BIPMIN: A Gamified Framework for Process Modeling Education

Politecnico di Torino, Department of Control and Computer Engineering (DAUIN), SoftEng Group, 10129 Turin, Italy
*
Author to whom correspondence should be addressed.
Information 2023, 14(1), 3; https://doi.org/10.3390/info14010003
Submission received: 29 September 2022 / Revised: 1 December 2022 / Accepted: 13 December 2022 / Published: 21 December 2022
(This article belongs to the Topic Software Engineering and Applications)

Abstract

:
Business Process Modeling is a skill that is becoming sought after for computer engineers, with Business Process Modeling Notation (BPMN) being one example of the tools used in modeling activities. Students of the Master of Computer Engineering course at Politecnico di Torino learn about BPMN in dedicated courses but often underperform on BPMN-related exercises due to difficulties understanding how to model processes. In recent years, there has been a surge of studies that employ gamification (using game elements in non-recreative contexts to obtain benefits) as a tool in Computer Engineering education to increase students’ engagement with the learning process. This study aims to use the principles of gamification to design a supplementary learning tool for the teaching of information systems technology. In particular, to improve student understanding and use of BPMN diagrams. This study also analyzes the usability and motivation of the participants in using different game elements in increasing student motivation and performance. As part of the study, a prototype web application was developed, which implemented three different designs, each incorporating different game elements relating to either progress, competition, or rewards. An evaluation was then conducted on the prototype to evaluate the performance of the practitioners in performing BPMN modeling tasks with the gamified tool, the usability of the proposed mechanics and the enjoyment of the individual game mechanics that were implemented. With the usage of the gamified tool, the users of the experimental sample were able to complete BPMN modeling tasks with performances compatible with estimates made through expert judgement (i.e., gamification had no negative effect on performance), and were motivated to check the correctness of their models many times during the task execution. The system was evaluated as highly usable (85.8 System Usability Score); the most enjoyed game elements were rewards, levels, progress bars and aesthetics.

1. Introduction

Computer engineering is consistently one of the most highly demanded fields in the workplace, with computer engineering-related jobs frequently appearing in the ten most in-demand jobs in Europe [1]. Recently these skills have become even more sought after due to the digital workplace transformation accelerated by the pandemic. Part of becoming a computer engineer is understanding business processes and how they relate to information systems. If able to accurately model business processes, computer engineers can more effectively design, evaluate and implement information systems that support business needs.
At the Politecnico di Torino, these skills are taught as part of the Master’s Degree in Computer Engineering course. In particular, students are taught how to model and analyze organizational processes using Business Process Model and Notation (BPMN). BPMN is a standard for the graphical representation of business processes. It can be used both at a high level for process analysis and modeling and at a lower level for actual implementation. It describes the functional and organizational aspects, i.e., who does what and when.
Students at the Politecnico often underperform on BPMN-related exercises. The modeling of business processes and their implementation with BPMN is not well understood. BPMN is currently taught at the Politecnico using traditional teaching methods such as lectures and laboratories. However, these teaching methods are being reviewed in the hope of improving student performance. Teachers of the Information Systems course are considering the potential augmentation of new teaching methods such as gamification into their curriculum.
Gamification is a field of research aimed at applying game-like elements to non-game scenarios in order to increase interest, motivation and participation, in both industrial and academic contexts [2]; its application to the Software Engineering discipline is a relatively new process. The main goals of gamification are to increase involved users’ productivity by stimulating positive feelings through an experience capable of engaging them. Creating a gameful experience that properly suits the context is not trivial: game elements cannot simply be considered separately since what actually affects the user is the result of their interaction. Gamification-based approaches have some important advantages from the psychological user-experience perspectives in non-ludic activities, such as increased motivation, focus and engagement, but also better performance and higher efficiency.
Recent studies on the effectiveness of gamification have reported success in using gamification to increase the motivation and participation of students [3,4,5,6]. Many of these studies also reported an increase in performance by students that participated in the gamified version of their experiment [4,5,7,8,9]. However, other studies have reported mixed results [10,11,12], deducing that the success of the gamified system is highly dependent on the design and implementation of the game elements. As there does not seem to be a unified consensus on the benefits of gamification, it is clear that further research on how gamification is suitable to different scenarios and how to design an effective gamified system is required.

1.1. Goal

The main goal of this study was to design a gamified tool for the teaching of BPMN and to analyze its effectiveness as a strategy to teach BPMN modeling. A gamified tool for BPMN modeling is a novel contribution in the literature, since no similar tools are available in related research.
This study also had a secondary goal of analyzing the effectiveness of different game elements in increasing student motivation, which was identified as a gap in the current literature. For this purpose, several gamification designs were analyzed, both in terms of their effectiveness in increasing student motivation and their potential for increasing student performance with regard to creating accurate BPMN diagrams. Several game elements were assessed and compared, and potential integrations were identified.
At the completion of the study, a recommendation was made on whether or not to incorporate gamification into the teaching of BPMN, as well as proposals for designing the gamified system.

1.2. Scope

This study involved the implementation of a prototype web application. The development part of this study was limited to the application of gamification to existing methods. Consequently, existing tools were incorporated into the application for managing the BPMN diagram creation and linting.
Three different versions of the web application were developed, each designed using different game elements. Since the number of game elements and their combinations are large, this study limited its assessment of game elements to the three different designs, which paired related game elements together. The three types of game elements assessed were those related to progress, competition and rewards.
The web application designs were assessed by conducting a trial on student volunteers from the Master of Computer Engineering course at the Politecnico di Torino.

1.3. Paper Structure

The remainder of this paper is organized as follows. Section 2 presents the current state of the art, examining the theory behind the BPMN notation and gamification, as well as analyzing studies that have similar goals. Section 3 describes the implementation of the tool, explaining which gamified mechanics were selected and also how the various modules that make up the tool, which was named BIPMIN, were developed. Section 4 details the first preliminary evaluation and its findings. Section 5 analyzes the issues that were found during the study; and finally, Section 6 defines future plans for the tool, based on both the results of the preliminary evaluation and on the issues that were found.

2. Background

This section presents an overview of the theoretical concepts behind this study, beginning with an explanation of the BPMN notation, followed by an analysis of gamification and its various uses in Software Engineering. It then concludes with a study on previous works in the literature that have a similar focus.

2.1. Business Process Modeling Notation

The Business Process Modeling Notation (https://www.bpmn.org/, accessed on 28 September 2022) consists of a standard defined by the Business Process Management Initiative, a non-profit organization founded to promote the standardization of common business processes. The organization has since merged with the Object Management Group (https://www.omg.org/, accessed on 28 September 2022) to further the development of e-business and business-to-business practices. The main function of BPMN is providing a graphical representation of the different elements that compose a business process, with the goal of capturing and defining the activities, rules, actors and responsibilities of the different processes. A BPMN diagram is composed of elements that belong to five basic categories [13]: tasks, flows, events, gateways and pools.

2.1.1. Tasks

Tasks, also called Activities, consist of different actions performed during a business process. The most important kinds of tasks are the following ones:
  • User Tasks: Activities that are performed by a human user with the assistance of a software application;
  • Manual Tasks: Tasks that are manually performed by users without any kind of software application;
  • Service Tasks: Tasks that are performed by software applications.
Activities may also represent Sub-Processes, which are useful for process decomposition and general organization. Tasks are represented as rectangles with curved angles, with Figure 1 as an example. Tasks are graphically distinguished by the icon in the upper-left angle (a human for user tasks, a hand for manual tasks and gears for a service task). At the same time, sub-processes are identified by a + shape at the bottom of the rectangle.

2.1.2. Flows

Flows are arrow-shaped elements used to connect the different blocks of a BPMN diagram. The most relevant ones are called Sequence Flows, the name being due to the fact that they display the sequence in which the various activities that compose a process are executed. Other significant flows are Message Flows, which are used to show the flow of messages between two separate process participants.

2.1.3. Events

Event objects represent something that can happen during a process and that affects its execution. Events usually have a cause, named trigger, and an impact, named result; moreover, they are split into three different categories (Start, Intermediate, End) based on when they affect the flow of a process. How events are drawn depends on their role in relation to the process flow, with the Start events represented by a single circle, the Intermediate events by a double circle and the End events by a bold circle (Figure 2).
Events can be used to express situations that can happen during a process without being directly performed by actors, with a figure inside the circle shape defining the event; examples of such events (Figure 3) include:
  • Receiving a message, either during a process or as the starting action of a process; identified by the outline of an envelope.
  • Sending a message, either in the middle or at the end of a process; identified by a filled-in envelope.
  • Timer events signify waiting for a time condition to become true, after which it becomes possible to either start a Process, start a Sub-Process or wait in the middle of a process flow; identified by a clock.
  • Events used to catch and handle errors; identified by a distorted N-like shape.

2.1.4. Gateways

Gateway elements control the various sequence flows that compose Processes; they can thus allow branching, merging and joining of process paths. Gateways can be used for splitting and receiving flows, with the behavior changing based on the chosen purpose.
Gateways are represented graphically as diamond-shaped elements, as shown in Figure 4, with the caveat that there is no difference between gateways used for splitting or merging flow.
Some of the most commonly used gateways include:
  • Exclusive gateways. When used for splitting, they allow only one flow to proceed, while, if used for receiving, they wait for one incoming branch to complete before continuing the flow. The flow that is allowed to proceed in case of splitting execution is the one where the activation condition connected to the gateway is verified. Exclusive gateways are identified by the X symbol.
  • Inclusive gateways. They are used to allow one or more incoming flows to proceed, based on specified conditions, in case of splitting execution; instead, if they are used in reception, they wait for all incoming flows to complete before continuing execution. These gateways are identified by the + symbol.
  • Parallel gateways. They split the incoming flow into all the outgoing branches (splitting) while also being used to wait for the completion of all the incoming branches before continuing (receiving). Parallel gateways are identified by a circle symbol.

2.1.5. Pools

Pools are horizontal, rectangle-shaped containers used to represent the different participants in a Business Process. A Pool contains the Sequence Flow associated with a single participant, meaning that it is not possible to have Sequence Flows traverse different Pools (the only action allowed between different Pools is exchanging messages).

2.1.6. BPMN Diagram Evaluation

When analysing BPMN diagrams, a way to determine whether they are correct or not according to various criteria is required. One example of metrics to evaluate BPMN diagrams comes from Dumas et al. [14], who define three main evaluation criteria, which are reported below.
Syntactic Quality relates to how much a BPMN diagram follows the syntactical rules and guidelines defined by the process modeling language. This is done via a process called Verification, in which a diagram’s structural (which types of elements are used and how they are connected) and behavioral (the possible sequences of execution of the progress) correctness are both evaluated to ensure every part of the process follows the modeling language’s rules.
Semantic Quality concerns the ability of a BPMN diagram to make true statements about the domain it is related to. While there is no defined set of rules for checking semantic quality, a process named Validation can be performed, during which the diagram is compared with the real-world process it is trying to model to assess whether both what is said in the process model is coherent with real life and whether all the relevant parts of the real process are present in the model.
Pragmatic Quality relates to the goal of having a process model characterized by a good usability. This usability is measured by the Certification process, in which a diagram is evaluated on how easy it is to understand, how easy it is to apply changes to it and how well it reveals how the process it models works in real life.
Section 3.2.2 will explain in detail how BIPMIN’s evaluation engine works, with the caveat that said engine checks mostly a diagram’s syntactic quality, as the other two are something that is more subjective, as well as comprised of elements that cannot be as easily evaluated as a process model following rules.

2.2. Gamification

Gamification is commonly defined as “the use of game design elements in non-game contexts” [10,15,16,17], as proposed by Deterding [18]. It involves the application of game principles and mechanics to enhance the user experience of tools designed for non-game purposes.
The main goal of gamification is to increase users’ motivation to interact with a system or use a given process [19,20]. It does so by addressing the three basic human needs from self-determination theory: the need for competence, the need for autonomy and the need for relatedness [20,21,22].
Gamification learns from the experiences and research developments of the game industry, identifying and extracting those features of games which motivate players and make the experience enjoyable and applying them to different industries and environments [16]. The game industry, over the years, has refined techniques for optimizing human motivation and engagement [19], resulting in a set of game elements tailored to human motivation. For example:
  • The need for competence has been addressed by providing the player with opportunities to earn points, awards and badges and compare their competence with other players through leaderboards [20];
  • The need for autonomy has been addressed by providing players with a choice of avatars, diverging story paths and choice of ways to play [23];
  • The need for connection has been addressed with meaningful storylines centred around the player, cooperative gameplay (with real or simulated players) [20] and through game-related discussion forums [23].
These game elements over the last decade have been applied to non-game contexts with increasing frequency.
Games have been around for centuries; however, gamification as a concept was only recently formally established, with the term’s first documented use in 2008 [18]. Gamification initially emerged from the marketing and digital media industry, then experienced widespread adoption in mid-2010 [15,18]. The idea that gamification could be used to improve the motivation of users was widely accepted [20]. However, conclusive evidence to support this hypothesis was lacking [16,21]. There are many examples of frameworks for gamified systems: the framework named Octalysis, theorized by Chou [24], was chosen as the basis for defining the gamified elements used in this study.
Octalysis defines eight main points, named Core Drives (see Figure 5) to evaluate and define a gamified system:
1.
Epic Meaning and Calling: the feeling users have when they believe that they are doing something greater than themselves, or that they have been chosen for something. This is usually applied when users dedicate their time to creating something new for their community.
2.
Accomplishment: the internal drive of making progress, developing skills and eventually overcoming challenges. Considered a necessary drive, as rewards without a challenge behind them are viewed as meaningless, as well as the easiest one to design.
3.
Empowerment of Creativity and Feedback: the feeling of being part of a creative process, which encourages users to repeatedly figure things out, and try new combinations of actions. It is important to allow users to express their creativity, show the results of said creativity, receive new feedback and then respond in turn.
4.
Ownership and Possession: the drive works by making users feel like they own something, innately guiding them to actions that improve what they already own and that make them own new things. This likely leads to users spending more time on customizing their avatars in gamified systems, as well as users also hoarding virtual currencies.
5.
Social Influence and Relatedness: this drive incorporates the social elements that motivate people, such as acceptance, companionship, competition and envy. Seeing friends being particularly skilled at some activity or in possession of some rare or valuable elements can be an excellent motivator to try and reach the same heights.
6.
Scarcity and Impatience: the feeling of wanting something because it is not possible to have it already; delaying the unlocking of something motivates people to think about it for a long time.
7.
Unpredictability and Curiosity: the drive of wanting to find out what happens next, as the feeling of the unknown engages the brain and makes people think about the gamified system.
8.
Loss and Avoidance: a Core Drive based upon the avoidance of something negative happening. Particularly strong with limited opportunities that can fade away, as it drives people to act immediately to avoid losing something forever.

2.2.1. Game Elements

There is no strict definition of what constitutes a game element compared to merely an element of digital applications, nor a strict boundary between what constitutes a game principle compared to a game element [18,20]. For this study, game elements are considered to be those components of games which can be extracted and applied to applications in non-game contexts and which are noticeably present in mainstream games, such as the existence of a point system or a reward scheme.
Many studies have characterized the game elements in different ways, with various different lists and descriptions of the elements [16,17,20,25]. An analysis of these studies reveals that a large amount of game elements can prove to be effective and beneficial. The game elements presented below are a subset of the elements commonly seen in many studies on gamification and its effectiveness: they can be grouped under the categories identified by Olgun et al. [26] and are generally believed to be the most effective at increasing motivation and interest of users of gamified systems, and were thus the ones considered for this study.
Points are a basic element of many games. They are, in essence, a numerical counter that grows as players complete tasks. Points can come in many formats, e.g., experience points (XP), skill points or reputation, and serve as an indicator of player progress through the game [20].
Badges, achievements and medals are different terms for a similar concept: a reward for achieving a specific goal. They are a visible indicator of the players’ accomplishments, commonly represented with a specific picture or icon relevant to the accomplished task. Badges are usually optional but encourage players to perform additional tasks or experiment with different ways of interacting with the system [17,20]. Badges can also be a form of social motivation by creating the essence of belonging to an exclusive club, especially if the badges are challenging to earn [20].
Prizes are items that users receive for achieving a task or goal. These items can be tangible, such as extra marks for a graded course for the student at the top of the leaderboard, or intangible, such as in-game items or currency. In this study, Gamification was assessed as a tool for inciting user motivational behaviors. Therefore, only intangible prizes were considered (tangible prizes are considered a separate motivational tool for gamification).
Progress bars are one method of displaying user progress through a gamified system. Similar to those seen when loading a web page, progress bars usually take the shape of horizontal rectangles filled to a certain point to indicate how much of a task a user has completed.
Levels are attached to the user’s profile or avatar and are a method of indicating the user’s experience with using a system. Commonly, there are a fixed number of levels available, which could be either numbered sequentially or labeled according to experience, for example, from novice to master. As the user completes tasks, their experience improves, which is indicated by a transition to the next level. Levels are generally tied to experience points (XP), but any counter can be used to indicate progress to the next level.
Challenges are complex tasks that require persistence, dedication and the display of skill to overcome. In gamification, challenges can be used as a method for testing participants’ skills, providing them with an arena for proving their competence in a given activity. Essentially, challenges in gamified applications consist of completing a larger task than would be required during regular use of the application.
Leaderboards compare the performance of users against others. They are generally shown as a rankings table, using performance metrics (such as points or badges earned) to sort users from highest to lowest rank. Leaderboards leverage the competitive nature of humans, displaying users’ performance compared to their peers to motivate them to improve their status [20].
Competition and Cooperation involves users working directly either against one another or with each other to achieve a shared task. Separately from leaderboards, the Competition game element refers to direct competition with a peer in a one-versus-one or group-versus-group scenario. Cooperation refers to users working with others to achieve a common goal. Cooperation can either happen within members of an explicitly defined team, or as part of a fluid collective.
Quests are tasks to be completed by the user. They are similar to the challenges game element mentioned previously. However, quests expand on these by providing a narrative aspect to the challenge. These narrative aspects provide a fictional reason for users to complete the tasks. They can be either independent stories or part of the overarching narrative for a gamified system.
Avatars are visual representations of users displayed within a gamified system [20]. They can either be selected by the user from a set list of avatar choices or created and designed by the user. Avatars allow users to express their personality and character separate from their physical appearance. In this way, they can provide a means of social interaction with a potentially different experience compared to the real world.
Easter Eggs are hidden surprises that can be discovered by users of a system. They are generally camouflaged and require users to search deeply within a system to uncover them. Easter eggs are used to encourage exploration and reward dedicated users.
Aesthetics are related to the look and feel of a user interface. This includes, but is not limited to, the color scheme, layout of components, font choices and animations. The aesthetics of a user interface are of major importance when it comes to the effective use of gamification concepts [20].
Feedback covers a broad design aspect, which is particularly important when used in an educational environment. Feedback in a gamified computer application includes interactive tutorials, error warnings, correction suggestions and notifications for the correct completion of tasks. These types of immediate feedback, which can be provided by a computer application to many students at once, as opposed to students waiting their turn for individual feedback from a teacher, can enable an accelerated learning environment.

2.2.2. Gamification in Education

Gamification has been increasingly used in the education sector as a tool to motivate student learning [21]. In the classroom context, Gamification has proved to be an effective tool to lower the learning curve of complex topics [3,27], increasing student motivation to complete tasks [6] and also as a tool for countering boredom and feelings of loneliness for users of online learning platforms [10]. It is seen as a solution for meeting the needs of next-generation students [15].
Gamification and gamified learning is not to be confused with game-based learning. Game-based learning involves the specific development of a game that is designed for educational purposes, i.e., has the goal of teaching students about a particular subject [15]. Gamified learning, on the other hand, consists of making use of some game-like elements to enhance existing learning processes [21]; for example, an online learning tool with game-based elements rather than an actual game that aims to teach a specific subject.
Methods for applying gamification techniques to education practices are still being researched and developed. The process for their application can be time-consuming for educators, and their effectiveness is varied, with some gamification studies showing a detrimental effect on learning outcomes [15]. The challenges of designing an effective gamified learning system is that they are complex, requiring a deep understanding of motivational mechanisms, which are not well understood or implemented [28]. In response to the need for a more robust gamification design process and to ease the burden on educators, several gamification frameworks have been proposed: Dicheva et al. [28] developed OneUp, a customizable gamified platform for the definition of exercises by instructors, with a distinction between warm-up and serious exercises, allowing students to train and obtain rewards with easy questions before trying the graded, serious exercises; the OneUp tool supports, in addition to static exercises such as multiple choice questions, true/false statements and matching, dynamic exercises which do not have a fixed solution but are checked for correctness via a program supplied by the instructor. While results from the study described it as an effective platform for increasing student motivation, its focus on theory concepts had made it incompatible with the BIPMIN tool’s development. A framework for gamification of Software Engineering courses has been proposed by Uskov and Sekar [29]: this framework performs a thorough mapping between the various categories of gamification techniques (identified by the authors as feedback, progress and behavior) and the actions to be performed during the entire course (assignments, deadlines, teamwork, course status, extra points for harder assignments, course grades, to name a few). An experiment performed by the authors during a Software Engineering course revealed a strong appreciation of the different gamified techniques by the students: said techniques are recommended by the autors to be implemented in Computer Science courses. Another example of a framework used for gamifying a Software Engineering course comes from Dubois and Tamburrelli [30]: the study focuses on programming best pracices, doing so by integrating the students’ development environment with the Sonar [31] platform, enabling students to view the quality scores of their developed code. The experiment performed by the authors made use of a competition mechanism for their gamified strategy, dividing the students into a group of students that could only see their own scores and a group that had access to a leaderboard containing all the students in the group. Results of the experiment show that a few of the measured metrics had higher averages in the group that had competition mechanics, leading authors to theorize that gamification with competition may lead to students improving more in comparison to gamification without it. The latter two frameworks presented above show that gamification can have benefits in terms of student participation and motivation when applied to classroom environments; however, these studies focus on arguments that cannot be applied to the BIPMIN tool, which is more effective as a simple exercise platform.

2.2.3. Gamification in Computer Engineering

The use of gamification in education is especially suited to material that is difficult, tedious or requires intensive collaboration [16]. Applying gamification techniques to repetitive or monotonous tasks can help make these tasks engaging to students [17,32]. For this reason, gamification has been considered particularly suitable for the education of Computer Engineering subjects [15]. Computer Engineering requires students to learn a number of different programming languages. For many students, learning computer programming is a difficult task [3,33]; moreover, they also lack motivation and interest in the subject due to its monotonous nature [4,32].
The utilization of gamification techniques in Computer Engineering education is still in its infancy [15]. However, over the last decade, studies have emerged regarding the effectiveness of gamification in the field, particularly for learning computer programming. For example, Marín et al. [3] applied gamification to the teaching of C programming at university and found that it had a statistically significant positive effect on student learning performance. Prabawa et al. [4] created a gamified media application to support the learning of basic programming concepts, finding that students were more engaged in the learning process and showed a better understanding of the concepts. Tasadduq et al. [7] explored the effects of gamification on students with a rote learning background and found no significant effect of gamification on students’ motivation, noticing simply that the students in the gamified track performed significantly better in the assessments.
Many literature reviews and mappings have been performed on the applications of gamification to Computer Engineering [15,34,35,36,37,38], exploring the methods and strategies adopted to find common elements and effective strategies. By reading these reviews it becomes apparent that the application of gamification to Computer Engineering, and to Software Engineering more precisely, is starting to become a widespread activity.
When considering Software Engineering and its various disciplines, gamified strategies are adopted at many different steps such as: testing (e.g., Bell et al. present testing as quests to complete [39]; Fraser et al. present Code Defenders, a competitive game where one player plays as the hacker and the other has to find defects through tests [40]; Fulcini and Ardito [41] present a gamified plugin for Scout [42], a tool for automated exploratory testing of graphical user interfaces); requirements definition (e.g., Snipes et al. present a gamified environment where requirements elicitation is enhanced with voting, rankings and leaderboards [43]); and development (e.g., graphical achievements provided by the Visual Studio IDE). This growing trend of adaptation is promising and encouraging, but the reviews also note that there is still a lack of strong empirical evidence for the benefits of gamification, meaning that future research is still required.
Finally, the literature has also provided frameworks to enable gamification of Computer Engineering education activities. An example from the Framework for Gamified Programming Education, or FGPE [44]: this framework provides the specifications for the gamification scheme and the exercise definition, a collection of gamified exercises covering different and popular programming languages, software for editing the exercises and an interactive learning environment for the students.

2.3. Related Work

There have been a few examples of tools and platforms developed to assist the teaching of modeling languages with the use of gamification. The examples that appear in this subsection are to be considered different from the ones that have appeared before due to the fact that they all share the underlying goal of teaching modeling languages: they are considered related work because they are studies about teaching the same specific field of Computer Engineering as BIPMIN. Many examples of tools that perform similar activities to BIPMIN are plugins developed for Papyrus (https://www.eclipse.org/papyrus/, accessed on 28 September 2022), a modeling tool that belongs to the Eclipse platform. One example of a plugin appears in the work of Cosentino et al. [45], where authors define a gamified system for teaching about Unified Modeling Language. The system is divided into three separate, interconnected metamodels:
  • Game. A metamodel composed of levels with increasing difficulty, with each level being formed by groups that represent specific domain areas (e.g., different kinds of UML diagrams). Groups have achievements associated with them, and achievements are earned after successfully completing a series of predefined tasks.
  • Project. A metamodel used to represent the project structure of the tool, listing all the various projects, folders and files.
  • Status. The final metamodel differs from the other two as it is not instantiated at design time, but it is only used during runtime to keep track of all game processes during a specific game instance. The structure of a Status metamodel mirrors that of a Game one.
The work is particularly notable because of its focus on aspects that are often overlooked in gamified tools, e.g., cheating prevention (game status is encrypted to avoid modifications) and user privacy (users can decide whether progress in the game can be collected by developers or not). Cosentino et al. [45] also mention in their study that their plugin can support other modeling languages such as SQL or the Entity-Relation model since these languages are present in the Papyrus tool.
Another study that improves the Papyrus tool with a gamified plugin has been conducted by Bucchiarone et al. [27]. The plugin, named Papygame, presents a dedicated game User Interface where users can keep track of their history, achievements and progress. The authors also present a separate gamification engine used for defining rules and keeping track of the player status, and a Game Master module used to set up the game, apply its mechanisms and rules and convey information about actions made in the game to the gamification Engine. Papygame makes use of three different game mechanisms to gamify the learning experience. The first one is based on game mechanisms such as the classic Hangman game. The second mechanism is based on obtaining experience (XP) and gold coins, with each completed level unlocking the next exercise and a reward for XP and gold coins. Ten gold coins can be used by students to remove one thing to be completed in their assignment, and the accumulated XP can be used to buy additional gold coins. The third mechanism consists of a virtual teacher who explains the rules and exercises and congratulates students on their performance. What happens in practice is that students are encouraged by rewards (coins and experience points) and feedback, with the Hangman receiving new parts for each error acting as another (negative) source of motivation, as failing is useful to try and improve for future tries.
There are also studies that move away from pre-existing tools such as Papyrus and develop their work: such is the case of BPMS-Game (Mancebo et al. [46]), a tool that introduces badges, leaderboards and achievements as mechanisms to increase attention to rules for defining business process models. BPMS-Game focuses mainly on the sustainability aspect of business models, a side that authors view as often neglected and should be the main focus of everyone who wants to define BPMs. The tool works by allowing administrators to define sustainability rules that employees have to respect when defining their models, as correctly following the rules translates to obtaining new awards and badges. Authors claim that using gamified elements in their tool should be a significant asset in improving the quality and attention to the sustainability of business modeling practices.
A different example of a tool that is used to facilitate the teaching of BPMN comes from the study of Kutun and Schmidt [47]: this study differentiates itself from the ones cited before as it consists of a game-based learning strategy rather than a gamified software tool. The study implements its teaching strategy via a board game composed of a BPMN wheel and a notation elements wheel: by spinning the first one players (who are divided into teams that face one another in a match) can earn cards with theoretical details, cards with questions that, if answered correctly reward them with coins, the right to turn the second wheel or the right to model a process with their teammates. Spinning the second wheel can reward players wit various possible components of a BPMN diagram (activities, events, gateways, pools). The game’s goal consists of modeling the described business process by obtaining the necessary elements with the two wheels, with the winning team being the one able to model the process with the lowest amount of errors within the given time slot (seventy minutes). The evaluation of a modeling test conducted by the authors before and after an experiment with the game revealed an improvement in the model quality for all the teams involved.
It can be said, after analyzing a few examples of pre-existing tools that aim to teach BPMN or other modeling languages with either gamification or game-like approaches, that there seem to be no examples of a gamified tool for teaching BPMN. The one example that comes the closest to this is BPMS-Game [46], which, however, focuses more on teaching how to improve modeling practice with the goal of improving sustainability of models. To the best of our knowledge, no gamified tool in the state of the art is focused on teaching modeling for beginners and students.

3. The Proposed Framework

In this section, the framework proposed for the gamification of BPMN modeling learning is described. The respective subsections describe: (i) the game elements that were selected and the motivation behind the choice; (ii) the implementation of the tool and the software components that were developed.

3.1. Selected Game Elements

The goal of this study was to analyze the applicability of gamification as a strategy for the education of BPMN, with the secondary goal of analyzing the usability and motivational impact of different game elements. To achieve these goals, three different versions of the gamified web application were developed, each utilizing different gamification elements. The gamification elements were grouped according to motivational themes. The three versions were related to progress, competition and rewards, and were identified accordingly. These themes were chosen to better understand the importance and effectiveness of the different motivational concepts represented in the different game elements. Referring back to what was discussed in Section 2, Table 1 details all the game elements that are present in the BIPMIN tool, together with the corresponding Core Drive from the Octalysis framework, as well as a brief explanation of the reasoning for choosing the various elements.
The choice of game elements came after an analysis of the systematic literature review by Olgun et al. [26], which shows that the most common game elements appearing in studies about gamification and Software Engineering are rankings, badges, levels, quests and awards. The three different versions of the BIPMIN tool were developed starting from these elements: rankings were used for the inspiration of the competition version with its leaderboards, the concept of levels was used for the progress version while awards were used as the base for the rewards version.
The three different versions of the BIPMIN tool are presented as follows.
  • Progress focuses on game elements that display the users’ progression through a set of tasks. In this study, the use of progress bars was adopted, such as the one shown in Figure 6. Progress can also refer to an increasing level of competence for the user. In games, this competence level is often represented by a skill level. In this study, a skill level was also included in this version of the web application, with users being able to progress through four different skill levels: from “Noob” to “Padawan” to “Genius” and finally to “Grandmaster”. The labels of these skill levels were chosen with reference to popular culture in an effort to be more appealing to students of Computer Engineering.
  • Competition focuses on game elements that compare users to one another. This version of the web application made use of a leaderboard to display and rank the progress of all users of the application, encouraging users to seek out the top position. In conjunction with the leaderboard, it was also necessary to implement a point-based system, with users earning experience points upon completion of exercises. These experience points (XP) were then used as the metric for ranking users on the leaderboard. User avatars were also included in this version, providing users with a simple way to project their personality into the system, which was also shared with their peers in the leaderboard display. Users were provided with the option of selecting from a set of 12 different avatars, each of a caricatured animal. An example of these three game elements in use is shown in Figure 7.
  • Rewards focus on awarding the user with prizes for completing tasks. In this version of the web application, rewards are intangible and are represented by pieces of a jigsaw puzzle. Upon completion of exercises, users are rewarded with a number of jigsaw pieces proportional to the difficulty of the exercise. These jigsaw pieces come together to form a hidden image, with users only able to view the portions of the image shown on the pieces they have collected. This version also implements unlockables, which is another game element related to rewards. In the web application, subsequent exercises are locked and can not be attempted until a sufficient number of previous exercises have been successfully completed.

Shared Elements

Shared game elements spanning all three versions of the application include the aesthetic design and use of immediate feedback. The aesthetic design of the application endeavoured to be clear and pleasant to view, following the principles of user interface design as proposed by Benyon [48]. Users should be able to quickly understand the layout of the application and easily navigate and interact with the components. Visual components were chosen based on their ease of use and consistency with expected norms. A color scheme using lime green as the primary color was chosen for the application. Green was selected as it represented correctness, as a color often used for showing success, to make users feel more successful. Gold was chosen as the color representing the rewards because of its association with treasure.
Immediate feedback was identified as a game element that would be beneficial for inclusion in a gamified education tool. With immediate feedback, users can quickly learn from their mistakes without having to wait for a traditional review by a teacher. Immediate feedback was implemented in the BIPMIN application through a “check solution” button, which, when pressed, analyzed the current diagram and displayed any errors to the user. If no errors are found, the user is notified of the successful completion of the exercise.
Following the Octalysis guidelines and patterns defined by Chou [24], the tool and its gamified elements were also analyzed to obtain a visual representation of its strengths and weaknesses. Note that we used the Octalysis visualization tool after the gamification mechanics were selected and in a quantiative way, i.e., for each mechanic that was implemented in the tool we assigned a point to the related core drive. Thereby, the Octalysis tool is used only as a visualization of the implemented mechanics and not as a qualitative, subjective assessment and scoring of the eight drives as they are perceived by the final users of the gamified tool.
The resulting graph, shown in Figure 8, displays an imbalance, with a greater focus on the Accomplishment and Ownership Core Drives, while others such as Scarcity, Avoidance and Epic Meaning appear neglected. This imbalance will require future changes and improvements aimed at increasing the effects of the neglected core drives. Some examples of how the tool could be improved include:
  • Implementing a “story” as a background for the tool and linking progress with the exercises to progress in the story, giving players increased purpose.
  • Adding a mechanism for keeping track of the errors made by students, with a loss of points assigned to exercises in case of too many errors. The presence of negative consequences to avoid could prove to be an effective mechanic in increasing student interest.
By observing the chart it can also be inferred, however, that the different versions of the tool, with the various gamified mechanics, have a positive influence on many of the Core Drives defined by the framework, meaning that the existing mechanics are a good starting point for building a complete and effective tool for teaching BPMN modeling with the assistance of gamification.
Figure 8. Octalysis graph of BIPMIN’s gamified elements.
Figure 8. Octalysis graph of BIPMIN’s gamified elements.
Information 14 00003 g008

3.2. Implementation

A number of different software tools were utilized to develop the web application. bpmn.js (https://bpmn.io/toolkit/bpmn-js/, accessed on 28 September 2022) was chosen as the BPMN modeling tool. It provides the functionality to interactively create BPMN diagrams and can be embedded into existing web applications. bpmn.js is written in JavaScript and provides a version for embedding that allows access to the individual components of its library. As part of the core services, bpmn.js provides access to its ElementRegistry, which lists all of the components existing in the currently displayed diagram. The ElementRegistry also includes a number of APIs to retrieve elements based on different criteria, as well as providing access to their properties. This functionality was used in the BIPMIN application to create custom BPMN diagram evaluation rules, explained further in Section 3.2.2.
Being an open-source project, bpmn.js, a part of bpmn.io, has an online community [49] where developers share examples of their usage of bpmn.js and propose extensions to the library. One extension available in the community is the bpmn-js-bpmnlint extension [50], which provides a linting functionality to the BPMN modeler. This linting functionality will validate the currently displayed diagram against a set of standard BPMN diagram rules and display errors on the diagram when any rules have been broken. An example of the bpmn-js-bpmnlint extension incorporated into bpmn.js is shown in Figure 9. The bpmn-js-bpmnlint extension was also incorporated into the BIPMIN web application. It was used to provide students with feedback on their diagrams and encourage good modeling practices.

3.2.1. Exercises

A set of exercises related to the creation of BPMN diagrams needed to be developed for use in testing the BIPMIN web application. A study of relevant literature included recommendations that exercises in gamified systems be designed with increasing difficulty levels and adapted to the skills of the students [5]. The  difficulty should be designed while also considering the time required to complete the exercises: if they take too long, students may not have sufficient time to complete them; on the other hand, if they are too short, students may find exercises too trivial and meaningless [51]. For this study, the exercises were designed to be short, with experienced users able to complete each exercise in under 5 min. In this way, the prototype could be more efficiently evaluated, focusing on the motivational impact of the gamification components, with testers able to complete multiple exercises during the testing period. The exercises were also designed with increasing difficulty levels, introducing new BPMN concepts as users progressed through the system.

3.2.2. Evaluation Engine

BPMN diagrams have traditionally been evaluated manually by teachers of the Information Systems course at the Politecnico di Torino. This assessment process is time-consuming, and students receive feedback about their diagrams in a delayed manner. A goal in the development of the BIPMIN web application was to implement an immediate feedback mechanism to allow students to improve their skills more efficiently and increase engagement. This required the development of a BPMN diagram evaluation engine to allow the web application to programmatically assess the BPMN diagrams. This tool would not only assess the diagrams against the standard rule set (as implemented using the bpmn-js-bpmnlint extension) but also against a set of definable assessment criteria provided by the teacher for each exercise. These additional rules define the success criteria for completing the exercise. Upon evaluation of an exercise, the application will then provide students with feedback on criteria that are not satisfied, allowing them to review their solution and correct any mistakes.
A specific grammar was developed for the creation of the assessment criteria, which is then used by the web application to appropriately assess the diagrams. Together with the grammar, a list of six different criteria accepted by the application were also defined. This criteria list is not exhaustive but was developed to provide a good breadth of assessment criteria sufficient to guide students to the right solution and improve their BPMN diagram creation practices; moreover, further programmatic assessment criteria could be developed in the future to improve the capabilities of the BIPMIN tool. When using the API to define new exercises, the rules must be defined using the appropriate grammar related to the intended rule. The API accepts objects in the JSON (JavaScript Object Notation) format. Components listed in the grammar rules must be labeled according to their specific bpmn.js type, as used in the element registry of the BPMN Modeler.
The six criteria used by the evaluation engine, together with the corresponding grammar definition for each, are listed below:
1.
Specific BPMN components must be present in the diagram.
  • This rule defines the number of components that must be present in the completed diagram. For example:
                {"StartEvent": 1, "Task": 2, "EndEvent": 1}
    requires the diagram to contain exactly one starting event, two tasks and one ending event.
2.
Connections are required between components of a specified type.
  • This rule defines which connections between components are required to be present in the completed diagram. The connection is defined by specifying a target of a given component type. For example:
                {"Target_ExclusiveGateway" : "Task"}
    requires an exclusive gateway to be connected to a task. These components are evaluated in order, meaning that in this example, the task must follow the exclusive gateway in the flow sequence.
3.
A component with a specific type definition must be present.
  • This rule defines which type definitions must exist for specific components in the diagram. For example:
                {"Definition_EndEvent" : "TerminateEventDefinition"}
    requires an end event to be of type terminate.
4.
A specific message flow from one element to another.
  • This rule defines the message flow connections required in diagrams with multiple pools. For example:
                {"MessageFlow_Task" : "StartEvent"}
    requires a message flow connection from a task in one pool to a start event in another pool.
5.
Number of outgoing connections on a specific component type.
  • This rule defines the number of subsequent sequence flows coming from a component in the diagram. For example:
                {"Outgoing_ExclusiveGateway" : 2}
    means that an exclusive gateway must have exactly two outgoing connections to other components.
6.
Number of incoming connections on a specific component type.
  • This rule defines the number of prior sequence flows coming into a component in the diagram. For example:
                {"Incoming_ExclusiveGateway" : 2}
    means that an exclusive gateway must have exactly two incoming connections from other components.
The assessment criteria defined using the proposed grammar were then used by the evaluation engine of the BIPMIN web application. This engine works by loading the element registry provided by bpmn.js which lists all the components present in the current diagram along with their properties. For this reason, the specific bpmn.js component types must be specified so that the evaluation engine can compare the assessment criteria to those components listed in the registry. For each criteria, the evaluation engine filters the registry for the relevant components and checks that the criteria are satisfied. For example, criteria five will check that at least one of the components has the correct number of outgoing connections. A snippet of the code implementing the check of criteria five is shown in Listing 1 :
Listing 1. Code for checking criteria five.
1const componentNode = elementRegistry.filter(
2(e) =>
3    e.type === "bpmn: " + component
4)
5for(let comp of componentNode){
6    if(comp.outgoing.length === numberOut){
7       satisfied =true
8    }
9}
Upon evaluation of an exercise, the application will then provide students with feedback on criteria that are not satisfied, allowing them to review their solution and correct any mistakes.
The evaluation engine was developed as an example of the potential for automatic assessment of BPMN diagrams. This example does not cover all errors that could be made by students during the creation of diagrams and the interpretation of the exercise description. BPMN diagrams are also inherently subjective, with multiple correct solutions being possible depending on the description of the task. In the BIPMIN application, the exercises were designed to provide clarity on which components are required in the diagram, specifically listing each of the components to help students create diagrams that would satisfy the evaluation criteria. The evaluation engine is, therefore, useful for simple exercises to teach students the basic concepts of BPMN modeling, but it does not scale well to complex solutions. In addition, it was not possible for the evaluation engine to easily identify specific nodes on a diagram because the ID of components could not be guaranteed. It could only check for the existence of node types and their connections. It was, therefore, difficult to check the specific ordering of components of a diagram if that diagram included multiple nodes of the same type. This constraint could possibly lead to incorrect diagrams being assessed by the engine as correct. Further developments on the capability of the evaluation engine and the addition of more detailed criteria definitions are recommended if the application is to be deployed in a classroom environment.
Another limitation of the evaluation engine is related to the different categories of quality defined by Dumas et al. [14]: from a development and engineering point of view, it is not trivial to implement checks on semantic and pragmatic quality since they depend on parameters such as the coherence between a process diagram and the real world or the readability of the diagram. The criteria defined by the evaluation engine are to be considered as a starting point of rules that are intended to be expanded upon for future studies, which will aim to develop extensive criteria for evaluating the syntactic quality of BPMN diagrams.

3.2.3. User Interface

The final design of the BIPMIN application was generated using components from the React-Bootstrap (https://react-bootstrap.github.io/, accessed on 28 September 2022) library to provide visual elements consistent with modern material design.
An example of the main page of the BIPMIN application is shown in Figure 10, which displays the progress version of the application. The main part of this page is allocated to the BPMN modeler, with the title of the currently selected exercise displayed at the top and buttons below the modeler to check the solution of the current exercise or navigate to the next or previous exercise. On the left is a side panel, which allows users to select directly which exercise they wish to complete and also shows the exercise instructions. This panel also includes progress bars indicating how many exercises the user has completed for each of the parts, as well as the total progress on all exercises displayed at the bottom. At the top of this side panel, the user’s current level is displayed, with guidance on how to progress to the next level.
All three versions of the BIPMIN application share a similar main screen as in Figure 10; however, each version has its own flavor for the side panel. For the competition version, the side panel includes two tabs, one showing the list of available exercises and the other showing the leaderboard. The competition version side panels are shown in Figure 11. The exercise list is similar to that of the progress version; however, the competition version displayed the number of experience points the user will earn upon completing each exercise. It also includes a tab for viewing the current state of the leaderboard, listing the rank of all users based on the experience points they collected. At the top of the side panel, users can see their total experience points and their chosen avatar.
The rewards version side panel includes two tabs: one showing the exercise list similar to the competition version; and the other showing the users’ reward collection. These tabs are shown in Figure 12. The exercise list in this version displayed the number of jigsaw pieces the user will be rewarded with upon completing each exercise. This version also includes unlockables, so subsequent parts in the exercise list are locked until the completion of the previous exercises. This is displayed using a padlock icon, and users cannot view the exercise descriptions until the parts are unlocked. The rewards tab lists the number of pieces collected by the user so far and also shows their portion of the image in the hidden jigsaw puzzle.

4. Preliminary Evaluation

The evaluation of the BIPMIN tool consisted of two parts: one part assessed the usability of the tool, and another part evaluated the effectiveness of the different game elements represented in the three different versions of the application in motivating the participants. The assessment of both parts was conducted during a single trial program.

4.1. Design

The design of the trial program followed the Goal Question Metric (GQM) template [52] by determining first the goal of the trial, then the associated research questions to be answered by the trial, and finally the metrics with which to measure the outcomes of the trial and answer the corresponding research questions.

4.1.1. Goal

The trial program conducted for this study was a preliminary evaluation of the usability of the tool and of the enjoyment and motivation of participants to utilize the proposed game elements. The GQM template, shown in Table 2, was used to formulate the goal. The goal can be expressed as follows:
The purpose of this study is to evaluate and improve the usability and motivation of participants in using different game elements in motivating students to interact with a gamified BPMN modeling application from the point of view of software development researchers in the context of an educational environment.
Gathering preliminary results on the usability and interest of the participants toward the various game elements will help inform future gamification studies on the importance of each of the assessed game elements, helping to focus the design of future systems on game elements and combinations that are more crucial to designing successful applications.
Table 2. GQM Template.
Table 2. GQM Template.
Object of studygamified BPMN modeling application
Purposeevaluate and improve
Focuseffectiveness at encouraging student interaction
Contexteducational environment
Stakeholdersdevelopers, researchers, teachers
The evaluation also included an analysis of the usability of the BIPMIN application to enable improvements to the tool for use in a subsequent larger experiment. A larger experiment is planned to be conducted during the teaching of the Information Systems course at the Politecnico di Torino. This preliminary evaluation will be used to improve the tool so that it is ready for deployment for use in the course experiment, which is planned to be run during the next semester of the university course program.
This trial program also provided a proof of concept for the utility of the technology in Computer Engineering educational environments.

4.1.2. Research Questions and Metrics

Following the GQM technique, a set of research questions were developed to represent the intent of the evaluation. The trial was designed to answer the following three research questions:
  • RQ1:Is the system effective in motivating students to perform modeling tasks?
  • RQ2:What is the usability of the system?
  • RQ3:Which game elements are preferred by the users of the system?
The objective of RQ1 is to assess whether the tool is capable of providing a reasonable performance in the tasks executed by the participants, and whether the adopted mechanics are able to motivate the participants in controlling their solution’s correctness and perform additional tasks. Thereby, measurements were taken to determine if students could:
  • Successfully log in to the application;
  • Easily navigate to their desired location;
  • Understand how to complete exercises within the application;
  • Understand the feedback provided by the application to easily recover from any errors, and 
  • interact positively with the game elements implemented.
To assess the performance and motivation provided by the BIPMIN application, we also considered a quantitative measure of the time it took participants to complete certain tasks relating to the navigation of the web application. Each task listed in Table 3 (a number of which were included to answer this research question) was assigned success criteria along with corresponding metrics to measure the success. The list of success criteria for each task is listed in Table 4. The exercises designed for the tool are shared among the different tool versions, with the caveat that their order is different depending on the version (e.g., the first exercise of the rewards version appears as the fourth exercise for the competition version); since the tasks required participants to perform the first exercise of each version this allowed participants to experience different exercises for different tasks. To evaluate the comprehension of the exercise error feedback, a metric was also included counting the number of times the participant checked their solution when completing exercise-related tasks (this check also displayed the list of errors to the user). The time values for a successful task completion were decided under the assumption that each task (with the corresponding sub-tasks) could be completed successfully in the specified time (e.g., the time for successful completion of task 3.1 was chosen as five minutes because said time was hypothesized to be an acceptable time to complete the sub-task). These times were hypothesized based on expert judgement and experience of the authors in previous editions of the course where the students were asked to solve equivalent exercises.
The objective of RQ2 is to assess the usability of the BIPMIN system. To that extent we asked participants to complete the System Usability Scale (SUS) survey [53] upon completion of the trial. The SUS is a common tool used to assess the usability of software applications. It consists of 10 standard statements about the user’s experience using the application, with users asked to rate their agreement with each statement ranging from 1 (strongly disagree) to 5 (strongly agree) using a Likert scale. The phrasing of statements alternates between positive and negative to remove any biases possible if any user simply ticked strongly agree to every statement without reading the survey.
The results from the SUS are combined to calculate a score ranging from 0 to 100. A score above 68 is considered above average. The score is calculated using the following equation:
S c o r e = 2.5 ( ( Q o d d 1 ) + ( 5 Q e v e n ) )
where Q o d d is the value for odd-numbered questions and Q e v e n is the value for even numbered questions.
RQ3 is instead related to the secondary goal of this study to analyse the preference of final users of gamified systems toward specific gaming mechanics.
To answer RQ3 three different versions of BIPMIN were developed, each including different game elements but with similar aesthetics. The game elements studied, and their use in the corresponding versions of the application are listed in Table 5.
Two sets of metrics were created to answer RQ2. The first related to directly observing student behaviour when faced with a choice between the different game elements. As part of the trial, participants were asked to complete one exercise in each of the three versions and afterwards asked to complete one exercise in the version of their choice (T6). This task was included to clearly show student preference for particular game elements, which might have differed from their responses to a related survey question. A final optional task (T7) was also provided, encouraging participants to complete further exercises in their chosen or other versions. The results from the final task were used to assess whether the game elements provided high enough motivation for participants to complete optional exercises.
The second set of metrics involved the formulation of a post-trial questionnaire. The questionnaire included questions relating to participant preferences between the three versions and also between the eight-game elements studied. The questions were designed using a ranking system, with participants asked to rank the different versions and game elements in terms of their enjoyment and preference of gamified mechanics. These results will serve for a preliminary investigation of the gamified mechanics that are preferred by the users and that may be worth deeper investigations in future experimentations.

4.1.3. Participants

Participants in the trial program were sought using convenience sampling from the cohort of Computer Engineering students at Politecnico. The participants were selected as representative masters-level students who would learn or use BPMN as part of their studies. Due to this being a preliminary trial program, only a small number of participants were arranged, with a larger experiment containing many more participants expected to be conducted during the Information Systems course in the next semester. A total of 12 volunteers were found to participate in the trial.

4.1.4. Procedure

The trial required participants to complete a number of tasks using a web-based prototype of the BIPMIN application. The full list of tasks is shown in Table 3. The first task was to complete a tutorial regarding the use of the BPMN Modeler bpmn.io, with participants accessing the public website of bpmn.io [54] to complete the tutorial. The task represented the non-gamified version of the tool, reflecting how students would have previously created BPMN diagrams before the creation of the BIPMIN tool, and served as a baseline for comparisons with the gamified versions.
Subsequent tasks involved the participant completing one exercise from each of the three versions of the BIPMIN application. During the completion of these exercises, participants were asked to identify and interact with the various gamification elements used to familiarise themselves with the differences between the three versions. After each of these exercises was complete, the participant was then asked to complete one further exercise from a version of their choice. Finally, participants were encouraged to complete further exercises from any version if desired and then asked to complete two post-test questionnaires.

4.1.5. Methods

A within-subjects experimental technique was employed for this trial program. In this way, each participant performed the trial for each version of the application. This method was preferred over a between-subjects technique (where each participant performs the trial on only one version of the application) due to the small number of participants available for the trial program. This technique will be reviewed for future experiments involving a larger number of participants.
To counter the effects of order biases experienced when utilising the within-subject technique, a counter-balancing approach was adopted. Counter-balancing involved modifying the order of tasks completed by the participants such that participants experienced the different versions of the application in different orders. By adopting this approach, any efficiencies learned by participants during their completion of previous tasks in the trial were balanced across the three versions. The order of tasks was determined using a balanced Latin square [55], specifically to prescribe the order of tasks T3, T4, and T5 (as listed in Table 3), which corresponded to the completion of exercises in the progress, competition and rewards versions of the application respectively. With three versions of the application to be tested, the balanced Latin square consisted of 6 different variations to the order of task completion. These orders are listed in Table 6. With 12 volunteers participating in the trial program, each order was used exactly twice.

4.2. Results

The results of the evaluation trial program are reported herein, including the results of the recorded metrics for the two research questions analyzed and general observations about the trial and the utility of the BIPMIN application.

4.2.1. Motivation (RQ1)

RQ1 related to the motivation induced by the use of BIPMIN in the participants to the experiment, and to the performance obtained when using BIPMIN within the context of the trial.
The task metrics included assessing the time it took participants to complete each task during the trial. The average time for the 12 participants to complete each task is given in Table 7 alongside the success criteria for completing the task. In all instances, the average time taken by participants was less than the specified success criteria, indicating that the application was able to be successfully used and understood by the evaluators.
Tasks T3.3, T4.4 and T5.3 are all related to returning to the main screen. Since the order of these tasks varies depending on the participant task order (as defined in Table 6), they were combined, and the reported value represents the time taken to return to the main screen the first time the participant was requested to do so. In the subsequent related tasks, the participants already knew how to return to the main screen and generally would only take 1-2 seconds to complete the task. Although the average for this task is well within the success criteria, it is noted that there were three instances during the trial where the facilitator had to assist participants in returning to the main screen. This was not considered a significant issue because this main screen was implemented for trial purposes only and is not planned to be present in the final application. For the trial, however, it was suggested to include a home icon in the navigation bar beside the logo to more clearly indicate its use as a button to return to the main screen.
During the trial program, there were, however, 13 instances where participants took longer than 5 min to complete an exercise-related task (T3.1, T4.1, T5.1 or T6). Six of these instances were experienced during the two remote trials, which were conducted online. The longer times are attributed to the lag experienced with the interactions during the online subministration of the experiment, since the web prototype used was accessed by the participants through the use of remote desktop applications. Five of the instances were also during task 6, which, by designing exercises with incremental difficulty, would always be slightly more complicated than previously experienced. Of the in-presence trials, however, all times remained below 8 min per exercise.
A second task metric measured the number of times a participant clicked the “check solution” button when completing an exercise task. This metric was used to determine the evaluation engine’s usefulness and users’ ability to recover from errors present in their submitted diagrams. The results of this metric are given in Table 8. Most participants checked their solutions between 1 and 3 times before the solution was considered correct by the application. In a few instances, participants had to check more times, but only up to a maximum of 6 times. This low number, paired with observations of the participants during their interaction with the evaluation engine, proved its usability for interactive BPMN exercises. The answer to RQ1 is reported in Box 1.
Box 1. Answer to RQ1.
Answer to RQ1: The task performance measurements indicate that the introduction of gamification has no negative effect in BPMN modeling activities. The participants were motivated and encouraged to check their solutions and their results often by the gamified environment.

4.2.2. Usability (RQ2)

Once the trial tasks were completed, participants were requested to fill out the SUS survey. The score for each participant was calculated using Equation 1, and then the final score was calculated as the average amongst all participant responses. The results from the SUS survey are listed in Table 9. The average score for the BIPMIN application was 85.8, well above 68, considered to be the average usability of software applications. This number indicates that the BIPMIN application exhibits good usability characteristics, being more than above the average computed by Sauro [56], the most extensive study on the SUS survey, which ranks five hundred different evaluations made with the scale. To add more perspective, a study by Pekpazar et al. [57] ranked four of the most commonly used mobile applications by a sample of 222 participants (YouTube, Facebook, WhatsApp, Mail app) with the SUS survey: results show an average score, respectively, of 84.02, 73.19, 87.32 and 76.72. The score obtained by the BIPMIN tool is comparable to the ones obtained by the four applications, but the small sample of participants must be considered: a large-scale experiment with more participants is almost certain to yield a different score, which may be in line with the one obtained in this study.
Concerning the usability and design of the BIPMIN application, many participants commented during the trial about how pleasant they found the design of the application, how it was intuitive and easy to use, and how they would be happy to interact with the application further as a learning tool for creating BPMN diagrams. These results show that the tool can be easily understood and picked up by students, as well as the fact that it is easy to use for the purpose of learning modeling. It is worth noting that high usability does not necessariliy translate to high effectiveness and efficiency in learning activities. However, a high usability is still a valuable result, being it a necessary pre-requisite for the adoption and utilisation of any efficient gamified learning tool. Conclusions related to RQ2 are presented in Box 2.
Box 2. Answer to RQ2.
Answer to RQ2: The gamified system obtained a score of 85.8 in the System Usability Scale, above than the average usability of software applications (68).

4.2.3. Preferred Gamification Elements (RQ3)

The questions of the questionnaire related to RQ3 were aimed at identifying the gamification elements that were preferred by the users in the experimental sample. To answer RQ3 we directly observed the participants’ choices during the trial, and we analyzed their responses to a post-trial questionnaire.
During the trial, participants were given a task which required them to complete one extra exercise from the version of their choice (task T6). Figure 13 shows which version was chosen by participants to complete this task. The rewards version of the application was overwhelmingly the most chosen version, with 8 out of 12 participants selecting this version. 2 participants chose the progress and competition versions each. This is a particularly interesting result because the rewards version did not contain any key game elements popularly associated with gamification, the PBL trio of points, badges and leaderboards. It instead focused on providing rewards for tasks and showing a collection of pieces to unlock. An element of mystery was also provided in this version, with users encouraged to discover the hidden image.
Upon trial completion, participants were also requested to complete a post-trial questionnaire. This questionnaire aimed to discover the effect of the different elements on participants’ feelings of enjoyment and motivation and also determine their preferences.
The questionnaire asked participants to rank the three versions in terms of their enjoyment of the presented mechanics, by ranking the versions from the most enjoyed (which was awarded 2 points) to the less enjoyed (0 points). with the mid-positioned version awarded 1 point. The chart in Figure 14 reports the total enjoyment scores for the different versions of the gamified system. The results highlight that the reward version proved to be the most enjoyed by the participants. It is worth highlighting that, however, the limited number of participants to the experiment and the limited length of the trials were a possible source of bias for the participants’ respondents, since the competitive mechanics of the tool could not be fully appreciated.
Participants were also asked to rank the game elements according to which they enjoyed using the most. According to the rankings provided by the respondents, the game elements were given a score ranging from 1 (the least enjoyed) to 8 (the most enjoyed). Figure 15 reports the scores obtained by all the game elements over the whole set of respondents. The most enjoyable elements were found to be, in order: rewards, avatars, aesthetics and leaderboards.
A final question allowed participants to enter free text comments providing any feedback or suggestions relating to the BIPMIN application. The responses to this question included comments about how participants found the features in the tool useful and stimulating. They appreciated the visuals and novelty of the rewards collection provided in the rewards version. A participant also acknowledged that they were not competitive and did not care about the leaderboard but appreciated that other people might find it more motivating.
Several observations regarding using the different game elements were noted during the trial program. The competition version of the application was not assessed to be as enjoyable by the participants when compared to the rewards version. However, it was acknowledged by participants that the competition version might be more appealing when deployed in a classroom environment and when users are competing against known colleagues. One participant admitted that they wanted to compete with their colleagues. If they used the application while attending the Information Systems course, they envisaged using the competition version the most (even though in the experiment, they chose the rewards version). The lower performance of the competition version could be due to the evaluation being conducted in an independent trial environment. This possibility warrants further investigation and is recommended to be considered during future experiments.
General comments from participants during their use of the tool also indicated that they were pleased with how the game elements had been incorporated into the application, enjoying the look and feel of the elements. Participants were particularly pleased with the choice of avatars and the display of the rewards collection. The final answer for RQ3 is reported in Box 3.
Box 3. Answer to RQ3.
Answer to RQ3: The gamification elements introduced in the tool were considered enjoyable by the participants, with a preference for rewards, avatars, aesthetics and leaderboards.

4.2.4. General Observations

Separate from the two formulated research questions, this study also aimed to design a tool that effectively improved students’ understanding of BPMN modeling. Observations relating to this goal were made during the evaluation session to assess the participants’ development of their BPMN modeling skills during the trial. These observations are summarised below:
  • the majority of participants were noted to improve their component labelling practices during their use of the tool. After their first error notification about a component missing a label, they remembered in subsequent exercises to label all their components. Some participants even labelled more components than were deemed necessary by the linting module. Other participants already had good labelling practices and never experienced errors relating to missing labels.
  • during the trial, some participants initially used the wrong type of component required for the exercise. Upon being notified and correcting their error, in subsequent exercises, these participants learnt to consider the types of components before submitting their solutions.
  • generally, participants were noted to learn from previous errors and did not repeat any mistakes.
These observations suggest that the BIPMIN application positively improved the trial participants’ BPMN modeling practices. Further research is, however, required to quantitatively prove its effectiveness, which will be conducted in the following semester.

4.3. Recommendations

The following section provides recommendations for future development of the BIPMIN application based on the outcomes of the evaluation.
This study analysed the effectiveness of 8 different game elements in motivating students to learn BPMN modeling. With the plan to combine the three versions of the BIPMIN application into one version to be used by future courses, the following recommendations are made relating to the choice and implementation of the gamification elements:
1.
results from this study found the top most enjoyable game elements to be: rewards, levels, progress bars, aesthetics, avatars and leaderboards. We therefore plan to keep such elements in the final version of the application. In particular, the rewards element was deemed both the most motivational and enjoyable by the participants in the evaluation for this study.
2.
aesthetics was also found to be a key attribute that was important for motivation and enjoyment. The general aesthetics of BIPMIN were found to be pleasing to participants of the trial program and should be retained in future versions of the application. If any changes to the aesthetic design are proposed, they should be carefully considered.
3.
this study only considered eight-game elements. However, many more game elements could be incorporated into the application. Of the elements analyzed in Section 2 the following could be suitably incorporated into BIPMIN:
  • Storytelling
  • Badges
  • Quests
Future studies could measure which elements are most interacted with by users to assess the effectiveness of the different elements in a combined application.
4.
observations from the trial program indicated that the competition version might perform better in a classroom environment. Therefore, it is recommended to keep the game elements from the competition version in the final version of the application and to design an experiment to further test its motivational potential.

5. Threats to Validity

The potential threats to the study validity are discussed according to the four categories reported by Wohlin et al. [58].
Threats to Internal Validity concern factors that may affect the results and were not considered in the study. In this study, internal validity threats were mostly related to the design of the gamified tool and, more specifically, to the selection of the gamified mechanics to implement. The selection of the gamified mechanics is based on the potential benefits and drawbacks discussed in related literature. However, it is not certain whether they constitute the optimal set of mechanics.
Similarly, in the discussed evaluation, a separation of the BIPMIN mechanics in two different subsets was performed. A verification was not undertaken to understand whether the grouping had an influence on the results of the evaluation and if better groupings were available.
Finally, there can also be some concerns regarding the implementation of the evaluation engine, as it currently checks only whether a diagram respects syntactic quality, omitting evaluations of pragmatic and semantic quality. This limitation, together with the fact that the various evaluation criteria defined by the engine are limited and do not cover every possible aspects of BPMN modeling, can be considered a relevant detail that may affect the inferred results.
Threats to Conclusion Validity are factors that can lead the researcher to reach an incorrect conclusion about a relationship in their observations.
The evaluation of the tool described in the paper serves as a preliminary evaluation of the usability of the tool and the feasibility of an application of BPMN education in a learning context. The collected results might therefore be invalidated by thorough future empirical evaluations of the tool.
Regarding the evaluation of the progress version of the tool, the length of the experimental phase, which involved a small set of tasks performed by each participant, can be seen as a limitation of the value of the gathered results. To better evaluate the progress features, experiments with a longer time span, including multiple tasks performed by the same participants, will be set up as future work.
Moreover, it must also be mentioned that results concerning the competition version of the tool are to be considered partial and preliminary, as the amount of participants to the evaluation experiment is too small to evaluate competition mechanisms. The time constraint of the experiment may as well have an impact on the participants’ evaluation of competition mechanisms, since leaderboards and points mechanisms can hardly be evaluated in a short period.
Finally, no evaluation can be performed on the effectiveness of the tool about learning core BPMN concepts, since the tool has not been yet employed in a full course where the competence of the participants in modeling BPMN diagram has been evaluated before and after the utilization of the gamified learning tool.
Threats to Construct Validity concern the relationship between theory and observation. In the case of this study, there was no direct mapping between the metrics that could be measured in the evaluation phase (i.e., enjoyment, time to complete tasks, preferred versions of the tool) and the effectiveness of the BIPMIN software when used as a learning tool. It will be necessary to perform longitudinal experiments to seek possible correlations between the utilisation of the tool and the students’ grades to evaluate the tool effectiveness as a learning instrument, as reported in the Future Work section.
Threats to External Validity concern whether the results can be generalized. This study reported the results of a preliminary validation performed on a tool to be used in a gamified educational context, with the objective of teaching process modeling with BPMN models. The preliminary results discussed cannot be generalized to other methodologies that can be used for process modeling or cannot be generalised to general educational settings. The selection of the gamification mechanics to introduce can also have an influence on the results of evaluations performed on a gamified tool. Therefore, the results of the preliminary validation cannot be compared to those of tools that employ different gamification mechanics and dimensions.

6. Conclusions and Future Work

This study applied gamification principles to Computer Engineering education by designing a gamified tool for the teaching of BPMN practices. The effectiveness of different game elements was then analyzed to determine which elements were the most important for improving student motivation and engagement with the tool and, by extension, the subject.
A prototype web application was developed named BIPMIN, which implemented three different designs, each incorporating different game elements relating to either progress, competition or rewards. The web application was designed following Software Engineering design principles. Additionally, for this study, an evaluation engine was developed to enable the automatic assessment of BPMN diagrams, and its suitability was analyzed.
A trial program was then conducted to evaluate the tool usability and the effectiveness of the various game elements in motivating the students. The program involved 12 volunteer participants from the cohort of Computer Engineering students, who were asked to complete several tasks using the web application. Participants’ choices and performance were recorded during the trial and used to assess the tool usability and effectiveness at increasing motivation.
The tool effectiveness at improving students’ understanding of BPMN modeling was also assessed. Participants were observed during the trial program to: improve their component labelling practices; consider the types of components in their diagrams, and learn from their previous errors in responding to notifications from the evaluation engine. These observations suggest that the BIPMIN application improved the trial participants’ BPMN modeling practices. This study demonstrated the potential of gamification and its components in motivating students to engage with a BPMN teaching tool.
The application usability performed well, scoring an average of 85.8 on the System Usability Scale. Participants were generally pleased with the user interface design aesthetics, layout and intuitiveness. Despite the application’s good usability performance, several recommendations were proposed to improve the usability further. These recommendations included adding additional feedback to users relating to any errors found with their submitted exercise solutions and adapting terminology used to be more broadly understandable by students. It was also suggested to implement additional reference material within the web application, including a tutorial introducing the BPMN components and their usage.
The rewards version was found to be the most enjoyable for students of the three different designs analyzed. It performed strongly in all criteria when compared to the progress and competition-based designs. However, by analysing the feedback received during the trial, we noted that the competition version could perform better in a classroom environment, where students could compete against their classmates directly. Further analysis of these elements was therefore recommended better to understand the cohort’s behaviour in a learning environment.
This study, given the limited number of participants that performed the evaluation of the gamified BPMN modeling learning platform, can serve only as a preliminary evaluation of the tool, mostly in terms of usability and possibility of applying it to an educational context. As an immediate extension of the current work, it is planned to apply the BIPMIN tool to all the practice classes of a BPMN modeling course, to collect a statistically significant sample of answers. Having a longer time span for the execution of the experiment, and a higher number of participants, will allow measurement of the quality of implementation and the benefits provided by specific characteristics of the tool, namely the progress mechanism (which requires multiple tasks to be executed) and the competition mechanisms (which requires a high number of participants to be appreciated). The utilization of the gamified learning tool in the context of a masters’ degree course, moreover, will allow to evaluate in a quantitative way the effectiveness of gamified education of BPMN modeling, by comparing the performance in tasks by the students – and eventually the related course grades – before and after the application of gamification elements and dynamics.

Author Contributions

Conceptualization, L.A. and M.M.; software, K.B.; writing—original draft preparation, K.B.; writing—review and editing, G.G. and R.C.; visualization, K.B.; All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Linkedin. Most In-Demand Jobs and Industries in Europe & Middle East and Latin America August 2020. Available online: https://business.linkedin.com/talent-solutions/recruiting-tips/thinkinsights-emea/most-in-demand-jobs-and-industries-in-europe-middle-east-and-latin-america/ (accessed on 1 August 2022).
  2. Groh, F. Gamification: State of the art definition and utilization. Inst. Media Inform. Ulm Univ. 2012, 39, 31. [Google Scholar]
  3. Marín, B.; Frez, J.; Cruz-Lemus, J.; Genero, M. An Empirical Investigation on the Benefits of Gamification in Programming Courses. ACM Trans. Comput. Educ. 2018, 19, 4. [Google Scholar] [CrossRef]
  4. Prabawa, H.W.; Sutarno, H.; Kusnendar, J.; Rahmah, F. Learning basic programming using CLIS through gamification. J. Phys. Conf. Ser. 2018, 1013, 12099. [Google Scholar] [CrossRef] [Green Version]
  5. Rojas-López, A.; Rincón-Flores, E.G. Gamification as Learning Scenario in Programming Course of Higher Education. In Proceedings of the International Conference on Learning and Collaboration Technologies: Learning and Teaching, Las Vegas, NV, USA, 15–20 July 2018; Volume 10925, pp. 200–210. [Google Scholar]
  6. Ayub, M.; Toba, H.; Wijanto, M.C.; Yong, S.; Wijaya, B. Gamification for blended learning in higher education. World Trans. Eng. Technol. Educ. 2019, 17, 76–81. [Google Scholar]
  7. Tasadduq, M.; Khan, M.S.; Nawab, R.M.A.; Jamal, M.H.; Chaudhry, M.T. Exploring the effects of gamification on students with rote learning background while learning computer programming. Comput. Appl. Eng. Educ. 2021, 29, 1871–1891. [Google Scholar] [CrossRef]
  8. Fraser, G. Gamification of Software Testing. In Proceedings of the 12th International Workshop of Automation of Software Testing, IEEE/ACM, Buenos Aires, Argentina, 20–21 May 2017; pp. 2–7. [Google Scholar]
  9. Rojas, J.M.; White, T.D.; Clegg, B.S.; Fraser, G. Code Defenders: Crowdsourcing Effective Tests and Subtle Mutants with a Mutation Testing Game. In Proceedings of the IEEE/ACM 39th International Conference on Software Engineering, Buenos Aires, Argentina, 20–28 May 2017; pp. 677–688. [Google Scholar]
  10. Olsson, M.; Mozelius, P.; Collin, J. Visualisation and Gamification of e-Learning and Programming Education. Electron. J. e-Learn. 2015, 13, 441–454. [Google Scholar]
  11. Matsubara, P.; da Silva, C. Game elements in a software engineering study group: A case study. In Proceedings of the 39th International Conference on Software Engineering: Software Engineering Education and Training Track (ICSE-SEET), IEEE/ACM, Buenos Aires, Argentina, 20–28 May 2017; pp. 160–169. [Google Scholar]
  12. Hanus, M.D.; Fox, J. Assessing the effects of gamification in the classroom: A longitudinal study on intrinsic motivation, social comparison, satisfaction, effort, and academic performance. Comput. Educ. 2015, 80, 152–161. [Google Scholar] [CrossRef]
  13. OMG. Business Process Model and Notation (BPMN), Version 2.0; Object Management Group: Needham, MA, USA, 2011. [Google Scholar]
  14. Dumas, M.; Rosa, M.L.; Mendling, J.; Reijers, H.A. Fundamentals of Business Process Management, 2nd ed.; Springer: Berlin/Heidelberg, Germany, 2018. [Google Scholar] [CrossRef]
  15. Alhammad, M.M.; Moreno, A.M. Gamification in software engineering: A systematic mapping. J. Syst. Softw. 2018, 141, 131–150. [Google Scholar] [CrossRef]
  16. Pedreira, O.; García, F.; Brisaboa, N.; Piattini, M. Gamification in software engineering—A systematic mapping. Inf. Softw. Technol. 2015, 57, 157–168. [Google Scholar] [CrossRef]
  17. Basten, D. Gamification. IEEE Softw. 2017, 35, 76–81. [Google Scholar] [CrossRef]
  18. Deterding, S.; Dixon, D.; Khaled, R.; Nacke, L. From Game Design Elements to Gamefulness: Defining “Gamification”. In Proceedings of the 15th International Academic MindTrek Conference, MindTrek’11, Tampere, Finland, 28–30 September 2011; pp. 9–15. [Google Scholar]
  19. Chou, Y.K. The Octalysis Framework for Gamification & Behavioural Design. Available online: https://yukaichou.com/gamification-examples/octalysis-complete-gamification-framework/ (accessed on 1 August 2022).
  20. Sailer, M.; Hense, J.U.; Mayr, S.K.; Mandl, H. How gamification motivates: An experimental study of the effects of specific game design elements on psychological need satisfaction. Comput. Hum. Behav. 2017, 69, 371–380. [Google Scholar] [CrossRef]
  21. Sailer, M.; Homner, L. The Gamification of Learning: A Meta-analysis. Educ. Psychol. Rev. 2020, 32, 77–112. [Google Scholar] [CrossRef]
  22. Shi, L.; Cristea, A.I. Motivational Gamification Strategies Rooted in Self-Determination Theory for Social Adaptive E-Learning. Intell. Tutoring Syst. 2016, 9684, 294–300. [Google Scholar]
  23. Wee, S.C.; Choong, W.W. Gamification: Predicting the effectiveness of variety game design elements to intrinsically motivate users’ energy conservation behaviour. J. Environ. Manag. 2019, 233, 97–106. [Google Scholar] [CrossRef]
  24. Chou, Y. Actionable Gamification: Beyond Points, Badges, and Leaderboards; Createspace Independent Publishing Platform: Scotts Valley, CA, USA, 2015. [Google Scholar]
  25. Antonaci, A.; Klemke, R.; Stracke, C.M.; Specht, M. Towards Implementing Gamification in MOOCs. In Proceedings of the International Conference on Games and Learning Alliance: GALA 2017, Lisbon, Portugal, 5–7 December 2017; Volume 10653, pp. 115–125. [Google Scholar]
  26. Olgun, S.; Yilmaz, M.; Clarke, P.; O’Connor, R. A Systematic Investigation into the Use of Game Elements in the Context of Software Business Landscapes: A Systematic Literature Review. In Proceedings of the 17th International Conference on Software Process Improvement and Capability Determination, Palma de Mallorca, Spain, 4–5 October 2017; pp. 384–398. [Google Scholar] [CrossRef] [Green Version]
  27. Bucchiarone, A.; Savary-Leblanc, M.; Pallec, X.L.; Bruel, J.M.; Cicchetti, A.; Cabot, J.; Gerard, S.; Aslam, H.; Marconi, A.; Perillo, M. Papyrus for gamers, let’s play modeling. In Proceedings of the ACM/IEEE 23rd International Conference on Model Driven Engineering Languages and Systems, MODELS ’20 Companion, Virtual Event, 16–23 October 2020; p. 21. [Google Scholar]
  28. Dicheva, D.; Irwin, K.; Dichev, C. Exploring Learners Experience of Gamified Practicing: For Learning or for Fun? Int. J. Serious Games 2019, 6, 5–21. [Google Scholar] [CrossRef]
  29. Uskov, V.; Sekar, B. Gamification of Software Engineering Curriculum. In Proceedings of the 2014 IEEE Frontiers in Education Conference Proceedings, Madrid, Spain, 22–25 October 2014; pp. 1–8. [Google Scholar]
  30. Dubois, D.J.; Tamburrelli, G. Understanding Gamification Mechanisms for Software Development. In Proceedings of the 2013 9th Joint Meeting on Foundations of Software Engineering, ACM, Saint Petersburg, Russia, 18–26 August 2013; pp. 659–662. [Google Scholar]
  31. Sonarsource. Clean Code | Developer First | Sonar. Available online: https://www.sonarsource.com/ (accessed on 10 September 2022).
  32. Hidayat, W.N.; Fitranti, A.; Firdaus, A.F.; Kartikasari, C.D.I.; Sutikno, T.A. Gamification based mobile application as learning media innovation for basic programming lessons. In IOP Conference Series: Materials Science and Engineering, Proceedings of the 1st Annual Technology, Applied Science and Engineering Conference, East Java, Indonesia, 29–30 August 2019; IOP Publishing: Bristol, UK, 2020; Volume 732, p. 12113. [Google Scholar]
  33. Maiga, J.; Emanuel, A.W.R. Gamification for Teaching and Learning Java Programming for Beginner Students—A Review. J. Comput. 2019, 14, 590–595. [Google Scholar] [CrossRef]
  34. Porto, D.; Jesus, G.; Ferrari, F.; Fabbri, S. Initiatives and Challenges of Using Gamification in Software Engineering: A Systematic Mapping. arXiv 2020, arXiv:2011.07115. [Google Scholar] [CrossRef]
  35. Barreto, C.; França, C. Gamification in Software Engineering: A literature Review. In Proceedings of the 2021 IEEE/ACM 13th International Workshop on Cooperative and Human Aspects of Software Engineering (CHASE), Madrid, Spain, 20–21 May 2021; pp. 105–108. [Google Scholar] [CrossRef]
  36. Cursino, R.; Ferreira, D.; Lencastre, M.; Fagundes, R.; Pimentel, J. Gamification in Requirements Engineering: A Systematic Review. In Proceedings of the 2018 11th International Conference on the Quality of Information and Communications Technology (QUATIC), Coimbra, Portugal, 4–7 September 2018; pp. 119–125. [Google Scholar] [CrossRef]
  37. Mäntylä, M.; Smolander, K. Gamification of Software Testing—An MLR. In Proceedings of the 17th International Conference on Product-Focused Software Process Improvement, Trondheim, Norway, 22–24 November 2016; pp. 611–614. [Google Scholar] [CrossRef]
  38. Vargas-Enriquez, J.; Garcia-Mundo, L.; Genero, M.; Piattini, M. A Systematic Mapping Study on Gamified Software Quality. In Proceedings of the 2015 7th International Conference on Games and Virtual Worlds for Serious Applications (VS-Games), Skovde, Sweden, 16–18 September 2015; pp. 1–8. [Google Scholar] [CrossRef]
  39. Bell, J.; Sheth, S.; Kaiser, G. Secret ninja testing with HALO software engineering. In Proceedings of the 4th International Workshop on Social Software Engineering, Szeged, Hungary, 5 September 2011; pp. 43–47. [Google Scholar]
  40. Fraser, G.; Gambi, A.; Kreis, M.; Rojas, J.M. Gamifying a software testing course with code defenders. In Proceedings of the 50th ACM Technical Symposium on Computer Science Education, Minneapolis, MN, USA, 27 February–2 March 2019; pp. 571–577. [Google Scholar]
  41. Fulcini, T.; Ardito, L. Gamified Exploratory GUI Testing of Web Applications: A Preliminary Evaluation. In Proceedings of the 2022 IEEE International Conference on Software Testing, Verification and Validation Workshops (ICSTW), Valencia, Spain, 4–13 April 2022; pp. 215–222. [Google Scholar] [CrossRef]
  42. EyeAutomate. EyeScout. Available online: https://eyeautomate.com/eyescout/ (accessed on 10 September 2022).
  43. Snipes, W.; Augustine, V.; Nair, A.R.; Murphy-Hill, E. Towards recognizing and rewarding efficient developer work patterns. In Proceedings of the 2013 35th International Conference on Software Engineering (ICSE), San Francisco, CA, USA, 18–26 May 2013; pp. 1277–1280. [Google Scholar]
  44. Swacha, J. Framework for Gamified Programming Education. Available online: https://fgpe.usz.edu.pl/ (accessed on 31 October 2022).
  45. Cosentino, V.; Gérard, S.; Cabot, J. A Model-based Approach to Gamify the Learning of Modeling. In Proceedings of the 5th Symposium on Conceptual Modeling Education, Valencia, Spain, 6–9 November 2017. [Google Scholar]
  46. Mancebo, J.; Garcia, F.; Pedreira, O.; Moraga, M. BPMS-Game: Tool for Business Process Gamification. In Proceedings of the International Conference on Business Process Management Forum, Barcelona, Spain, 10–15 September 2017; pp. 127–140. [Google Scholar] [CrossRef]
  47. Kutun, B.; Schmidt, W. BPMN Wheel: Board Game for Business Process Modelling. In Proceedings of the European conference on Games Based Learning, Odense, Denmark, 3–4 October 2019. [Google Scholar]
  48. Benyon, D. Designing Interactive Systems: A Comprehensive Guide to HCI, UX and Interaction Design; Pearson: London, UK, 2014. [Google Scholar]
  49. Bpmn.io. Awesome-bpmn-io. Available online: https://github.com/bpmn-io/awesome-bpmn-io (accessed on 22 March 2022).
  50. Philippfromme. bpmn-js-bpmnlint. Available online: https://github.com/bpmn-io/bpmn-js-bpmnlint (accessed on 22 March 2022).
  51. Sheth, S.; Bell, J.; Kaiser, G. A competitive-collaborative approach for introducing software engineering in a CS2 class. In Proceedings of the 26th International Conference on Software Engineering Education and Training, IEEE, San Francisco, CA, USA, 19–21 May 2013; pp. 41–50. [Google Scholar]
  52. van Solingen, R.; Basili, V.; Caldiera, G.; Rombach, H.D. Goal Question Metric (GQM) Approach. In Encyclopedia of Software Engineering; Wiley Online Library: Hoboken, NJ, USA, 2002. [Google Scholar]
  53. Lewis, J.R. The System Usability Scale: Past, Present, and Future. Int. J. Hum. Comput. Interact. 2018, 34, 577–590. [Google Scholar] [CrossRef]
  54. Camunda. Web-Based Tooling for BPMN, DMN and Forms. Available online: https://bpmn.io/ (accessed on 22 March 2022).
  55. Masson, D. Balanced Latin Square Generator. Available online: https://cs.uwaterloo.ca/~dmasson/tools/latin_square/ (accessed on 18 May 2022).
  56. Sauro, J. A Practical Guide to the System Usability Scale: Background, Benchmarks & Best Practices; Measuring Usability LLC: Denver, CO, USA, 2011. [Google Scholar]
  57. Pekpazar, A.; Öztürk, R.; Altin Gumussoy, C. Usability Measurement of Mobile Applications with System Usability Scale (SUS). In Selected Papers from the Global Joint Conference on Industrial Engineering and Its Application Areas, GJCIE 2018, Nevsehir, Turkey, 21–22 June 2018; Springer: Cham, Switzerland, 2019; pp. 389–400. [Google Scholar] [CrossRef]
  58. Wohlin, C.; Runeson, P.; Höst, M.; Ohlsson, M.C.; Regnell, B.; Wesslén, A. Experimentation in Software Engineering; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2012. [Google Scholar]
Figure 1. BPMN shapes for task objects.
Figure 1. BPMN shapes for task objects.
Information 14 00003 g001
Figure 2. BPMN shapes for Start, Intermediate and End events.
Figure 2. BPMN shapes for Start, Intermediate and End events.
Information 14 00003 g002
Figure 3. BPMN shapes for message, timer and error events.
Figure 3. BPMN shapes for message, timer and error events.
Information 14 00003 g003
Figure 4. BPMN shapes for gateways.
Figure 4. BPMN shapes for gateways.
Information 14 00003 g004
Figure 5. Octalysis diagram. Source: https://www.researchgate.net/figure/Octalysis-Gamification-framework-Chou-2015_fig2_336854026 (accessed on 6 September 2022).
Figure 5. Octalysis diagram. Source: https://www.researchgate.net/figure/Octalysis-Gamification-framework-Chou-2015_fig2_336854026 (accessed on 6 September 2022).
Information 14 00003 g005
Figure 6. Example of a progress bar.
Figure 6. Example of a progress bar.
Information 14 00003 g006
Figure 7. Example of a leaderboard using points and avatars.
Figure 7. Example of a leaderboard using points and avatars.
Information 14 00003 g007
Figure 9. Bpmnlint incorporated into bpmn.js.
Figure 9. Bpmnlint incorporated into bpmn.js.
Information 14 00003 g009
Figure 10. Progress version main page.
Figure 10. Progress version main page.
Information 14 00003 g010
Figure 11. Competition version main page—exercise list view and leaderboard tabs. (a) Exercise list. (b) Leaderboard.
Figure 11. Competition version main page—exercise list view and leaderboard tabs. (a) Exercise list. (b) Leaderboard.
Information 14 00003 g011
Figure 12. Rewards version main page—exercise list view and rewards tabs. (a) Exercise list. (b) Rewards collection.
Figure 12. Rewards version main page—exercise list view and rewards tabs. (a) Exercise list. (b) Rewards collection.
Information 14 00003 g012
Figure 13. Version chosen during task T6.
Figure 13. Version chosen during task T6.
Information 14 00003 g013
Figure 14. Enjoyment score for each version of BIPMIN.
Figure 14. Enjoyment score for each version of BIPMIN.
Information 14 00003 g014
Figure 15. Enjoyment score for each game element.
Figure 15. Enjoyment score for each game element.
Information 14 00003 g015
Table 1. Game elements present in BIPMIN.
Table 1. Game elements present in BIPMIN.
ElementCore DriveMotivation
Progress BarsEmpowermentTying visual feedback to skill progression
Skill LevelsAccomplishmentEncouraging skill progression by increasing appeal
LeaderboardsSocial InfluenceImplementing competition between users
Experience PointsAccomplishmentGauging user progress, ranking users
AvatarsOwnershipAllowing users to customise their experience
PrizesOwnershipGiving users a reason to keep solving new exercises
Unlockable ExercisesUnpredictabilityGiving users new challenges and new rewards
Table 3. Participant task list.
Table 3. Participant task list.
TaskDescription
T1Complete the tutorial. Once you have finished, ask the facilitator to check your solution.
T2Log in to your account using the credentials provided.
T31. Enter the progress version and complete the first exercise.
2. Note your current skill level.
3. Return to the main screen.
T41. Enter the competition version and complete the first exercise.
2. Note your position on the leaderboard.
3. Choose your avatar. 4. Return to the main screen.
T51. Enter the rewards version and complete the first exercise.
2. Note your reward collection.
3. Return to the main screen.
T6Complete 1 extra exercise from the version of your choice.
T7Complete further exercises if desired, with the version of your choice, then please fill out the post-experiment surveys.
Table 4. Usability task success criteria.
Table 4. Usability task success criteria.
TaskSuccess CriteriaMetrics
T1The participant follows the instructions provided and is able to complete the exercise (in 5 min)- Time to complete
- Number of requests to check the solution
T2The participant logs in to the tool successfully (in 30 s)- Time to complete
T31. The participant completes the first exercise (in 5 min)
2. The participant records their level (in 30 s)
3. The participant returns to the home screen (in 30 s)
- Time to complete
- Number of clicks on the check solution button
T41. The participant completes the first exercise (in 5 min)
2. The participant records their rank (in 30 s)
3. The participant selects a new avatar (in 2 min)
4. The participant returns to the home screen (in 30 s)
- Time to complete
- Number of clicks on the check solution button
T51. The participant completes the first exercise (in 5 min)
2. The participant records the number of pieces collected (in 1 min)
3. The participant returns to the home screen (in 30 s)
- Time to complete
- Number of clicks on the check solution button
T6The participant completes an exercise (in 5 min)- Time to complete
- Number of clicks on the check solution button
T7N/A
Table 5. Game elements present in each version of BIPMIN.
Table 5. Game elements present in each version of BIPMIN.
VersionGame Element
ProgressProgress bars
Levels
CompetitionLeaderboard
Avatars
Points
RewardsRewards
Unlockables
AllAesthetics
Table 6. Task order based on a balanced Latin square.
Table 6. Task order based on a balanced Latin square.
Task Order
ParticipantFirstSecondThird
1T3T4T5
2T4T5T3
3T5T3T4
4T3T5T4
5T4T3T5
6T5T4T3
Table 7. Average time (m:ss) to complete tasks.
Table 7. Average time (m:ss) to complete tasks.
TaskCriteriaResult
T15:003:33
T20:300:14
T3.15:002:50
T3.20:300:07
T4.15:004:32
T4.20:300:08
T4.32:000:06
T5.15:004:58
T5.21:000:06
T65:004:50
T3.3/4.4/5.30:300:14
Table 8. Number of requests to check solution during exercise-related tasks.
Table 8. Number of requests to check solution during exercise-related tasks.
ParticipantT1T3T4T5T6
121343
215313
312321
413331
513222
621221
713112
812222
911211
1012211
1111222
1214226
Table 9. System Usability Scale (SUS) scores.
Table 9. System Usability Scale (SUS) scores.
Participant123456789101112
Score90709097.56592.590858097.592.580
Average85.8
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Bedwell, K.; Garaccione, G.; Coppola, R.; Ardito, L.; Morisio, M. BIPMIN: A Gamified Framework for Process Modeling Education. Information 2023, 14, 3. https://doi.org/10.3390/info14010003

AMA Style

Bedwell K, Garaccione G, Coppola R, Ardito L, Morisio M. BIPMIN: A Gamified Framework for Process Modeling Education. Information. 2023; 14(1):3. https://doi.org/10.3390/info14010003

Chicago/Turabian Style

Bedwell, Kylie, Giacomo Garaccione, Riccardo Coppola, Luca Ardito, and Maurizio Morisio. 2023. "BIPMIN: A Gamified Framework for Process Modeling Education" Information 14, no. 1: 3. https://doi.org/10.3390/info14010003

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop