Usefulness of Digital Serious Games in Engineering for Diverse Undergraduate Students

: The use of educational digital games as supplemental tools to course instruction materials has increased over the last several decades and especially since the COVID-19 pandemic. Though these types of instructional games have been employed in the majority of STEM disciplines, less is known about how diverse populations of students interpret and deﬁne the value of these games towards achieving academic and professional pursuits. A mixed-method sequential exploratory research design method that was framed on the Technology Acceptance Model, Game-Based Learning Theory and Expectancy Value Theory was used to examine how 201 students perceived the usefulness of an intuitive education game that was designed to teach engineering mechanics used in designing civil structures. We found that students had different expectations of educational digital games than games designed for entertainment used outside of classroom environments. Several students thought that the ability to design their own structures and observe structure failure in real-time was a valuable asset in understanding how truss structures responded to physical loading conditions. However, few students thought the educational game would be useful for exam (14/26) or job interview (19/26) preparation. Students associated more value with engineering games that illustrate course content and mathematical calculations used in STEM courses than those that do not include these elements.


Introduction and Project Motivation
The use of online digital technologies has increased during the COVID-19 pandemic as educators attempt to rapidly address the need for emergency remote instruction [1][2][3][4][5][6]. Along with this need to transition courses to online formats, comes the parallel need for digital tools that can supplement course instruction. Digital learning and grading tools have served as a way for teachers to cover the burgeoning amount of required topics related to science [7,8], engineering [9,10], technology [11,12], and math [13,14].
Serious games (a term coined by Carl Apt in the 1970 s [15]) are games that are designed for an explicit and carefully crafted educational purpose, which extends beyond amusement or entertainment [16]. Serious games can either be digital or tactile. Digital serious games are considered for this project. Digital serious games can be housed in a plethora of online applications and software programs that can be accessed by the individual via the internet or downloaded onto a smart mobile device or computer. These types of games are premised on the principles of the Constructivist Learning Theory [17,18], which postulates that learners actively cultivate and construct knowledge as they experience the world around them, reflect on their experiences, and subsequently incorporate new information into their pre-existing knowledge base [19]. The mechanisms by which people operate and interact with serious games is described by game-based learning theory.
Game-based learning describes how individuals connect with a game that has defined learning outcomes [20]. Game-based learning also explains how the balance between the necessity of covering subject matter and gameplay entertainment is created [21]. Of equal importance to game-based learning lies the assessment of student outcomes such as academic performance [22][23][24], motivation [25,26], and engagement [27], though these things may be personal, i.e., premised on an individual's preferences or perceptions [28,29].
Much of the scholarly work on serious games in higher education environments has focused on making comparisons between groups of students who have been or not been exposed to a digital game intervention. This aggregation of groups can be problematic as the goal of the serious games is to help individuals meet milestones within a stated goal or timeframe. At the same time, without understanding how individuals attribute usefulness to a game, a true measure of student outcomes becomes convoluted.
The purpose of this exploratory study was to better understand the perceived usefulness of an engineering education serious game in higher education. This study was bounded by a specific game, Civil-Build (Pseudonym used to represent the online engineering game used in the study.), and was contextualized to an engineering course at a Northeast institution in the United States. We believe our findings have the potential for transferability in other contexts. One unique element of this study is the focus on exploring usefulness from a disciplinary context (e.g., engineering) and with a racially and gender diverse population of students (e.g., women, Black/African American (AA), LatinX, Asian). We anticipate that the usefulness of engineering education serious games used as part of a university-level course can result in two potential outcomes. First, students will experience enhanced course performance when they connect the game tool to the course content. Second, students' will assign value (usefulness) to a game that connects to real-world applications and will conclude the game is useful in their preparation for professional and future academic settings.

Technology Acceptance Model (TAM): A Model to Explore Serious Game Usefulness
The Technology Acceptance Model [30,31] has been extensively used to examine individual learner differences and perceptions as mediated by technology (e.g., information technology, email, software, etc.). The original TAM, incepted in 1989, posits that five behaviors are important to understand how learners interact with technology: Perceived Usefulness, Perceived Ease of Use, Attitude Towards Using, Behavioral Intention, and Actual Use. These concepts were further explained by Davis [32] in 1993, where he described Perceived Usefulness as the degree to which a person believes that using a particular system will enhance their job performance. Perceived Ease of Use is the degree to which a person believes that using a particular system would be free of effort. These two distinct constructs center around people's subjective appraisals of performance and effort, and function as behavioral determinants that connect an individual's Attitude Towards Using a technological system. Venkatesh and Davis [33] argued that the degree to which one believes they will use a system in the future (Behavioral Intention) and Actual Use of a technological system are connected to the beliefs about performance that can disagree with objective reality. Davis surmised that user acceptance is undesirable in cases where systems fail to provide true performance gains. Gains in performance are uniquely connected to both perceived and actual usefulness of a given technology. Many scholars have adapted and extended Davis's TAM to include effective teaching, intrinsic motivation [34], incentive and reward strategies [35], and trustworthiness [36] to better understand how other factors may influence the effectiveness of engineering games within a classroom environment.
The handful of the researchers who have used TAM to assess engineering education serious games, have detailed the meaning that participants attributed for the utility of these games as a supplemental resource to a course. This observation was evidenced recently [37] when researchers intended to extend the TAM to include online learning environments (OLE), such as virtual labs, simulators, videos, and interactive learning activities. These authors concluded that the original TAM did not address salient assessment issues, such as users' perceived efficiency, playfulness, and satisfaction. These same authors clarified that perceived usefulness may be interpreted as students' perceived advantage in using an educational system for self-study and hands-on exercises, which may be linked to students' attitude toward using the OLE and students' intention to use OLEs [37]. Similarly, other scholars [35] extended the TAM to include considerations around user satisfaction and attitude toward game elements (game design elements), while defining perceived usefulness as a student's perception that playing the game, GamiCRS, would enhance their performance and help them achieve their academic goals.

Motivational Learning-Expectancy Value Theory
Many researchers have noted that usefulness and the perceived value of learning tools is linked to Motivation Theory (Expectancy Value) Theory, which [38] postulates that a student's achievement-related choices, performance, and persistence are predicted and motivated by their expectations for success on those tasks. Furthermore, these theories posit that the subjective value attached to a task is related to a student's perception of the element's utility (usefulness). According to these theories, students' expectancies and values are determined by other achievement-related beliefs such as achievement goals, selfschemata, and task specific beliefs [38,39]. Expectancy Value Theory (EVT) identifies four major components of subjective values: (1) attainment value of importance (importance of doing well on a task); (2) intrinsic value (enjoyment from the task); (3) utility value or usefulness of the task (how the task fits within an individual's future plans); and (4) cost. EVT was extended by [40] to understand the motivational perspectives of African American adolescents in academic domains. It was concluded that African American students' ability, self-concepts, and academic performance were not as highly linked to either their academic achievement or their self-esteem as those of European-American students. This model was also used to examine differences in self-and task perceptions of first, second-and fourthgrade children in the domains of math, reading, sports and instrumental music [41]. It was found that students differed in competence beliefs and values as a function of gender and age, where boys were more positive about sports and math, while girls were more positive about reading and music. Collectively, it is possible that the perceived usefulness of an educational game may be tied to how individuals dissociate performance indicators from the values they attribute to a game based upon their unique contexts and experiences.

Game-Based Learning
Game-based learning can be rooted in behaviorist and/or constructivist learning theories, where the importance of play in cognitive development and learning has been established since Piaget [42] in 1962. Piaget postulated that play could become more abstract, symbolic, and social as the player develops with time. As serious games should (by definition) have an educational purpose, game-based learning emphasizes the balance between game play and coverage of content matter [43,44].
Though not a requirement, serious games that employ elements such as incentive systems (points, stars, badges, etc.) to motivate individuals to engage in the task of learning content that they otherwise might find unattractive [45] are often linked to games that are digital. These types of games often have a structure that includes three key elements: a challenge, a response and feedback. To assess and quantify the impact of serious games, identification of genre is needed as games are not only used for learning in different fields, but also cater to a variety of gaming preferences, e.g., casual game, first-person shooter environment, role-playing, drill-and-practice, massively multiplayer online (MMO) games and intuitive games. Casual games can be played on most portable electronics like smart phones and tablets for easy access and are designed to help students learn quickly with little to no previous gaming skills, expertise, or regular time commitment to play [46,47]. Serious games classified as having first-person shooter environments provide a first-person perspective of gameplay whereas images are seen from the perspective of the player's eyes [48]. Role-playing games allow players to act out the structured decision of characters or avatars in the game [49], while drill-and-practice game strategies repeatedly quiz players on concepts and practice problems allowing for immediate feedback like digital flashcards. Massively multiplayer online (MMO) games engage large numbers of players in a virtual world, which enhances social interaction and competition. Intuitive learning games emphasize experiential learning theory practices [50], where gaming environments allow players to consider a range of possibilities and ideas to solve problems, explore practical outcomes, foster abstraction, imagination, and prediction [51]. Due to the number of genres of serious games, conclusions from specific studies may not readily translate across all genres [45].
Game design elements allow for the realization of different genres, which can facilitate engagement on an affective, behavioral, cognitive and sociocultural level, where it has been noted that elements of challenge, curiosity and fantasy are intrinsically motivating for players [52]. According to [21], the transfer of knowledge content is optimized with practice and reinforcement of existing knowledge and skill, which can be facilitated through game scaffolding. Though scaffolding processes are relatively successful and commercially used in entertainment games, strategies for doing this in learning games are less understood, though these elements are connected to user perceptions of game usefulness.
How users perceive ease of use and usefulness of a technology is also related to their prior experiences with other forms of digital technology. User experiences with technology (UX) is part of the human-computer interaction framework that describes a "person's perceptions and responses resulting from the use and/or anticipated use of a product, system, or service" [53]. Few authors have related aspects of the TAM and UX models together [54], with the exception of [54,55] who connected user experiences while engaging with a technology, system and/or device to the prediction of the player's tendency towards playing the game in the future. Recently some researchers have linked users' prior experiences with technology using the TAM [56,57]. Our findings help to bridge the gap in available literature to begin to explore emerging trends in student device usage and expected task value in educational engineering software.
There have been a number of studies in engineering education pertaining to gamebased learning interventions in undergraduate and graduate classrooms in the majority of the engineering disciplines, e.g., mechanical engineering [58][59][60][61][62][63][64][65], civil engineering [66][67][68][69][70][71][72], computer science and software engineering [73][74][75][76], chemical engineering [75][76][77][78][79][80], power engineering [81], industrial engineering [82][83][84][85], environmental engineering [86,87], biomedical engineering [88] and aerospace engineering [89]. Of the aforementioned studies, less than a handful explore how gender influences the perceived value of the game intervention. Since engineering mechanics is a subject taught by mechanical or civil engineering departments, articles relating to these specific engineering disciplines are provided in Table 1. Few studies have examined the role that gender and engineering role identity may play on user acceptance, value, or perceived usefulness in the mechanical engineering, which is among the four lowest engineering disciplines to graduate women BS degrees (mechanical engineering women getting BS degrees = 15.7% in comparison to environmental engineering = 51.7% women and biomedical engineering = 48.1% women), despite being the largest population among all engineering disciplines according to the American Society of Engineering Education (mechanical engineering BS degrees = 24% followed by computer science engineering BS degrees = 14%) [90]. Hence, introduction of digital learning games could in theory present an opportunity to enhance the population of women in the profession according to some scholars [91]. Examples of game-based learning studies in mechanical and civil engineering are provided in Table 1. The studies detailed in Table 1 illustrate digital game-based learning research that emphasizes findings from mechanical or civil engineering participants and studies that aimed to examine how participants related usefulness, engagement, or value to the digital learning tool. For example, ref. [58] studied the use of several different types of learning games in a second-year mechanical engineering course to understand students' satisfaction, usefulness, and enjoyment. The games incorporated in this graphical engineering course included Championship of Limits and Fits, Tournament of Video Search of Fabrication Processes, Forum of Doubts, Proposals of Trivial Questions, Contest of 3D modeling. Each game was evaluated by asking the participants if they thought the game was "more useful and enjoyable" as a teaching element in the course than if the class were taught using conventional approaches. Participants responded to each question using a Likert Scale rating from 1 to 5, where 1 = Strongly disagree, 2 = Disagree, 3 = Neither Agree nor Disagree, 4 = Agree, 5 = Strongly Agree. It is difficult to ascertain how students defined usefulness of the games since the question also included enjoyment. Coupling both descriptive words ("useful" and "enjoyable") in the questionnaire makes interpretation of the results difficult as these two sentiments could be mutually exclusive. These researchers concluded that the main advantage of using the games within a class environment was an increase in attendance and enhanced interest in the course topics, in addition to bringing joy to the classroom environment. Interestingly, in this study, 9 of the 11 women studied did not play video games but ranked the games 3.3 in comparison to their male counterparts 3.6. While the researchers concluded that the differences in opinions between the men and women were not significant, no statistical analysis was reported in the study, i.e., no standard deviations or p-values were included in the analysis.
Other researchers such as [62] conducted a study across all disciplines at a university in Germany to understand how students perceived gamification and mixed reality learning tools, and what types of study modules would be considered useful by students. They found that of all the disciplines studied, mechanical engineering students saw the most potential in gamification in academic settings and students from all disciplines expected that games teach in a practical, realistic, and fun way. In a graduate maintenance engineering course, ref. [63] examined how students engage, actively cooperate within a group, and perceive real-time feedback when playing an asset management serious engineering game. They concluded that the game allowed the participants to apply different strategies to problems, forced students to make and learn from mistakes, and encouraged students to learn the reasons behind their mistakes. In the engineering game, NIU-Torcs that was designed by [27] for inclusion in an dynamic systems and control course, student engagement was examined. In this study of 51 undergraduate students, the authors concluded that students experienced higher intellectual intensity, intrinsic motivation, and overall student engagement in comparison to traditional modes of approaches to homework and coursework. Similar games focused on mechanical engineering dynamics using race car scenarios have been studied with emphasis on game design, student engagement, and perceived effectiveness [23,27,65]. Researchers have also compared the effectiveness of different games aimed at enhancement of content from the same course, such as [92].
While the vast majority of the studies in game-based learning in engineering for undergraduate and graduate level courses excludes differences/similarities as a function of gender, several studies for middle and high school aged students studying STEM fields have explored the role gender plays in acceptance and effectiveness of game-based learning intervention [93][94][95] with varying conclusions regarding differentiation of perceptions and effectiveness of the game-based learning tool. However, several researchers have noted the importance of recognizing differences in gender in game design, appeal, and efficacy [96][97][98][99], but studies that reflect this important consideration in engineering disciplines such as mechanical engineering, which have lower concentrations of women-lag behind.
It is important to note that this study does not negate or diminish the relevance or importance of work and research from scholars and game designers that have not explored the effectiveness of their games as a function of student role identity in terms of gender. Instead, this research fills a gap in the literature towards understanding how one's unique perspective and experiences influence how one associates usefulness and value of digital educational tools that are used to supplement traditional textbook and lecture course materials. This study fills the space in the literature that does not report on demographics in order to assure participant anonymity [65] or those whose studies were statistically inconclusive due to small percentages of women and under-represent minorities.

Research Design and Research Questions
The goal of this study is to explore usefulness of an engineering educational serious game from the perspectives of a diverse population of undergraduate students in a course at a Northeast institution in the United States. The research questions for this study are: 1.
How is usefulness described by engineering student users of the game?

2.
How do perceptions of game elements vary/stay the same as a function of prior gaming experience?
The study was conducted at a Research-1 institution in the Northeastern region of the United States in general statics and dynamics courses, Honors level engineering statics courses and engineering student organizations. Qualitative and quantitative data were collected via pre-and post-questionnaires, interviews, and focus groups. Data were analyzed in terms of ethnicity and gender to understand how this game connected with diverse populations of students. This work will help researchers understand what aspects of serious games can be leveraged to enable meaningful improvements in engineering education.

Methods
A Mixed-Method Sequential Exploratory Research Design Method [100] was proposed and approved by the primary Institutional Review Board of the first author and a cede of that IRB from the institution (at the time of the study) for the second author. The study took place at a Research-1 [101], research-intensive institution in the Northeastern region of the United States. The data described herein represent phases of a multi-year study. The preliminary results from this study that included responses from thirty-three participants were reported in conference papers [102][103][104]. In addition, correlation of post-survey responses with physiological eye-tracking measurements of eight participants were described in [105]. In this work, responses from 201 participants are described and discussed. This work differs from the previous work by including data from a larger data sample (previous work 33 participants and this work 201 participants) and data triangulation of coded textbox and focus group responses with pre-and post-questionnaire responses. Questions included in the pre-and post-questionnaire and focus group discussion are provided in Table 2. All of the 201 participants in the study were recruited from engineering classrooms and STEM and engineering student organizations. Students provided demographic information such as age range, gender, race/ethnicity, undergraduate major, and experience with online learning tools.
The research design is premised on the authors' positionalities as intersectional women in engineering and engineering education who have also experienced or witnessed firsthand the role those educational materials in engineering can have in a woman's overall sense of belonging and formation as an engineer.

Data Collection Protocol
An overview of the data collection process is provided in Figure 1. The students first completed a pre-game questionnaire, played the engineering game for 20 min, completed a post-game questionnaire (questions provided in Table 2), and then participated in a focus group discussion for approximately 30-40 min. The works of [106][107][108] were used to support the use of a 20-min exposure to the game play. In these studies, participants were exposed to the game-based learning tool for 20 min followed by a post-test. This timeline was also selected because researchers such as [109,110] indicated that students can have meaningful knowledge retention after 15-20 min of game play, where many students' attention spans are usually less than 20 min [111].
The questionnaire included 7-point Likert-scaled questions pertaining to their experiences with the game, demographic student information, and previous experiences with playing video games. Students were also asked to provide text responses to provide additional information pertaining to their Likert-scale responses to the questions on the questionnaire. The learning theory or model associated with the questionnaire question is provided in the table, where questions were slightly modified to consider the context of the serious game selected for this study, Civil-Build. Also, prior learning experience questions were modified per recommendations from other scholars [29] to understand the nature and environment in which the student engages with video and entertainment games in their everyday lives. The Likert-scale ranges included: Strongly Agree (1), Agree (2), Somewhat Agree (3), Neither Agree nor Disagree (4), Somewhat Disagree (5), Disagree (6), and Strongly Disagree (7), where Strongly Agree and Strongly Disagree were ranked 1 and 7, respectively. The students played the game in a quiet computer laboratory with section partitions around each player to limit interaction of participants while playing the game. Students wore noise cancelling headsets attached to their computers that allowed them to hear the sounds of the game. The focus group was conducted in a conference room, which in a separate location to the computer room.
During the focus group, participants discussed their perceptions of the game as an engineering educational learning and motivational tool. Selected questionnaire questions were repeated during the focus group along with several additional questions provided in Figure 1 and Table 2.
The focus group questions enabled a more in-depth discussion of the topics described in the Technology Acceptance Model (TAM) [30,32,112], i.e., perceived usefulness and easeof-use of the game. Several additional questions were included along with the questionnaire questions during the focus group discussion to facilitate the exploration of student's opinions regarding their prior experiences with video games, enjoyment playing the Civil-Build game. Additional questions were asked to understand how participants defined "usefulness". Focus groups consisted of 3 to 6 participants. The data were collected during three semesters, Spring 2018, Spring 2019, and Fall 2019. A quantitative analysis of preand post-questionnaire responses was conducted, in addition to written responses to open ended questions in the post-questionnaire. Twenty-six of the 201 participants in the study participated in the focus group discussions.

Student Population Demographics
Two hundred and one students participated in this study that introduced the online engineering educational game, Civil-Build (Pseudonym used to represent the online engineering game used in the study.). A pseudonym is used for the game to protect the students and instructors' identities who participated in the study. Students were recruited using flyers and email advertisements in engineering classrooms and engineering student organizations. Students whose schedule availability fit the time frame of the study were selected to participate. Students who were in statics courses received extra credit for participation in the study, while those who were not engaged in a statics course at the time of the study received compensation for their participation. The questionnaire included 7-point Likert-scaled questions pertaining to their experiences with the game, demographic student information, and previous experiences with playing video games. Students were also asked to provide text responses to provide additional information pertaining to their Likert-scale responses to the questions on the questionnaire. The learning theory or model associated with the questionnaire question is provided in the table, where questions were slightly modified to consider the context of the serious game selected for this study, Civil-Build. Also, prior learning experience questions were modified per recommendations from other scholars [29] to understand the nature and environment in which the student engages with video and entertainment games in their everyday lives. The Likert-scale ranges included: Strongly Agree (1), Agree (2), Somewhat Agree (3), Neither Agree nor Disagree (4), Somewhat Disagree (5), Disagree (6), and Strongly Disagree (7), where Strongly Agree and Strongly Disagree were ranked 1 and 7, respectively. The students played the game in a quiet computer laboratory with section partitions around each player to limit interaction of participants while playing the game. Students wore noise cancelling headsets attached to their computers that allowed them to hear the sounds of the game. The focus group was conducted in a conference room, which in a separate location to the computer room.
During the focus group, participants discussed their perceptions of the game as an engineering educational learning and motivational tool. Selected questionnaire questions were repeated during the focus group along with several additional questions provided in Figure 1 and Table 2. The focus group questions enabled a more in-depth discussion of the topics described in the Technology Acceptance Model (TAM) [30,32,112], i.e., perceived usefulness and ease-of-use of the game. Several additional questions were included along with the questionnaire questions during the focus group discussion to facilitate the exploration of student's opinions regarding their prior experiences with video games, enjoyment playing  Game-Based Learning [42,52] and Expectancy Value Theory [29,38] Q10.post. The hints motivated me to want to advance to higher challenge levels. Q11.post. The learning lessons or goals of each challenge are defined in enough detail to play the game. Q12.post I would improve this game by (complete the sentence). Select a choice. a.
This game is a good learning tool because (select all that apply). Adding a story line.
Making the images look more like real life. (Game-Based Learning) d.
Adding opportunities to compete against other players while playing. e.
The game is fine the way it is. f.
Adding more explanation to the challenges. (EVT) g.
Changing the rewards from nuts to something else. h.
Give any other feedback.-Text Q13.post. This game is a good learning tool because (select appropriate responses).
a. The game is easy to figure out. b.
I can apply what I learn in my classes. c.
he game taught me about stability of truss structures. d.
It is a fun alternative to traditional learning.
Added to assess usefulness.

Focus Group Questions
F1. Would you use this game to prepare for a job interview? Explain your answer. Added to assess usefulness F2. Would you use this game to prepare for an exam for a class?
The self-identified demographics of the students who participated in the study are provided in Table 3, where the percentage of men, women, non-binary, and other were 51%, 46%, 1%, and 1%, respectively. Eight of the 209 students declined to answer the question pertaining to gender. Students were also asked to indicate the race and/or ethnicity, where those who self-identified as African American/Black (7%), Caucasian/White (32%), LatinX (8%), Asian (46%), Mixed Race (6%), and "other" (1%). The populations of students who reported their gender and race are provided in Table 3, where the data from the 8 students who did not disclose demographic information are excluded. Table 3. Demographics of the participants according to race and gender, where count (number of participants) and percentage of the total participants is presented *.

Online Engineering Game Description
The online engineering educational game used for this study was selected as per recommendations from instructors who have used it to supplement course lecture materials on structural stability of truss structures, which is a topic that is covered in traditional undergraduate engineering mechanics statics courses. The game, Civil-Build was also selected for this study because it has been used as an educational tool in an existing engineering statics course at the university when course curriculum and schedule allowed. This digital tool is also considered to be the gold-standard of online games for engineering mechanics because it was designed by an engineering professional educator to help engineering students build intuition about how truss structures behave in real life and how they fail though physics-based simulations. Engineering instructors that opted to use this tool in the classroom believe that it supports student learning of engineering statics and have used it to supplement course materials such as the textbook and in-class lectures. This software was used by instructors who have taught the course for at least 5 years and who have taught both honors and general population classes. The software used in this study was encouraged and suggested by these instructors as they had used the software for three years in an honors engineering mechanics class.
The software focuses on truss structural stability, which is covered in all UG engineering mechanics courses and is not only used by instructors at this university located in the eastern region of the United States but, also by instructors at institutions in the middle and western coastal states and in several universities in Europe. (The specific names of universities where this tool has been used have been intentionally excluded to protect the identity of the software creator, university professors and students who presently use the software.) Unlike other civil engineering apps that illustrate structural failure using computerized animations from artists' renderings, this app uses finite strain theory in computational models to produce scientifically accurate dynamic visual representations of how structures physically respond to applied mechanical loads in real time. This software has been praised in ASEE's Prism magazine because it builds upon the visuospatial capabilities of students in real-time, can be downloaded onto a smartphone, and is accessible for use in classrooms (free). Software like this address the limitations of static drawings in textbooks as noted by other scholars [24,27,113]. The engineering software learning tool also enables the player to visualize material and geometric nonlinearities in addition to the dynamic movement of failed/compromised structures.
The goal of the game Civil-Build is to assist students in developing engineering intuition of truss structure behavior when subjected to loads. A representative rendering of the game interface is provided in Figure 2. The depiction of the game interface was modified to keep the tool and the designer anonymous. The software tool is based on finite strain theory that enables the user to visualize material and geometric nonlinearities and dynamic movement of failed/compromised structures. Users play the game by positioning bars and joints on the screen to construct a truss structure that can support an external mass and the structure's weight. The structure the player builds must consist of joints and bars, where the bars are connected via the joints. Players are rewarded with nut(s) and points based on the player's ability to create a structure of minimal weight and optimal structural stability (ability to support the given load while minimizing the support structure's overall weight). The number of nuts and points rewarded to the players are based on the structure's ability to support the given load while minimizing the overall weight of the support structure. Participants move the bars and joints on the screen of the game interface while manipulating the weight of the truss and adjusting the thickness of the bars. Participants visualize their structure's success or failure in real-time, as the structures visibly collapse or maintain their position once the truss structure is completed. The collapse of the structure is punctuated with clanging sounds associated with the destruction of the structure. The bars subjected to loading from the weights change color (shades of blue and red) to illustrate compression and tension of the bars, respectively. The tool is designed to teach students intuition about the relationship between truss structural design, material and geometric nonlinearities, and dynamic failure. The player progresses from Challenge 1 to 24, where each level increases in difficulty. Players cannot move to the next challenge without successfully completing the previous challenge level and there are no game instructions furnished in the game interface. No supplemental gameplay resources or instructions are provided in the game interface. However, supplemental resources are available for download on the software website and via YouTube demonstrations. No supplemental supplies or instruction was provided as part of this study to maintain the game designers' intent to teach engineering design intuition, i.e., apprehension or direct knowledge about a subject without instruction pertaining to the science or engineering governing the game.

Statistical Analysis for Quantitative Data
The descriptive statistics (mean, standard deviation, frequency, and percentage of the homogenous group) of all of the students who participated in the study were collected for the entire group of participants as a whole. In addition, these variables were recorded as a function of self-identified identity in order to be further correlated to their perceptions of the engineering learning game in terms of EVT, TAM and GBL. Questions pertaining to usefulness were added to understand the relationship between EVT, TAM, GBL and usefulness for engineering serious games that are designed for intuitive learning.
One-way and two-way multivariate analyses of variances (ANOVA and MANOVA) were performed to ascertain if there are statistical differences between students according to demographics, experience with playing games (on the phone or computer), perceptions of game being easy to use or useful, and frustration while playing the game. Two-way multivariate analyses (MANOVA) were conducted to understand if there were statistically significant interaction between subgroupings. The quantitative data from the questionnaire responses are triangulated with open-ended responses from students on the questionnaire pertaining to frustration and focus group questions.

First Cycle Coding for Qualitative Data-Elemental Coding Approach (Textbox and Focus Group Responses)
An open categorical first cycle elemental coding approach [114] was employed in this study to identify explicit words, phrases, opinions and experiences discussed in the selected text portions of the post-questionnaire and selected questions in the focus group. Two types of elemental coding approaches: structural and in vivo coding were used for analyzing the focus group transcriptions and Questions 12 and 13. For this work, six focus group discussions were transcribed and coded, where a total of 28 students participated in the focus group discussions in total. The students who participated in the focus group also participated in the first phase of the study, i.e., completed both a pre-and post-questionnaire, and were asked to participate in a focus group after they completed the postquestionnaire. Focus group size ranged from three to six participants due ease with recruitment, hosting and comfort for participants [115].

Statistical Analysis for Quantitative Data
The descriptive statistics (mean, standard deviation, frequency, and percentage of the homogenous group) of all of the students who participated in the study were collected for the entire group of participants as a whole. In addition, these variables were recorded as a function of self-identified identity in order to be further correlated to their perceptions of the engineering learning game in terms of EVT, TAM and GBL. Questions pertaining to usefulness were added to understand the relationship between EVT, TAM, GBL and usefulness for engineering serious games that are designed for intuitive learning.
One-way and two-way multivariate analyses of variances (ANOVA and MANOVA) were performed to ascertain if there are statistical differences between students according to demographics, experience with playing games (on the phone or computer), perceptions of game being easy to use or useful, and frustration while playing the game. Two-way multivariate analyses (MANOVA) were conducted to understand if there were statistically significant interaction between subgroupings. The quantitative data from the questionnaire responses are triangulated with open-ended responses from students on the questionnaire pertaining to frustration and focus group questions.

First Cycle Coding for Qualitative Data-Elemental Coding Approach (Textbox and Focus Group Responses)
An open categorical first cycle elemental coding approach [114] was employed in this study to identify explicit words, phrases, opinions and experiences discussed in the selected text portions of the post-questionnaire and selected questions in the focus group. Two types of elemental coding approaches: structural and in vivo coding were used for analyzing the focus group transcriptions and Questions 12 and 13. For this work, six focus group discussions were transcribed and coded, where a total of 28 students participated in the focus group discussions in total. The students who participated in the focus group also participated in the first phase of the study, i.e., completed both a pre-and post-questionnaire, and were asked to participate in a focus group after they completed the post-questionnaire. Focus group size ranged from three to six participants due ease with recruitment, hosting and comfort for participants [115].

Elemental-Structural Coding for Qualitative Data (Textbox Responses)
The structural coding approach was applied for this exploratory investigation to ascertain major themes or sentiments of the participants. This process facilitated the categorization of the data to identify comparable commonalities, differences, and possible relationships [114]. For this element of the qualitative analysis, data from all 201 respondents were analyzed. It is important to note, however, that provision of text responses to questions 12 and 13 was optional for the respondents. Not all of the 201 participants provided text response to both questions and 28 students participated in the focus group discussions. The number of responses provided to each question will be provided in the results section.

Elemental-In Vivo Coding for Qualitative Data (Focus Group Responses)
An in vivo coding approach (form of elemental approach) is described as "literal coding" or "verbatim coding" was selected for this first cycle of qualitative analysis of the transcribed focus group discussions because it prioritizes and honors the participant's voice [116]. The in vivo coding approach is also rooted in the Initial Coding employed in grounded theory [117] and is useful in educational ethnographies.

Results and Discussion
Two hundred and one students participated in the study. Also, only 174 of the 201 responses provided information pertaining to the challenge level that they achieved when playing the game. These 27 responses that did not provide challenge level were considered outliers in the study because this group of participants answered all of the preand post-game questions except the challenge level questions.
The descriptive frequency counts and percentages for the pre-game questions are provided in Table 4. The results indicate that the majority of the students who participated in this study play video games on their phone and were currently taking engineering statics or dynamics, 70.1% and 71.6%, respectively. Also, slightly less students played video games on their computer (62.7%) and the majority of the students had never played the game Civil-Build prior to participating in this study (86.6%).

Descriptive Statistics for the Entire Population of Students
The statistical means and standard deviations for the post-game questionnaire for the entire population of student respondents are presented in Table 5. The responses for the questions 2 through 11 were based on a 7-point Likert scale. There were 24 challenge levels that the participants could achieve, where a player could not proceed to a new challenge level without first successfully completing the prior challenge level. On average, the participants reached the 7th challenge level out of the 24 challenge levels while playing for 20 min. Respondents "agreed" (rating between 2.0 and 2.5) with Q8, i.e., that they enjoyed playing the game. Respondents also indicated that they "somewhat agreed" (rating between 2.6 and 3.5) with questions, 2, 3 and 11, i.e., that the game was easy to play, easy to understand, and got frustrated playing the game. It has been found that frustration can foster motivation in cognitive-and emotion-states while learning complex materials and one's cognitive and emotion-states are related to the duration and frequency that participants experience frustration [118,119]. Game-based learning recognizes the balance between gamer interaction with the learning tool and adequate failure that foster learning. On the other hand, players neither agreed nor disagreed with questions 6, 9 and 10, which means that they were not convinced that playing the game increased their confidence in their engineering skills, motivated them to advance to higher challenge levels, or that the learning lessons or goals were defined in enough detail to successfully play the game. These results are compelling because they illustrate the difficulties in establishing a balance between experiential intuitive-based learning and other schema for creating an interactive environment that fosters learning in engineering games. Table 5. Descriptive statistics, i.e., mean, and standard deviations for the post-game questionnaire questions. Question 1 was answered by 174 participants and questions 2-11 were answered by 201 participants.

Question (Number of Responses)
Mean ± s Q1. The Challenge Level I ended the game at was: (N = 174) 7.36 ± 3.01 Q2. The game is easy to play.
3.06 ± 1.43 Q3. The ways to advance to higher levels in the game are easy to understand.
2.98 ± 1.52 Q4. I understood the engineering topics each level of the game was teaching me.
3.08 ± 1.52 Q5. This game helped me to understand engineering truss structures.
2.77 ± 1.22 Q6. Playing the game increased my confidence in my engineering skills.
3.62 ± 1.47 Q7. The engineering concepts presented in the game were intuitive to me.
2.61 ± 1.13 Q8. Did you enjoy playing the engineering serious game?
2.49 ± 1.40 Q9. The hints (in the game) motivated me to want to advance to higher challenge levels.
3.80 ± 1.64 Q10. The learning lessons or goals of each challenge are defined in enough detail to play the game.
3.47 ± 1.65 The last two questions in the post-game questionnaire were questions added to explore how game players defined usefulness in educational games. The results for the aggregate responses are provided in Table 6. From this table, it can be seen that the majority of the game players thought that the serious game should have more explanations about the challenge levels (78.1%) and nearly half of the students indicated that there should be opportunities for them to play against other players interactively. These findings suggest that students may have different expectations of engineering education serious games compared to those games designed solely for entertainment purposes. Also, students indicate that these serious games may have more use if they had images that are more realistic looking, suggesting a need for these types of games to connect to the real world. Participants were also asked to explain what about the app made it a good learning tool. The results from this open-ended question are provided in Table 7. Of the choices provided, players indicated that the game was a fun alternative to traditional learning (lectures). Although students indicated that elements from the game could be applied to their class and taught them about truss stability, additional studies should include inquiry about specifics that they learned and how these elements of the game were related to their course work. The results from Table 7 are further elaborated in their responses to the post-game questionnaire. Many students indicated that there was value to being able to see how trusses behaved physically in a dynamic game environment. For example, one student indicated that "it's not a complete alternative, but it helps create a visual understanding of trusses." Another student wrote, "With some more explanation in the game about the mathematical statics counterparts to the game experience, the game would provide knowledge, which could be applied to classes." This sentiment is supported by another comment from a student who wrote, "The simulations could be played in slow motion," which may indicate that there is value to being able to visualize the physical reaction of structures to forces in real time.

Choice Count (%)
The game is easy to figure out. 88

Analysis of Variance (ANOVA)
A one-way ANOVA was performed to examine how the responses to the post-game questions (dependent variable) varied as a function of gender (independent variable) as shown in Table 8. The confidence interval used was 95%. It was found that there were significant differences between genders for post-game questions 1, 10 and 11, with p equal to 0.00, 0.00, 0.001 for questions 1, 2, 10 and 11, respectively. Men had higher scores on the learning game compared to women students (8.47 ± 2.9 and 6.30 ± 2.72, respectively). The observations for non-binary and students who identify as "other" are statistically inconclusive due to the small number of students representing this population in this study. Men found the game to be easier to play (2.70 ± 1.26) than women (3.36 ± 1.48). Men also agreed more with the statement that the learning lessons or goals of each challenge were defined in enough detail to play the game, with scores equal to 3.30 ± 1.44 and 4.48 ± 1.67, for men and women, respectively. A one-way ANOVA was also performed to examine how participants' experiences with the serious game varied as a function of gaming experience on the phone, where 70.1% of the participants played games on their phone as shown in Table 9. Students indicated that the games they played on their phones were entertainment games like Candy Crush, which required minimal transfer of learning and high hand-eye coordination. How students experienced elements of the serious game varied depending on whether they played entertainment games on their phone, for post-game questions 5, 6, 8, 9, and 10, with p values equal to 0.001, 0.021, 0.00, 0.006, 0.011, respectively. In particular, students who indicated that they played games on their phones, thought that the educational game helped them understand engineering truss structures more than those who did not play entertainment video games on their phones, i.e., with means of 2.58 ± 1.13 and 3.20 ± 1.30, respectively. Similarly, students who played games on their phones indicated that playing the game increased their confidence in their engineering skills where means were 3.46 ± 1.40 and 3.98 ± 1.50 for those who did and did not play games on their phones, respectively. Table 9. One-way ANOVA statistically significant post-game questions in terms of playing video games on one's phone. In a similar way, a one-way ANOVA was performed to examine how participants experience the serious game as a function of prior gaming experience on their computer. The questions that provided statistically significant differences are provided in Table 10. Ironically, the responses that were significant for the phone app-gameplay were not the same as those for the computer gameplay, where students indicated that the type of games that they played on their phone (Candy crush, solitaire etc.) were entertainment and singleperson games. On the other hand, students that played video games on their computer (Fortnite, Minecraft, Valorant, and Call of Duty) typically engaged with games that required more strategy and "gamer skills". While some of the games that the participants listed for computer-based games are available as phone apps, the phone versions do not usually offer all of the features as the computer-based game, such as third-party mods, connection to third-party servers, etc.

Post-Game Questions
In addition, many players of the computer-based games indicated that a different more advanced set of skills were needed to excel at the computer-based games as opposed to the phone-based games, e.g., good communication skills (for multiple player platforms), reflex skills (hand-eye-coordination), strategy, cooperation between teammates, etc. Hence, those who stated that they did play computer games on their computer indicated to a higher degree (than those that did not) that playing the game was easy 2.86 ± 1.30 in comparison to those who did not play the game, 3.41 ± 1.50 (p = 0.007). Those who played games on their computer reached higher and more complex game challenges that their peers that did not, i.e., 7.96 ± 3.41 and 6.43 ± 2.80, respectively. Also, those who played computer games had a more positive response to the games' usefulness than those who did not in terms of the game helping them in their understanding of engineering truss structures (2.60 ± 1.20 and 3.04 ± 1.30, p = 0.013), confidence in engineering skills (3.30 ± 1.40 and 4.00 ± 1.40, p = 0.002), and motivation to advance to higher challenge levels (3.60 ± 1.60 and 4.00 ± 0.047). The experienced computer gamer also indicated that they felt that the game goals were defined in enough detail to excel in the game more than those students who did not play games on their computer (3.6 ± 1.6 and 4.4 ± 1.5, p = 0.001). Also, people who did not play games on their computer indicated that they were frustrated while playing to a higher degree than those who do not (3.7 ± 1.7 and 3.0 ± 1.5, p = 0.013). The findings pertaining to prior use of games on smart phones or computers are similar to the findings of Abramson et al. [120] who tested specific factors of behavioral intention to use m-learning in a community college using an extended TAM and prior experiences with mobile phones. They concluded that there is a relationship between prior use of e-learning and behavioral intention in using m-learning. Similarly, our results are similar to those found by McFarland and Hamilton [57], who examined the influence of contextual specificity pertaining to technology acceptance. They concluded that the proclivity of one to use a computer system is strongly influenced by computer anxiety, prior experience, organizational support, and perceived usefulness. Our work also supports the conclusions of Venkatesh [56], who underscores the need for an increased focus on "individual difference variables" to foster acceptance and usage of information technologies. Venkatesh [56] highlights that the former focus of work is needed rather than "overemphasizing system-related perceptions and design characteristics as has been done".

Multivariate Analysis of Variance (MANOVA)
A multivariate analysis of variance was performed to understand the interrelationship between those who play games on their phone and gender in regard to the post-questionnaire. The results from this analysis are provided in Table 11. Though some researchers have concluded that those who have prior experience with playing video games accept and adapt to use of games in the classroom environment, our results indicate that amongst students who play phone apps, men achieve higher challenge levels than women, and even men who do not play video games on their phone achieve higher scores than women who do play games on their phones. In addition, men who do and do not play games on their phone found the engineering education serious game easy to play to a higher degree than women who either play or do not play games on their phone. Both women and men who played games on their phone agreed more with the statement that the engineering game helped them to understand engineering structures. This trend was the same for motivation and clarity of the learning lessons of the game for those who play games on their phone.
The sample sizes of students who identify as non-binary (N = 3) or other (N = 1) are small in comparison to the groups that identify as men or women. However, both of these groups achieved lower challenge levels in the game within the group of students who play video games on their phones. On the other hand, non-binary students that do not play video games on their phone achieved higher levels than women in the group. Also, non-binary students indicated that the game helped them learn about truss structures more than any of the other gender groups as shown in Table 11. These findings are interesting, but higher sampling sizes are needed to better understand the sentiments and perceptions of these students. Table 11. MANOVA illustrating the post-game questions that have statistically significant differences in terms of means as a function of both video game usage on one's phone and gender.  A multivariate analysis of variance was performed to understand the interrelationship between those who play games on their computer and gender. The results from this analysis are provided in Table 12. Similar to the results shown in Table 11, men who did and did not play video games achieved higher challenge levels on the serious engineering game than women and non-binary students who did not play video games on their computer. Women who did not play video games indicated that they disagreed more with the statement that the game was easy to play than those who did play video games. Ironically, however, women who did have experience playing video games, agreed with the statement that the game helped them to learn more about truss structures than men who indicated that they played video games. Hence, this population of students seemed to value their exposure to the game to a higher degree than their men counterparts, which may indicate differences in expectations of serious engineering learning games. Similarly, this population of women agreed more with the statement that playing the game increased their confidence in their engineering skills than women who did not play video games and men who played video games. Thus, this may indicate that women who enjoy playing video games for entertainment may value the learning experience from engaging with a serious engineering game than women who do not find engineering games entertaining in their spare time. Also, women participants who did and did not play video games disagreed more than their male counterparts with the statement that the goals of the engineering game were provided in enough detail to play the game. In a similar way, women indicated that the game was frustrating to them to a higher degree than their men counterparts. Understanding the role of frustration and student perceptions of value is important as it has been found that the negative path between learning anxiety and self-efficacy is stronger for females than males. So, in the case of a male and female student with equally high levels of learning anxiety, the women's self-efficacy would be lower [121].   Text box responses for Questions 12 and 13 (Tables 2, 6 and 7) were recorded and analyzed using a structural coding approach, where responses were examined as a function of gender (men, women, and non-binary). Forty-five participants provided text responses to Question 12, which asked how they would improve the serious educational game.

Post-Game Question
Of the 45 participants that responded to Question 12, nineteen were women. The majority of women (53% (10/19)) indicated that they would recommend including an instructional tutorial to explain how the game worked. This theme was linked to the sentiment of 5 participants who stated that they would have liked it if the game designer provided hints (5/19) to explain why a structure failed and where it failed. The next two primary themes focused on provision of an explanation of why the structure failed (4/19) and how the game controls were operated (4/19). Two women indicated that they would have liked to understand how the game scoring worked and how they could increase the number of points for a better score. Quotes from textbox responses from women and men are provided in Table 13. In general, women's comments primarily focused on provision of instruction and hints to learn how to use and play the game. Table 13. Examples of quotes from women/men/non-binary/other in response to Question 12 in the textbox.

Example Quotes to Question 12 from Women
Having an introductory tutorial that gives general explanations as to how to play the game and what each icon represents. An introduction would allow the gamer to have a better understanding about the game's goal and rules. Many games that I have seen have this sort of interactive tutorial. Additionally, an explanation of how your structure is scored would also be helpful to allow the gamer to know what actions are "worth more points" and would therefore maximize the gamer's score (while I was playing, I never got a reward greater than one nut, and I was not sure how to improve that).
Providing more incentive to get 3 bolts. A more guided tutorial or explanation would've been good as well. They just toss you into the game.
giving hints to explain how to get higher scores I think having a brief demo at the beginning would be helpful. As I said earlier, I did not grow up playing video games so I don't always intuitively know how the controls work.

Example quotes to question from men
I feel like playing 3D games when I was younger helped me a lot in visualizing 3D objects in Engineering, so perhaps making the game 3D. Also, the placement of the structure on the grid was very awkward and made it frustrating to build symmetrical structures. I may be wrong about this, but structures that earned more than one bolt seemed very impractical to build in real life. While saving on material and money is important, not every type of beam (length, thickness, etc.) is always available.
Make the game more calculation based. But then that would make the game boring. But I guess that what it takes to make a game that is informative and useful.
Making it impossible to complete levels with nonsense structures. Also, adding maybe a tutorial that explains how some engineering concepts (moments, tensile/compressive strength, buckling) are relevant to the game, or making the challenges in a way such that they teach these concepts, one at a time.
Having a small tutorial section on how to use all the tools and improving the method of placing objects.
Fully explain about the game before so people know not to spend most of their time trying to get more than one nut.

Example quotes to question from non-binary
explain the objective of the game within the game itself Option to show the forces in the beams numerically Slightly more men responded to Question 12 than women, e.g., 22 out of the 45 who responded in the text box. Though several men indicated that they would prefer to have an explanation about how the game worked, they did not directly state whether this was to be included within the game interface as the women did. This is evidenced by how the men articulated the request for game instruction. Examples of this are provided in Table 13. Only 2 of the 22 men asked for a "tutorial" (14%) and 3 asked for an explanation of the game and the game goals. While women primarily asked for a tutorial and better hints, men asked for a better user experience, (7/22 men and 0/19 women) i.e., "better UI (user interface)", easier ways to place components on the screen, etc. Men also asked for more technical content, i.e., numerical values for calculation, etc., (3/22) and more realistic structures (3/22 men and 0/19 women). These findings suggest that men and women appreciate and expect different things from a learning game.
All of the students that self-identified as non-binary provided a response to Question 12. One of the three respondents asked for a tutorial within the game interface, while another indicated that the game experience (screen appearance) could be improved (see Tables 13 and 14).

Example Quotes to Question 13 from Women
"I liked how the simulation could be slowed down to really give the gamer an opportunity to see where the weaknesses in their structure lie. This allowed me to correct the issues and construct a better truss." "I liked the fact that some of the bars were highlighted blue or red, which I assume meant that they were on tension or compression, and that should be included in the instructions."

Example quotes to question from men
"With some more explanation in the game about the mathematical statics counterparts to the game experience, the game would provide knowledge which can be applied to classes." "I didn't like it. At all. There is nothing about this game that was fun nor educational, any "engineering" aspects were not explained, yet a 5-year-old could probably figure it out by trial and error. (Textbox Q13-men, Pos. 4)"

Question 13: This Game Is a Good Learning Tool Because: __________
Eighteen of the participants provided textbox responses to Question 13, which asked the game player to explain why they thought the serious engineering game was a good (not good) learning tool. Four of the eighteen respondents were women. All of the responses to question 13 were "positive", where the respondent indicated a value associated with playing the game. In particular, women indicated that the tool was good because it extended their knowledge beyond equations and numbers and allowed them to view the physical responses of structures to loading conditions in real-time, i.e., allowing them to slow down or speed up the visualization. In other words, all four women indicated that there is value to being able to visualize the response of structures, which supported content that was covered their textbook and supplemented static drawings with dynamic representations of truss responses to various loading conditions and truss design. Examples of this feedback are provided in Table 14.
Six men and no non-binary students provided text feedback to Question 13. Only one of the men indicated that they liked the visualization (simulation) of the dynamic events pertaining to the truss structure (see Table 14). Two other participants indicated that the game was not a learning resource because it did not provide meaningful explanations to support the simulated failures/successes of the structures. In particular, these students indicated that software incorporated as supplemental learning tools should provide knowledge that can be applied to content provided in the class materials. On the other hand, two men indicated that they liked the fact that the game encouraged them to try multiple design options as "there was no punishment for failure," and it was "less stressful". Another man indicated that the game was reminiscent of games that he played when younger. None of the women indicated that they played games similar to this in their children in the textbox responses or linked it to a positive prior learning/entertainment experience. These responses support the post-questionnaire feedback where men who played video games on their phone did not agree as strongly as women who played video games with the statement that the game helped them to learn engineering concepts.

Focus Group Responses
Twenty-six participants engaged in the focus group discussion described in Table 2. The two focus group questions are: Would you use this game to prepare for a job interview? Would you use this game to prepare for an exam? The responses to these questions can be grouped into four primary themes that help to describe how students interpret and define usefulness in terms of engineering learning games.
Nineteen of the twenty-six participants indicated that they would not use the engineering software to prepare for an interview, while the remaining participants stated "maybe" or "not sure". The seven who indicated that they were unsure or would consider using the tool for an interview if it was for a company that designed bridge or truss structures. Hence, games that are designed to address content specific areas of expertise are deemed to be useful if they allow the student to obtain concrete skills for application to an engineering career path. Tools that rely on the user to indirectly appreciate intuitive skills without eventually leading to direct explanation of results are deemed to be less beneficial as time for students is valuable.
Fourteen of the participants indicated that they would not use the game to prepare for an exam. The remaining students did not directly state that they would not use the game, but instead provided explanations of when it would be appropriate to use the game or how it could be used within a classroom setting. This feedback is summarized with the four themes provided.

Theme 1: Serious Engineering Games Should Strengthen Knowledge of Core Course Content
The majority of statics coursework consisted of using equations of equilibrium and algebra to solve for force and moments acting on structures in two or three dimensions. Hence, five students indicated that this tool was not helpful because it did not relate or support the methods (numerical calculations) used in the course to solve problems, for example, one student stated, in class "we are given predefined things". Another student stated, "I would not use it to study just because, like, tests are geared towards problems and these problems, like, the challenges in the game aren't really relative to the problems that we see on tests." Also, because the games did not include explanations for failed structures or hints at how to improve structures, eight students indicated that tools that do not explain why are not as effective at teaching complex topics. Those that stated that they would not use the tool for a job interview indicated that this was because it did not help them understand why their structures failed, and they would need to understand the science and engineering behind failed structures to perform their jobs. One woman from this group indicated that she would not play the game for an interview, even if the instructor recommended it as an extra resource, because she did not like video games. These sentiments pair with and support the post-questionnaire textbox responses.

Theme 2: Serious Engineering Games That Do Not Pair with Math and Science Fundamentals Are for Novice Learners
Many students indicated that while the game did not advance their knowledge of truss structures enough to complement their learning, people new to statics (without prior engineering experiences/courses) may value learning the intuitive aspects of truss design. Several suggested that the game may be better incorporated in the statics course at the beginning of the term, prior to when the students learn about truss structures (9/26 participants).

Theme 3: Homework and Supplemental Materials Should Reflect Reflect-Real World Problems
Students also indicated that serious homework assignments (4/26 participants), and by extension engineering games should reflect real world problems, where several noted that while this software allowed for dynamic analysis of designs, it was not detailed enough to reflect the actual types of software used by industry that would relate to hand calculations they perform in class (3/26) and allow for more detailed input of parameters and output of values (5/26). Hence, while the software that the students used actually did include calculations based on real engineering principles, students did not interpret it to be based firmly in theory since it did not allow them to input specific numerical values. In fact, some students indicated that they would rather use other bridge building software packages, which (unbeknownst to them) did not adhere to all theoretical laws of physics or finite theoretical strain theory but included better graphics and 3-dimensional images that looked more representative of actual bridge and truss structures. These other software packages provided numerical feedback, though it was not always rooted in theoretically sound calculations along with visual images of failed structures. Instead of weights used as loads, actual vehicles and material selections were available, though the material selections did not result in actual changes in structural outcome.
Of the 26 focus group members, only one man affirmatively indicated that he would use the tool because it helped him quickly assess aspects of a bridge or truss structure without having to do calculations during the interview.

Theme 4: Software Presented a New Way of Learning the Material
Four students stated that while the engineering software did not add to their knowledge of statics, playing the game revealed a gap in their coursework. This gap was the ability to design one's own structure and examine the outcome of the structure when subjected to a mechanical load. These students stated that their present homework did not reflect real world problems as they thought they would. Also, their homework primarily focused on examining pre-designed structures, but rarely their own attempts and engineering on their own. Hence, for these students, there was value associated with the process of designing structures on their own and testing them.

Conclusions and Future Work
A mixed-method sequential exploratory research design method was used to examine how a diverse population of students perceive the usefulness of a casual intuitive education game that was designed to teach players about the engineering mechanics and stability needed to build truss structures. The study also focused on elucidating how students with varying experience with gameplay on their phone and computer interpret the value of the serious intuitive engineering game. Elements from the technology acceptance model, game-based learning theory and expectancy value theory were incorporated into the pre-and post-questionnaire for this study, in addition to several questions posed to garner student perceptions of game usefulness, which we assert is related to application to enhancement of knowledge towards mastery of course content and preparation for a professional engineering job interview.
Our results indicate that students having higher expectations of casual intuitive engineering learning games than games played for entertainment. In particular, students expect game interfaces and design schema to include explanations of failed or unsuccessful attempts that incorporate technical learning concepts. Game players who played games on their computer achieved higher game challenge levels than students who did not play games on their computers, and these students also indicated higher levels of pleasure while playing the engineering game than those who did not play games on their computer. There were also differences in how students experienced the game in terms of gender. Men who did and did not play games on their computer indicated that the game was easy to play to a higher degree than women. Women indicated higher frustration while playing the game than men, but those who played games on their phone stated that they learned more about truss structures than men. These findings were supported by focus group and textbox coding analysis where more women indicated a desire for game explanations than men.
Though the sample sizes of students who did not identify as men or women was small, some preliminary trends were observed. Non-binary students thought that the engineering game helped them understand truss structure more than men and women. These students also indicated that they liked playing the game, Civil-Build more than their men and women counterparts.
Few students thought the game would be useful for exam preparation because the game did not directly align with course homework and lecture material problems and educational philosophy. Hence, students expect direct ties between supplemental learning tools and course content, and desire educational tools to reflect aspects of real-world engineering problems. The majority of students did not believe the tool to be useful in preparation for a job interview because it did not include tangible numbers and calculations, which were regularly needed for homework assignments. Also, because the game interface did not render real-world visualization of engineering problems, most thought it not a good tool for job preparation. Several students thought that the ability to design their own structures and observe their failure in real-time was a valuable asset in understanding how truss structures failed.
There are several limitations of the findings from this research. For example, the number of students who identified as "other" and "non-binary" was miniscule. Hence, no statistically significant conclusions regarding these groups can be made. The findings of this study would be enhanced with additional participants in the study of diverse backgrounds. Also, the reliability of the method to objectively predict student perceived ease and usefulness would be strengthened with a more comprehensive assessment of other engineering mechanics engineering games. However, there are very few engineering mechanics games available that present physical structural responses that are rooted in authentic engineering principles and theory and as such this work serves as one of the early studies to explore these experiences.
The role of prior experience playing engineering games and the types of games played should be studied further to understand their relationship to student affinity or appreciation of engineering mechanics games. In addition, analysis of the relationship between students' perceived ease of use and usefulness to students' academic performance in the course is needed.

Institutional Review Board Statement:
The study was conducted in accordance with and approved by the Internal Review Board of Rutgers, the State University of New Jersey, New Brunswick (protocol code: Pro2018000499, date approved: 8/9/2018).

Informed Consent Statement:
Informed consent was obtained from all subjects involved in the study.

Data Availability Statement:
Data is available on request, but demographical data will be redacted in some instances due to restrictions need to maintain participant privacy and anonymity.

Conflicts of Interest:
The authors declare no conflict of interest.