Next Article in Journal
From Engagement to Achievement: How Gamification Impacts Academic Success in Higher Education
Previous Article in Journal
Challenges and Opportunities of Multi-Grade Teaching: A Systematic Review of Recent International Studies
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Do Teaching Media Matter? A Comparative Study of Finance Education via Classroom, Livestream, Video, and Educational Games

1
Faculty of Economics, University of Rome Tor Vergata, 00133 Rome, Italy
2
Faculty of Applied Social Sciences, Munich University of Applied Sciences, 81243 Munich, Germany
*
Author to whom correspondence should be addressed.
Educ. Sci. 2025, 15(8), 1053; https://doi.org/10.3390/educsci15081053
Submission received: 12 July 2025 / Revised: 8 August 2025 / Accepted: 14 August 2025 / Published: 18 August 2025

Abstract

This study examines how different instructional media—face-to-face classes, live streaming, pre-recorded videos, and educational games—affect student learning outcomes in finance education. A sample of first-year economics students was assessed on their knowledge of basic financial principles before being randomly assigned to five groups. Four groups attended the same finance course delivered through different media formats, while a fifth group served as a control and received no instruction. After the course, all students completed a second (post-course) assessment. By comparing individual pre- and post-test results, as well as learning gains across the groups, we evaluated the effectiveness of each delivery method. The results show that all four instructional formats significantly improved financial knowledge compared to the control group. Among the media types, educational games proved to be an effective and reliable tool for delivering finance content. However, the differences in learning gains between face-to-face instruction, live streaming, and pre-recorded videos were not statistically significant. These findings indicate that a range of delivery models can be used effectively in finance education. The study contributes to current debates on cost-effective teaching strategies and supports evidence-based decisions on curriculum design in digitally transformed higher education environments after COVID-19.

1. Introduction

In recent years, there has been a growing interest in the effectiveness of financial education in helping people increase their financial literacy. This interest bloomed after, in previous years, several studies demonstrated the positive effect of financial literacy on personal financial decisions. The more people are financially literate, the more likely they (1) participate in the stock market (Van Rooij et al., 2011; Almenberg & Dreber, 2015); (2) avoid financial difficulties (Gathergood, 2012; Lusardi & Tufano, 2015; French & McKillop, 2016); (3) prepare adequately for retirement (Bucher-Koenen & Lusardi, 2011; Lusardi & Mitchell, 2011; Sekita, 2011; Van Rooij et al., 2012); and (4) exhibit resilience in the face of financial shocks (De Bassa Scheresberg, 2013; A. Anderson et al., 2017).
Additional evidence from the literature is that the average level of financial literacy among the general population is low and often insufficient to enable individuals to achieve financial well-being (Nicolini & Haupt, 2019). Studies conducted by international organizations (Atkinson & Messy, 2013; OECD, 2020) have emphasized the importance of equipping individuals worldwide, particularly young people, with the knowledge and skills necessary to manage their personal finances effectively.
The combination of the positive effect of financial literacy on people’s financial decisions and the evidence of a lack (on average) of financial literacy among people makes the need for financial education more and more relevant. Numerous studies have demonstrated its potential to improve financial literacy, as evidenced by meta-analyses (Fernandes et al., 2014) and studies utilizing large datasets (Xiao & O’Neil, 2016; Lusardi, 2019). Some extant studies have evaluated financial education’s impact in schools (Walstad et al., 2010; Kaiser & Menkhoff, 2020), while others have focused on how it affects adults making financial decisions (Ambuehl et al., 2014; DeHart et al., 2016). More recently, extant studies have drawn more robust conclusions through meta-analyses that incorporate multiple studies (Kaiser & Menkhoff, 2020; Kaiser et al., 2022). This compelling evidence has prompted many countries to adopt national strategies to improve financial literacy, including the integration of financial education into school curricula.
This study does not aim to demonstrate that financial education improves financial literacy—an outcome already well-established by extant studies—but rather to assess how variations in the delivery media influence a financial education curriculum’s effectiveness.
The objective of this study is to evaluate the impact of different instructional media—face-to-face classes, live-streaming, pre-recorded videos, and educational games—on student learning outcomes in a financial education context. The central research question is: To what extent does the mode of content delivery affect the effectiveness of a standardized financial education curriculum? We test the following hypotheses:
H1: 
Participation in a financial education program—regardless of delivery medium—leads to significantly improved financial knowledge compared to a control group.
H2: 
There are no statistically significant differences in learning outcomes among students exposed to face-to-face, live-streamed, or video-based instruction.
H3: 
Educational games lead to higher learning gains compared to other delivery media.
Previous studies have reported or summarized findings on individual curricula’s effectiveness; however, curricula can differ in multiple aspects, including target groups (e.g., children, students, adults and workers), topics covered (e.g., basic principles or specific areas, such as investments, loans/debts and retirement planning), duration (e.g., short seminars or comprehensive courses) and delivery media (e.g., face-to-face meetings, online streaming and videos). These variations make it challenging to understand fully how specific factors impact a financial education program’s effectiveness.
By focusing on a single curriculum, it is possible to determine whether a specific combination of target audience, content, duration and delivery media is effective. However, isolating each dimension’s individual contributions to the overall program outcome remains a challenge. To address this issue, this study examines how different delivery media affect the same financial education curriculum’s effectiveness. Materials covering basic financial concepts were designed and delivered through four distinct methods: (1) face-to-face classes; (2) live-streaming classes; (3) pre-recorded videos; and (4) an educational game. Differences in learning outcomes across groups were evaluated and compared with a control group using a difference-in-differences methodology.
The financial education curriculum’s effectiveness was assessed by the increase in the number of correct responses to a set of financial knowledge questions before and after exposure to the curriculum (accounting for the control group’s response rates).
Understanding how different delivery methods impact financial education curricula’s effectiveness can help target specific groups. Furthermore, considering the varying costs associated with these delivery options can enhance efforts to maximize financial education investments’ value. If different delivery methods prove equally effective, this creates opportunities for scaling up programs using cost-effective, scalable solutions (e.g., online content) or blending delivery methods to expand accessibility (e.g., combining face-to-face meetings with online content). Such strategies may otherwise be unfeasible if they rely solely on non-scalable options. These findings also can help reach target groups with personal constraints that make traditional financial education programs impractical or inaccessible. Addressing this study’s research questions can guide financial education professionals in planning new curricula or optimizing existing ones. Furthermore, policymakers can use these insights to support implementation of the most effective and practical delivery options for financial education.
The remainder of the paper is organized as follows: The next section provides a brief literature review that highlights the need for more studies on financial education’s effectiveness while emphasizing the present study’s contribution. Subsequent sections describe the present study’s methodology and data, then gauge the reliability of the analysis. The results then are presented, followed by a final section discussing the study’s key findings and potential policy implications.

2. Literature Review

A significant number of extensive meta-analyses have examined financial education’s causal effects on financial knowledge and downstream financial behaviors, such as credit use, budgeting, saving and investment, insurance and remittances. Early studies (Fernandes et al., 2014; Miller et al., 2015), which relied on a relatively small number of extant studies, generated mixed evidence regarding the financial education program’s effectiveness. The limited evidence on their effectiveness was attributed partly to challenges that financial education providers faced in accurately evaluating their programs (Fox et al., 2005; Lyons et al., 2006). However, more recent studies (Kaiser & Menkhoff, 2017, 2020; Kaiser et al., 2022) have built on earlier ones by incorporating additional and more recent evidence. Through meta-analyses, these studies have reached a robust conclusion: Financial education positively impacts several desirable outcomes, such as increasing financial knowledge and encouraging positive financial behaviors.
Policy recommendations suggest that financial education programs should target specific audiences (e.g., low-income populations and high school students) and areas of financial activity (e.g., homeownership and credit card counseling) (Walstad et al., 2010; Kalmi & Rahko, 2022). Furthermore, it is recommended that such training occur immediately before a relevant financial event (e.g., purchasing a home or using a credit card). Multi-skill or broad-based programs appear to have certain disadvantages compared with highly targeted ones, as they often are perceived as less relevant, leading attendees to pay less attention and experience reduced motivation (Fernandes et al., 2014).
A recent systematic review and bibliometric analysis by Goyal and Kumar (2020) highlighted that financial education’s impact on improving financial literacy and behavior is one of the three major themes in current financial literacy research, with digital financial education identified as an emerging theme. Building on this, Kalmi and Rahko (2022) investigated the emerging theme of digital financial education, focusing on the effects of three game-based financial education approaches and their combinations, compared with a control group that received traditional teaching.
Recent studies have highlighted the engagement and cost-efficiency advantages of non-traditional teaching methods such as gamification and digital media. Kalmi and Rahko (2022), for example, showed that game-based approaches in school settings significantly improved financial knowledge. Similarly, Kaiser and Menkhoff (2020) emphasized the value of digital media in reaching broader populations without sacrificing instructional quality. These insights suggest that a shift toward flexible, scalable delivery options could enhance the reach and effectiveness of financial education programs.
In the context of financial education, it is important to distinguish between learning outcomes and curriculum (or program) effectiveness. Learning outcomes refer to the specific cognitive or behavioral changes observed in participants—typically measured through knowledge assessments or attitudinal shifts. In contrast, curriculum effectiveness reflects the broader success of a program in achieving its objectives, which includes but is not limited to improvements in knowledge. It also encompasses dimensions such as delivery feasibility, cost-effectiveness, participant engagement, retention rates, and transferability of skills. While most empirical studies focus on quantifying learning outcomes (e.g., knowledge scores), recent work emphasizes the need to evaluate the effectiveness of financial education curricula in light of resource constraints, scalability, and target audience characteristics (Kaiser & Menkhoff, 2020; Fernandes et al., 2014; Goyal & Kumar, 2020). This study contributes to that discussion by analyzing how different delivery media affect learning outcomes and, by extension, inform decisions about curriculum effectiveness.
Gamification and digital learning formats may affect financial education outcomes not only by increasing access and scalability but also through specific pedagogical mechanisms. Educational games often integrate immediate feedback, progressive difficulty levels, and repetition, all of which align with principles of active learning and reinforcement theory (De Freitas & Oliver, 2006; Plass et al., 2015). These features can enhance motivation and sustain attention, especially among younger learners. Similarly, digital platforms such as pre-recorded videos or live streams allow for flexible, self-paced learning and reduce logistical barriers (Goyal & Kumar, 2020). However, some studies highlight that digital learning environments may lack the social and interactive dynamics of face-to-face instruction unless intentionally designed to foster engagement and interaction (Means et al., 2013). Investigating these underlying mechanisms provides a foundation for understanding performance differences across instructional media.
Beyond the debate on digital vs. traditional teaching approaches, Goyal and Kumar (2020) also emphasized the need for policymakers to take action and implement effective financial education programs. Consequently, they outlined future research directions. Among these, they argued in favor of assessing financial education interventions’ appropriateness, with the goal of identifying the least costly programs, emphasizing the cost–benefit aspect of financial education.
This study seeks to contribute to the literature by examining the role of different media in delivering (financial) education and their influence on its overall effectiveness. Four delivery methods—face-to-face classes, live-streaming classes, pre-recorded videos and educational games—are tested to evaluate their impact on a financial education curriculum’s success and to identify differences arising from these various delivery methods.
While there is a growing body of literature on the impact of financial education on knowledge and behavior (e.g., Kaiser & Menkhoff, 2020; Fernandes et al., 2014), there is still limited comparative research that evaluates the same curriculum delivered through different teaching media. Most existing studies assess one delivery mode at a time or combine various factors such as target group, content, and duration, making it difficult to isolate the effect of the instructional format. This study aims to fill this gap by systematically comparing four distinct delivery methods—classroom, livestream, video, and educational game—within a controlled experimental setting using an identical curriculum.

3. Data and Methods

To assess how different media used to deliver content in financial education affect a specific financial education curriculum’s effectiveness, this study’s authors developed an ad hoc curriculum. The course, titled “Money and Its Use”, covers basic principles of money, its functions within a financial system, an analysis of various payment methods (e.g., cash, payment cards and online payments), managing exchange rates and different currencies, and identifying counterfeit banknotes.
The ad hoc curriculum “Money and Its Use” was designed to achieve the following learning outcomes:
-
Students will be able to identify the functions of money and distinguish between various forms of payment (e.g., cash, cards, digital payments).
-
Students will understand the basics of exchange rates and apply simple conversion principles.
-
Students will recognize basic legal rules related to cash use in Italy and the Eurozone (e.g., withdrawal and payment limits).
-
Students will be able to identify key institutions related to monetary policy (e.g., the European Central Bank).
-
Students will demonstrate the ability to apply practical financial knowledge in decision-making contexts (e.g., currency exchange, counterfeit recognition).
These outcomes were aligned with the multiple-choice questions used in the pre- and post-intervention tests, particularly items 6–10, which were used to construct the financial knowledge index.
The choice of such fundamental topics was intended to maintain attendees’ interest, as money is a subject relevant to daily life. Materials on the same set of topics were prepared for delivery through four methods: (1) face-to-face classes (traditional educational sessions); (2) live-streaming classes; (3) pre-recorded videos; and (4) an educational game. The target audience comprised first-year students enrolled in bachelor’s programs offered by an economics department.
Data collection took place during two consecutive fall semesters, in October 2022 and October 2023. All participants were first-year students enrolled in bachelor’s programs at the Faculty of Economics. After a brief welcome and introduction, all students completed a pre-course questionnaire, administered in person using pencil-and-paper format. Students were then randomly assigned to one of five groups (four treatment groups and one control group). The four treatment groups participated in their assigned instructional activities (face-to-face class, live stream, video, or educational game), all of which lasted approximately three hours. The control group did not receive any instruction but returned for the final phase. After a standardized break, all participants reconvened in the same setting to complete the post-course questionnaire, which was identical in structure and format to the pretest but excluded demographic questions and additional baseline items unrelated to the curriculum (e.g., Lusardi–Mitchell questions on compound interest, inflation and diversification). Data were anonymized using participant codes and digitized for analysis.
Participation was free, with no monetary incentives provided. However, students who fully participated in all program activities were rewarded with credits for ‘extra activities’ included in their bachelor’s program. To minimize selection biases, the program was designed intentionally to be short, requiring only a single day of attendance to complete. Data was collected using pencil-and-paper questionnaires. The activities were first conducted in October 2022 and then repeated with a new cohort of first-year students in October 2023.
Selecting first-year students from the faculty of economics as the study’s participants was performed for several reasons. First, the research focused on young adults, a demographic that corresponds to first-year students’ typical age group. Second, as newly enrolled students, they have not yet been exposed to any economics or finance courses, minimizing potential bias from prior educational experiences. Finally, their choice to enroll in a faculty of economics indicates a preexisting interest in economics and finance. This reduces the likelihood that financial education’s effectiveness would be hindered by a lack of attention or motivation among participants.
Notably, data drawn from homogeneous groups, such as first-year students from a specific faculty at a single university, can limit the results’ external validity. However, a highly specific sample also can serve as an additional factor in achieving a ceteris paribus scenario, which is crucial for ensuring the findings’ robustness. Furthermore, the study’s primary aim was not to evaluate the financial education’s effectiveness in itself, but rather to assess differences in effectiveness resulting from the delivery of content through various media.
While this study’s participants may exhibit stronger motivation and greater interest in finance than other groups, this potential upward shift in motivation is expected to be evenly distributed across all participants. Consequently, it should not influence the differences between groups or compromise the results’ reliability.
The study’s methodology was based on a difference-in-differences (DiD) approach, in which participants are assigned randomly to two groups: a treatment group, which is exposed to the financial education curriculum, and a control group, which is not exposed to any treatment. Measurements of financial knowledge are taken for both groups before and after the treatment. These measurements are used to test how the treatment influences changes in financial knowledge, accounting for the natural changes that may occur within the control group, which is not exposed to the intervention.
The difference between the groups is determined by comparing the changes in financial knowledge (pre- and post-treatment) across the two groups, forming the basis for the DiD approach. The underlying assumption is that the treatment’s effect cannot simply be measured by the change in the variable for participants in the treatment group over time. Instead, the true impact of financial education must account for the ‘natural improvement’ observed in the control group due to exogenous effects. A visual representation of the DiD methodology is provided in Figure 1.
The DiD methodology has been applied widely to evaluate educational programs’ efficacy across various fields. H. M. Anderson et al. (2005) reviewed assessment practices in pharmacy education by analyzing papers published in the American Journal of Pharmaceutical Education. Their review, encompassing over 50 studies, highlighted the DiD approach as a standard evaluation methodology. More recently, Barteit et al. (2020) conducted a systematic review to assess e-learning’s effectiveness in medical education, reporting the common and reliable practice of administering pretests and post-tests to both a treated group and a control group. In the financial education context, the DiD methodology is also employed frequently. Borden et al. (2008) used it to evaluate financial education programs for college students, while Walstad et al. (2010) applied it to assess financial education’s effectiveness in high schools. Zhang and Xiong (2020) adopted this approach to study financial education in rural China, and Litterscheidt and Streich (2020) utilized it in their analysis of financial education and digital asset management. Furthermore, Kaiser and Menkhoff (2020), in their meta-analysis, identified 18 other studies that employed similar (quasi-)experimental designs to evaluate financial education’s effectiveness. This widespread adoption of the DiD methodology in the literature underscores its suitability for use in the present study.
The protocol adopted in the present study was as follows: After welcoming the students and providing a brief introduction, all participants completed a questionnaire comprising 10 multiple-choice questions about money (e.g., banknotes and coins, currency exchange and ATM cash withdrawals). Details on the questionnaire are provided later. These questions’ content is aligned with the financial education curriculum proposed in the study.
After the study’s initial phase, students were assigned randomly to five groups. Four groups were invited to engage with the financial education content as follows: (1) attending a face-to-face class on money and its use; (2) attending the same class via live-streaming on their own devices (e.g., laptops and tablets); (3) watching a web series of videos available on YouTube; and (4) playing an educational game. That game was available online, and participants of the fourth group were invited to download it and play for the whole time other groups were attending their curricula.
The educational game was designed as a browser-based application that participants accessed on personal devices. It consisted of 15 multiple-choice questions increasing in difficulty. Each correct answer enabled the player to progress to the next level, while an incorrect answer reset the game to Level 1. Importantly, incorrect answers triggered immediate feedback indicating the correct response, accompanied by a concise explanatory note (1–2 sentences) related to the financial concept tested. To promote learning by repetition, questions were drawn from a randomized pool and changed with each new game session, ensuring exposure to varied content. Although the game did not use points, badges, or leaderboards, the challenge of advancing without restarting created an intrinsic reward mechanism through mastery and completion. This design mirrors elements of popular quiz-based formats and incorporates low-stakes repetition as a form of behavioral reinforcement.
The fifth group served as the control group. While the other groups engaged in their respective activities (e.g., attending class and watching videos) in separate rooms, the control group was asked to leave the general session room and return at the end of the day. Members of the control group were excluded from participating in any of the activities in the (quasi-) experiment, such as attending classes and watching videos. The learning activities—including the face-to-face class, live-streaming class, videos and educational game—were conducted simultaneously as parallel sessions, each lasting approximately three hours. A break of about 90 min was scheduled between the conclusion of the educational activities and the final meeting. During the final meeting, all five groups reconvened to complete a second questionnaire identical to the one administered in the morning.
Regarding the pre- and post-curriculum questionnaires, the first questionnaire (pretest) was administered in the morning and comprised three sections: (1) participants’ sociodemographic characteristics, including age, gender, parental education and high school diploma final grade; (2) 10 multiple-choice questions on financial knowledge focusing on money and its use; and (3) three standard financial literacy questions on compound interest, inflation and risk diversification (commonly referred to as the Lusardi–Mitchell questions), along with two additional questions on mortgages and bond pricing. The second questionnaire (post-test) comprised just 10 multiple-choice questions and each respondent’s identification code.
Descriptive statistics for the sociodemographic variables are presented in Table 1.
The overall sample was balanced between males (49.86%) and females (50.14%), with no significant differences in gender distribution across the groups. Approximately one in four participants (23.86%) graduated from high school with a top grade. The ratio between participants from families in which at least one parent holds a degree and those who are first-generation college attendees was nearly one-to-one. Balance tests conducted on these variables revealed no statistically significant differences between groups, ensuring the DiD methodology’s validity. Variables related to employment were not included, as all participants were college students. Similarly, no data on age were considered due to the very limited variation (98% of the participants were born between 2002 and 2004).
The financial knowledge questions used in the study were aligned with the topics included in the educational materials (e.g., face-to-face classes, videos and the game). The list included very basic questions, such as identifying the color of a EUR 20 bill or naming countries that do not use the euro as their local currency, as well as more technical questions, such as those on bid–ask quotations for exchange rates and limits on coin-based payments. These questions’ perceived difficulty was reflected in correct-response rates, as presented in Table 2.
The Lusardi–Mitchell questions on compound interest, inflation and diversification (often referred to as the Big Three) (Lusardi & Mitchell, 2011), along with additional questions on mortgages and bond pricing, were included to provide a baseline for comparison with previous financial literacy studies, which have adopted these questions widely for assessment purposes. However, as the topics covered by these questions were not part of the educational materials in this study, they were excluded from the post-treatment questionnaire and were not used to evaluate the delivery media’s effectiveness in financial education. Correct response rates for these five items are presented in Table 3.
The average correct response rate (3.12 out of 5) was higher than that of the general population, which can be attributed to the participants’ young ages and their status as students in a faculty of economics. The 10 multiple-choice questions on financial knowledge were analyzed to construct a single scale. A preliminary analysis of the correlations between responses to the 10 items (see Table 4) suggests minimal overlap between the items in both the pretest and post-test.
The option to sum up the number of correct answers to all 10 items as a financial knowledge scale was rejected due to that index’s low reliability (Cronbach’s alpha = 0.6546). The first five questions exhibited very high correct response rates, making them ineffective at differentiating between individuals or between pretest and post-test results. Consequently, the remaining five items (items six to 10) were used to construct a financial knowledge index based on the number of correct answers to these questions. This scale’s reliability is supported by a Cronbach’s alpha of 0.7176 (see Table 5). Notably, all the topics addressed by the 10 items were covered in the educational materials used in the study’s treatments.
The financial knowledge index, calculated as the sum of correct answers to five questions (questions six to 10), ranged from zero to five. The average pretest scores varied across groups: 1.22 for the live-streaming group; 1.24 for the game group; 1.29 for the control group; 1.41 for the video group; and 1.44 for the class group (see Table 6).
The financial knowledge index values in the post-test increased for all groups, including the control group (1.92). The game group achieved the highest average correct response rate (4.3), followed by the class group (3.95), the live-streaming group (3.86) and the video group (3.79). The difference between financial knowledge scores before and after exposure to the educational treatments was statistically significant across all groups. However, evaluating the financial education curriculum’s effectiveness required accounting for the natural improvement observed in the control group by applying the DiD methodology described earlier. Hence, it is not the difference between the pre- and post-curriculum financial knowledge of a certain group that can assess the effectiveness of education, but it is the change—between the pre- and post-assessment—in the difference between the financial knowledge of a group that attended the curriculum and the control group (that did not) to isolate the educational effect on financial knowledge. Each of the four groups that received financial education is compared with the control group, testing whether or not the curriculum works when contents are delivered by a certain media. Moreover, a comparison between groups that received the same education but via different media allowed for a comparison of the differences between delivery options.

4. Results and Discussion

Financial education’s potential impact on participants’ financial knowledge was assessed using an ordered logistic regression model, with the financial knowledge index as the dependent variable. Independent variables included dummy variables for each participant group (class, live-streaming, video, game and control) and interaction terms that captured the marginal effects of receiving financial education via a specific medium compared with not receiving any education (control group). These interaction terms were defined by the interaction of (a) being part of a treatment group and (b) being in the post-treatment period, thereby identifying the DiD effect on financial knowledge.
Control variables included gender (male or female), parental education (whether at least one parent is a graduate, denoted as Parentgrad) and academic performance (whether the participant graduated high school with top grades, denoted as Topgrade). Results from this initial phase of the empirical analysis are presented in Table 7.
The positive coefficients for all the interaction terms (“medium × post-treatment time”) suggest that the financial education curriculum used in this study effectively increases participants’ financial knowledge. For each content delivery medium employed in the study, the positive sign of the coefficients and their statistical significance at the 1% level support the hypothesis that the financial knowledge index increases more during the post-treatment period than the pretreatment period for those exposed to the financial education curriculum compared with the control group. Furthermore, no statistically significant differences were found for the control variables (being male, having at least one parent with a degree and graduating high school with top grades).
The study aimed not only to demonstrate financial education’s effectiveness but also to evaluate its relative effectiveness across different content delivery media. A test of the statistical differences between the coefficients for each medium provides insights into the research question (see Table 8).
The lack of statistically significant differences between the coefficients for (1) class and (2) live-streaming, (1) class and (2) videos, and (1) live-streaming and (2) videos suggests that no big differences exist in the effectiveness of an educational program when one delivery method is preferred to another one.
An item-level review of learning outcomes reveals that certain topics posed more difficulty for participants, regardless of instructional format. For instance, the legal limit on cash payments by coins (item 10) and the handling of truncated banknotes (item 8) showed relatively low pre- and post-intervention scores in most groups. These findings suggest that legal–technical knowledge, which is often abstract or counterintuitive, may be harder to convey through brief interventions. Conversely, questions involving currency conversion (item 5) and bid–ask logic (item 7) saw greater improvement in groups using more interactive formats (particularly the game group), suggesting that immediate feedback and repetition may enhance the acquisition of applied procedural knowledge. This aligns with findings from Kalmi and Rahko (2022), who showed that game-based formats improved learning in rule-based financial contexts. Similarly, Goyal and Kumar (2020) emphasize that feedback-rich environments can improve retention in digital financial education.
These patterns point to the importance of matching delivery media to content type. While face-to-face and live-stream formats perform well for broad conceptual understanding, game-based formats may be particularly effective for reinforcing procedural or rule-based learning through repetition and challenge. Such differentiation supports the idea of blended approaches in curriculum design, as also discussed by Kaiser and Menkhoff (2020).
The study yields three key findings: First, changing the media used in delivering educational content does not jeopardize the effectiveness of the program. People learn from attending courses regardless of the fact that a program is based on a face-to-face meeting, a live-streaming class, or a set of pre-recorded videos. Even playing an educational game helps people improve their financial knowledge. Second, a comparison of different content delivery options stresses that, except for the educational game, the learning outcome does not change substantially when a program is based on different options. From these two results it follows that more than one content delivery option is effective, and education does not have to remain in the face-to-face scenario to be effective. The digitalization of education, as in the case of MOC (Massive Online Courses), requires considering new distribution channels for education, and this study supports the hypothesis that no real differences in the educational outcomes of a program can be attributed to the content delivery media used for its distribution. For example, online financial education can be used to provide content to target recipients in remote areas or those less likely to attend an on-site program. Furthermore, a blended curriculum combining on-site meetings and online content appears to be a viable option that does not diminish the program’s effectiveness.
The third piece of evidence from the study is the potential to stimulate a learning process by using educational games. The learning-by-doing hypothesis behind the use of educational games seems to be confirmed by the present studies. Participants of the fifth group, which played an educational game, increased their financial knowledge even more than other groups. The chance that people could be more likely to be engaged in a game-based educational experience, rather than a classic class-based experience, makes the positive outcome for the educational game’s group even more interesting, because it has the potential to reach a wider target audience than the one prone to invest their time in attending classes (onsite or online). A key advantage of educational games, particularly when available online or downloadable on local devices, is their ability to utilize spare time—such as while waiting in line or traveling—for learning. This approach helps transform time that might otherwise be wasted into an opportunity for financial education, particularly for individuals unwilling or unable to attend classes.
These findings are in line with previous research demonstrating that digital delivery formats—such as pre-recorded video and live-streaming—can be as effective as traditional classroom instruction in transmitting financial knowledge (Kaiser & Menkhoff, 2020; Goyal & Kumar, 2020). The absence of significant differences between these delivery methods suggests that educational outcomes may depend more on the content and structure of the curriculum than on the medium of instruction itself.
Furthermore, the particularly strong performance of the educational game group aligns with emerging evidence that gamified learning environments can enhance motivation, engagement, and knowledge retention (Kalmi & Rahko, 2022). The immediate feedback and repetition structure built into our game likely contributed to the observed learning gains. These findings support the notion that interactive and self-paced formats may be especially effective in financial education, particularly for younger learners or those with limited time for formal instruction.
The superior performance of the game-based group may be partly explained by principles drawn from educational psychology. From a constructivist perspective, learners build knowledge actively through experiences, especially when they are personally meaningful or contextually engaging. The game environment provided learners with active tasks, goal-oriented challenges, and opportunities for self-correction, which align well with constructivist learning principles that emphasize the active role of the learner in building knowledge, often through interaction, exploration, and problem-solving (Piaget, 1962; Bruner, 1966). Additionally, cognitive load theory (Sweller, 1988) suggests that instructional design should avoid overwhelming the learner’s working memory. The game’s progressive difficulty and immediate feedback likely helped distribute cognitive load effectively by breaking down complex tasks into smaller, manageable chunks. Furthermore, the immersive and interactive nature of the game may have promoted deeper engagement and intrinsic motivation (Deci & Ryan, 1985), contributing to better knowledge retention.

5. Conclusions

The study aimed to evaluate the role of various media used to deliver content in financial education and their impact on a financial education curriculum’s effectiveness. The program content was delivered through four media: face-to-face classes; live-streaming classes; videos; and an educational game. Using a DiD approach, changes in different dimensions of financial literacy—knowledge, confidence and overconfidence—before and after financial education were compared with changes observed in a control group. The results demonstrated that different media could be used in financial education without significant differences in the curriculum’s overall effectiveness. Notably, the educational game appeared to yield a greater increase in financial knowledge than the other groups (class, live-streaming and video).
The study has limitations, such as the small sample size and the participants’ specific nature (first-year students from a faculty of economics), thereby restricting the results’ generalizability to the broader population. These limitations suggest the need for more investigation before definitive policy recommendations can be made. However, the authors believe the findings are encouraging and provide valuable insights for future studies on factors that contribute to financial education’s success.
If future studies support these results, the ability to deliver financial education effectively through various media, including remote-access options such as live-streaming and pre-recorded videos, could enhance access to financial education while reducing its costs compared with programs that rely solely on face-to-face meetings. This could benefit individuals, particularly in remote areas, who might otherwise lack access to such education. Furthermore, designing blended programs that combine multiple delivery media within the same curriculum or replicate entire curricula across different media presents promising opportunities.
The evidence that unconventional learning options, such as educational games, can produce outcomes comparable with—or even better than—traditional teaching methods opens the door to significantly different approaches in financial education. Targeted groups that may be reluctant to participate in traditional financial education programs or unwilling to commit substantial time to standard curricula might be more inclined toward engaging with apps available 24/7 on smartphones and tablets.
The findings of this study carry important and timely implications for various stakeholders in higher education. For educators and curriculum designers, the evidence suggests that financial education can be effectively delivered through multiple formats—including face-to-face, livestreaming, pre-recorded video, and game-based learning—enabling more flexible, inclusive, and scalable course planning. University administrators may draw on these insights when allocating resources for digital transformation, as the results show that digital formats can yield outcomes comparable to traditional classroom instruction. For policymakers and financial literacy advocates, the strong performance of the game-based group points to the value of investing in interactive, self-paced, and learner-driven tools that can engage broader segments of the population, including those less likely to attend conventional programs. Moreover, game-based formats show potential for boosting motivation and retention through immediate feedback and repetition, especially among younger learners. Blended approaches that combine structured instruction with technology-enhanced components offer promising pathways to increase engagement and expand access—particularly in remote or resource-constrained contexts. These insights are especially relevant in the context of evolving higher education models that emphasize personalization, flexibility, and lifelong learning. Overall, this research contributes to a broader understanding of how higher education can respond to future challenges by adopting empirically grounded, media-diverse, and cost-effective instructional designs.

Author Contributions

Conceptualization, G.N.; methodology, G.N. and M.H.; formal analysis, G.N.; data curation, G.N.; writing—original draft preparation, G.N. and M.H.; writing—review and editing, G.N. and M.H.; visualization, G.N. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

According to the national and institutional regulations, this type of study—based on anonymous, voluntary participation in an educational setting without the collection of sensitive personal data—does not require approval by an ethics committee.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study. Participants were informed prior to participation that the data would be collected anonymously and used exclusively for research purposes, in full compliance with applicable privacy regulations.

Data Availability Statement

The study described in the manuscript involved the anonymous collection of data from adult university students who voluntarily participated in a brief, non-invasive educational activity. The purpose was exclusively educational, with no collection of sensitive personal information or medical/biological data. The raw data supporting the conclusions of this article will be made available by the authors on request.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Almenberg, J., & Dreber, A. (2015). Gender, stock market participation and financial literacy. Economics Letters, 137, 140–142. [Google Scholar] [CrossRef]
  2. Ambuehl, S., Bernheim, B. D., & Lusardi, A. (2014). Effect of financial education on the quality of decision making (pp. 1–46). NBER Working Paper No. 20618. National Bureau of Economic Research. [Google Scholar] [CrossRef]
  3. Anderson, A., Baker, F., & Robinson, D. T. (2017). Precautionary savings, retirement planning and misperceptions of financial literacy. Journal of Financial Economics, 126(3), 383–398. [Google Scholar] [CrossRef]
  4. Anderson, H. M., Anaya, G., Bird, E., & Moore, D. (2005). A review of educational assessment. American Journal of Pharmaceutical Education, 69(1), 12. [Google Scholar] [CrossRef]
  5. Atkinson, A., & Messy, F. A. (2013). Promoting financial inclusion through financial education: OECD/INFE evidence, policies and practice (OECD Working Papers on Finance, Insurance and Private Pensions, No. 34). OECD Publishing. [Google Scholar] [CrossRef]
  6. Barteit, S., Guzek, D., Jahn, A., Barninghausen, T., Mendes Jorge, M., & Neuhann, F. (2020). Evaluation of e-learning for medical education in low- and middle-income countries: A systematic review. Computers & Education, 145, 103726. [Google Scholar] [CrossRef]
  7. Borden, L. M., Lee, S. A., Serido, J., & Collins, D. (2008). Changing college students’ financial knowledge, attitudes, and behavior through seminar participation. Journal of Family and Economic Issues, 29(1), 23–40. [Google Scholar] [CrossRef]
  8. Bruner, J. (1966). Toward a theory of instruction. Harvard University Press. [Google Scholar]
  9. Bucher-Koenen, T., & Lusardi, A. (2011). Financial literacy and retirement planning in Germany. Journal of Pension Economics & Finance, 10(4), 565–584. [Google Scholar] [CrossRef]
  10. De Bassa Scheresberg, C. (2013). Financial literacy and financial behavior among young adults: Evidence and implications. Numeracy, 6(2), 5. [Google Scholar] [CrossRef]
  11. Deci, E. L., & Ryan, R. M. (1985). Intrinsic motivation and self-determination in human behavior. Plenum Press. [Google Scholar]
  12. De Freitas, S., & Oliver, M. (2006). How can exploratory learning with games and simulations within the curriculum be most effectively evaluated? Computers & Education, 46(3), 249–264. [Google Scholar] [CrossRef]
  13. DeHart, W. B., Friedel, J. E., Lown, J. M., & Odum, A. L. (2016). The effects of financial education on impulsive decision making. PLoS ONE, 11(7), e0159561. [Google Scholar] [CrossRef] [PubMed]
  14. Fernandes, D., Lynch, J. G., Jr., & Netemeyer, R. G. (2014). Financial literacy, financial education, and downstream financial behaviors. Management Science, 60(8), 1861–1883. [Google Scholar] [CrossRef]
  15. Fox, J., Bartholomae, S., & Lee, J. (2005). Building the case for financial education. The Journal of Consumer Affairs, 39(1), 195–214. [Google Scholar] [CrossRef]
  16. French, D., & McKillop, D. (2016). Financial literacy and over-indebtedness in low-income households. International Review of Financial Analysis, 48, 1–11. [Google Scholar] [CrossRef]
  17. Gathergood, J. (2012). Self-control, financial literacy and consumer over-indebtedness. Journal of Economic Psychology, 33(3), 590–602. [Google Scholar] [CrossRef]
  18. Goyal, K., & Kumar, S. (2020). Financial literacy: A systematic review and bibliometric analysis. International Journal of Consumer Studies, 45(1), 80–105. [Google Scholar] [CrossRef]
  19. Kaiser, T., Lusardi, A., Menkhoff, L., & Urban, C. (2022). Financial education affects financial knowledge and downstream behaviors. Journal of Financial Economics, 145(2), 255–272. [Google Scholar] [CrossRef]
  20. Kaiser, T., & Menkhoff, L. (2017). Does financial education impact financial literacy and financial behavior, and if so, when? The World Bank Economic Review, 31(3), 611–630. [Google Scholar] [CrossRef]
  21. Kaiser, T., & Menkhoff, L. (2020). Financial education in schools: A meta-analysis of experimental studies. Economics of Education Review, 78, 101930. [Google Scholar] [CrossRef]
  22. Kalmi, P., & Rahko, J. (2022). The effects of game-based financial education: New survey evidence from lower-secondary school students in Finland. The Journal of Economic Education, 53(2), 109–125. [Google Scholar] [CrossRef]
  23. Litterscheidt, R., & Streich, D. (2020). Financial education and digital asset management: What’s in the black box? Journal of Behavioral and Experimental Economics, 87, 101573. [Google Scholar] [CrossRef]
  24. Lusardi, A. (2019). Financial literacy and the need for financial education: Evidence and implications. Swiss Journal of Economics and Statistics, 155(1), 1–8. [Google Scholar] [CrossRef]
  25. Lusardi, A., & Mitchell, O. S. (2011). Financial literacy and retirement planning in the United States. Journal of Pension Economics & Finance, 10(4), 509–525. [Google Scholar] [CrossRef]
  26. Lusardi, A., & Tufano, P. (2015). Debt literacy, financial experiences, and overindebtedness. Journal of Pension Economics & Finance, 14(4), 332–368. [Google Scholar] [CrossRef]
  27. Lyons, A. C., Palmer, L., Jararatne, K., & Scherpf, E. (2006). Are we making the grade? A national overview of financial education and program evaluation. The Journal of Consumer Affairs, 40(2), 208–235. [Google Scholar] [CrossRef]
  28. Means, B., Toyama, Y., Murphy, R., & Baki, M. (2013). The effectiveness of online and blended learning: A meta-analysis of the empirical literature. Teachers College Record, 115(3), 1–47. [Google Scholar] [CrossRef]
  29. Miller, M., Reichelstein, J., Salas, C., & Zia, B. (2015). Can you help someone become financially capable? A meta-analysis of the literature. The World Bank Research Observer, 30(2), 220–246. [Google Scholar] [CrossRef]
  30. Nicolini, G., & Haupt, M. (2019). The assessment of financial literacy: New evidence from Europe. International Journal of Financial Studies, 7(3), 54. [Google Scholar] [CrossRef]
  31. OECD. (2020). OECD/INFE 2020 international survey of adult financial literacy. Available online: https://web-archive.oecd.org/2022-08-09/555847-launchoftheoecdinfeglobalfinancialliteracysurveyreport.htm (accessed on 1 July 2024).
  32. Piaget, J. (1962). Play, dreams and imitation in childhood. W. W. Norton. [Google Scholar]
  33. Plass, J. L., Homer, B. D., & Kinzer, C. K. (2015). Oundations of game-based learning. Educational Psychologist, 50(4), 258–283. [Google Scholar] [CrossRef]
  34. Sekita, S. (2011). Financial literacy and retirement planning in Japan. Journal of Pension Economics & Finance, 10(4), 637–656. [Google Scholar]
  35. Sweller, J. (1988). Cognitive load during problem solving: Effects on learning. Cognitive Science, 12(2), 257–285. [Google Scholar] [CrossRef]
  36. Van Rooij, M., Lusardi, A., & Alessie, R. (2011). Financial literacy and stock market participation. Journal of Financial Economics, 101(2), 449–472. [Google Scholar] [CrossRef]
  37. Van Rooij, M., Lusardi, A., & Alessie, R. (2012). Financial literacy, retirement planning and household wealth. The Economic Journal, 122(560), 449–478. [Google Scholar] [CrossRef]
  38. Walstad, W. B., Rebeck, K., & MacDonald, R. A. (2010). The effects of financial education on the financial knowledge of high school students. The Journal of Consumer Affairs, 44(2), 336–357. [Google Scholar] [CrossRef]
  39. Xiao, J. J., & O’Neill, B. (2016). Consumer financial education and financial capability. International Journal of Consumer Studies, 40(6), 712–721. [Google Scholar] [CrossRef]
  40. Zhang, H., & Xiong, X. (2020). Is financial education an effective means to improve financial literacy? Evidence from rural China. Agricultural Finance Review, 80(3), 305–320. [Google Scholar] [CrossRef]
Figure 1. Difference-in-differences methodology.
Figure 1. Difference-in-differences methodology.
Education 15 01053 g001
Table 1. Socio-demographic characteristics of the sample.
Table 1. Socio-demographic characteristics of the sample.
Male
(1 = Male)
GroupMeanSt. Dev.Obs.p-value
(T-test on diff ! = 0 respect to control group)
Class0.532170.478891040.9537
Streaming0.471700.501571060.4141
Video0.455090.495081060.2878
Game0.508200.501991020.7742
(Control)0.528240.49458102
Total0.498660.49442520
Topgrade
(1 = Participant completed high school with a top grade)
GroupMeanSt. Dev.Obs.p-value
(T-test on diff ! = 0 respect to control group)
Class0.276600.449711040.3243
Streaming0.283020.452611060.2644
Video0.250750.468981060.5684
Game0.263930.371741020.3819
(Control)0.215690.41333102
Total0.258210.43180520
Parentgrad
(1 = At least one parent graduated)
GroupMeanSt. Dev.Obs.p-value
(T-test on diff ! = 0 respect to control group)
Class0.437830.478891040.6321
Streaming0.471700.501571060.9873
Video0.509430.502291060.5775
Game0.550000.491951020.2550
(Control)0.470590.50160102
Total0.487760.49530520
Table 2. Financial knowledge items.
Table 2. Financial knowledge items.
(1) Which background color is the EUR 20 bill?
ClassStreamingVideoGame(Control)
Pre-testPost-testPre-testPost-testPre-testPost-testPre-testPost-testPre-testPost-test
Gray0000200000
Pink/Red3200200000
Blue (correct)98102106106102106101102102102
Orange0000000000
(Do not know)3000001000
Obs.104104106106106106102102102102
% correct answers94.2%98.1%100.0%100.0%96.2%100.0%99.0%100.0%100.0%100.0%
(2) Who is the issuer of euros (banknotes and coins)?
ClassStreamingVideoGame(Control)
Pre-testPost-testPre-testPost-testPre-testPost-testPre-testPost-testPre-testPost-test
Minister of Economics and Finance52442066410
European Central Bank (ECB) (correct)901029810210210694949890
Parliament3000000000
Government2020000000
(Do not know)4020202202
Obs.104104106106106106102102102102
% correct answers86.5%98.1%92.5%96.2%96.2%100.0%92.2%92.2%96.1%88.2%
(3) When did the switch from the Italian lira to the euro happen?
ClassStreamingVideoGame(Control)
Pre-testPost-testPre-testPost-testPre-testPost-testPre-testPost-testPre-testPost-test
Around 5 years ago0200000000
Around 10 years ago0022260000
More than 15 years ago (correct)97100104102102100100100100102
Less than 5 years ago3200000000
(Do not know)4002202220
Obs.104104106106106106102102102102
% correct answers93.3%96.2%98.1%96.2%96.2%94.3%98.0%98.0%98.0%100.0%
(4) Which of the following countries does NOT use the euro as their local currency?
ClassStreamingVideoGame(Control)
Pre-testPost-testPre-testPost-testPre-testPost-testPre-testPost-testPre-testPost-test
France0000000000
Germany5200200000
Spain0000000000
USA (correct)96102106106102106102102102102
(Do not know)3000200000
Obs.104104106106106106102102102102
% correct answers92.3%98.1%100.0%100.0%96.2%100.0%100.0%100.0%100.0%100.0%
(5) If the euro–dollar exchange rate is 1.20, how many dollars do you take for exchanging EUR 100?
ClassStreamingVideoGame(Control)
Pre-testPost-testPre-testPost-testPre-testPost-testPre-testPost-testPre-testPost-test
USD 805522424024
USD 120 (correct)67838288668674928274
USD 83.331916181420182481422
USD 12003000000000
(Do not know)100421600242
Obs.104104106106106106102102102102
% correct answers64.4%79.8%77.4%83.0%62.3%81.1%72.5%90.2%80.4%72.5%
(6) What is the maximum amount (by law) of cash you can withdraw from an ATM in a month?
ClassStreamingVideoGame(Control)
Pre-testPost-testPre-testPost-testPre-testPost-testPre-testPost-testPre-testPost-test
EUR 10,0002724283822342562238
EUR 500029123242262731812
EUR 2500821041601561612
There is no limit by the law (correct)666186026644861826
(Do not know)3401802023112814
Obs.104104106106106106102102102102
% correct answers5.8%63.5%17.0%56.6%24.5%60.4%3.9%84.3%17.6%25.5%
(7) You are coming back from the US. Shopping around in the airport, you decide to buy an item that you can pay for either by euro or US dollars. Suppose you still have dollars, and you can exchange them at Bid = 1.10 and Ask = 1.40 in a currency kiosk. Is it more convenient to pay in store with EUR 100 or USD 120?
ClassStreamingVideoGame(Control)
Pre-testPost-testPre-testPost-testPre-testPost-testPre-testPost-testPre-testPost-test
It is better to pay in euros1116881416174128
It is better to pay in US dollars (correct)32651870228231921638
Because the average between Bid and Ask is 1.20, paying in-store in euro or dollar is the same2141461026068
There is not enough information to answer for sure224060001210
(Do not know)381762225464865638
Obs.104104106106106106102102102102
% correct answers30.8%62.5%17.0%66.0%20.8%77.4%30.4%90.2%15.7%37.3%
(8) You have found banknotes that are “trunked” by the 60% (60% of the banknote is missing). If you go to the Bank of Italy (central bank)
ClassStreamingVideoGame(Control)
Pre-testPost-testPre-testPost-testPre-testPost-testPre-testPost-testPre-testPost-test
You can still replace these banknotes with new ones17714268110108
You will receive new banknotes equal to 40% of the original full value2022604024
You receive nothing because the truncation is beyond 50%, and you receive your banknotes back (correct)32851482227821982234
Your banknotes will be retained by the central bank, and you will receive your banknotes back1510261828203322222
(Do not know)3825024403324634
Obs.104104106106106106102102102102
% correct answers30.8%81.7%13.2%77.4%20.8%73.6%20.6%96.1%21.6%33.3%
(9) If you find a suitcase full of Italian lira and you go to the Bank of Italy (issuer)
ClassStreamingVideoGame(Control)
Pre-testPost-testPre-testPost-testPre-testPost-testPre-testPost-testPre-testPost-test
You realize these banknotes cannot be exchanged for euro anymore (correct)63102681026610271926272
Banknotes will be exchanged for euros at the 2001 official exchange rate (1936.27 lira for 1 euro)12040804686
You can exchange lira for euro only by proving the legal provenance of the banknotes80102609026
Banknotes will be retained and destroyed (without anything in exchange)42624260102
(Do not know)1701802221222016
Obs.104104106106106106102100102102
% correct answers60.6%98.1%64.2%96.2%62.3%96.2%69.6%92.0%60.8%70.6%
(10) What is the limit by law for cash payment by coins in Italy?
ClassStreamingVideoGame(Control)
Pre-testPost-testPre-testPost-testPre-testPost-testPre-testPost-testPre-testPost-test
There is no limit because coins are euro as banknotes4413401050246523842
50 coins (regardless of their value) (correct)1289129614767991426
500 coins (regardless of their value)2040222002
Coins whose total value exceeds EUR 5004040444020
(Do not know)4224603602414832
Obs.104104106106106106102102102102
% correct answers11.5%85.6%11.3%90.6%13.2%71.7%6.9%97.1%13.7%25.5%
Table 3. Correct response rate to the Lusardi–Mitchell questions.
Table 3. Correct response rate to the Lusardi–Mitchell questions.
Correct Answer (%)Obs.
Compound interest68.3%520
Inflation79.5%520
Bond20.9%520
Mortgage64.9%520
Diversification78.4%520
Note: Average number of correct answers in the entire pre-treatment sample is 3.12.
Table 4. Correlation analysis of the financial literacy items.
Table 4. Correlation analysis of the financial literacy items.
Pre-test
item1item2item3item4item5item6item7item8item9item10
EUR 20 bill coloritem11
Euro issueritem20.161
Changeover to euroitem30.250.211
Country without the euroitem40.390.160.251
Exchange rateitem50.090.060.190.151
ATM withdrawal limititem60.06−0.03−0.13−0.02−0.031
Bid–Ask quotationitem70.08−0.05−0.020.010.12−0.041
Trunked banknotesitem80.010.00−0.070.01−0.06−0.04−0.011
Banknotes out of circulationitem90.060.010.000.010.01−0.060.100.101
Coin payments’ limititem100.05−0.04−0.20−0.03−0.180.00−0.030.170.021
Post-test
item1item2item3item4item5item6item7item8item9item10
EUR 20 bill coloritem11
Euro issueritem20.421
Changeover to euroitem30.600.221
Country without the euroitem40.810.330.481
Exchange rateitem50.250.130.080.241
ATM withdrawal limititem60.140.110.090.070.141
Bid–Ask quotationitem70.170.030.090.160.030.201
Trunked banknotesitem80.190.150.120.130.090.210.191
Banknotes out of circulationitem90.360.130.180.280.190.170.190.191
Coin payments’ limititem100.200.170.090.140.140.170.260.270.341
Table 5. Financial knowledge indices: Analysis of reliability.
Table 5. Financial knowledge indices: Analysis of reliability.
RangeMeansSt. Dev.Obs *Cronbach’s Alpha
FinKnowledge (items 1–10)(0–10)7.0311.86261040 0.6546
FinKnowledge (items 1–5)(0–5)4.5930.73881040 0.4803
FinKnowledge (items 6–10)(0–5)2.4381.59991040 0.7176
* To assess the reliability of the indices, all the available observations (from both the pre-test and the post-test) were included.
Table 6. Financial knowledge index.
Table 6. Financial knowledge index.
ClassStreamingVideoGameControl
Sum of correct answers to the five financial knowledge questions (items 6–10)Pre-testPost-testPre-testPost-testPre-testPost-testPre-testPost-testPre-testPost-test
Mean1.446813.957451.226423.867931.415093.792451.245904.360661.294121.92157
St. Dev.0.990460.774800.865010.995951.058910.813190.806091.192720.778270.95115
Obs.104104106106106106102102102102
p-value
(Ha: post-test–pre-test > 0)
0.0000.0000.0000.0000.000
Table 7. Results of an ordered logistic regression on financial knowledge.
Table 7. Results of an ordered logistic regression on financial knowledge.
Financial Knowledge Index
(Sum of Correct Answers to the Five Financial Knowledge Questions: Items 6–10)
(Integer: 0-1-2-3-4-5)
Ordered Logistic RegressionNumber of obs. =1040
LR chi2(11) =984.93
Prob > chi2 =0.000
Log Likelihood = −1342.3237Pseudo R2 =0.2684
 
Coef.Std. Err.zP > z [95% Conf.]Interval
Groups(omitted)
Class−0.224520.23147−0.970.332−0.6781920.22916
Live-Streaming−0.771180.21806−3.540.000−1.198579−0.34378
Video−0.409960.22710−1.810.071−0.8550630.03515
Game−0.632870.20914−3.030.002−1.042785−0.22296
 
Class × Post-Test4.406690.31513.990.0003.7892955.02409
Live-Streaming × Post-Test4.913930.3086415.920.0004.3090195.51885
Video × Post-Test4.360650.3013314.470.0003.770064.95124
Game × Post-Test6.509560.3300819.720.0005.8626137.1565
 
Male−0.010440.11821−0.090.930−0.24211760.22124
Parentsgrad0.071980.116690.620.537−0.15672210.30068
Topgrade0.156790.135791.150.248−0.10936230.42293
Table 8. Test of significance of the differences between regression coefficients (Diff-In-Diff) and financial knowledge.
Table 8. Test of significance of the differences between regression coefficients (Diff-In-Diff) and financial knowledge.
T-Test
(H0 = Coef.1 − Coef.2 = 0)
Class × Post-TestLive-Streaming × Post-TestVideo × Post-TestGame × Post-Test
Class × Post-Test
Live-Streaming × Post-TestH0 = 4.40669 − 4.91393 = 0
Chi-squared (1) = 1.86
Prob > Chi-squared = 0.1726
Video × Post-TestH0 = 4.40669 − 4.36065 = 0
Chi-squared (1) = 0.02
H0 = 4.91393 − 4.36065 = 0
Chi-squared (1) = 2.36
Prob > Chi-squared = 0.9007Prob > Chi-squared = 0.1242
Game × Post-TestH0 = 4.40669 − 6.50956 = 0
Chi-squared (1) = 30.81
H0 = 4.91393 − 6.50956 = 0
Chi-squared (1) = 18.67
H0 = 4.36065 − 6.50956 = 0
Chi-squared (1) = 34.05
Prob > Chi-squared = 0.000Prob > Chi-squared = 0.000Prob > Chi-squared = 0.000
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Nicolini, G.; Haupt, M. Do Teaching Media Matter? A Comparative Study of Finance Education via Classroom, Livestream, Video, and Educational Games. Educ. Sci. 2025, 15, 1053. https://doi.org/10.3390/educsci15081053

AMA Style

Nicolini G, Haupt M. Do Teaching Media Matter? A Comparative Study of Finance Education via Classroom, Livestream, Video, and Educational Games. Education Sciences. 2025; 15(8):1053. https://doi.org/10.3390/educsci15081053

Chicago/Turabian Style

Nicolini, Gianni, and Marlene Haupt. 2025. "Do Teaching Media Matter? A Comparative Study of Finance Education via Classroom, Livestream, Video, and Educational Games" Education Sciences 15, no. 8: 1053. https://doi.org/10.3390/educsci15081053

APA Style

Nicolini, G., & Haupt, M. (2025). Do Teaching Media Matter? A Comparative Study of Finance Education via Classroom, Livestream, Video, and Educational Games. Education Sciences, 15(8), 1053. https://doi.org/10.3390/educsci15081053

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop