Previous Article in Journal
Investigating the Effect of Pseudo-Haptics on Perceptions Toward Onomatopoeia Text During Finger-Point Tracing
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

SmartRead: A Multimodal eReading Platform Integrating Computing and Gamification to Enhance Student Engagement and Knowledge Retention

School of Computer Science, University of Hull, Kingston upon Hull HU6 7RX, UK
*
Author to whom correspondence should be addressed.
Multimodal Technol. Interact. 2025, 9(10), 101; https://doi.org/10.3390/mti9100101
Submission received: 15 July 2025 / Revised: 12 September 2025 / Accepted: 16 September 2025 / Published: 23 September 2025

Abstract

This paper explores the integration of computing and multimodal technologies into personal reading practices to enhance student engagement and knowledge assimilation in higher education. In response to a documented decline in voluntary academic reading, we investigated how technology-enhanced reading environments can re-engage students through interactive and personalized experiences. Central to this research is SmartRead, a proposed multimodal eReading platform that incorporates gamification, adaptive content delivery, and real-time feedback mechanisms. Drawing on empirical data collected from students at a higher education institution, we examined how features such as progress tracking, motivational rewards, and interactive comprehension aids influence reading behavior, engagement levels, and information retention. Results indicate that such multimodal interventions can significantly improve learner outcomes and user satisfaction. This paper contributes actionable insights into the design of innovative, accessible, and pedagogically sound digital reading tools and proposes a framework for future eReading technologies that align with multimodal interaction principles.

Graphical Abstract

1. Introduction

Personal reading goes beyond assessment; it is the basis necessary to be developed in both cognitive and academic environments. We are now in the world of artificial intelligence, where virtually all the sectors of life have evolved to level up with technological advancement, and this also includes the educational sector. Computing is a broad field that touches all aspects, ranging from simple or elementary computer usage to the use of advanced artificial intelligence tools [1]. It cannot be overemphasized that the integration of computing into education can improve student learning and engagement through the usage of online learning platforms, educational software, and digital content [2]. Computing, such as gamification, can foster an interesting learning environment and make it more engaging, as we are now in a tech-driven age. Furthermore, it can help with individual student needs and give personalized learning methods that aid assimilation [3].
Over the years, students have been exposed to digital-aided tools, but lack the right usage. This conclusion is drawn from research that shows there has been a great decline in student personal reading, which has resulted in a poor ability to retain lifelong learning skills. Screen-based reading has been observed to be more common [4]. Technology-based aided technique has been researched to be the most realistic way to help improve student assimilation [5]. It is not limited to keeping the student interested; it helps them to develop the required skills for professional achievement.
There have been various concerns about how technology can serve as a distraction if it is incorporated into students’ personal development [6]. This can be seen as one of the problems that will prevent assimilation. Additionally, there has been contemplation on whether it can serve as a better method of comprehension compared with the paper-based method that is widely used by students. However, research has shown that the benefit of using it has a greater edge over the disadvantage of ignoring it if it is appropriately used. The integration of computing in personal reading will not only help with student engagement, but it will also help in navigating through the digital world, which helps students in their professional careers.
The purpose of this study was to evaluate the impact of integrating multimodal computing tools on student engagement and knowledge assimilation during personal reading. By analyzing student interactions with digital reading technologies, this research aimed to identify effective strategies for enhancing comprehension and retention in higher education.

2. Background and Literature Review

The integration of computing into reading practices has gained increasing attention, particularly as educational institutions seek innovative ways to improve student engagement and assimilation. This literature review explores foundational concepts such as digital literacy, student engagement, and the strategic use of technology to enhance independent reading in higher education.
Digital literacy is widely recognized as a critical competency in contemporary learning environments. It encompasses a range of skills, from basic digital navigation to complex problem-solving tasks enabled by technology. Ng [7] highlights the essential role digital literacy plays in improving learning outcomes, while Meridha [8] emphasizes its pedagogical significance in higher education. However, access-related challenges and limited training often hinder learners from fully leveraging technological opportunities, creating disparities in student outcomes.
Student engagement, defined through behavioral, emotional, and cognitive participation, is essential for deep learning. According to Wong et al. [9], students who are cognitively engaged tend to persist longer, comprehend difficult concepts, and display stronger academic commitment. The relationship between technology and student engagement has been explored in numerous studies, with evidence suggesting that digital tools, when aligned with learner needs, can foster higher levels of attentiveness and motivation.
Research on the intersection of digital literacy and learning underscores how educational institutions are adapting to new technological demands. Meridha [8] and Otto et al. [10] have noted that while digital tools provide opportunities for flexibility and interaction, barriers such as infrastructure gaps and insufficient pedagogical integration still exist. Audrin and Audrin [11], using text mining techniques, identified key factors for digital learning success, including 21st-century digital skills and proactive engagement practices.
The use of computing tools in personal reading has emerged as a key area of interest. Tools such as AI-supported reading apps, eBooks, and interactive platforms provide opportunities for personalized, engaging, and accessible learning experiences. Bajúzová and Hrmo [12] argue that these tools can make reading more responsive to individual needs, while Lin and Yu [13] suggest that perceived ease of use and usefulness significantly influence students’ willingness to adopt such technologies. Chavez and Palaoag [14] add that tailoring reading platforms to student preferences can increase motivation and sustained use.
A wide variety of computing tools have been explored in academic settings. E-books and digital textbooks allow for customizable reading experiences, offering features such as font adjustment, multimedia integration, and in-app dictionaries that aid in assimilation [15]. Reading apps like Kindle and Google Play Books support continuity across devices and include tools like annotations and bookmarks [7]. Platforms such as Epic! and Raz-Kids provide interactive experiences through read-aloud options, quizzes, and content recommendations suited for younger learners [16]. AI-powered tools, including Rewordify and Grammarly, offer real-time feedback on grammar and comprehension, simplifying complex texts and supporting writing development [17]. Digital annotation tools enhance peer collaboration and critical reading by allowing learners to highlight, comment, and share ideas across networks.
Successful integration of computing tools into personal reading requires thoughtful instructional strategies. Personalized learning paths adapt content and challenge levels to meet individual learner needs [7], while collaborative tools promote peer dialogue and co-construction of knowledge [18]. Gamification features, such as leaderboards and rewards, have been consistently linked to improved motivation and participation [16]. Aligning digital interventions with curriculum objectives, as noted by Măduţa [19], ensures that technology enhances rather than disrupts educational outcomes.
Overall, the literature provides robust evidence that computing tools, when thoughtfully implemented, can meaningfully enhance personal reading experiences by supporting motivation, comprehension, and retention.

Impact on Student Engagement and Assimilation

Digital tools such as AI-powered platforms and educational technologies have significantly influenced how students engage with personal reading tasks [20]. Given the role student engagement plays in shaping academic success—particularly in environments like academic libraries, several techniques have been employed to build more interactive and responsive learning spaces. Empirical studies suggest that these technologies support student assimilation through features that foster both active participation and comprehension.
For instance, interactive e-books and educational applications have been shown to improve student engagement, particularly where elements like embedded quizzes and multimedia content are embedded in the reading process [20]. In the context of comprehension, students who use e-book features such as audio narration or visual aids tend to score higher on reading tests than those who rely solely on static content. Evidence also points to improved retention levels, particularly vocabulary acquisition, among students who regularly use AI-supported tools during their reading practice [17]. Adaptive reading platforms further contribute by delivering personalized content that adjusts to learner progress, resulting in a more tailored and motivating reading experience [7].
Despite these benefits, the integration of computing tools into personal reading is not without significant challenges. Limited access to devices such as tablets, e-readers, and high-speed internet remains a key barrier, especially for students in under-resourced contexts [21]. Even where hardware is available, technical issues—such as system glitches or unreliable connectivity can disrupt the reading process, highlighting the need for consistent technical support [22].
Educational barriers also persist. Teachers play a crucial role in facilitating digital learning, yet inadequate training or a lack of confidence in using digital platforms may hinder their ability to support students effectively. This can affect both the depth of engagement and the level of comprehension students achieve during personal reading [23]. Furthermore, when digital tools are not meaningfully aligned with curriculum objectives, they risk functioning as supplementary add-ons rather than integrative components of the learning process [7].
From the student perspective, digital reading environments also introduce cognitive challenges. The potential for distraction, through notifications, multitasking, or unrelated web browsing, can reduce focus and weaken assimilation [24]. Additionally, students bring diverse learning needs to the reading experience, and one-size-fits-all platforms may not adequately cater to varying preferences, cognitive abilities, or reading paces. This calls for adaptive technologies that are sensitive to learner diversity and inclusive by design [25].
While the use of computing tools in personal reading offers substantial pedagogical promise, it also presents infrastructural, instructional, and individual-level complexities that must be addressed to ensure equitable and effective outcomes.
Research Questions
This study sought to answer the following questions:
  • How do multimodal computing tools affect student engagement in personal reading?
  • What impact do digital reading tools have on knowledge assimilation?
  • Which features of digital reading platforms are most preferred by students?
  • What challenges do students face when using digital reading tools?
  • How can the design and technical features of a digital reading platform be optimized to improve usability, engagement, and comprehension
Hypotheses and Rationale
The hypotheses developed in this study are grounded in the research questions and the overarching aim of evaluating how multimodal digital tools influence student engagement and knowledge assimilation. As outlined in the abstract, the study centers on the SmartRead platform, which integrates features such as gamification, adaptive content delivery, and AI-powered support to enhance personal reading experiences in higher education.
To empirically assess the impact of these tools, the study first considers the following:
H1. 
Students’ personal reading correlates with improved assimilation. It is hypothesized that students who read more frequently will demonstrate higher levels of comprehension and retention.
H2. 
Reading frequency will not significantly influence engagement levels, as motivation can be shaped by a variety of external and internal factors beyond reading habits alone.
H3. 
Student engagement levels will vary across academic faculties, reflecting disciplinary differences.
H4. 
Knowledge assimilation levels will differ by age group.
These hypotheses are intended to examine the influence of demographic variables on the effectiveness of digital reading tools.
H5. 
Students who report higher engagement will also report stronger assimilation outcomes, indicating the relationship between engagement and learning.
In addition to behavioral and demographic factors, the study evaluates the following:
H6. 
Student preferences for AI-powered reading assistants and features will vary significantly across age groups, reflecting generational differences in digital literacy and comfort with AI technologies.
H7. 
Faculty affiliation will not significantly affect preferences for AI-powered tools, as the utility of AI tools is likely to be perceived similarly across disciplines.
Collectively, these hypotheses are designed to provide a comprehensive understanding of how digital reading technologies function across diverse student populations and learning contexts. They also support the development of evidence-based recommendations for designing more effective and inclusive digital reading platforms.

3. Materials and Methods

3.1. Research Design

This study followed a quantitative research design to systematically analyze the integration of computing tools into personal reading and their impact on student engagement and assimilation. A structured approach was adopted to ensure methodological rigor in data collection, analysis, and interpretation.

3.2. Research Approach

A descriptive quantitative method was utilized to assess trends, correlations, and student experiences in digital learning. This method enables a systematic examination of student interaction with digital tools, allowing for measurable insights into engagement patterns and perceived benefits.

3.3. Data Collection Method

This study employed a JISC online survey to efficiently distribute questionnaires and gather responses from participants across various academic disciplines. The survey was designed to accommodate participation across the faculty of Business, Law and Politics, Criminology, Sociology and Policing, Humanities, Art, Education, Science, Engineering, Nursing and Midwifery, Psychology and Social work, and Sport, Exercise and Rehabilitation Science. This approach ensured broad accessibility and participation, facilitating a diverse dataset for analysis.

3.4. Sample Selection

A total of 178 students from the University of Hull participated in the survey. The survey was distributed across multiple faculties using a combination of online and in-person outreach methods. It was convenience sampling. Due to the time constraints and administrative approval required to access departmental mailing lists formally, direct distribution was limited to the Science and Engineering department only. To ensure broad participation, students were engaged through student spaces such as the university library, cafeteria, and student lounges to engage students and invite them for voluntary participation using flyers with QR codes. Participation was entirely voluntary, and students were informed of the study’s purpose and assured anonymity. The final sample of 178 respondents reflects those who chose to participate within the data collection window.

3.5. Analytical Framework

Python (v3.12.7) was used as a data analysis tool due to its robust libraries for statistical analysis and visualization. Specifically, it enabled the processing of survey responses, the generation of visual insights, and the execution of statistical tests. Python was not used as a framework but as a programming language to implement the analytical procedures. It utilizes libraries such as pandas for data manipulation, matplotlib and seaborn for visualization, and scipy.stats (v1.16.1) and statsmodels (v1.16.1) for statistical testing. Sentiment analysis was conducted using TextBlob (v0.19.0), and feature preference analysis was performed using keyword extraction and scoring models. The data analysis was conducted by employing descriptive statistics that summarize demographic characteristics, engagement trends, and preferred digital tools. Correlation analysis identifies the relationships between reading frequency, faculty inclination toward digital tools, and age-based preferences. Sentiment analysis examines textual responses to understand student perceptions and experiences with digital learning tools. Feature preference analysis evaluates favored digital reading functionalities such as audiobooks, AI assistants, and interactive tools. Challenge identification analyzes common obstacles encountered by students, including cost, distraction, and privacy concerns. Recommendations and improvements assess student feedback to propose enhancements in digital tools for better engagement and assimilation.
Participants in this study engaged with a diverse range of digital reading tools, as identified through the JISC survey. The most widely used and valued feature was AI-powered assistants such as ChatGPT, LearnGPT, and TutorAI, which were selected by 77% of respondents for their effectiveness in enhancing comprehension and retention. Interactive content and learning platforms, including quizzes and multimedia resources, were favored by 43% of students, while 38% found highlighting and note-taking tools beneficial for organizing and reinforcing their understanding. Similarly, summarization features were appreciated by 38% of respondents for their ability to condense complex information. Interactive quizzes and summaries were used by 36%, and audiobooks or text-to-speech tools were preferred by 27% of students for their accessibility and support for auditory learning. A smaller group, 9%, identified acronym checkers as helpful in navigating technical or unfamiliar terminology.
In terms of usage frequency, the majority of students reported engaging in personal reading for learning purposes on a regular basis. Specifically, 51% of respondents indicated that they read several times a week, while 28% engaged in daily reading. An additional 11% read once a week, and 10% reported reading rarely. These findings suggest that nearly 80% of students maintain a consistent reading habit, with most interacting with digital tools multiple times per week. This level of engagement underscores the importance of these tools in supporting academic reading and highlights the features that contribute most significantly to usability, comprehension, and retention.

4. Results

4.1. Impact of Computing on Engagement

Figure 1 shows how students perceive the impact of digital tools on their engagement with academic reading. The majority reported that digital tools somewhat (44.5%) or significantly (42.8%) increased their engagement. Only a small portion experienced no change (8.7%) or a decrease in engagement (4.0%), indicating a generally positive influence of technology on reading motivation.
In conclusion, the majority (~87.3%) of students feel that integrating computing tools increases their engagement with reading materials, with 30% reporting a significant boost. Tools like online learning platforms and educational software are cited as key contributors. However, ~8.7% of students reported no notable change, and a small percentage noted decreased engagement.

4.2. Top eReading Features

Figure 2 shows students’ most favored digital reading features. The top preferences include AI-powered assistants like ChatGPT, TutorAI, and LearnGPT (each with 16.8%), followed by interactive quizzes and summaries (10.6%) and note-taking tools (10.2%). These results highlight a strong interest in intelligent, interactive, and supportive technologies that enhance comprehension and engagement.
In conclusion, AI-powered assistants, such as ChatGPT and similar tools, emerge as the most effective for improving assimilation, favored by nearly half of respondents. These tools, along with interactive quizzes and platforms, cater to personalized learning styles, fostering better retention of reading materials.

4.3. Impact of eReading Tools on Assimilation

Figure 3 shows students’ perceptions of how digital tools affect their comprehension and retention. The results show that most students experienced either somewhat improved assimilation (46.8%) or greatly improved assimilation (44.5%), indicating a strong positive impact. Only a small fraction reported no difference (6.4%) or negative effects, such as reduced assimilation and disengagement (1.2% each).
In conclusion, approximately 91.3% of students report that computing tools enhance their comprehension and retention of information compared with traditional methods, with over a third experiencing significant improvement. Interactive features like AI-powered assistants, quizzes, and summarization tools contribute to the positive shift, while a small minority note no difference.

4.4. Desired Technical Features

Figure 4 shows the most desired improvements in digital tools as reported by students. The results show that better integration with other learning resources (31.6%) and enhanced personalization (31.1%) are the top priorities, followed by more interactive content (28.6%). Improved user interface was mentioned less frequently (8.8%), indicating a stronger demand for pedagogical and engagement-focused enhancements.
In conclusion, students prioritize personalization, interactivity, and seamless integration of digital tools into existing learning platforms. Improved user interfaces and offline capabilities are also crucial for enhancing usability. Addressing these preferences can significantly improve the user’s experience in digital reading platforms.

4.5. Challenges

Figure 5 shows the most common challenges students face when using digital tools for academic reading. It highlights key barriers such as distraction, cost constraints, and difficulty finding suitable tools. These are factors that must be addressed to improve engagement and assimilation.
Result
Distraction from non-educational content—24.2%
Cost Constraints (restricting free access to better features)—23.3%
Difficulty finding the right digital tools—13.6%
Privacy Concern—10.0%
Difficulty with ease of use and navigation—8.6%
Conclusion: The findings emphasize the need for strategies for minimizing digital distractions and making high-quality features more accessible to students. There is also an opportunity to improve tool discovery and address privacy concerns to enhance user confidence in digital learning environments. Addressing these challenges could lead to better engagement, improved comprehension, and increased adoption of digital tools.
Summary of Results
The dataset reveals:
  • Computing tools substantially increase student engagement and assimilation;
  • Tools like AI-powered assistants are most effective for improving comprehension and retention;
  • Compared with traditional methods, digital tools are overwhelmingly preferred, with limited concerns about distractions or lack of personalization;
  • Students value personalized experiences, interactivity, and better tool integration for a more user-friendly environment;
  • Despite their benefits, digital tools pose challenges, including difficulty in finding the right tools (13.6%), privacy concerns (10.0%), and ease-of-use/navigation issues (8.6%).

4.6. P-Test Analysis

4.6.1. Assimilation by Reading Frequency

ANOVA results revealed a statistically significant difference in assimilation levels based on students’ reading frequency, F(3, 169) = 4.79, p = 0.0032, η2 = 0.092. This indicates a moderate effect size, suggesting that students who engage in personal reading more frequently tend to report higher levels of comprehension and retention. The 95% confidence interval for the mean difference is illustrated in the shaded region of Figure 6, reinforcing the observed variation in assimilation across different reading habits.

4.6.2. Engagement by Reading Frequency

ANOVA results showed no statistically significant difference in engagement levels based on reading frequency: F(3, 169) = 1.558, p = 0.2015, η2 = 0.027. This small effect size suggests that how often students engage in personal reading does not substantially influence their reported levels of engagement. The 95% confidence intervals for each group mean are illustrated in the shaded region of Figure 7, indicating consistency in engagement across different reading habits.

4.6.3. Engagement by Faculty

ANOVA results showed no statistically significant difference in engagement levels across faculties: F(11, 166) = 1.1982, p = 0.2995, η2 = 0.073. The small effect size suggests that students’ faculty affiliation does not substantially influence their reported engagement with digital reading tools. The 95% confidence intervals for each group mean are illustrated in the shaded region of Figure 8, indicating a consistent pattern of engagement across academic disciplines.

4.6.4. Test: Assimilation by Age Group

ANOVA results showed no statistically significant difference in assimilation levels across age groups: F(5, 172) = 1.9436, p = 0.0897, η2 = 0.053. The small effect size suggests that students’ age does not substantially influence their reported levels of comprehension and retention. The overall mean assimilation score was 2.36, and the group means are visualized in the shaded region of the Figure 9, indicating a consistent pattern of assimilation across age categories.

4.6.5. Correlation Between Engagement and Assimilation

A Pearson correlation analysis revealed a moderate positive relationship between engagement and assimilation (r = 0.5020, p = 0.2995). However, this association is not statistically significant, indicating that students who report higher levels of engagement with digital tools do not necessarily demonstrate better comprehension and retention of reading materials. The scatter plot in Figure 10 illustrates this relationship, with engagement levels plotted on the x-axis and assimilation impact on the y-axis, highlighting the lack of a significant connection between these two variables.

4.6.6. Preferences for AI-Powered Assistants by Age Group

ANOVA results revealed a statistically significant difference in preferences for AI-powered assistants across age groups: F(6, 165) = 3.1598, p = 0.0094, η2 = 0.103. This moderate effect size suggests that age plays a meaningful role in shaping students’ valuation of AI-powered features. Figure 11 illustrates the distribution of preferences among different age groups, highlighting variations in the selection of AI-powered assistants, interactive content, and note-taking tools.

4.6.7. Preferences for AI-Powered Assistants by Faculty

ANOVA results showed no significant difference in preferences for AI-powered assistants across faculties: F(12, 164) = 0.8892, p = 0.6604, η2 = 0.061. This suggests a small effect size, indicating that faculty affiliation does not substantially influence preference for AI-powered features. Figure 12 illustrates the proportion of participants in each faculty who valued AI-Powered Assistants, interactive content, and note-taking tools.

4.6.8. Overall P-Test Result and the Original Hypothesis

Based on the above outcomes, we can now summarize the analysis of the hypotheses.
H1 was supported. It shows that assimilation levels differ significantly based on reading frequency, and that students who read more identify as having higher levels of comprehension and retention. This is confirmed by the significant variation in assimilation levels based on reading frequency (p = 0.0032).
H2 was also supported, showing that engagement levels do not differ significantly based on reading frequency (p = 0.2015). This seems to suggest that other factors affect engagement.
H3 and H4 were not confirmed. It is validated by the absence of significant differences in engagement across faculties (p = 0.2995) and assimilation across age groups (p = 0.0897), respectively. These findings imply that digital reading tools have a similar/same impact across demographic categories.
H5 was moderately supported. It shows a moderate positive correlation between engagement and assimilation (r = 0.487, p < 0.0001). It reinforces the idea that a higher engagement level leads to better comprehension and retention.
H6 was supported. It is confirmed by the significant variation in preferences for AI-powered assistants across age groups (p = 0.0094). Preferences for AI-powered assistants are not significantly influenced by the faculty, suggesting that perceptions of tools seem consistent across disciplines.
H7 was not supported—preferences for AI-powered assistants are not significantly influenced by the faculty, suggesting that perceptions of tools seem consistent across disciplines.
Together, these results provide a robust foundation for the strategic design of SmartRead, ensuring that its features align with diverse student needs while maintaining consistency in engagement and assimilation outcomes.

5. Strategic Framework for SmartRead Development

This section presents the strategic rationale behind the development of SmartRead, a multimodal eReading platform designed to enhance student engagement and knowledge assimilation in higher education. The framework comprises two core design components: a navigation panel developed to guide users through the platform and improve engagement, and a reading panel designed to support comprehension through personalized, multimodal content delivery. These strategies were derived from both descriptive and inferential statistical findings, which revealed key patterns in student preferences, tool effectiveness, and engagement behaviors.

5.1. Strategic Navigation Design to Enhance Engagement

To address the challenge of low voluntary academic reading and improve student engagement, SmartRead incorporates a strategically designed navigation panel. The flowchart-based interface (Figure 13) guides users through a structured learning path, ensuring clarity, interactivity, and progressive comprehension.
The navigation begins with a decision point: users indicate whether they are familiar with the platform. If unfamiliar, they are directed to a list of stages that outline the platform’s features. Each stage is accompanied by a short instructional video, which introduces the functionality and purpose of that section. Following the video, users engage with a game-based activity designed to test their understanding of the stage content. If the user fails the game, they are prompted to rewatch the video and retry the activity. Upon successful completion, they proceed to the reading interface.
This design ensures that users are not only oriented before engaging with content but also actively involved in understanding how to use the platform effectively. By embedding comprehension checkpoints and interactive tutorials, the navigation panel reduces cognitive overload and supports sustained engagement.
The rationale for this design is supported by the study’s findings, which revealed that although engagement levels did not significantly differ across faculties or age groups, higher engagement was strongly associated with improved assimilation. Furthermore, students expressed a preference for platforms that offer better integration and ease-of-use tools. The navigation panel directly responds to these preferences by simplifying the user journey and embedding interactive learning elements that reinforce platform familiarity and content readiness.
Strategic Reading Panel Design to Improve Comprehension
To address the need for improved student comprehension during personal reading, SmartRead incorporates a strategically designed reading panel that supports multimodal learning and adaptive feedback. One of the most frequently reported challenges in this study was distraction from non-educational content, cited by 24.2% of students. This finding highlights a critical barrier to effective engagement and comprehension in digital reading environments. In response, SmartRead integrates a Focus Mode Activation feature within its reading workflow (Figure 14) to create a distraction-free, immersive learning experience. The reading panel allows students to upload a book or document, setting their preferred reading time and page range. Once these parameters are defined, Focus Mode is automatically activated. This mode minimizes external interruptions by disabling non-essential notifications and locking the user into a dedicated reading interface. From this point, students are prompted to select a reading mode that aligns with their cognitive style. Available modes include Text, Audio, Video Illustration, and Game.
Each mode is equipped with features that enhance comprehension. The Text mode includes summarization, note-taking, and highlighting tools to support active reading. The Audio mode offers a read-aloud function for auditory learners and accessibility support. The Video Illustration mode links to relevant visual content that reinforces understanding through storytelling. The Game mode transforms reading material into interactive challenges, promoting engagement and retention through gamification.
To further support comprehension, each reading mode is enhanced by an AI-powered assistant, introduced as a direct response to the study’s finding that AI tools were the most effective for improving comprehension and retention, cited by 18.4% of participants. The AI assistant performs distinct functions across modes:
In Text Mode, it provides summarization, note-taking, and highlighting tools to help students extract and organize key information. In Audio Mode, it reads content aloud and responds to voice or typed queries, offering definitions and clarifications in real time. In Video Mode, it links to topic-relevant visual content and answers questions about concepts presented in the video, reinforcing understanding through multimedia. In Game Mode, it transforms reading material into interactive challenges and provides hints, feedback, and adaptive difficulty adjustments based on user performance.
After engaging with the content, students complete a comprehension test designed to assess their understanding. If they pass, they are rewarded through a Digital Tracker, which displays progress and awards points. If they fail, the system prompts them to re-engage with the content, either by revisiting the same reading mode or selecting an alternative mode that may better suit their learning style. This adaptive re-engagement strategy ensures that students are not penalized for initial misunderstanding but are instead guided toward deeper comprehension through personalized support.
The rationale for this design is supported by the study’s findings, which revealed that approximately 91.3% of students reported improved assimilation when using computing tools compared with traditional methods. AI-powered assistants and interactive features were identified as the most effective for enhancing comprehension and retention. Furthermore, students expressed a strong preference for personalized and multimodal reading experiences, with many favoring tools that adapt to their learning style and provide real-time feedback.
This design reflects SmartRead’s commitment to inclusive and responsive learning. By combining distraction-reduction mechanisms, multimodal content delivery, and intelligent feedback loops, the platform promotes sustained engagement and improved academic outcomes. Successful completion of the test activates a digital tracker, which displays progress and awards points. This reward system encourages consistent reading behavior and reinforces learning outcomes.

5.2. Benefits of SmartRead

SmartRead offers a structured and adaptive approach to enhancing student engagement and knowledge assimilation through multimodal reading experiences. While the platform is still in its prototype phase, its design is informed by empirical data collected from 178 students across multiple faculties. The following benefits are grounded in both descriptive statistics and inferential analysis.

5.2.1. Benefits for the Student

SmartRead supports improved learning experiences by offering customizable reading modes such as Text, Audio, Video, and Game that align with individual preferences. According to the survey, 91.3% of students reported improved assimilation when using computing tools compared with traditional methods, with 44.5% indicating significant improvement. The inclusion of AI-powered reading schedules and summarization tools was informed by the finding that AI assistants were the most preferred feature, cited by 18.4% of respondents, for enhancing comprehension and retention.
Gamification elements such as quizzes and challenges were added in response to the observed correlation between engagement and assimilation. A Pearson correlation analysis revealed a moderate positive relationship (r = 0.487, p < 0.0001), suggesting that students who reported higher engagement also demonstrated better assimilation. Leaderboards and reward systems were introduced to reinforce consistent reading behavior and motivation.
Note-taking and highlighting tools support active learning strategies, which were among the top features preferred by students (9.2%). These tools are embedded across reading modes to facilitate deeper interaction with content.

5.2.2. Benefits for Institutions

Institutions benefit from SmartRead through increased student engagement and retention. While engagement levels did not differ significantly across faculties (F(11, 166) = 1.1982, p = 0.2995), the overall sentiment analysis indicated that students felt more immersed and motivated when using multimodal tools. The platform’s reward-based access model addresses cost constraints, which were reported by 23.3% of students, by allowing institutions to incentivize usage without direct payment.
SmartRead also provides data-driven insights through its digital tracker and analytics dashboard. These features enable institutions to monitor reading progress and assimilation trends, supporting evidence-based refinement of learning strategies.

5.2.3. Benefits for Educators

For educators, SmartRead offers enhanced teaching strategies through AI-powered adaptive quizzes and interactive learning tools. These features complement traditional coursework and foster engagement beyond static textbooks. Real-time feedback mechanisms allow educators to personalize interventions based on student performance.
Collaborative learning is supported through shared annotations and visual tools such as mind maps, which help break down complex topics. These features align with the study’s findings that students value interactivity and better tool integration, with 36.0% prioritizing seamless integration and 28.6% requesting more interactive content.

6. Discussion

This study evaluated the potential of SmartRead, a prototype multimodal eReading framework, to enhance student engagement and knowledge assimilation in higher education. Drawing responses from 178 students across multiple faculties, the analysis revealed that integrating gamification, adaptive content delivery, and real-time feedback can positively influence reading behavior, learner motivation, and retention.
Findings from descriptive statistics and correlation analyses indicated a strong preference for features such as progress tracking, comprehension prompts, and gamified rewards. These trends support earlier studies emphasizing the motivational role of structured feedback and adaptive scaffolds in digital learning [26,27]. Adaptive personalization also emerged as a strong influence on learner autonomy and attention, echoing findings in recent AI-focused educational studies [28,29].
Sentiment analysis further revealed favorable perceptions of multimodal interactivity—students reported feeling more immersed, supported, and motivated compared with when traditional static text experiences are employed. These perceptions echo the work of Mills et al. [29], suggesting that interactive and multimodal reading reduces cognitive overload and fosters deeper comprehension. Importantly, SmartRead’s simulated features addressed common barriers to engagement, such as distraction from non-educational content, data privacy concerns, and cost, which have been noted by students across faculties, offering an experience that felt more responsive and personalized.
From a strategic perspective, SmartRead contributes to broader goals in equitable and inclusive digital learning. Echoing Salaberri et al. [30], this framework suggests that technology designed around learner preferences can foster engagement among traditionally disengaged student segments. Its potential applications extend to context-specific domains like healthcare education, where adaptive reading tools may support skill acquisition and critical thinking.
Future research should expand on these findings by validating SmartRead’s framework in live software environments, exploring its potential across cultural contexts and disciplines, and incorporating extended analytics such as behavioral heatmaps or biometric feedback. The addition of voice interfaces or XR-enhanced comprehension features [29,31] may further deepen its impact on diverse learners.
Although this study contributes to the growing body of evidence supporting the role of digital tools in enhancing student engagement and retention during personal reading, limitations should be acknowledged. Notably, while prior meta-analyses have established a positive relationship between digital reading platforms and improved retention outcomes, the current study does not experimentally validate this effect within its own context.
A meta-analysis of fifteen studies that conducted live tests on digital reading platforms revealed that 13.5 reported improved retention outcomes, with only one showing no difference and one indicating mixed results. These studies span a range of educational levels and technologies, including AI-powered assistants [32,33], collaborative annotation systems [34], and eye-tracking tools [35]. Unlike these studies, which employed controlled interventions and objective measures such as DRP scores [36] or comprehension tests [37], the present study relies on self-reported perceptions of engagement and assimilation. This introduces potential bias and limits the ability to quantify actual learning gains.
Additionally, while the participant pool spanned across various faculties, it was limited to a single institution, which may impact generalizability. In contrast, larger-scale studies, such as Asif et al. [38], which involved 525 students, offer broader representation. Furthermore, the SmartRead framework, while conceptually robust, has not yet been deployed or tested in a live educational setting. This contrasts with studies like Chen and Leitch [32], which evaluated a functioning LLM-based assistant. The absence of technical implementation details and longitudinal tracking in this study limits the assessment of feasibility and sustained impact.
Finally, the literature also suggests that digital tools may not universally benefit all learners. For example, Miles and Ari [39] found no improvement in retention among first-grade students using personalized e-books, highlighting the need for age- and context-sensitive design considerations in future iterations of SmartRead.

7. Conclusions

SmartRead serves as a transformative strategy for enhancing student assimilation and engagement by integrating AI-powered personalization, gamification, and institution-backed features into personal reading. By addressing key challenges such as distraction, cost constraints, and privacy concerns, SmartRead effectively bridges the gap between traditional reading and modern, interactive learning experiences. Through adaptive reading schedules, interactive quizzes, and reward-based access, students can retain knowledge more effectively while staying motivated. The institutional endorsement of SmartRead not only builds student confidence in using the platform but also ensures sustainability and security.
Overall, SmartRead enhances comprehension, maintains engagement, and drives a more immersive learning experience, making it a scalable solution for educational institutions aiming to improve student outcomes in digital reading environments.

8. Future Work

Expanding the sample to include more diverse institutions, cultural settings, and user profiles will enhance the generalizability of findings. More research work can be performed by conducting the studies using a fully developed version of SmartRead in real-world academic settings over extended periods. This would help track actual engagement trends, knowledge retention, and academic performance, moving beyond self-reported perceptions.

Author Contributions

Conceptualization, I.P.; methodology, I.P. and N.G.; software, I.P.; validation, I.P.; formal analysis, I.P.; investigation, I.P.; resources, I.P.; data curation, I.P.; writing—original draft preparation, I.P.; writing—review and editing, N.G.; visualization, I.P.; supervision, N.G. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The study was conducted in accordance with the University of Hull’s ethical research policy, approved by the Faculty of Science and Engineering Ethics Committee. (Approval Code: FOSE-24-25-032, Approval Date: 11 February 2025).

Informed Consent Statement

Informed consent was obtained from all students who participated in the research survey. They all understood the purpose of the research and agreed to participate. Also, no personal information was requested from participants, as it was completely anonymous.

Data Availability Statement

Data is available by contacting the authors.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
AIArtificial Intelligence

References

  1. Almasri, F. Exploring the Impact of Artificial Intelligence in Teaching and Learning of Science: A Systematic Review of Empirical Research. Res. Sci. Educ. 2024, 54, 977–997. [Google Scholar] [CrossRef]
  2. Walter, Y. Embracing the future of Artificial Intelligence in the classroom: The relevance of AI literacy, prompt engineering, and critical thinking in modern education. Int. J. Educ. Technol. High. Educ. 2024, 21, 15. [Google Scholar] [CrossRef]
  3. Hong, H.Y.; Lee, Y.H. Computer-supported knowledge building to enhance reading motivation and comprehension. Br. J. Educ. Technol. 2023, 54, 375–393. [Google Scholar] [CrossRef]
  4. Besele, M. Challenges of Online Learning to University Students; Education Resources Information Center: Washington, DC, USA, 2021. [Google Scholar]
  5. Gaddis, M.L. Faculty and student technology use to enhance student learning. Int. Rev. Res. Open Distrib. Learn. 2020, 21, 39–60. [Google Scholar] [CrossRef]
  6. Dontre, A.J. The influence of technology on academic distraction: A review. Hum. Behav. Emerg. Technol. 2020, 3, 379–390. [Google Scholar] [CrossRef]
  7. Ng, W. Personalized learning paths: Impact on student engagement and learning outcomes. Int. J. Educ. Technol. High. Educ. 2022, 19, 207–221. [Google Scholar]
  8. Meridha, J.M. The causes of poor digital literacy in educational practice, and possible solutions among the stakeholders: A systematic literature review. SN Soc. Sci. 2024, 4, 210. [Google Scholar] [CrossRef]
  9. Wong, Z.Y.; Liem, G.A.D.; Chan, M.; Datu, J.A.D. Student engagement and its association with academic achievement and subjective well-being: A systematic review and meta-analysis. J. Educ. Psychol. 2024, 116, 48–75. [Google Scholar] [CrossRef]
  10. Otto, S.; Bertel, L.B.; Lyngdorf, N.E.R.; Markman, A.O.; Andersen, T.; Ryberg, T. Emerging Digital Practices Supporting Student-Centered Learning Environments in Higher Education: A Review of Literature and Lessons Learned from the COVID-19 Pandemic. Educ. Inf. Technol. 2024, 29, 1673–1696. [Google Scholar] [CrossRef] [PubMed]
  11. Audrin, C.; Audrin, B. Key factors in digital literacy in learning and education: A systematic literature review using text mining. Educ. Inf. Technol. 2022, 27, 7395–7419. [Google Scholar] [CrossRef]
  12. Bajúzová, M.; Hrmo, R. Digital Tools in Education The Impact of Digital Tools in Education on Students’ Creativity. R&E Source 2024, 11, 4–18. [Google Scholar] [CrossRef]
  13. Lin, Y.; Yu, Z. Extending Technology Acceptance Model to higher-education students’ use of digital academic reading tools on computers. Int. J. Educ. Technol. High. Educ. 2023, 20, 34. [Google Scholar] [CrossRef]
  14. Chavez, O.J.; Palaoag, T. AI-driven mobile application: Unraveling students’ motivational feature preferences for reading comprehension. J. Res. Innov. Teach. Learn. 2024, 17, 226–242. [Google Scholar] [CrossRef]
  15. Lim, B.C.Y.; Liu, L.W.L.; Hou, C.C. Investigating the effects of interactive e-book towards academic achievement. Asian J. Univ. Educ. 2020, 16, 78–88. [Google Scholar] [CrossRef]
  16. Clark, C.; Rumbold, K. Reading for Pleasure: A Research Overview; National Literacy Trust: London, UK, 2023. [Google Scholar]
  17. Bacak, J.; Wagner, J.; Martin, F.; Byker, E.; Wang, W.; Alhgrim-Delzell, L. Examining technologies used in K-12 school districts: A proposed framework for classifying educational technologies. J. Educ. Technol. Syst. 2023, 51, 282–302. [Google Scholar] [CrossRef]
  18. Xu, B.; Stephens, J.M.; Lee, K. Assessing Student Engagement in Collaborative Learning: Development and Validation of New Measure in China. Asia-Pac. Educ. Res. 2024, 33, 395–405. [Google Scholar] [CrossRef]
  19. Măduţa, G. Digital Tools Adoption in Pedagogy Approaches in Language Learning for Economics Students. Rom. Econ. Bus. Rev. 2023, 18, 21–31. [Google Scholar]
  20. Nkomo, L.M.; Daniel, B.K.; Butson, R.J. Synthesis of student engagement with digital technologies: A systematic review of the literature. Int. J. Educ. Technol. High. Educ. 2021, 18, 34. [Google Scholar] [PubMed]
  21. Mustafa, A. Does the digital divide matter? Factors and conditions that promote ICT literacy. Telemat. Inform. 2021, 58, 1–9. [Google Scholar] [CrossRef]
  22. Selwyn, N. Education in a Digital World: Global Perspectives on Technology and Education; Routledge: Abington-on-Thames, UK, 2022. [Google Scholar]
  23. U.S. Department of Education. Professional Development for Teachers in the Digital Age. Educ. Technol. Res. Dev. 2023, 45, 123–134. [Google Scholar]
  24. Pérez-Juárez, M.Á.; González-Ortega, D.; Aguiar-Pérez, J.M. Digital Distractions from the Point of View of Higher Education Students. Sustainability 2023, 15, 6044. [Google Scholar] [CrossRef]
  25. Huang, S.; Spector, J.M.; Yang, J. The impact of interactive e-books on secondary school students’ reading comprehension. Comput. Educ. 2022, 150, 103842. [Google Scholar]
  26. Jaramillo-Mediavilla, L.; Basantes-Andrade, A.; Cabezas-González, M.; Casillas-Martín, S. Impact of gamification on motivation and academic performance: A systematic review. Educ. Sci. 2024, 14, 639. [Google Scholar] [CrossRef]
  27. Sahli, S.; Spriet, T. Gamification Based Collaborative Learning: The Impact of Rewards on Student Motivation. In Proceedings of the 2023 International Conference on Interactive Collaborative Learning, Paris, France, 27 August–1 September 2023; Springer: Cham, Switzerland, 2024. [Google Scholar]
  28. Shafiee Rad, H. Reinforcing L2 reading comprehension through artificial intelligence intervention: Refining engagement to foster self-regulated learning. Smart Learn. Environ. 2025, 12, 23. [Google Scholar] [CrossRef]
  29. Mills, K.A.; Brown, A.; Moro, C. Virtual and augmented reality text environments support self-directed multimodal reading. Interact. Learn. Environ. 2025, 1–20. [Google Scholar] [CrossRef]
  30. Salaberri, M.; Gil, M.; Sylla, C. GamAll: Playing Beyond Boundaries—Gamification and Multimodal Literacy. In Proceedings of the 5th EAI International Conference on Design, Learning, and Innovation, Virtual Event, 10–11 December 2020; Springer: Cham, Switzerland, 2021. [Google Scholar]
  31. Mao, M.; Perez-Cabarcas, M.M.; Kallakuri, U.; Waytowich, N.R.; Lin, X.; Mohsenin, T. Multi-RAG: A Multimodal Retrieval-Augmented Generation System for Adaptive Video Understanding. arXiv 2025, arXiv:2505.23990. [Google Scholar]
  32. Chen, C.; Leitch, A. LLMs as Academic Reading Companions: Extending HCI Through Synthetic Personae. arXiv 2024, arXiv:2403.19506. [Google Scholar] [CrossRef]
  33. Zhang, P. Effects of highlights and annotations on EFL learners’ Reading comprehension: An application of computer-assisted interactive reading model. Comput. Assist. Lang. Learn. 2024, 1–33. [Google Scholar] [CrossRef]
  34. Chen, C.-M.; Li, M.-C.; Chen, T.-C. A web-based collaborative reading annotation system with gamification mechanisms to improve reading performance. Comput. Educ. 2020, 144, 103697. [Google Scholar] [CrossRef]
  35. Turčáni, M.; Balogh, Z.; Kohútek, M. Evaluating computer science students reading comprehension of educational multimedia-enhanced text using scalable eye-tracking methodology. Smart Learn. Environ. 2024, 11, 29. [Google Scholar] [CrossRef]
  36. Hidayat, M.T. Effectiveness of AI-Based Personalised Reading Platforms in Enhancing Reading Comprehension. J. Learn. Dev. 2024, 11, 115–125. [Google Scholar] [CrossRef]
  37. Keelor, J.L.; Creaghead, N.; Silbert, N.; Horowitz-Kraus, T. Text-to-speech technology: Enhancing reading comprehension for students with reading difficulty. Assist. Technol. Outcomes Benefits 2020, 14, 19–35. [Google Scholar]
  38. Asif, M.; Sheeraz, M.; Sacco, S.J. Evaluating the Impact of Technological Tools on the Academic Performance of English Language Learners at Tertiary Level: A Pilot Investigation. Pegem J. Educ. Instr. 2022, 12, 272–282. [Google Scholar]
  39. Miles, H.; Ari, F. Implementing Personalized Reading Plans with an E-Book Library to Support First-Grade Students’ Reading Engagement and Comprehension. TechTrends 2022, 66, 914–922. [Google Scholar] [CrossRef]
Figure 1. Impact of digital tools on engagement.
Figure 1. Impact of digital tools on engagement.
Mti 09 00101 g001
Figure 2. Top eReading features.
Figure 2. Top eReading features.
Mti 09 00101 g002
Figure 3. Impact of eReading tools on assimilation (comprehension and retention).
Figure 3. Impact of eReading tools on assimilation (comprehension and retention).
Mti 09 00101 g003
Figure 4. Desired technical features.
Figure 4. Desired technical features.
Mti 09 00101 g004
Figure 5. Challenges.
Figure 5. Challenges.
Mti 09 00101 g005
Figure 6. Assimilation by reading frequency.
Figure 6. Assimilation by reading frequency.
Mti 09 00101 g006
Figure 7. Engagement by reading frequency.
Figure 7. Engagement by reading frequency.
Mti 09 00101 g007
Figure 8. Engagement by faculty.
Figure 8. Engagement by faculty.
Mti 09 00101 g008
Figure 9. Test: assimilation by age group.
Figure 9. Test: assimilation by age group.
Mti 09 00101 g009
Figure 10. Correlation between engagement and assimilation.
Figure 10. Correlation between engagement and assimilation.
Mti 09 00101 g010
Figure 11. Preferences for AI-powered assistants by age group.
Figure 11. Preferences for AI-powered assistants by age group.
Mti 09 00101 g011
Figure 12. Preferences for AI-powered assistants by faculty.
Figure 12. Preferences for AI-powered assistants by faculty.
Mti 09 00101 g012
Figure 13. Flowchart illustrating the navigation in the platform.
Figure 13. Flowchart illustrating the navigation in the platform.
Mti 09 00101 g013
Figure 14. Flowchart illustrating the engagement and interaction features.
Figure 14. Flowchart illustrating the engagement and interaction features.
Mti 09 00101 g014
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Pelumi, I.; Gordon, N. SmartRead: A Multimodal eReading Platform Integrating Computing and Gamification to Enhance Student Engagement and Knowledge Retention. Multimodal Technol. Interact. 2025, 9, 101. https://doi.org/10.3390/mti9100101

AMA Style

Pelumi I, Gordon N. SmartRead: A Multimodal eReading Platform Integrating Computing and Gamification to Enhance Student Engagement and Knowledge Retention. Multimodal Technologies and Interaction. 2025; 9(10):101. https://doi.org/10.3390/mti9100101

Chicago/Turabian Style

Pelumi, Ifeoluwa, and Neil Gordon. 2025. "SmartRead: A Multimodal eReading Platform Integrating Computing and Gamification to Enhance Student Engagement and Knowledge Retention" Multimodal Technologies and Interaction 9, no. 10: 101. https://doi.org/10.3390/mti9100101

APA Style

Pelumi, I., & Gordon, N. (2025). SmartRead: A Multimodal eReading Platform Integrating Computing and Gamification to Enhance Student Engagement and Knowledge Retention. Multimodal Technologies and Interaction, 9(10), 101. https://doi.org/10.3390/mti9100101

Article Metrics

Back to TopTop