Next Article in Journal
Design and Experimental Validation of a Unidirectional Cable-Driven Exoskeleton for Upper Limb Rehabilitation
Previous Article in Journal
Type-3 Fuzzy Logic-Based Robust Speed Control for an Indirect Vector-Controlled Induction Motor
Previous Article in Special Issue
MemCatcher: An In-Depth Analysis Approach to Detect In-Memory Malware
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

The MUG-10 Framework for Preventing Usability Issues in Mobile Application Development

1
Department of Software Engineering, Faculty of Electronics, Telecommunications and Informatics, Gdansk University of Technology, Narutowicza 11/12, 80-233 Gdansk, Pomorskie, Poland
2
Faculty of Physical Culture, Gdansk University of Physical Education and Sport, Kazimierza Gorskiego 1, 80-336 Gdansk, Pomorskie, Poland
*
Author to whom correspondence should be addressed.
Appl. Sci. 2025, 15(22), 11995; https://doi.org/10.3390/app152211995
Submission received: 6 October 2025 / Revised: 30 October 2025 / Accepted: 10 November 2025 / Published: 12 November 2025
(This article belongs to the Special Issue Cyber Security and Software Engineering)

Abstract

Nowadays, mobile applications are essential tools for everyday life, providing users with anytime, anywhere access to up-to-date information, communication, and entertainment. Needless to say, hardware limitations and the diverse needs of different user groups pose a number of design and development challenges. According to recent studies, usability is one of the most revealing among many others. However, few have made the direct effort to provide and discuss what countermeasures can be applied to avoid usability issues in mobile application development. Through a survey of 20 mobile software design and development practitioners, this study aims to fill this research gap. Given the qualitative nature of the data collected, and with the goal of capturing and preserving the intrinsic meanings embedded in the experts’ statements, we adopted in vivo coding. The analysis of the collected material enabled us to develop a novel framework consisting of ten guidelines and three activities with general applications. In addition, it can be noted that active collaboration with users in testing and collecting feedback was often emphasized at each stage of mobile application development. Future research should consider focused action research that evaluates the effectiveness of our recommendations and validates them across different stakeholder groups. In this regard, the development of automated tools to support early detection and mitigation of usability issues during mobile application development could also be considered.

1. Introduction

In the realm of ubiquitous wireless networking and mobile devices, mobile application development is constantly evolving to meet the growing requirements of users [1]. On the other hand, the mobile app market is considered both attractive and profitable [2], with a global valuation of USD 252.89 billion in 2023 with over 6.3 billion users across the world [3]. While mobile applications have penetrated almost every area of human activity, from health to education [4], finance [5], communication [6], and entertainment [7], one of the main areas of ongoing research concerns usability [8,9].
In the realm of mobile applications, usability is typically understood in terms of the ISO 9241-11 standard [10], which defines usability as the extent to which a product (system, software, or service) can be used by specified users to achieve their goals with effectiveness, efficiency, and satisfaction in a specified context of use [11]. However, over the years, usability has also been evaluated based on various other attributes that reflect the specific needs and requirements of different user groups [12,13,14,15]. Although a customized app can better align with users’ needs and objectives, it can also introduce new usability challenges.
Undeniably, both the academic and software development communities are still developing and testing different frameworks [16,17], methods [18,19] and tools [20,21] to address a variety of mobile usability issues in an effective way [22,23]. Although recent studies have reported the results of usability testing across various applications [24,25,26], demonstrating the use of diverse measures, to the best of our knowledge, a general framework for preventing usability issues in mobile application development has not yet been introduced. The present study attempts to fill this research gap. To achieve this goal, we interviewed twenty human-computer interaction experts and carried out a targeted literature review.
The lack of a comparable solution emphasizes the novelty of the introduced MUG-10 framework. Our framework is generic by design, meaning it can be applied in different functional and non-functional requirement configurations and settings. More specifically, the novelty of the framework lies in its comprehensive and systematic definition of ten general guidelines that promote mobile application development with a strong focus on usability. Additionally, the analysis reported in the form of an impact-effort matrix supports more informed decision-making regarding the expected impact on perceived usability and the amount of human, time, and work resources required.
The rest of the papers organized as follows. Section 2 provides the theoretical background. Section 3 outlines the research methodology. Section 4 presents the MUG-10 framework. Section 5 evaluates the introduced MUG-10 framework. Section 6 discusses the limitations and contributions of the study. Section 7 concludes the paper and suggests future research directions.

2. Background

By its very nature, usability as such does not exist [27]. It arises as a result from the user’s conscious experience with a mobile application [28]. On account of that any kind of its imperfection is an issue. In this view usability definition provides three attributes to be considered. First, effectiveness refers to a user’s ability to successfully complete a task within a given context [29]. Any obstacles hindering task completion are considered usability issues. Second, efficiency relates to how quickly and accurately a user can complete a task [30]. From this perspective, any deficiencies in the application’s processing capacity are considered issues as well [31]. Third, satisfaction refers to a user’s perceived level of comfort, pleasure, or fulfillment of expectations and needs [32]. From this standpoint, an issue refers to any negative emotional or subjective response a user experiences as a result of interacting with the application [33].
That being said, usability issues with mobile applications can affect every aspect of the application in practice. Furthermore, given the diverse and changing contexts in which users interact with applications, perceived usability falls under a much broader umbrella. Digital distribution services such as Google Play, or the App Store have opened up new possibilities for software vendors to collect user feedback on their published mobile applications, as users can freely express their opinions on various topics, including usability [34]. Indeed, many studies have analyzed online user opinions about mobile applications, providing valuable insights and enriching the existing literature on firsthand mobile usability issues.
For example, Iacob et al. [35] found that users had difficulty configuring, using, and learning the apps. Ismail et al. [36] discovered various issues like accessibility, accuracy, conciseness, ease-of-use, convenience, learnability, user satisfaction, task-technology fit, among many others. Weichbroth & Baj-Rogowska [37] synthesized the opinions and further classified the extracted keywords into the form of seven attributes, namely: efficiency, satisfaction, effectiveness, errors, ease of use, cognitive load, and operability. Keeping in mind that over 2.08 billion users play games on their mobile devices [38], Ho and Tu [39] found that users paid attention to stability, ease of use, and accuracy, among other factors.
In light of recent research conducted through expert interviews [40], ten categories related to the usability of mobile applications were identified, including: Information Architecture (27.36%), User Interface (20.75%), Performance (16.04%), Interaction Patterns (10.38%), Aesthetics, Errors, and Hardware (4.72% each), as well as Advertising, Responsiveness, and Security (3.77% each). In this regard, Figure 1 provides a graphical summary.
In response to these issues, researchers and practitioners have developed new patterns to overcome hardware limitations, especially smaller screens that now also serve as both a keyboard and a mouse. Neil [41] introduced a comprehensive guide to designing intuitive and effective mobile user interfaces. The guide categorizes 70 mobile patterns related to navigation, forms, tables, search, and data visualization. For each pattern, Neil offers practical solutions and best practices, emphasizing the importance of consistency and responsiveness across different platforms, supported by showcases real-world examples from successful apps. On the other hand, Neil also discusses the anti-patterns of mobile design, which can be considered counterproductive practices from a user’s point of view.
Grill et al. [42] presented an iterative, user-centered development process that integrates an interaction pattern library into the design process. Adopting a pattern-based approach takes into account not only functional and interaction requirements, but also architectural aspects. The authors elaborated on the key parameters influencing the design of mobile applications and discussed integrating these parameters into a user-centered development process focused on design and evaluation. The proposed mobile interaction design patterns approach addresses the specific requirements of mobile applications by demonstrating how designers can apply proven solutions and supporting tools to enhance the overall quality and usability of their designs.
Punchoojit and Hongwarittorrn [43] provide a systematic literature review of the existing studies on mobile UI design patterns, giving an interesting overview of recent studies on the mobile designs. The results show that the touchscreen is the primary factor influencing the research direction of mobile user interfaces. Important qualities of touch screens that influence these directions include limited screen size, the lack of physical response and tactile feedback, invisible gestures, the ubiquity of mobile devices, and the high demand for visual attention.
A more recent study by da Silva et al. [44] introduce a systematic mapping of the latest research on interaction design patterns for mobile devices. The study identified 18 categories of patterns that cover different aspects of interaction. The authors also highlighted the lack of accessibility-specific design patterns and the need to broaden their scope in design research.
However, despite the maturity of usability guidelines, well-established best practices, and advanced software tools, numerous challenges and unresolved issues still need to be addressed to meet the diverse requirements of mobile users [45]. Since, to the best of our knowledge, the current literature lacks a comprehensive perspective on this matter, we aim to identify existing countermeasures that can be applied to eliminate or mitigate usability issues in mobile application development and maintenance.

3. Methodology

In our study we put forward the following research question: What countermeasures can be taken to avoid usability problems in mobile application development? Given the qualitative nature of the study and prior research in this area, we decided to conduct expert interviews, which allowed us to gather insights from individuals with specialized knowledge and experience.

3.1. Identify and Recruit Experts

Experts were identified and recruited by both authors using convenience sampling, a non-probability method based on accessibility and willingness to participate. This approach allowed us to efficiently engage individuals with relevant knowledge and experience in mobile application usability. In total, 21 experts were recruited.

3.2. Interview Protocol

Since our study is organized around an explicit theme, we adopted a fully structured protocol consisting of three parts. The first part contains a brief introduction. The second part includes an open-ended research question. The third part asks for demographic information.

3.3. Interview Organization

We conducted one round of data collection, which lasted from May 2024 to February 2025. In this regard, a request with a brief description (including the research question) along with the attached file was sent to each respondent who agreed to participate in the study. Therefore, a participant could respond at any time and any place. Note that respondents did not receive any compensation.

3.4. Data Sample

All data was collected in a single online spreadsheet using a popular and free tool from one of the world’s leading software and services providers. Based on an initial assessment of the data collected, Expert #4 was excluded from the analysis due to the cursory and generic nature of the responses submitted. The data was collected in Polish and then translated into English. Eventually, the data sample included statements provided by 20 experts, with a total of 2021 words (11,151 characters without spaces). Table 1 presents the experts’ profiles along with the length of the answers provided.
In summary, 20 experts participated in our study, including 10 men and 10 women. The average age of the men and women was 31.9 and 38.6 years, respectively. Out of 20 experts, all women and eight men have higher education, while 2 men have secondary education. In terms of professional experience, all positions held have involved mobile application development at various stages. In our view, such diversity should be seen as beneficial, as it provides a broader perspective on the research problem. On average, women (13) had more work experience than men (9). Similarly, the average number of declared projects was higher for women (7.6) than for men (5.1).
The length of the experts’ responses varies significantly, ranging from 12 to 322 words, with an average of 101.05 words per answer. Interestingly, on average, men (119.7) provided longer answers than women (82.4). However, note that its length does not always reflect its explicit information value.

3.5. Data Analysis

After preparing the data set, we did an initial reading to get a general sense of the content. In particular, we focused on the participants’ language, especially in the context of the research problem. The in vivo codes were extracted in two rounds. In the first (extraction) round, we highlighted exact words or phrases used by the participants that stood out from a common language, extracting a total of 212 codes. In the second (pruning) round, we removed all words and phrases that were irrelevant to the research question, removing 41 codes.

3.6. Data Synthesis

Next, the remaining 171 codes were manually stemmed, meaning that prefixes and suffixes were removed from words by converting them to their root form. Grouping and categorizing the codes was done in two rounds. In the first, we examined the codes, looking for similarities, patterns, or contradictions among the codes. In the second, we grouped the codes into broader themes, while still preserving the participants’ voices, and counted their frequencies. Finally, we divided the codes into two categories: guideline (an indication of what or how something should be done) and activity (things people do, especially to achieve a particular goal). Our findings include ten guidelines and three activities which are discussed in more detail in the next section.

4. Results

In this section, we first present and discuss a structured framework, consisting of ten guidelines, for preventing usability issues in mobile application development. Then, we outline three activities that can be undertaken to effectively detect and eliminate these issues.
Figure 2 presents the ten mobile usability guidelines, hereafter referred to as the MUG-10 framework.
G1: 
Prepare documentation. Maintain technical, accurate, and up-to-date documentation to ensure alignment across teams, and engage stakeholders in the documentation process to ensure their perspectives are captured and reflected in product development [46,47]. In a broader sense, well-prepared documentation supports consistent decision-making and minimizes the risk of miscommunication throughout the project lifecycle [48].
G2: 
Use storytelling. Through scenarios and personas, storytelling techniques can be used to better understand and address user needs [49,50]. Storytelling builds connections, inspires action, and facilitates learning more effectively than presenting raw data alone by tapping into human emotions and shared experiences. Besides, this practice also promotes more effective communication among people by translating abstract usability narratives into specific objectives [51].
G3: 
Apply User-Centered Design (UCD) principles. By definition, UCD is an iterative design process that emphasizes active engagement in the mobile application development process [52,53]. By involving users in research, testing, and continuous feedback, development teams can identify and resolve usability issues effectively. Early and frequent testing, in particular, helps ensure that the final application closely aligns with their expectations [54].
G4: 
Prioritize core functionality. Mobile users typically engage with apps to quickly and efficiently perform specific tasks. Prioritizing core functionality eliminates distractions, reduces interface complexity, and improves the overall user experience [55,56]. Besides, by focusing development efforts on the most critical features, before adding additional functions or fine-tuning visuals, a minimum viable product (MVP) can be delivered without overwhelming users [57].
G5: 
Use design patterns and templates. Established design patterns and templates offer reliable solutions to common UI issues, reducing cognitive load by providing well-known icons, symbols, and interface structures [44]. These solutions can speed up development by giving teams reusable components that are easier to implement, maintain, and test. Using familiar patterns also improves learnability [58], as users can take advantage of their prior knowledge.
G6: 
Develop mock-ups and prototypes. Use static, both low- and high-fidelity visuals to illustrate an application layout [59]. Then follow up with interactive prototypes that enable early user testing, help validate design concepts, and gather first-hand user feedback [60,61]. Graphical artifacts also facilitate better communication among designers, developers, as well as other stakeholders by providing a shared reference point [62].
G7: 
Follow accessibility guidelines. Design inclusive interfaces that follow POUR (Perceivable, Operable, Understandable, Robust) [63,64] principles to create truly inclusive content [65]. Adhering to POUR guidelines ensures that a mobile application is usable by people with diverse abilities, including those with visual, auditory, motor, or cognitive impairments. Additionally, following these standards supports legal compliance and shields organizations from reputational harm.
G8: 
Ensure responsive design across devices. Design and test for multiple screen sizes to provide a smooth user experience [66], while key features include flexible grids, layouts, and images that automatically adjust to different resolutions and device types [67]. Additionally, a well-implemented responsive layout reduces the need for redundant code across separate app versions, streamlining both current maintenance and future development.
G9: 
Assure user privacy. Design and communicate transparent authorization patterns that address the permissions granted to an application to access specific data and functions on a user’s device, and the means to protect the user’s personal information [68], including the right to control how such information is used, stored, and shared, while preserving the user’s anonymity where possible [69].
G10: 
Conduct A/B testing. Design, implement, and test two product versions to collect data-driven insights and make informed decisions based on user behavior [70,71]. By making decisions based on factual user feedback rather than arbitrary assumptions, the A/B testing approach helps determine the best solution [72].
In summary, our MUG-10 framework offers a comprehensive approach applicable to any mobile application development project. As can be observed, it emphasizes early planning and communication through documentation, storytelling, and mock-ups, keeping user preferences at the forefront. Prioritizing core functionality, applying User-Centered Design (UCD) principles, and using proven design patterns leads to user-friendly interfaces. On top of that, technical practices such as responsive design, accessibility compliance, and A/B testing are also key to achieving greater usability and inclusivity. Lastly, safeguarding user privacy reinforces trust, which is essential for the acceptance of technology and, ultimately, long-term user engagement.
From a practical point of view, there are three activities which can be undertaken to detect and eliminate usability issues in mobile application development.
  • User testing. Experts strongly emphasized involving at least five users in usability testing [73] through live and moderated sessions on a regular basic, on each stage of app development [74]. One can also consider A/B testing with the aim comparing two versions of a design to determine which one leads to better outcomes based on actual user behavior [75]. Such approach helps reduce opinion-based changes [76], allows testing of specific variables (e.g., layout, color, wording) [77], and supports data-driven decision making to ultimately select the optimal solution [78]. Remarkably, low-fidelity prototypes should be used in the early stages of application design as they tend to be visually unappealing and lack detail [79]. On the other hand, high-fidelity models should accompany feature implementation, allowing users to interact with the solution as if it were fully developed [80].
  • Heuristics testing. In this approach, instead of users, experts assess an interface based on established usability principles, guidelines or rules of thumb, known as heuristics used to evaluate a user interface. Heuristics can be understood as a practical approach to quickly identify issues, but they are not fixed solutions. In this view, heuristics guide the evaluator in testing various scenarios by applying Nielsen’s 10 usability principles [81].
  • Smoke testing. Depending on the stage of the development process and the team’s workflow, smoke testing is usually performed by quality assurance testers or developers [82]. Smoke testing is a quick way to detect critical or major defects early [83], helping avoid spending time on detailed testing when the app build is seriously flawed [84]. In this sense, smoke testing can be seen as a build verification testing that precedes user testing.
In summary, involving multiple perspectives in comprehensive usability testing throughout the design and development process makes an application usable and user-friendly. Engaging major stakeholders is not merely a recommended practice but an imperative. Ultimately, users’ intention to use and adopt the mobile application in the long term undeniably confirms its quality and value.

5. Framework Evaluation

Analyzing the MUG-10 framework guidelines individually raises two questions. First, how much effort is required to determine and implement a course of action? Second, what is the expected impact on user value? To answer these two questions, each guideline was independently evaluated based on available published research on mobile application usability.
In order to structure and map the relationships between these two factors we used the impact-effort matrix tool [85]. Note that, in our case, effort refers to the amount of time, resources, and work required to implement the guideline, whereas impact refers to the potential positive change in the perceived usability of the mobile application by its users.
Figure 3 shows the detailed impact-effort matrix, which categorizes each guideline into one of four quadrants [86]:
  • Quick Win: High Impact, Low Effort (upper left square).
  • Strategic Investment: High Impact, High Effort (upper right square).
  • Low Priority: Low Impact, Low Effort (lower left square).
  • Reconsider: Low Impact, High Effort (lower right square).
The Quick Wins group is the highest priority and includes three guidelines that provide significant user benefits with minimal effort. First, prioritizing core functionality (G4) requires early decision-making and trade-offs but results in a user-focused list of essential app features [87], thereby reducing overall development effort [88]. Second, using design patterns and templates (G5) reduces the work required for user interface design and interaction patterns by relying on tested solutions [89]. Third, although ensuring responsive design across devices (G8) involves testing across various screen sizes and platforms [90], it is essential for delivering a consistent user interface layout and appearance [91,92].
Second in line are Strategic Investments, guidelines that have a significant impact but require a lot of time and resources to implement. Four items fall into this category. First, applying UCD principles involves iterative testing and active user research [93], leading to strong alignment with user needs and expectations [94,95]. Second, developing mock-ups and prototypes (G6) is time-consuming [96], especially in the early stages [97]. However, it enables early user feedback and testing [98], which significantly reduces the risk of costly interface redesigns in the future [99]. Third, following accessibility guidelines (G7) requires extra effort [100], but it ensures inclusivity and digital equality. Moreover, accessibility is a legal requirement in many regions, such as the European Union (European accessibility act), or the United States (The Americans with Disabilities Act). With over 1.3 billion people (16% of the world’s population) living with some form of disability [101], mobile applications that include accessible features can reach a broader user base over time. Fourth, ensuring user privacy (G9) requires a combination of technical safeguards [102], legal compliance [103], and transparent communication [104]. Despite requiring considerable effort, it is essential for building user trust [105] and meeting ethical [106] and legal standards [107].
In our framework, Low Priority includes low-impact and low-effort guidelines. Only one item falls into this category: the use of storytelling (G2). It is easy to implement in presentations or personas [108] and helps enhance empathy [109] and user understanding. However, its direct impact on usability is limited, since storytelling primarily aims to create emotional ties [110], foster trust [111] and loyalty [112], and motivate users to engage more deeply with the product [113].
At the bottom of the priority list are the Reconsider guidelines, which offer limited benefit to the usability while consuming substantial resources. First, preparing documentation (G1) requires considerable time and resources [114], often involving multiple stakeholders across teams [115]. However, its direct impact on usability has not yet been clearly demonstrated. Second, conducting A/B testing (G10) involves preparing two or more versions of an app [116] and carrying out empirical testing with both internal and external users [117]. While A/B testing is often promoted as an effective method for testing a hypothesis [118], in many cases it is simply unnecessary, as both UI design and interaction patterns rely on proven, established solutions that may be sufficient.

6. Discussion

6.1. Contributions

In light of the results discussed above, we argue that our study contributes to the current body of literature in two ways. First, our framework covers a generic and wide range of mobile usability issues, from design and functionality to testing, inclusivity, and privacy. They are well aligned with recognized best practices and, when applied systematically and according to the experience and knowledge of experts, can significantly reduce usability obstacles and burdens in mobile application development. Second, based on a targeted literature review, we confirm the validity and usefulness of the three usability testing methods currently employed by practitioners in real-world mobile software development environments. Third, we present the results of the guidelines evaluation and support our judgments and conclusions with references to related research. From a practical point of view, this analysis can inform stakeholders and enrich discussions about applying specific guidelines to ongoing and future projects.

6.2. Implications

The formulated guidelines for mobile usability are undeniably well-known. However, these guidelines are scattered across numerous studies and are sometimes tailored to specific applications. The proposed MUG-10 framework is unified and universal, meaning it can be applied to any mobile application development project regardless of domain, configuration of functional or nonfunctional requirements, or target user group. Therefore, the MUG-10 framework has theoretical implications for our better understanding of the existing countermeasures available to eliminate or reduce mobile usability issues.
As practitioners are still searching for effective strategies, methods, and tools for developing and maintaining highly appreciated mobile applications, we believe the MUG-10 framework can serve as a valuable source of information. In addition, due to the generic nature of the MUG-10 framework, it can be integrated into agile or user-centered design workflows, serving as a reference point for sprint reviews, design evaluations, and acceptance tests.

6.3. Limitations

Nevertheless, our study, like others of a similar nature, suffers from inherent and obvious limitations. First, an interview-based study may reflect biases or assumptions rooted in the specific assumptions or requirements of the projects involved. In other words, such bias occurs when respondents rely heavily on their own experience and knowledge, which may inadvertently skew their judgments or recommendations. Second, our study relies on a sample size of 20 respondents, which may be considered relatively small. Basically, a small sample size may limit the reliability and generalizability of the results, meaning that the results may not accurately represent the opinions of the broader population.

7. Conclusions

Undeniably, usability is a key non-functional software quality that directly affects user acceptance. By contrast, poor usability can lead to user frustration and ultimately application rejection. While usability issues can arise at any stage of mobile application development, it is important to identify and understand countermeasures that can be applied effectively. To this end, we provide general guidelines and practical activities recognized by experts as working countermeasures in the conducted survey. Future research could explore how the expert-identified countermeasures perform in real-world mobile application projects through longitudinal case studies. It would also be valuable to conduct a targeted action research that evaluate the effectiveness of these recommendations, validating them across different stakeholder groups. In addition, future studies could focus on developing automated tools or frameworks to support the early detection and mitigation of usability issues during the mobile application development.

Author Contributions

Conceptualization, P.W.; methodology, P.W.; software, P.W.; validation, P.W. and T.S.; formal analysis, P.W.; investigation, P.W.; resources, P.W.; data curation, P.W.; writing—original draft preparation, P.W.; writing—review and editing, P.W.; visualization, P.W. and T.S.; supervision, P.W.; project administration, P.W.; funding acquisition, P.W. All authors have read and agreed to the published version of the manuscript.

Funding

The first author covered the APC fee with vouchers and statutory funds obtained from the Department of Software Engineering.

Institutional Review Board Statement

In Poland, non-interventional studies (e.g., surveys, questionnaires, social media research) are not subject to mandatory ethical approval under applicable legal regulations, nor are there analogous requirements imposed by Polish universities. Therefore, no formal approval was required for this study. Each participant was verbally informed that their personal data would remain confidential and would not be disclosed at any stage of the research process. Furthermore, we affirm that the findings of this study do not harm the reputation of any participant nor infringe upon their personal rights. Additionally, the research results do not include any content of a racist, political, or regional nature and, as such, do not violate the personal rights of third parties.

Data Availability Statement

The data that support the findings of this study are available on request from the corresponding author.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Statista. Mobile App Usage —Statistics & Facts. 2025. Available online: https://www.statista.com/topics/1002/mobile-app-usage/ (accessed on 9 August 2025).
  2. Wagner, C. How Companies Can Do More with Mobile Apps. 2023. Available online: https://www.forbes.com/councils/forbestechcouncil/2023/12/08/how-companies-can-do-more-with-mobile-apps/ (accessed on 26 April 2025).
  3. Grand View Research. Mobile Application Market Size, Share & Trends Analysis Report By Store (Google Store, Apple Store, Others), by Application (Gaming, Music & Entertainment, Health & Fitness, Social Networking), and Region, Segment Forecasts, 2024–2030. 2025. Available online: https://www.grandviewresearch.com/industry-analysis/mobile-application-market (accessed on 23 April 2025).
  4. Gawlik-Kobylińska, M.; Kabashkin, I.; Misnevs, B.; Maciejewski, P. Education mobility as a service: A study of the features of a novel mobility platform. Appl. Sci. 2023, 13, 5245. [Google Scholar] [CrossRef]
  5. García-Méndez, S.; de Arriba-Pérez, F.; González-González, J.; González-Castaño, F.J. Explainable assessment of financial experts’ credibility by classifying social media forecasts and checking the predictions with actual market data. Expert Syst. Appl. 2024, 255, 124515. [Google Scholar] [CrossRef]
  6. Wang, Z.; Du, Y.; Wei, K.; Han, K.; Xu, X.; Wei, G.; Tong, W.; Zhu, P.; Ma, J.; Wang, J.; et al. Vision, application scenarios, and key technology trends for 6G mobile communications. Sci. China Inf. Sci. 2022, 65, 151301. [Google Scholar] [CrossRef]
  7. Falkowski-Gilski, P.; Uhl, T. Current trends in consumption of multimedia content using online streaming platforms: A user-centric survey. Comput. Sci. Rev. 2020, 37, 100268. [Google Scholar] [CrossRef]
  8. Ali, W.; Riaz, O.; Mumtaz, S.; Khan, A.R.; Saba, T.; Bahaj, S.A. Mobile application usability evaluation: A study based on demography. IEEE Access 2022, 10, 41512–41524. [Google Scholar] [CrossRef]
  9. Salman, H.M.; Ahmad, W.F.W.; Sulaiman, S. Usability evaluation of the smartphone user interface in supporting elderly users from experts’ perspective. IEEE Access 2018, 6, 22578–22591. [Google Scholar] [CrossRef]
  10. Weichbroth, P. Usability of mobile applications: A systematic literature study. IEEE Access 2020, 8, 55563–55577. [Google Scholar] [CrossRef]
  11. ISO 9241-11:1998(en); Ergonomic Requirements for Office Work with Visual Display Terminals (VDTs)—Part 11: Guidance on Usability. ISO: Geneva, Switzerland, 1998. Available online: https://www.iso.org/obp/ui/#iso:std:iso:9241:-11:ed-1:v1:en (accessed on 9 August 2025).
  12. Daif, A.; Dahroug, A.T.; López-Nores, M.; González-Soutelo, S.; Bassani, M.; Antoniou, A.; Gil-Solla, A.; Ramos-Cabrer, M.; Pazos-Arias, J.J. A mobile app to learn about cultural and historical associations in a closed loop with humanities experts. Appl. Sci. 2018, 9, 9. [Google Scholar] [CrossRef]
  13. Weichbroth, P. Usability attributes revisited: A time-framed knowledge map. In Proceedings of the IEEE 2018 Federated Conference on Computer Science and Information Systems (FedCSIS), Poznań, Poland, 9–12 September 2018; pp. 1005–1008. [Google Scholar]
  14. Acosta-Vargas, P.; Salvador-Acosta, B.; Salvador-Ullauri, L.; Villegas-Ch, W.; Gonzalez, M. Accessibility in native mobile applications for users with disabilities: A scoping review. Appl. Sci. 2021, 11, 5707. [Google Scholar] [CrossRef]
  15. Iakovets, A.; Balog, M.; Židek, K. The use of mobile applications for sustainable development of SMEs in the context of Industry 4.0. Appl. Sci. 2022, 13, 429. [Google Scholar] [CrossRef]
  16. Lee, H.J.; Lee, J.S.; Jee, E.; Bae, D.H. A user experience evaluation framework for mobile usability. Int. J. Softw. Eng. Knowl. Eng. 2017, 27, 235–279. [Google Scholar] [CrossRef]
  17. Au, F.T.; Baker, S.; Warren, I.; Dobbie, G. Automated usability testing framework. In Proceedings of the Ninth Conference on Australasian User Interface, Wollongong, Australia, 1 January 2008; Volume 76, pp. 55–64. [Google Scholar]
  18. Lee, K.B.; Grice, R.A. Developing a new usability testing method for mobile devices. In Proceedings of the IEEE International Professional Communication Conference, IPCC 2004, Minneapolis, MN, USA, 29 September–2 October 2004; pp. 115–127. [Google Scholar]
  19. Kluth, W.; Krempels, K.H.; Samsel, C. Automated usability testing for mobile applications. In Proceedings of the International Conference on Web Information Systems and Technologies, SCITEPRESS, Barcelona, Spain, 3–5 April 2014; Volume 2, pp. 149–156. [Google Scholar]
  20. Ma, X.; Yan, B.; Chen, G.; Zhang, C.; Huang, K.; Drury, J.; Wang, L. Design and implementation of a toolkit for usability testing of mobile apps. Mob. Netw. Appl. 2013, 18, 81–97. [Google Scholar] [CrossRef]
  21. Arif, K.S.; Ali, U. Mobile Application testing tools and their challenges: A comparative study. In Proceedings of the IEEE 2019 2nd International Conference on Computing, Mathematics and Engineering Technologies (iCoMET), Sukkur, Pakistan, 30–31 January 2019; pp. 1–6. [Google Scholar]
  22. Huang, Z.; Benyoucef, M. A systematic literature review of mobile application usability: Addressing the design perspective. Univers. Access Inf. Soc. 2023, 22, 715–735. [Google Scholar] [CrossRef]
  23. Kumar, B.A.; Chand, S.S.; Goundar, M.S. Usability testing of mobile learning applications: A systematic mapping study. Int. J. Inf. Learn. Technol. 2024, 41, 113–129. [Google Scholar] [CrossRef]
  24. Akmal Muhamat, N.; Hasan, R.; Saddki, N.; Mohd Arshad, M.R.; Ahmad, M. Development and usability testing of mobile application on diet and oral health. PLoS ONE 2021, 16, e0257035. [Google Scholar] [CrossRef]
  25. Storm, M.; Fjellså, H.M.H.; Skjærpe, J.N.; Myers, A.L.; Bartels, S.J.; Fortuna, K.L. Usability testing of a mobile health application for self-management of serious mental illness in a Norwegian community mental health setting. Int. J. Environ. Res. Public Health 2021, 18, 8667. [Google Scholar] [CrossRef] [PubMed]
  26. Yánez-Pérez, I.; Toma, R.B.; Meneses-Villagrá, J.Á. Design and usability evaluation of a mobile app for elementary school inquiry-based science learning. Sch. Sci. Math. 2025, 125, 247–258. [Google Scholar] [CrossRef]
  27. Weichbroth, P. Usability testing of mobile applications: A methodological framework. Appl. Sci. 2024, 14, 1792. [Google Scholar] [CrossRef]
  28. Ballard, B. Designing the Mobile User Experience; John Wiley & Sons: Hoboken, NJ, USA, 2007. [Google Scholar]
  29. Xiong, J.; Acemyan, C.Z.; Kortum, P. SUSapp: A Free Mobile Application That Makes the System Usability Scale (SUS) Easier to Administer. J. Usability Stud. 2020, 15, 135–144. [Google Scholar]
  30. La, H.J.; Lee, H.J.; Kim, S.D. An efficiency-centric design methodology for mobile application architectures. In Proceedings of the 2011 IEEE 7th International Conference on Wireless and Mobile Computing, Networking and Communications (WiMob), Shanghai, China, 10–12 October 2011; pp. 272–279. [Google Scholar]
  31. Al-Sakran, H.O.; Alsudairi, M.A. Usability and accessibility assessment of Saudi Arabia mobile e-government websites. IEEE Access 2021, 9, 48254–48275. [Google Scholar] [CrossRef]
  32. Palyama, D.G.; Tomasila, G. An Important Aspect of Satisfaction on Mobile Apps: An Usability Evaluation Based on Gender. JINAV J. Inf. Vis. 2022, 3, 93–98. [Google Scholar] [CrossRef]
  33. Hajesmaeel-Gohari, S.; Khordastan, F.; Fatehi, F.; Samzadeh, H.; Bahaadinbeigy, K. The most used questionnaires for evaluating satisfaction, usability, acceptance, and quality outcomes of mobile health. BMC Med. Inform. Decis. Mak. 2022, 22, 22. [Google Scholar] [CrossRef]
  34. Iacob, C.; Harrison, R. Retrieving and analyzing mobile apps feature requests from online reviews. In Proceedings of the IEEE 2013 10th Working Conference on Mining Software Repositories (MSR), San Francisco, CA, USA, 18–19 May 2013; pp. 41–44. [Google Scholar]
  35. Iacob, C.; Veerappa, V.; Harrison, R. What are you complaining about?: A study of online reviews of mobile applications. In Proceedings of the 27th International BCS Human Computer Interaction Conference (HCI 2013), BCS Learning & Development, London, UK, 9–13 September 2013. [Google Scholar]
  36. Ismail, N.; Ahmad, F.; Kamaruddin, N.; Ibrahim, R. A review on usability issues in mobile applications. IOSR J. Mob. Comput. Appl. 2016, 3, 47–52. [Google Scholar]
  37. Weichbroth, P.; Baj-Rogowska, A. Do online reviews reveal mobile application usability and user experience? The case of WhatsApp. In Proceedings of the IEEE 2019 Federated Conference on Computer Science and Information Systems (FedCSIS), Leipzig, Germany, 1–4 September 2019; pp. 747–754. [Google Scholar]
  38. Statista. Number of Mobile Game Users Worldwide from 2017 to 2030 (in Billions). 2025. Available online: https://www.statista.com/forecasts/667694/number-mobile-gamers-worldwide (accessed on 28 August 2025).
  39. Ho, S.C.; Tu, Y.C. The investigation of online reviews of mobile games. In Proceedings of the Workshop on E-Business, Shanghai, China, 4 December 2011; Springer: Berlin/Heidelberg, Germany, 2011; pp. 130–139. [Google Scholar]
  40. Weichbroth, P. Usability issues with mobile applications: Insights from practitioners and future research directions. IEEE Access 2025, 13, 91301–91311. [Google Scholar] [CrossRef]
  41. Neil, T. Mobile Design Pattern Gallery: UI Patterns for Smartphone Apps; O’Reilly Media, Inc.: Cambridge, MA, USA, 2014. [Google Scholar]
  42. Grill, T.; Biel, B.; Gruhn, V. A pattern approach to mobile interaction design. Inf. Technol. 2009, 51, 93–101. [Google Scholar]
  43. Punchoojit, L.; Hongwarittorrn, N. Usability studies on mobile user interface design patterns: A systematic literature review. Adv. Hum.-Comput. Interact. 2017, 2017, 6787504. [Google Scholar] [CrossRef]
  44. da Silva, L.F.; Parreira Junior, P.A.; Freire, A.P. Mobile user interaction design patterns: A systematic mapping study. Information 2022, 13, 236. [Google Scholar] [CrossRef]
  45. Genc-Nayebi, N.; Abran, A. A systematic literature review: Opinion mining studies from mobile app store user reviews. J. Syst. Softw. 2017, 125, 207–219. [Google Scholar] [CrossRef]
  46. Aggarwal, C. Documenting the Right Things: Writing Documentation for Mobile App Development. 2024. Available online: https://chetan-aggarwal.medium.com/documenting-the-right-things-writing-documentation-for-mobile-app-development-5f0f024fa5ef (accessed on 9 July 2025).
  47. Mykhailov, M. How to Write a Proper Mobile App Requirements Document in 5 Steps. 2025. Available online: https://nix-united.com/blog/how-to-write-a-proper-mobile-app-requirements-document-in-5-steps/ (accessed on 9 July 2025).
  48. Kortum, F.; Klünder, J.; Schneider, K. Miscommunication in software projects: Early recognition through tendency forecasts. In Proceedings of the International Conference on Product-Focused Software Process Improvement, Trondheim, Norway, 22–24 November 2016; Springer: Cham, Switzerland, 2016; pp. 731–738. [Google Scholar]
  49. De Paolis, L.T.; Gatto, C.; Corchia, L.; De Luca, V. Usability, user experience and mental workload in a mobile Augmented Reality application for digital storytelling in cultural heritage. Virtual Real. 2023, 27, 1117–1143. [Google Scholar] [CrossRef]
  50. Chan, M. Storytelling in UX Work: Study Guide. 2024. Available online: https://www.nngroup.com/articles/storytelling-study-guide/ (accessed on 8 August 2025).
  51. Velhinho, A.; Quintero, E.; Pedro, L.; Almeida, P. Usability Study of a Mobile Platform for Community Memory Sharing and Audiovisual Storytelling. In Proceedings of the Iberoamerican Conference on Applications and Usability of Interactive TV, Santo Domingo, Dominican Republic, 13–15 November 2024; Springer: Cham, Switzerland, 2024; pp. 122–138. [Google Scholar]
  52. Mao, J.Y.; Vredenburg, K.; Smith, P.W.; Carey, T. The state of user-centered design practice. Commun. ACM 2005, 48, 105–109. [Google Scholar] [CrossRef]
  53. Trujillo-Lopez, L.A.; Raymundo-Guevara, R.A.; Morales-Arevalo, J.C. User-Centered Design of a Computer Vision System for Monitoring PPE Compliance in Manufacturing. Computers 2025, 14, 312. [Google Scholar] [CrossRef]
  54. Lopes, A.; Valentim, N.; Moraes, B.; Zilse, R.; Conte, T. Applying user-centered techniques to analyze and design a mobile application. J. Softw. Eng. Res. Dev. 2018, 6, 5. [Google Scholar] [CrossRef]
  55. Singh, V.P.; Golden Rules for Mobile System Design Interviews. Mobile System Design Explained in Detail with Examples. 2024. Available online: https://medium.com/\spacefactor\@m{}engineervishvnath/golden-rules-for-mobile-system-design-interviews-bd7b71e4f454 (accessed on 9 August 2025).
  56. Sky Web Design Technologies. How to Balance Performance and Design in Mobile App Development. 2024. Available online: https://www.linkedin.com/pulse/how-balance-performance-design-mobile-app-gyzic/ (accessed on 9 July 2025).
  57. Cheng, L.C. The mobile app usability inspection (MAUi) framework as a guide for minimal viable product (MVP) testing in lean development cycle. In Proceedings of the 2nd International Conference in HCI and UX Indonesia 2016, Jakarta, Indonesia, 13–15 April 2016; pp. 1–11. [Google Scholar]
  58. Yang, M.; Gao, Q.; Wang, X. A comprehensive learnability framework for mobile application design for older adults. Univers. Access Inf. Soc. 2024, 24, 1393–1423. [Google Scholar] [CrossRef]
  59. Zhang, T.; Rau, P.L.P.; Salvendy, G.; Zhou, J. Comparing low and high-fidelity prototypes in mobile phone evaluation. Int. J. Technol. Diffus. (IJTD) 2012, 3, 1–19. [Google Scholar] [CrossRef]
  60. Kanai, S.; Horiuchi, S.; Kikuta, Y.; Yokoyama, A.; Shiroma, Y. An integrated environment for testing and assessing the usability of information appliances using digital and physical mock-ups. In Proceedings of the International Conference on Virtual Reality, Beijing, China, 22–27 July 2007; Springer: Berlin/Heidelberg, Germany, 2007; pp. 478–487. [Google Scholar]
  61. Radziszewski, K.; Anacka, H.; Obracht-Prondzyńska, H.; Tomczak, D.; Wereszko, K.; Weichbroth, P. Greencoin: Prototype of a mobile application facilitating and evidencing pro-environmental behavior of citizens. Procedia Comput. Sci. 2021, 192, 2668–2677. [Google Scholar] [CrossRef]
  62. Wirtz, S.; Jakobs, E.M. Improving user experience for passenger information systems. Prototypes and reference objects. IEEE Trans. Prof. Commun. 2013, 56, 120–137. [Google Scholar] [CrossRef]
  63. The World Wide Web Consortium. Web Content Accessibility Guidelines (WCAG) 2.1. 2025. Available online: https://www.w3.org/TR/WCAG21/ (accessed on 8 August 2025).
  64. Halpin, M. Understanding the POUR Accessibility Principles. 2025. Available online: https://reciteme.com/news/pour-accessibility-principles/ (accessed on 8 August 2025).
  65. Ballantyne, M.; Jha, A.; Jacobsen, A.; Hawker, J.S.; El-Glaly, Y.N. Study of accessibility guidelines of mobile applications. In Proceedings of the 17th International Conference on Mobile and Ubiquitous Multimedia, Cairo, Egypt, 25–28 November 2018; pp. 305–315. [Google Scholar]
  66. Design Develop Now. Stop Losing Users: How Mobile App Responsive Design Saves You. 2025. Available online: https://designdevelopnow.com/blog/mobile-app-responsive-design/ (accessed on 9 August 2025).
  67. Almeida, F.; Monteiro, J. The Role of Responsive Design in Web Development. Webology 2017, 14, 48–65. [Google Scholar]
  68. Martin, K.; Shilton, K. Putting mobile application privacy in context: An empirical study of user privacy expectations for mobile devices. Inf. Soc. 2016, 32, 200–216. [Google Scholar] [CrossRef]
  69. Alkindi, Z.R.; Sarrab, M.; Alzeidi, N. User privacy and data flow control for android apps: Systematic literature review. J. Cyber Secur. Mobil. 2021, 10, 261–304. [Google Scholar] [CrossRef]
  70. Lee, M.; Kim, G.J. On applying experience sampling method to a/b testing of mobile applications: A case study. In Proceedings of the IFIP Conference on Human-Computer Interaction, Bamberg, Germany, 14–18 September 2015; Springer: Cham, Switzerland, 2015; pp. 203–210. [Google Scholar]
  71. Samuel, T.; Pfahl, D. Problems and solutions in mobile application testing. In Proceedings of the International Conference on Product-Focused Software Process Improvement, Trondheim, Norway, 22–24 November 2016; Springer: Cham, Switzerland, 2016; pp. 249–267. [Google Scholar]
  72. Neusesser, T. A/B Testing 101. 2024. Available online: https://www.nngroup.com/articles/ab-testing/ (accessed on 13 September 2025).
  73. Glance. How Many Users Do I Need For Effective App Testing? 2024. Available online: https://thisisglance.com/learning-centre/how-many-users-do-i-need-for-effective-app-testing (accessed on 14 September 2025).
  74. Samrgandi, N. User interface design & evaluation of mobile applications. Int. J. Comput. Sci. Netw. Secur. 2021, 21, 55–63. [Google Scholar]
  75. Adinata, M.; Liem, I. A/B test tools of native mobile application. In Proceedings of the IEEE 2014 International Conference on Data and Software Engineering (ICODSE), Bandung, Indonesia, 26–27 November 2014; pp. 1–6. [Google Scholar]
  76. Krüger, J.D. Optimizing e-commerce relaunches: A rigorous economic framework employing a/b testing for enhanced conversion rates and risk mitigation. Preprint 2025. [Google Scholar] [CrossRef]
  77. King, R.; Churchill, E.F.; Tan, C. Designing with Data: Improving the User Experience with A/B Testing; O’Reilly Media, Inc.: Cambridge, MA, USA, 2017. [Google Scholar]
  78. Goswami, A.; Han, W.; Wang, Z.; Jiang, A. Controlled experiments for decision-making in e-Commerce search. In Proceedings of the 2015 IEEE International Conference on Big Data (Big Data), Santa Clara, CA, USA, 29 October–1 November 2015; pp. 1094–1102. [Google Scholar]
  79. Saleh, A.M.; Ismail, R.B. The Impact of Dynamic Prototype on Usability Testing and User Satisfaction: Fidelity and In-situ Prototyping. In Proceedings of the 3rd Global Summit on Education GSE 2015, Kuala Lumpur, Malaysia, 9–10 March 2015. [Google Scholar]
  80. Kim, S.Y.; Lee, Y. Using High Fidelity Interactive Prototypes for Effective Communication to Create an Enterprise Mobile Application. In Proceedings of the International Conference on Applied Human Factors and Ergonomics, Virtual, 16–20 July 2020; Springer: Cham, Switzerland, 2020; pp. 173–178. [Google Scholar]
  81. Nielsen, J. 10 Usability Heuristics for User Interface Design. 1994. Available online: https://www.nngroup.com/articles/ten-usability-heuristics/ (accessed on 8 August 2025).
  82. Global App Testing. The Ultimate Guide to Smoke Testing. 2024. Available online: https://www.globalapptesting.com/blog/the-ultimate-guide-to-smoke-testing (accessed on 20 September 2025).
  83. Herbold, S.; Haar, T. Smoke testing for machine learning: Simple tests to discover severe bugs. Empir. Softw. Eng. 2022, 27, 45. [Google Scholar] [CrossRef]
  84. Reshma, S.; Mohan Kumar, H.; Manu, A. Smoke Test Execution in Software Application Testing. In Proceedings of the 4th International Conference on Emerging Research in Electronics, Computer Science and Technology, ICERECT, Mandya, India, 26–27 December 2022; pp. 26–27. [Google Scholar]
  85. Pérez-Fernández, L.; Sebastián, M.A.; González-Gaya, C. Methodology to optimize quality costs in manufacturing based on multi-criteria analysis and lean strategies. Appl. Sci. 2022, 12, 3295. [Google Scholar] [CrossRef]
  86. Six Sigma Development Solution. Impact Effort Matrix: Prioritize Projects with Our Simple Guide. 2025. Available online: https://sixsigmadsi.com/what-is-an-impact-effort-matrix-how-does-it-work/ (accessed on 16 September 2025).
  87. Gök, S. Design the Best UX with These 7 Guidelines. 2022. Available online: https://www.iienstitu.com/en/blog/design-the-best-ux-with-these-7-guidelines (accessed on 9 July 2025).
  88. Teka, D.; Dittrich, Y.; Kifle, M.; Ardito, C.; Lanzilotti, R. User involvement and usability evaluation in Ethiopian software organizations. Electron. J. Inf. Syst. Dev. Ctries. 2017, 83, 1–19. [Google Scholar] [CrossRef]
  89. Nilsson, E.G. Design patterns for user interface for mobile applications. Adv. Eng. Softw. 2009, 40, 1318–1328. [Google Scholar] [CrossRef]
  90. Dutson, P. Responsive Mobile Design: Designing for Every Device; Addison-Wesley Professional: Boston, MA, USA, 2014. [Google Scholar]
  91. Patel, J.; Gershoni, G.; Krishnan, S.; Nelimarkka, M.; Nonnecke, B.; Goldberg, K. A case study in mobile-optimized vs. responsive web application design. In Proceedings of the 17th International Conference on Human-Computer Interaction with Mobile Devices and Services Adjunct, Copenhagen, Denmark, 24–27 August 2015; pp. 576–581. [Google Scholar]
  92. Parlakkiliç, A. Evaluating the effects of responsive design on the usability of academic websites in the pandemic. Educ. Inf. Technol. 2022, 27, 1307–1322. [Google Scholar] [CrossRef] [PubMed]
  93. de Paula, D.F.; Menezes, B.H.; Araújo, C.C. Building a quality mobile application: A user-centered study focusing on design thinking, user experience and usability. In Proceedings of the International Conference of Design, User Experience, and Usability, Heraklion, Greece, 22–27 June 2014; Springer: Cham, Switzerland, 2014; pp. 313–322. [Google Scholar]
  94. Sedlmayr, B.; Schöffler, J.; Prokosch, H.U.; Sedlmayr, M. User-centered design of a mobile medication management. Inform. Health Soc. Care 2019, 44, 152–163. [Google Scholar] [CrossRef] [PubMed]
  95. Quezada, P.; Cueva, R.; Paz, F. A systematic review of user-centered design techniques applied to the design of mobile application user interfaces. In Proceedings of the International Conference on Human-Computer Interaction, Virtual, 24–29 July 2021; Springer: Cham, Switzerland, 2021; pp. 100–114. [Google Scholar]
  96. Jurgensen, C. Motivation in lifestyle changes: Using mock-ups as a tool for exploring the design space. In Proceedings of the IEEE 2011 5th International Conference on Pervasive Computing Technologies for Healthcare (PervasiveHealth) and Workshops, Dublin, Ireland, 23–26 May 2011; pp. 572–576. [Google Scholar]
  97. Jouppila, T.; Tiainen, T. Nurses’ Participation in the design of an intensive care unit: The use of virtual mock-ups. HERD Health Environ. Res. Des. J. 2021, 14, 301–312. [Google Scholar] [CrossRef]
  98. Duan, P.; Warner, J.; Li, Y.; Hartmann, B. Generating automatic feedback on ui mockups with large language models. In Proceedings of the 2024 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 11–16 May 2024; pp. 1–20. [Google Scholar]
  99. Kryszajtys, D.T.; Rudzinski, K.; Chan Carusone, S.; Guta, A.; King, K.; Strike, C. Do mock-ups, presentations of evidence, and Q&As help participants voice their opinions during focus groups and interviews about supervised injection services? Int. J. Qual. Methods 2021, 20, 16094069211033439. [Google Scholar]
  100. Huq, S.F.; Tafreshipour, M.; Kalcevich, K.; Malek, S. Automated Generation of Accessibility Test Reports from Recorded User Transcripts. In Proceedings of the 2025 IEEE/ACM 47th International Conference on Software Engineering (ICSE), Ottawa, ON, Canada, 27 April–3 May 2025; IEEE Computer Society: Los Alamitos, CA, USA, 2025; pp. 534–546. [Google Scholar]
  101. World Health Organization. Disability. 2023. Available online: https://www.who.int/news-room/fact-sheets/detail/disability-and-health (accessed on 17 September 2025).
  102. Assal, H.; Hurtado, S.; Imran, A.; Chiasson, S. What’s the deal with privacy apps? A comprehensive exploration of user perception and usability. In Proceedings of the 14th International Conference on Mobile and Ubiquitous Multimedia, Linz, Austria, 30 November– 2 December 2015; pp. 25–36. [Google Scholar]
  103. Amaral Cejas, O.; Abualhaija, S.; Sannier, N.; Ceci, M.; Bianculli, D. GDPR Compliance in Privacy Policies of Mobile Apps: An Overview of the State-of-Practice. In Proceedings of the the 33rd IEEE International Requirements Engineering 2025 Conference, Valencia, Spain, 1–5 September 2025. [Google Scholar]
  104. Asuquo, P.; Cruickshank, H.; Morley, J.; Ogah, C.P.A.; Lei, A.; Hathal, W.; Bao, S.; Sun, Z. Security and privacy in location-based services for vehicular and mobile communications: An overview, challenges, and countermeasures. IEEE Internet Things J. 2018, 5, 4778–4802. [Google Scholar] [CrossRef]
  105. Aïmeur, E.; Lawani, O.; Dalkir, K. When changing the look of privacy policies affects user trust: An experimental study. Comput. Hum. Behav. 2016, 58, 368–379. [Google Scholar] [CrossRef]
  106. Svanæs, D.; Alsos, O.A.; Dahl, Y. Usability testing of mobile ICT for clinical settings: Methodological and practical challenges. Int. J. Med. Inform. 2010, 79, e24–e34. [Google Scholar] [CrossRef]
  107. Hassanaly, P.; Dufour, J.C. Analysis of the regulatory, legal, and medical conditions for the prescription of mobile health applications in the United States, the European Union, and France. Med. Devices Evid. Res. 2021, 14, 389–409. [Google Scholar] [CrossRef] [PubMed]
  108. Schleser, M. Mobile Storytelling: Changes, Challenges and Chances. In Mobile Storytelling in an Age of Smartphones; Springer: Cham, Switzerland, 2022; pp. 9–24. [Google Scholar]
  109. Zhu, D.; Al Mahmud, A.; Liu, W.; Wang, D. Digital storytelling for people with cognitive impairment using available mobile apps: Systematic search in app stores and content analysis. JMIR Aging 2024, 7, e64525. [Google Scholar] [CrossRef] [PubMed]
  110. Yoo HyunKyung, Y.H.; Kim EunJin, K.E.; Yoon YooShik, Y.Y.; Lee HeyRon, L.H.; Kwon SoonWoo, K.S. Factors of Storytelling that Effect on Visitors’ Satisfaction and Loyalty. In Proceedings of the 5th Advances in Hospitality &Tourism Marketing and Management (AHTMM) Conference, Beppu, Japan, 18–21 June 2015; pp. 543–548. [Google Scholar]
  111. Ojonugwa, B.M.; Ogunwale, B.; Adanigbo, O.S.; Ochefu, A. Media Production in Fintech: Leveraging Visual Storytelling to Enhance Consumer Trust and Engagement. J. Front. Multidiscip. Res. 2022, 3, 425–431. [Google Scholar] [CrossRef]
  112. Salvietti, G. Fostering loyalty through engagement and gamification. In Loyalty Management; Routledge: Oxfordshire, UK, 2025; pp. 163–181. [Google Scholar]
  113. Thefinchdesignagency. The Role of Storytelling in UX Design: How to Create a Narrative. 2024. Available online: https://medium.com/@thefinchdesignagency/the-role-of-storytelling-in-ux-design-how-to-create-a-narrative-1c5809889a34 (accessed on 17 September 2025).
  114. Redlarski, K.; Weichbroth, P. Hard lessons learned: Delivering usability in IT projects. In Proceedings of the IEEE 2016 Federated Conference on Computer Science and Information Systems (FedCSIS), Gdansk, Poland, 11–14 September 2016; pp. 1379–1382. [Google Scholar]
  115. Balagtas-Fernandez, F.; Hussmann, H. A methodology and framework to simplify usability analysis of mobile applications. In Proceedings of the 2009 IEEE/ACM International Conference on Automated Software Engineering, Auckland, New Zealand, 16–20 November 2009; pp. 520–524. [Google Scholar]
  116. Zou, T.; Ertug, G. Unwitting participants at our expense: A/B testing and digital exploitation. Bus. Soc. 2024, 63, 1513–1517. [Google Scholar] [CrossRef]
  117. Lettner, F.; Holzmann, C.; Hutflesz, P. Enabling A/B testing of native mobile applications by remote user interface exchange. In Proceedings of the International Conference on Computer Aided Systems Theory, Las Palmas de Gran Canaria, Spain, 10–15 February 2013; Springer: Berlin/Heidelberg, Germany, 2013; pp. 458–466. [Google Scholar]
  118. Zarzosa, J. Adopting a design-thinking multidisciplinary learning approach: Integrating mobile applications into a marketing research course. Mark. Educ. Rev. 2018, 28, 120–125. [Google Scholar] [CrossRef]
Figure 1. The ten most prevalent categories of mobile usability issues. Source: [40].
Figure 1. The ten most prevalent categories of mobile usability issues. Source: [40].
Applsci 15 11995 g001
Figure 2. The MUG-10 framework for preventing usability issues in mobile application development.
Figure 2. The MUG-10 framework for preventing usability issues in mobile application development.
Applsci 15 11995 g002
Figure 3. Impact-effort matrix of the MUG-10 framework guidelines for preventing usability issues in mobile application development.
Figure 3. Impact-effort matrix of the MUG-10 framework guidelines for preventing usability issues in mobile application development.
Applsci 15 11995 g003
Table 1. The demographic profiles of the experts and the length of their responses.
Table 1. The demographic profiles of the experts and the length of their responses.
SexAgeEducationCurrent OccupationExperienceProjectsWords
Man39HigherFront-End Developer164322
Man26HigherProduct Designer52200
Man42HigherUX Designer1725172
Woman30HigherUX Designer89163
Woman26HihgerUX Researcher43160
Man25HigherSenior Data Engineer63107
Man29HigherIT Team Leader32103
Woman46HigherMobile Front-End Dev.201597
Woman33HigherUX Designer3392
Man26SecondaryFront-End Developer4385
Man37HigherIT Consultant20581
Woman41HihgerUX Writer20373
Woman45HigherUX Designer201967
Man22HigherMobile Software Dev.2267
Woman42HigherUX Writer16262
Woman26HihgerUX Designer3550
Man31SecondaryUX Designer7348
Woman56HigherSoftware Developer201545
Woman41HigherTechnical Writer16215
Man42HihgerCPO10212
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Weichbroth, P.; Szot, T. The MUG-10 Framework for Preventing Usability Issues in Mobile Application Development. Appl. Sci. 2025, 15, 11995. https://doi.org/10.3390/app152211995

AMA Style

Weichbroth P, Szot T. The MUG-10 Framework for Preventing Usability Issues in Mobile Application Development. Applied Sciences. 2025; 15(22):11995. https://doi.org/10.3390/app152211995

Chicago/Turabian Style

Weichbroth, Pawel, and Tomasz Szot. 2025. "The MUG-10 Framework for Preventing Usability Issues in Mobile Application Development" Applied Sciences 15, no. 22: 11995. https://doi.org/10.3390/app152211995

APA Style

Weichbroth, P., & Szot, T. (2025). The MUG-10 Framework for Preventing Usability Issues in Mobile Application Development. Applied Sciences, 15(22), 11995. https://doi.org/10.3390/app152211995

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop