Cumulative Impact of Testing Factors in Usability Tests for Human-Centered Web Design †
Abstract
:1. Introduction
2. Literature Review: Contextual Fidelity Factors and the Lack of Complexity in Today’s Research on Them
- -
- most authors focus only on one factor and its impact upon user experience, which makes cumulative impact of several factors slip away from the researchers’ attention;
- -
- it is rare that authors use social theories beyond pure perception and/or cognitive studies for assessment of the contextual fidelity factors, while user experience has social aspects and there are theories that may be applied to better understand the user states during task performance;
- -
- the proxies defined by scholars for researching upon contextual fidelity factors are usually very narrow. For example, web design quality is assessed via comparing two types of menus, without all other aspects being taken into account.
- -
- instead of focusing on one factor, we assess cumulative effects of them all;
- -
- we see these factors, as well as user states, as multi-faceted and complex;
- -
- this is why we apply theories and methods that allow for a significantly more complex assessment of both the contextual fidelity factors and user states.
2.1. User Traits in Usability Studies
2.2. Product Features: Complex Assessment of Web Design Quality
- -
- On the macro-level: overall type of layout; layout module structure; vertical spacing; page zonation; creolization of the layout;
- -
- On the micro-level:
- ○
- syntagma: line length, line length in title block, leading (inter-lineage spacing);
- ○
- typography: contour contrast, tone and color contrast with background, font adaptivity, x-height, font and line length combination.
2.3. The Testing Environment: Individual vs. Group Task Performance
3. Literature Review: Towards the Assessment of Cumulative Impact of Contextual Fidelity Factors and Complexity of User States in Usability Testing
3.1. User Traits vs. Task Features/Product Features, and Their Crossroads in Aesthetic Perception
3.2. The Research on Cumulative Nature of User Experience
3.3. User Functionality as the Target Category in User Experience Studies: (Dys)functional User States
4. The Research Hypotheses
5. Methods and Assessor Groups
5.1. The Research Design
- Contextual fidelity factors (independent variables):
- (1)
- product features ≥ web aesthetics quality ≥ U-index (13 parameters);
- (2)
- task complexity ≥ monotony-inducing and anxiety-inducing tasks especially elaborated for the experiment;
- (3)
- user traits ≥ cultural belonging of respondents, as cultural belonging implies many individual factors of visual perception, from traditions of color perception to left–right vs. up–down reading;
- (4)
- experiment conditions ≥ group vs. individual performance, as this division, according to previous research, may affect the results to the biggest extent.
- User states (dependent variables): The states of monotony and anxiety as complex dysfunctional states, instead of measuring particular eye reactions or time of performance. Measured by both rational and emotional parameters, that is, intellectual lability and emotional stress (with four scales for measuring individual emotions).
5.2. The Assessor Groups
5.3. Elaboration of Test Tasks
5.4. Measurement and Scales
- –
- anxiety (the ‘calmness/anxiety’ scale);
- –
- fatigue (the ‘energy/fatigue’ scale);
- –
- arousal (the ‘excitement/depression’ scale);
- –
- confidence (the ‘self-confidence/helplessness’ scale).
5.5. Data Analysis
6. Results
6.1. Testing the Cultural Differences
User States | Metrics | Aesthetic Design | Non-Aesthetic Design | Individual Work | Group Work | |
---|---|---|---|---|---|---|
Monotony | Intellectual lability | 0.739 | 0.315 | 0.218 | 0.912 | |
Emotional stress | Calmness/anxiety | 0.353 | 0.796 | 0.280 | 0.971 | |
Energy/fatigue | 0.481 | 0.684 | 0.912 | 0.912 | ||
Excitement/depression | 0.052† | 0.218 | 0.247 | 0.055 † | ||
Self-conf./helplessness | 0.853 | 1.000 | 0.280 | 0.315 | ||
Anxiety | Intellectual lability | 0.853 | 0.579 | 0.912 | 0.853 | |
Emotional stress | Calmness/anxiety | 0.353 | 0.28 | 0.739 | 1.000 | |
Energy/fatigue | 0.218 | 0.436 | 0.739 | 0.912 | ||
Excitement/depression | 0.052† | 0.529 | 0.529 | 0.529 | ||
Self-conf./helplessness | 0.796 | 0.315 | 0.123 | 0.796 |
6.2. The Results of the Experiment and Their Interpretation
User State | Metrics | Individual Test | |||||||
---|---|---|---|---|---|---|---|---|---|
Aesthetic Design | Non-Aesthetic Design | Mann-Whitney, Δ p-Value | |||||||
before Task | after Task | Delta (Δ) | before Task | after Task | Delta (Δ) | ||||
Monotony | Intellectual lability (Amean) | 187.05 (1.057) | 182.93 (1.128) | −4.12 (1.504) | 186.74 (1.266) | 178.91 (2.166) | −7.83 (3.019) | 0.004 | |
Emotional stress (10 to 1) | Calmness/anxiety | 7 (0.471) | 7 (0.480) | 0 (0.667) | 7 (1.490) | 6.9 (1.449) | −0.1 (1.853) | 0.912 | |
Energy/fatigue | 6.9 (1.197) | 5.5 (0.527) | −1.4 (1.505) | 6.9 (1.449) | 5.5 (1.649) | −1.4 (2.547) | 1.000 | ||
Excitement/depression | 6.9 (0.567) | 6.9 (0.316) | 0 (0.817) | 7.1 (1.728) | 7 (1.247) | −0.1 (0.876) | 0.684 | ||
Self-conf./helplessness | 6.85 (1.475) | 7.1 (1.100) | 0.3 (1.767) | 7.1 (1.370) | 7 (0.816) | −0.1 (0.876) | 0.353 | ||
Anxiety | Intellectual lability (Amean) | 187.15 (1.285) | 171.13 (3.067) | −16.02 (2.791) | 186.94 (0.751) | 187.04 (4.247) | 0.1 (4.423) | 0.000 | |
Emotional stress (10 to 1) | Calmness/anxiety | 7.1 (0.875) | 6.1 (0.800) | −1 (0.471) | 6.8 (1.135) | 5.7 (0.823) | −1.1 (0.568) | 0.000 | |
Energy/fatigue | 6 (0.670) | 5.9 (0.737) | −0.1 (0.568) | 6.1 (0.567) | 5.9 (0.737) | −0.2 (0.632) | 0.739 | ||
Excitement/depression | 6.9 (0.567) | 7 (0.471) | 0.1 (0.567) | 7.1 (0.994) | 7.2 (1.135) | 0.1 (0.568) | 1.000 | ||
Self-conf./helplessness | 7.1 (1.370) | 7.3 (1.251) | 0.2 (0.789) | 7 (1.247) | 6.9 (0.875) | −0.1 (0.994) | 0.436 |
User State | Metrics | Group Test | |||||||
---|---|---|---|---|---|---|---|---|---|
Aesthetic Design | Non-Aesthetic Design | Mann-Whitney, Δ p-Value | |||||||
before Task | after Task | Delta (Δ) | before Task | after Task | Delta (Δ) | ||||
Monotony | Intellectual lability (Amean) | 186.9 (3.478) | 185.14 (8.230) | −1.76 (6.401) | 186.92 (3.030) | 175.94 (3.815) | −10.98 (5.622) | 0.003 | |
Emotional stress (10 to 1) | Calmness/anxiety | 6.9 (1.286) | 6.8 (1.135) | −0.1 (0.316) | 7.1 (0.875) | 6.9 (1.286) | −0.2 (0.919) | 0.436 | |
Energy/fatigue | 5.9 (0.737) | 5.8 (0.788) | −0.1 (0.568) | 6.9 (1.197) | 5.3 (1.059) | −1.6 (1.579) | 0.015 | ||
Excitement/depression | 7.2 (0.632) | 6.9 (0.567) | −0.3 (0.483) | 7.1 (1.286) | 6.8 (0.918) | −0.3 (0.949) | 0.684 | ||
Self-conf./helplessness | 6.5 (1.080) | 6.8 (0.918) | 0.3 (1.059) | 6.7 (2.110) | 6.9 (2.020) | 0.2 (0.632) | 0.853 | ||
Anxiety | Intellectual lability (Amean) | 186.96 (1.028) | 159.2 (5.860) | −27.76 (5.977) | 186.99 (0.741) | 173.94 (3.048) | −13.05 (3.497) | 0.000 | |
Emotional stress (10 to 1) | Calmness/anxiety | 6.7 (1.330) | 5.1 (1.286) | −1.6 (0.699) | 7.1 (0.875) | 5.7 (0.823) | −1.4 (0.966) | 0.529 | |
Energy/fatigue | 5.9 (0.737) | 5.8 (0.788) | −0.1 (0.568) | 6.1 (0.875) | 5.9 (1.449) | −0.2 (1.549) | 0.853 | ||
Excitement/depression | 6.9 (0.567) | 7 (0.471) | 0.1 (0.568) | 6.9 (0.737) | 7 (0.816) | 0.1 (0.316) | 0.971 | ||
Self-conf./helplessness | 6.9 (0.500) | 7 (1.054) | 0.1 (0.738) | 6.1 (1.663) | 5.9 (1.286) | −0.2 (0.422) | 0.393 |
- (1)
- Monotony-related effects:
- -
- Induction of monotony (drop of intellectual lability + growth of fatigue) in individual testing, with slight difference in fatigue dispersion between aesthetic and non-aesthetic design;
- -
- Induction of monotony (drop of intellectual lability + growth of fatigue) for group performance on non-aesthetic design;
- -
- Preservation of user state in monotony-inducing tasks in group performance, if mediated by aesthetic design;
- (2)
- Anxiety-related effects:
- -
- Induction of the state of anxiety (drop of intellectual lability + growth of anxiety) in group testing on both design types and in individual testing mediated by efficient design;
- -
- Preservation of intellectual lability in group testing with non-aesthetic design.
User State | Metrics | Aesthetic Design | |||||||
---|---|---|---|---|---|---|---|---|---|
Individual Testing | Group Testing | Mann–Whitney, Δ p-Value | |||||||
before Task | after Task | Delta (Δ) | before Task | after Task | Delta (Δ) | ||||
Monotony | Intellectual lability (Amean) | 187.05 (1.057) | 182.93 (1.128) | −4.12 (1.504) | 186.9 (3.478) | 185.14 (8.230) | −1.76 (6.401) | 0.218 | |
Emotional stress (10 to 1) | Calmness/anxiety | 7 (0.471) | 7 (0.480) | 0 (0.667) | 6.9 (1.286) | 6.8 (1.135) | −0.1 (0.316) | 0.796 | |
Energy/fatigue | 6.9 (1.197) | 5.5 (0.527) | −1.4 (1.505) | 5.9 (0.737) | 5.8 (0.788) | −0.1 (0.568) | 0.035 | ||
Excitement/depression | 6.9 (0.567) | 6.9 (0.316) | 0 (0.817) | 7.2 (0.632) | 6.9 (0.567) | −0.3 (0.483) | 0.247 | ||
Self-conf./helplessness | 6.85 (1.475) | 7.1 (1.100) | 0.3 (1.767) | 6.5 (1.080) | 6.8 (0.918) | 0.3 (1.059) | 0.796 | ||
Anxiety | Intellectual lability (Amean) | 187.15 (1.285) | 171.13 (3.067) | −16.02 (2.791) | 186.96 (1.028) | 159.2 (5.860) | −27.76 (5.977) | 0.000 | |
Emotional stress (10 to 1) | Calmness/anxiety | 7.1 (0.875) | 6.1 (0.800) | −1 (0.471) | 6.7 (1.330) | 5.1 (1.286) | −1.6 (0.699) | 0.000 | |
Energy/fatigue | 6 (0.670) | 5.9 (0.737) | −0.1 (0.568) | 5.9 (0.737) | 5.8 (0.788) | −0.1 (0.568) | 1.000 | ||
Excitement/depression | 6.9 (0.567) | 7 (0.471) | 0.1 (0.567) | 6.9 (0.567) | 7 (0.471) | 0.1 (0.568) | 1.000 | ||
Self-conf./helplessness | 7.1 (1.370) | 7.3 (1.251) | 0.2 (0.789) | 6.9 (0.500) | 7 (1.054) | 0.1 (0.738) | 0.796 |
User State | Metrics | Non-Aesthetic Design | |||||||
Individual Testing | Group Testing | Mann–Whitney, Δp-Value | |||||||
before Task | after Task | Delta (Δ) | before Task | after Task | Delta (Δ) | ||||
Monotony | Intellectual lability (Amean) | 186.74 (1.266) | 178.91 (2.166) | −7.83 (3.019) | 186.92 (3.030) | 175.94 (3.815) | −10.98 (5.622) | 0.143 | |
Emotional stress (10 to 1) | Calmness/anxiety | 7 (1.490) | 6.9 (1.449) | −0.1 (1.853) | 7.1 (0.875) | 6.9 (1.286) | −0.2 (0.919) | 0.971 | |
Energy/fatigue | 6.9 (1.449) | 5.5 (1.649) | −1.4 (2.547) | 6.9 (1.197) | 5.3 (1.059) | −1.6 (1.579) | 0.912 | ||
Excitement/depression | 7.1 (1.728) | 7 (1.247) | −0.1 (0.876) | 7.1 (1.286) | 6.8 (0.918) | −0.3 (0.949) | 0.853 | ||
Self-conf./helplessness | 7.1 (1.370) | 7 (0.816) | −0.1 (0.876) | 6.7 (2.110) | 6.9 (2.020) | 0.2 (0.632) | 0.280 | ||
Anxiety | Intellectual lability (Amean) | 186.94 (0.751) | 187.04 (4.247) | 0.1 (4.423) | 186.99 (0.741) | 173.94 (3.048) | −13.05 (3.497) | 0.000 | |
Emotional stress (10 to 1) | Calmness/anxiety | 6.8 (1.135) | 5.7 (0.823) | −1.1 (0.568) | 7.1 (0.875) | 5.7 (0.823) | −1.4 (0.966) | 0.631 | |
Energy/fatigue | 6.1 (0.567) | 5.9 (0.737) | −0.2 (0.632) | 6.1 (0.875) | 5.9 (1.449) | −0.2 (1.549) | 0.971 | ||
Excitement/depression | 7.1 (0.994) | 7.2 (1.135) | 0.1 (0.568) | 6.9 (0.737) | 7 (0.816) | 0.1 (0.316) | 0.971 | ||
Self-conf./helplessness | 7 (1.247) | 6.9 (0.875) | −0.1 (0.994) | 6.1 (1.663) | 5.9 (1.286) | −0.2 (0.422) | 0.853 |
7. Conclusions
7.1. Further Discussion of the Results
7.2. The Limitations of the Study
- Our research design implies that the smallest-level sub-group has 5 assessors only. This is not much, despite the overall number of assessors being three to five times larger than in most DUXU studies on individual contextual fidelity factors. We have tried to overcome his limitation by enlarging the sub-groups for the first and the second stage of data analysis. In the research design, the quality dichotomies for each contextual fidelity factor are actually tested on 80 participants.
- Such research design implies that its general limitation is that metrics of validity and significance, including standard deviations and confidence intervals, may play smaller roles than for larger-sample studies. This general limitation is well-recognized as a basic limitation in the quantitative DUXU research; this is why we speak of effects but also call them trends or tendencies. Despite that, the results of studies that do not employ validity and/or significance testing are also recognized as important for user experience studies.
- The theory of dysfunctional states is new for the English-language DUXU studies, while, in psychology of work, there are important empirical studies based on it. Its unrelatedness to user experience studies may be seen as a limitation of our study, but we would insist on introducing it into the research area that highly depends of the rise of dysfunctionality and has actually tried a lot to detect the dysfunctional states without naming them and seeing them as complex user states.
- Two more limitations are linked to the method of scaling, via which emotional stress indicators were measured. First, the method implies associating one’s state with a small enough number of fixed phrases, rather than more flexible measurement scales like, e.g., percentages or open questions that would allow self-interpretation of user states. Having a set of ranged phrases may lead to choices shifted towards either the middle of the scale or to some psychologically more acceptable options. Our results suggest that choices of the middle option occurred rarely enough, but the choice of the value slightly higher than the center was frequent, which deserves a separate methodological enquiry, despite the wide use of these questions-based Likert scales. Second, the questions could be understood differently in the two cultural groups. We have tried to overcome this difference by providing the questionnaires in English to both groups who we non-native English speakers, but this does not fully guarantee the equality of understanding.
- While working upon the data, we have discussed in the working group that, despite the instructions given to the assessors, their initial functional state, including the mood and mental readiness for task completion, is not fully captured by the tests before task and is elusive but potentially influential. In other words, factors beyond the immediate setting of the experiment may affect user performance in the ways not captured by the current testing instruments. This is why we have focused on the deltas that show the shifts in performance quality regardless of the underlying worse or better assessors’ moods and their individual cognitive differences.
- In assessing web aesthetics, we have followed the logic of maximum divergence of attribute and, thus, have focused on web pages with either high or low U-index, but the pages with middle-range U-index values have remained untested. However, given the antagonistic effects discovered and the compensatory effects of non-aesthetic design in case of anxiety-inducing tasks, web pages of middle quality may happen to have optimal efficiency if both types of tasks are taken into account. This deserves further studies.
8. Conclusions: Questions and Prospects for Future Research
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
Appendix A. Examples of Web Pages Used for the Experiment
References
- Sauer, J.; Sonderegger, A. The influence of product aesthetics and user state in usability testing. Behav. Inf. Technol. 2011, 30, 787–796. [Google Scholar] [CrossRef]
- Gilal, N.G.; Zhang, J.; Gilal, F.G. The four-factor model of product design: Scale development and validation. J. Prod. Brand Manag. 2018, 27, 684–700. [Google Scholar] [CrossRef]
- Madathil, C.K.; Greenstein, J.S. Synchronous remote usability testing: A new approach facilitated by virtual worlds. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Vancouver, BC, Canada, 7–12 May 2011; pp. 2225–2234. [Google Scholar]
- Gunesekera, A.I.; Bao, Y.; Kibelloh, M. The role of usability on e-learning user interactions and satisfaction: A literature review. J. Sys. Inf. Technol. 2019, 21, 368–394. [Google Scholar] [CrossRef]
- Aziz, N.S.; Sulaiman, N.S.; Hassan, W.N.I.T.M.; Zakaria, N.L.; Yaacob, A. A Review of Website Measurement for Website Usability Evaluation. J. Phys. Conf. Series 2021, 1874, 012045. [Google Scholar] [CrossRef]
- Asemi, A.; Asemi, A. A judgment-based model for usability evaluating of interactive systems using fuzzy Multi Factors Evaluation (MFE). Appl. Soft Comput. 2022, 117, 108411. [Google Scholar] [CrossRef]
- Puwakpitiyage, C.A.; Rao, V.R.; Azizi, M.S.A.; Tee, W.J.; Murugesan, R.K.; Hamzah, M.D. A Proposed Web Based Real Time Brain Computer Interface (BCI) System for Usability Testing. Int. J. Online Biomed. Eng. 2019, 15, 108–119. [Google Scholar]
- Alhadreti, O. A Comparison of Synchronous and Asynchronous Remote Usability Testing Methods. Int. J. Human–Comp. Interaction 2022, 38, 289–297. [Google Scholar] [CrossRef]
- Pal, D.; Vanijja, V. Perceived usability evaluation of Microsoft Teams as an online learning platform during COVID-19 using system usability scale and technology acceptance model in India. Child. Youth Serv. Rev. 2020, 119, 105535. [Google Scholar] [CrossRef]
- Meyer, J.T.; Gassert, R.; Lambercy, O. An analysis of usability evaluation practices and contexts of use in wearable robotics. J. Neuro-Eng. Rehab. 2021, 18, 1–15. [Google Scholar] [CrossRef]
- Kjeldskov, J.; Skov, M.B.; Stage, J. A longitudinal study of usability in health care: Does time heal? Int. J. Med. Inf. 2008, 79, 135–143. [Google Scholar] [CrossRef] [Green Version]
- Leonova, A.B. The concept of human functional state in Russian applied psychology. Psychol. Russia 2009, 2, 517–538. [Google Scholar] [CrossRef]
- Sonderegger, A.; Zbinden, G.; Uebelbacher, A.; Sauer, J. The influence of product aesthetics and usability over the course of time: A longitudinal field experiment. Ergon 2012, 55, 713–730. [Google Scholar] [CrossRef] [Green Version]
- Sauer, J.; Sonderegger, A.; Heyden, K.; Biller, J.; Klotz, J.; Uebelbacher, A. Extra-laboratorial usability tests: An empirical comparison of remote and classical field testing with lab testing. Appl. Ergon. 2019, 74, 85–96. [Google Scholar] [CrossRef]
- Tuch, A.N.; Roth, S.P.; Hornbaek, K.; Opwis, K.; Bargas-Avila, J.A. Is beautiful really usable? Toward understanding the relation between usability, aesthetics, and affect in HCI. Comput. Hum. Behav. 2012, 28, 1596–1607. [Google Scholar] [CrossRef]
- Partala, T.; Surakka, V. The effects of affective interventions in human-computer interaction. Interact. Comput. 2004, 16, 295–309. [Google Scholar] [CrossRef]
- Ben-Bassat, T.; Meyer, J.; Tractinsky, N. Economic and subjective measures of the perceived value of aesthetics and usability. ACM Trans. Comput. Human Interact. 2006, 13, 210–234. [Google Scholar] [CrossRef] [Green Version]
- Bodrunova, S.S.; Yakunin, A.V. U-index: An eye-tracking-tested checklist on webpage aesthetics for university web spaces in Russia and the USA. In Proceedings of the DUXU 2017, LNCS, Vancouver, Canada, 9–14 July 2017; Marcus, A., Wang, W., Eds.; Springer: Cham, Switzerland, 2017; Volume 10288, pp. 219–233. [Google Scholar] [CrossRef]
- Yakunin, A.; Bodrunova, S. Website aesthetics and functional user states as factors of web usability. In Proceedings of the IHIET 2021, LNNS, Paris, France, 27–29 August 2021; Ahram, T., Taiar, R., Eds.; Springer: Cham, Switzerland, 2021; Volume 319, pp. 394–401. [Google Scholar] [CrossRef]
- Bodrunova, S.S.; Yakunin, A. Impact of menu complexity upon user behavior and satisfaction in information search. In Proceedings of the International Conference on Human Interface and the Management of Information, Las Vegas, NV, USA, 15–20 July 2018; pp. 55–66. [Google Scholar]
- Jones, F.; Harris, P.; Waller, H.; Coggins, A. Adherence to an exercise prescription scheme: The role of expectations, self-efficacy, stage of change and psychological well-being. Br. J. Health Psychol. 2005, 10, 359–378. [Google Scholar] [CrossRef]
- Klein, J.; Moon, Y.; Picard, R.W. This computer responds to user frustration: Theory, design, and results. Interact. Comput. 2002, 14, 119–140. [Google Scholar] [CrossRef]
- Huang, Z.; Mou, J. Gender differences in user perception of usability and performance of online travel agency websites. Technol. Soc. 2021, 66, 101671. [Google Scholar] [CrossRef]
- Ahmad, N.A.N.; Hussaini, M. A Usability Testing of a Higher Education Mobile Application Among Postgraduate and Undergraduate Students. Int. J. Interact. Mob. Technol. 2021, 15, 88–102. [Google Scholar] [CrossRef]
- Cho, H.; Powell, D.; Pichon, A.; Kuhns, L.M.; Garofalo, R.; Schnall, R. Eye-tracking retrospective think-aloud as a novel approach for a usability evaluation. Int. J. Med. Inf. 2019, 129, 366–373. [Google Scholar] [CrossRef]
- Yakunin, A.V.; Bodrunova, S.S.; Gourieva, M. Rationality or aesthetics? Navigation vs. web page ergonomics in cross-cultural use of university websites. In Proceedings of the 5th International Conference on Internet Science (INSCI’2018), St. Petersburg, Russia, 24–26 October 2018; Springer: Cham, Switzerland, 2018; pp. 169–180. [Google Scholar]
- Sauer, J.; Sonderegger, A. Visual aesthetics and user experience: A multiple-session experiment. Int. J. Human-Comp. Stud. 2022, 165, 102837. [Google Scholar] [CrossRef]
- Yakunin, A.V.; Bodrunova, S.S. Cumulative Distortions in Usability Testing: Combined Impact of Web Design, Experiment Conditions, and Type of Task Upon User States during Internet Use. Design, User Experience, and Usability: UX Research, Design, and Assessment. In Proceedings of the International Conference on Human-Computer Interaction, HCII 2022, LNCS, Virtual Event, 26 June–1 July 2022; Meiselwitz, G., Ed.; Springer: Cham, Switzerland, 2022; Volume 13321, pp. 526–535. [Google Scholar]
- Sauer, J.; Sonderegger, A.; Schmutz, S. Usability, user experience and accessibility: Towards an integrative model. Ergon 2020, 63, 1207–1220. [Google Scholar] [CrossRef]
- Russ, A.L.; Saleem, J.J. Ten factors to consider when developing usability scenarios and tasks for health information technology. J. Biomed. Inf. 2018, 78, 123–133. [Google Scholar] [CrossRef]
- Lourenço, D.F.; Carmona, E.V.; Lopes, M.H.B.D.M. Translation and Cross-Cultural Adaptation of the System Usability Scale to Brazilian Portuguese. Aquichan 2022, 22. Available online: http://www.scielo.org.co/scielo.php?pid=S1657-59972022000202228&script=sci_arttext&tlng=en.https://doi.org/10.5294/aqui (accessed on 8 February 2022). [CrossRef]
- Holden, R.J.; Abebe, E.; Hill, J.R.; Brown, J.; Savoy, A.; Voida, S.; Jones, J.F.; Kulanthaivel, A. Human factors engineering and human-computer interaction: Supporting user performance and experience. Clin. Inf. Study Guide 2022, 119–132. [Google Scholar]
- Freeman, M.H. The Aesthetics of Human Cognition. At SSRN 3259227. 2018. Available online: https://www.researchgate.net/publication/328524538_The_Aesthetics_of_Human_Cognition (accessed on 1 October 2022).
- Yin, D.R.K. Case Study Research: Design and Methods (Applied Social Research Methods); Sage Publications: Thousand Oaks, CA, USA, 2008. [Google Scholar]
- Kaptelinin, V. Computer-mediated activity: Functional organs in social and developmental contexts. In Context and Consciousness: Activity Theory and Human Computer Interaction; Nardi, B.A., Ed.; MIT Press: Cambridge, MA, USA, 1995; pp. 45–68. [Google Scholar]
- Bara, I.; Binney, R.J.; Ward, R.; Ramsey, R. A generalised semantic cognition account of aesthetic experience. Neuropsychologia 2022, 173, 108288. [Google Scholar] [CrossRef]
- Collaud, R.; Reppa, I.; Défayes, L.; McDougall, S.; Henchoz, N.; Sonderegger, A. Design standards for icons: The independent role of aesthetics, visual complexity and concreteness in icon design and icon understanding. Displays 2022, 74, 102290. [Google Scholar] [CrossRef]
- McDougall, S.; Reppa, I.; Smith, G.; Playfoot, D. Beyond emoticons: Combining affect and cognition in icon design. In Proceedings of the EPCE 2009, LNCS (LNAI), San Diego, CA, USA, 19–24 July 2009; Harris, D., Ed.; Springer: Heidelberg, Germany, 2009; Volume 5639, pp. 71–80. [Google Scholar] [CrossRef]
- Silvennoinen, J.M.; Jokinen, J.P.P. Aesthetic appeal and visual usability in four icon design eras. In Proceedings of the Conference on Human Factors in Computing Systems, San Hose, CA, USA, 7–12 May 2016; pp. 4390–4400. [Google Scholar] [CrossRef]
- Bara, I.; Binney, R.J.; Ramsey, R. Investigating the Role of Executive Resources across Aesthetic and Non-Aesthetic Judgments. 2021, pp. 1–4. Available online: https://psyarxiv.com/ydmbr/download?format=pdf (accessed on 1 October 2022). [CrossRef]
- Li, Q.; Liu, Z.; Wang, P. The Impact of Visual Aesthetics of Different Types of APP Icons on User Cognition. In Proceedings of the 24th International Conference on Human-Computer Interaction, Proceedings, Part I, HCII 2022, Virtual Event, 26 June–1 July 2022; pp. 228–237. [Google Scholar]
- Silvennoinen, J. Interactionist Approach to Visual Aesthetics in HCI. In Proceedings of the 10th International Conference, DUXU 2021, Held as Part of the 23rd International Conference on Human-Computer Interaction, Proceedings, Part I, LNCS, Design, User Experience and Usability, UX Research and Design HCII 2021, Virtual Event, 24–29 July 2021; Soares, M.M., Rosenzweig, E., Marcus, A., Eds.; Springer: Cham, Switzerland, 2022; Volume 12779, pp. 115–127. [Google Scholar]
- Roto, V.; Law, E.L.-C.; Vermeeren, A.P.O.S.; Hoonhout, J. User Experience White Paper: Bringing Clarity to the Concept of User Experience. 2011. Available online: http://www.allaboutux.org/files/UX-WhitePaper.pdf. (accessed on 1 October 2022).
- Chalhoub, G.; Kraemer, M.J.; Nthala, N.; Flechais, I. “It did not give me an option to decline”: A Longitudinal Analysis of the User Experience of Security and Privacy in Smart Home Products. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, Yokohama, Japan, 8–13 May 2021; pp. 1–16. [Google Scholar]
- Robb, A.; Kohm, K.; Porter, J. Experience Matters: Longitudinal Changes in Sensitivity to Rotational Gains in Virtual Reality. ACM Trans. Appl. Percept. (TAP) 2022. [Google Scholar] [CrossRef]
- Wang, Y.; Zhang, Y.; Chen, Y. A Longitudinal Study of Children’s Perceived Usability: Methodology and Measurements. Int. J. Human–Comp. Interact. 2022, 19, 1–18. [Google Scholar] [CrossRef]
- Leontiev, A.N. Activity, Consciousness, and Personality; Prentice-Hall: Englewood Cliffs, NJ, USA, 1978. [Google Scholar]
- Kuutti, K.; Bannon, L.J. The Turn to Practice in HCl; Association for Computing Machinery: New York, NY, USA, 2014; pp. 3543–3552. [Google Scholar] [CrossRef]
- Kuutti, K. Activity theory as a potential framework for human computer interaction research. In Context and Consciousness: Activity Theory and Human Computer Interaction; Nardi, B.A., Ed.; MIT Press: Cambridge, MA, USA, 1995; pp. 17–44. [Google Scholar]
- Forlizzi, J.; Battarbee, K. Understanding Experience in Interactive Systems; Association for Computing Machinery: New York, NY, USA, 2004; pp. 261–268. [Google Scholar]
- Garrett, J.J. The Elements of User Experience: User-Centered Design for the Web and Beyond; New Riders: Berkeley, CA, USA, 2010. [Google Scholar]
- Bertelsen, O.W.; Bødker, S. Activity theory. In HCl Models, Theories, and Frameworks: Toward a Multi-Disciplinary Science Multidisciplinary Science; Carroll, J.M., Ed.; Morgan Kaufmann: San Francisco, CA, USA, 2003; pp. 291–324. [Google Scholar]
- Kaptelinin, V.; Nardi, B. Activity Theory in HCl: Fundamentals and Reflections; Morgan & Claypool Publishers: San Rafael, CA, USA, 2012. [Google Scholar]
- Kaptelinin, V.; Nardi, B.A. Acting with Technology: Activity Theory and Interaction Design; MIT Press: Cambridge, MA, USA, 2006. [Google Scholar]
- Abuqaddom, I.; Alazzam, H.; Hudaib, A.; Al-Zaghoul, F. A measurable website usability model: Case Study University of Jordan. In Proceedings of the 2019 10th International Conference on Information and Communication Systems (ICICS), IEEE, Irbid, Jordan, 11–13 June 2019; pp. 83–87. [Google Scholar]
- Cha, S.S.; Lee, S.H. The Effects of User Experience Factors on Satisfaction and Repurchase Intention at Online Food Market. J. Industr. Distrib. Bus. 2021, 12, 7–13. [Google Scholar]
- Doi, T.; Doi, S.; Yamaoka, T. The peak-end rule in evaluating product user experience: The chronological evaluation of past impressive episodes on overall satisfaction. Human Factors Ergon. Manufact. Serv. Industr. 2022, 32, 256–267. [Google Scholar] [CrossRef]
- Al-Maskari, A.; Sanderson, M. A review of factors influencing user satisfaction in information retrieval. J. Amer. Soc. Inf. Sci. Technol. 2010, 61, 859–868. [Google Scholar] [CrossRef]
- Ushakov, I.B.; Bogomolov, A.V.; Kukushkin, Y.A. Patterns of Operator Functional States [Patterny Funktsional’nyh Sostoyaniy Operatora]; Nauka: Moscow, Russia, 2010. [Google Scholar]
- Ilyin, V.; Kalnysh, V. The new approach to evaluation of functional state of human organism. In Potegowanie Zdrowia. Czynniki, Mechanizmy i Strategie Zdrowotne; OLX: Warsaw, Poland, 2003; pp. 183–184. [Google Scholar]
- Leonova, A.B. Occupational stress, personnel adaptation, and health. In Stress and Emotion: Anxiety, Anger, and Curiosity; Spielberger, C.D., Sarason, I.G., Eds.; Taylor & Francis: London, UK, 2013; pp. 109–125. [Google Scholar]
- Zankovsky, A.N. Professional stress and functional states. Organ. Psychol. Labor Psychol. Russ. Acad. Sci. Bull. 2019, 4, 222–243. [Google Scholar]
- Leonova, A.B. Functional status and regulatory processes in stress management. In Operator functional state: The assessment and prediction of human performance degradation in complex tasks; Hockey, G.R.J., Gaillard, A.W.K., Burov, O., Eds.; Ios Pr Inc.: Amsterdam, The Netherlands, 2003; pp. 36–52. [Google Scholar]
- Merzon, E.; Shterts, O.; Panfilov, A.N. Lability and Flexibility of Thinking as Factors of the Development of Technical Giftedness of the Individual [Labil’nost’ i Gibkost’ Myshleniya kak Factory Razvitiya Tehnicheskoy Odarennosti Lichnosti]. Current Problems in Sci. Educ. (Sovremennye Problem nauki i Obrazovaniya) 2013, 3. Available online: https://science-education.ru/ru/article/view?id=9381 (accessed on 1 October 2022).
- Karpov, A.V.; Kornilova, Y.K. (Eds.) Subject and Object of Practical Thinking [Sub’ekt i ob’ekt Prakticheskogo Myshleniya]; Remder: Yaroslavl, Russia, 2004. (In Russian) [Google Scholar]
- Taber, K.S. Development of student understanding: A case study of stability and lability in cognitive structure. Res. Sci. Technol. Educ. 1995, 13, 89–99. [Google Scholar] [CrossRef]
- Merzon, E.; Shterts, O.; Sibgatullina-Denis, I.; Sharafullina, Z. Digital technologies as a means of developing technical abilities in children with writing impairments in an inclusive education. In 2022 AIP Conference Proceedings; AIP Publishing LLC: Anchorage, AK, USA, 2022; Volume 2647, p. 040061. [Google Scholar]
- Mashin, V.A. Analysis of heart rate variability based on the graph method. Hum. Physiol. 2002, 28, 437–447. [Google Scholar] [CrossRef]
- Wessman, A.E.; Ricks, D.F. Mood and Personality; Holt, Rinehart and Winston: New York, NY, USA, 1966. [Google Scholar]
- Lavie, T.; Tractinsky, N. Assessing dimensions of perceived visual aesthetics of web sites. Int. J. Human-Comp. Stud. 2004, 60, 269–298. [Google Scholar] [CrossRef] [Green Version]
- Nemery, A.; Brangier, E. Set of Guidelines for Persuasive Interfaces: Organization and Validation of the Criteria. J. Usab. Stud. 2014, 9, 105–128. [Google Scholar]
- Koutsabasis, P.; Istikopoulou, T.G. Perceived web site aesthetics by users and designers: Implications for evaluation practice. Int. J. Tech. Human Interact. (IJTHI) 2013, 9, 39–52. [Google Scholar] [CrossRef]
Test Format | Dysfunctional State | ‘East’ | ‘West’ | ||
---|---|---|---|---|---|
Aesthetic Design | Non-Aesthetic Design | Aesthetic Design | Non-Aesthetic Design | ||
Individual test | Monotony | sub-group 1 | sub-group 5 | sub-group 9 | sub-group 13 |
Anxiety | sub-group 2 | sub-group 6 | sub-group 10 | sub-group 14 | |
Group test | Monotony | sub-group 3 | sub-group 7 | sub-group 11 | sub-group 15 |
Anxiety | sub-group 4 | sub-group 8 | sub-group 12 | sub-group 16 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Yakunin, A.V.; Bodrunova, S.S. Cumulative Impact of Testing Factors in Usability Tests for Human-Centered Web Design. Future Internet 2022, 14, 359. https://doi.org/10.3390/fi14120359
Yakunin AV, Bodrunova SS. Cumulative Impact of Testing Factors in Usability Tests for Human-Centered Web Design. Future Internet. 2022; 14(12):359. https://doi.org/10.3390/fi14120359
Chicago/Turabian StyleYakunin, Alexander V., and Svetlana S. Bodrunova. 2022. "Cumulative Impact of Testing Factors in Usability Tests for Human-Centered Web Design" Future Internet 14, no. 12: 359. https://doi.org/10.3390/fi14120359
APA StyleYakunin, A. V., & Bodrunova, S. S. (2022). Cumulative Impact of Testing Factors in Usability Tests for Human-Centered Web Design. Future Internet, 14(12), 359. https://doi.org/10.3390/fi14120359