Designs and Interactions for Near-Field Augmented Reality: A Scoping Review
Abstract
1. Introduction
2. Background
2.1. Augmented Reality
2.2. The Near Field and Perception
2.2.1. Perceptual Issues in the Near Field
2.2.2. The Challenge of Researching Perception
2.2.3. Design
2.2.4. Interaction
2.3. The Rationale, Research Gap, and Need for This Review
3. Methodology
3.1. Research Questions and Aim
3.2. Search Strategy
3.2.1. Definition of Keywords
2016 to present (Percept* OR visual OR vision) AND (design OR interaction) AND (“mixed reality” OR MR OR “augmented reality” OR AR) AND (“near field” OR “near-field” OR “close up” OR “close range” OR “peripersonal” OR “near range” OR “near by” OR “near-by”)
2016 to present (“Percept*” OR “visual” OR “vision”) AND (“design” OR “interaction”) AND (“mixed reality” OR “MR” OR “augmented reality” OR “AR”) AND (“near field” OR “near-field” OR “close up” OR “close range” OR “close-range” OR “peripersonal” OR “near range” OR “near by” OR “near-by”)
3.2.2. Search Databases and Date Range
3.3. Evidence Screening and Filtering
3.3.1. Inclusion and Exclusion Criteria
3.3.2. Running the Query
3.4. Data Extraction
3.4.1. Stage One—Deductive
3.4.2. Stage Two—Inductive Content Analysis
3.4.3. Critical Appraisal, Limitations, and Potential Bias
3.4.4. Synthesis
4. Results
4.1. Deductive Codes
4.2. Inductive Analysis
4.2.1. Hardware Effects or Limitations on Near Field Perception
4.2.2. Real Interactions Are Multi-Modal
4.2.3. Despite More Accurate Options, Hand Gestures Are Preferred for Near-Field Interactions
Avatarisation, Perception, and Embodiment
4.2.4. Depth Is the Main Contributor to Perceptual Inaccuracy, Particularly in the Near Field
Design Techniques Used to Alleviate Depth Estimation Errors in the Near Field
- Correct estimation of depth is possible without occlusion (although with lower confidence) if all other depth cues are preserved.
- Virtual objects that are less opaque and have a higher contrast are easier to align with physical counterparts.
- Virtual object size and brightness can affect depth estimation.
- Sharpening algorithms can aid the perception of virtual objects that are much closer than the fixed focal depth.
- Complex occluding surfaces negatively impact the perception of objects beneath, but virtual holes in the surface can help.
4.2.5. Perception Is Personal and Made Up of a Multitude of Factors
5. Discussion
5.1. Summary of Results and Contributions
- Hardware effects or limitations on the near field perception.
- Real interactions are multi-modal.
- Despite more accurate options, hand gestures are preferred for near-field interactions.
- Avatarisation, perception, and embodiment.
- Depth is the main contributor to perceptual inaccuracy, particularly in the near field.
- Design techniques used to alleviate depth estimation errors in the near field.
- Perception is personal and made up of a multitude of factors.
5.2. Research Opportunities
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Syed, T.A.; Siddiqui, M.S.; Abdullah, H.B.; Jan, S.; Namoun, A.; Alzahrani, A.; Nadeem, A.; Alkhodre, A.B. In-Depth Review of Augmented Reality: Tracking Technologies, Development Tools, AR Displays, Collaborative AR, and Security Concerns. Sensors 2023, 23, 146. [Google Scholar] [CrossRef]
- Arena, F.; Collotta, M.; Pau, G.; Termine, F. An Overview of Augmented Reality. Computers 2022, 11, 28. [Google Scholar] [CrossRef]
- Reljić, V.; Milenković, I.; Dudić, S.; Šulc, J.; Bajči, B. Augmented Reality Applications in Industry 4.0 Environment. Appl. Sci. 2021, 11, 5592. [Google Scholar] [CrossRef]
- Mendoza-Ramírez, C.E.; Tudon-Martinez, J.C.; Félix-Herrán, L.C.; Lozoya-Santos, J.d.J.; Vargas-Martínez, A. Augmented Reality: Survey. Appl. Sci. 2023, 13, 10491. [Google Scholar] [CrossRef]
- Papadopoulos, T.; Evangelidis, K.; Kaskalis, T.H.; Evangelidis, G.; Sylaiou, S. Interactions in Augmented and Mixed Reality: An Overview. Appl. Sci. 2021, 11, 8752. [Google Scholar] [CrossRef]
- Cooper, E.A. The Perceptual Science of Augmented Reality. Annu. Rev. Vis. Sci. 2023, 9, 455–478. [Google Scholar] [CrossRef] [PubMed]
- Bremers, A.W.D.; Yöntem, A.Ö.; Li, K.; Chu, D.; Meijering, V.; Janssen, C.P. Perception of perspective in augmented reality head-up displays. Int. J. Hum.-Comput. Stud. 2021, 155, 102693. [Google Scholar] [CrossRef]
- Bhowmik, A.K. Virtual and augmented reality: Human sensory-perceptual requirements and trends for immersive spatial computing experiences. J. Soc. Inf. Disp. 2024, 32, 605–646. [Google Scholar] [CrossRef]
- Wang, Y.J.; Lin, Y.H. Liquid crystal technology for vergence-accommodation conflicts in augmented reality and virtual reality systems: A review. Liq. Cryst. Rev. 2021, 9, 35–64. [Google Scholar] [CrossRef]
- Itoh, Y.; Langlotz, T.; Sutton, J.; Plopski, A. Towards Indistinguishable Augmented Reality: A Survey on Optical See-through Head-mounted Displays. ACM Comput. Surv. 2021, 54, 120:1–120:36. [Google Scholar] [CrossRef]
- Diaz, C.; Walker, M.; Szafir, D.A.; Szafir, D. Designing for Depth Perceptions in Augmented Reality. In Proceedings of the 2017 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), Nantes, France, 9–13 October 2017; pp. 111–122. [Google Scholar] [CrossRef]
- Tricco, A.C.; Lillie, E.; Zarin, W.; O’Brien, K.K.; Colquhoun, H.; Levac, D.; Moher, D.; Peters, M.D.; Horsley, T.; Weeks, L.; et al. PRISMA Extension for Scoping Reviews (PRISMA-ScR): Checklist and Explanation. Ann. Intern. Med. 2018, 169, 467–473. [Google Scholar] [CrossRef]
- PRISMA. PRISMA for Scoping Reviews (PRISMA-ScR). Available online: https://www.prisma-statement.org/scoping (accessed on 9 June 2025).
- Cooper, S.; Cant, R.; Kelly, M.; Levett-Jones, T.; McKenna, L.; Seaton, P.; Bogossian, F. An Evidence-Based Checklist for Improving Scoping Review Quality. Clin. Nurs. Res. 2021, 30, 230–240. [Google Scholar] [CrossRef]
- Joanna Briggs Institue. JBI Manual for Evidence Synthesis—JBI Global Wiki. Available online: https://jbi-global-wiki.refined.site/space/MANUAL (accessed on 9 June 2025).
- Sutherland, I. The Ultimate Display. In Proceedings of the Congress of the Internation Federation of Information Processing (IFIP), New York, NY, USA, 24–29 May 1965. [Google Scholar]
- McCarthy, C.J.; Uppot, R.N. Advances in Virtual and Augmented Reality—Exploring the Role in Health-care Education. J. Radiol. Nurs. 2019, 38, 104–105. [Google Scholar] [CrossRef]
- Morimoto, T.; Kobayashi, T.; Hirata, H.; Otani, K.; Sugimoto, M.; Tsukamoto, M.; Yoshihara, T.; Ueno, M.; Mawatari, M. XR (Extended Reality: Virtual Reality, Augmented Reality, Mixed Reality) Technology in Spine Medicine: Status Quo and Quo Vadis. J. Clin. Med. 2022, 11, 470. [Google Scholar] [CrossRef]
- Speicher, M.; Hall, B.D.; Nebeling, M. What is Mixed Reality? In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, Glasgow, UK, 4–9 May 2019; CHI ’19. pp. 1–15. [Google Scholar] [CrossRef]
- Vinci, C.; Brandon, K.O.; Kleinjan, M.; Brandon, T.H. The clinical potential of augmented reality. Clin. Psychol. Sci. Pract. 2020, 27, e12357. [Google Scholar] [CrossRef]
- Caudell, T.; Mizell, D. Augmented reality: An application of heads-up display technology to manual manufacturing processes. In Proceedings of the Twenty-Fifth Hawaii International Conference on System Sciences, Kauai, HI, USA, 7–10 January 1992; Volume 2, pp. 659–669. [Google Scholar] [CrossRef]
- Javornik, A. The Mainstreaming of Augmented Reality: A Brief History. Available online: https://hbr.org/2016/10/the-mainstreaming-of-augmented-reality-a-brief-history (accessed on 9 June 2025).
- Vertucci, R.; D’Onofrio, S.; Ricciardi, S.; De Nino, M. History of Augmented Reality. In Springer Handbook of Augmented Reality; Nee, A.Y.C., Ong, S.K., Eds.; Springer International Publishing: Cham, Switzerland, 2023; pp. 35–50. [Google Scholar] [CrossRef]
- Microsoft. HoloLens (1st gen) Hardware. Available online: https://learn.microsoft.com/en-us/previous-versions/mixed-reality/hololens-1/hololens1-hardware (accessed on 9 June 2025).
- Google. Google Trends: Understanding the Data—Google News Initiative. Available online: https://newsinitiative.withgoogle.com/en-gb/resources/trainings/google-trends-understanding-the-data/ (accessed on 13 June 2025).
- VR Compare. HTC Vive: Full Specification. Available online: https://vr-compare.com/headset/htcvive (accessed on 9 June 2025).
- Pokémon. Pokémon GO. Available online: https://pokemongo.com/ (accessed on 9 June 2025).
- IKEA Global. Launch of New IKEA Place App. Available online: https://www.ikea.com/global/en/newsroom/innovation/ikea-launches-ikea-place-a-new-app-that-allows-people-to-virtually-place-furniture-in-their-home-170912/ (accessed on 9 June 2025).
- BBC. Google Glass Smart Eyewear Returns. Available online: https://www.bbc.com/news/technology-40644195 (accessed on 9 June 2025).
- Dargan, S.; Bansal, S.; Kumar, M.; Mittal, A.; Kumar, K. Augmented Reality: A Comprehensive Review. Arch. Comput. Methods Eng. 2023, 30, 1057–1080. [Google Scholar] [CrossRef]
- Babu, S.V.; Huang, H.C.; Teather, R.J.; Chuang, J.H. Comparing the Fidelity of Contemporary Pointing with Controller Interactions on Performance of Personal Space Target Selection. In Proceedings of the 2022 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), Singapore, 17–21 October 2022; pp. 404–413. [Google Scholar] [CrossRef]
- Babu, S.; Tsai, M.H.; Hsu, T.W.; Chuang, J.H. An Evaluation of the Efficiency of Popular Personal Space Pointing versus Controller based Spatial Selection in VR. In Proceedings of the ACM Symposium on Applied Perception 2020, Virtual, 12–13 September 2020. SAP ’20. [Google Scholar] [CrossRef]
- Cutting, J.E.; Vishton, P.M. Chapter 3—Perceiving Layout and Knowing Distances: The Integration, Relative Potency, and Contextual Use of Different Information about Depth. In Perception of Space and Motion; Epstein, W., Rogers, S., Eds.; Handbook of Perception and Cognition; Academic Press: San Diego, CA, USA, 1995; pp. 69–117. [Google Scholar] [CrossRef]
- Xu, J.; Doyle, D.; Moreu, F. State of the art of augmented reality capabilities for civil infrastructure applications. Eng. Rep. 2023, 5, e12602. [Google Scholar] [CrossRef]
- Mutis, I.; Ambekar, A. Challenges and enablers of augmented reality technology for in situ walkthrough applications. J. Inf. Technol. Constr. (ITcon) 2020, 25, 55–71. [Google Scholar] [CrossRef]
- Cutolo, F.; Fida, B.; Cattari, N.; Ferrari, V. Software Framework for Customized Augmented Reality Headsets in Medicine. IEEE Access 2020, 8, 706–720. [Google Scholar] [CrossRef]
- Yin, K.; He, Z.; Xiong, J.; Zou, J.; Li, K.; Wu, S.T. Virtual reality and augmented reality displays: Advances and future perspectives. J. Phys. Photonics 2021, 3, 022010. [Google Scholar] [CrossRef]
- Erkelens, I.M.; MacKenzie, K.J. 19-2: Vergence-Accommodation Conflicts in Augmented Reality: Impacts on Perceived Image Quality. SID Symp. Dig. Tech. Pap. 2020, 51, 265–268. [Google Scholar] [CrossRef]
- Blignaut, J.; Venter, M.; van den Heever, D.; Solms, M.; Crockart, I. Inducing Perceptual Dominance with Binocular Rivalry in a Virtual Reality Head-Mounted Display. Math. Comput. Appl. 2023, 28, 77. [Google Scholar] [CrossRef]
- Argelaguet, F.; Andujar, C. A survey of 3D object selection techniques for virtual environments. Comput. Graph. 2013, 37, 121–136. [Google Scholar] [CrossRef]
- Edwards, P.J.E.; Chand, M.; Birlo, M.; Stoyanov, D. The Challenge of Augmented Reality in Surgery. In Digital Surgery; Atallah, S., Ed.; Springer International Publishing: Cham, Switzerland, 2021; pp. 121–135. [Google Scholar] [CrossRef]
- Reichelt, S.; Häussler, R.; Fütterer, G.; Leister, N. Depth cues in human visual perception and their realization in 3D displays. In Proceedings of the Three-Dimensional Imaging, Visualization, and Display 2010 and Display Technologies and Applications for Defense, Security, and Avionics IV, Orlando, FL, USA, 5–9 April 2010; SPIE: Bellingham, WA, USA, 2010; Volume 7690, pp. 92–103. [Google Scholar] [CrossRef]
- Dror, I.E.; Schreiner, C.S. Chapter 4—Neural Networks and Perception. In Advances in Psychology; Jordan, J.S., Ed.; North-Holland: Amsterdam, The Netherlands, 1998; Volume 126, pp. 77–85. [Google Scholar] [CrossRef]
- Marcos, S.; Moreno, E.; Navarro, R. The depth-of-field of the human eye from objective and subjective measurements. Vis. Res. 1999, 39, 2039–2049. [Google Scholar] [CrossRef] [PubMed]
- Ellis, S.R.; Menges, B.M. Localization of Virtual Objects in the Near Visual Field. Hum. Factors 1998, 40, 415–431. [Google Scholar] [CrossRef] [PubMed]
- Sielhorst, T.; Bichlmeier, C.; Heining, S.M.; Navab, N. Depth Perception—A Major Issue in Medical AR: Evaluation Study by Twenty Surgeons. In Medical Image Computing and Computer-Assisted Intervention—MICCAI 2006, Proceedings of the 9th International Conference, Copenhagen, Denmark, 1–6 October 2006; Larsen, R., Nielsen, M., Sporring, J., Eds.; Springer: Berlin/Heidelberg, Germany, 2006; pp. 364–372. [Google Scholar] [CrossRef]
- Ballestin, G.; Solari, F.; Chessa, M. Perception and Action in Peripersonal Space: A Comparison Between Video and Optical See-Through Augmented Reality Devices. In Proceedings of the 2018 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct), Munich, Germany, 16–20 October 2018; pp. 184–189. [Google Scholar] [CrossRef]
- Clarke, T.J.; Mayer, W.; Zucco, J.E.; Matthews, B.J.; Smith, R.T. Adapting VST AR X-Ray Vision Techniques to OST AR. In Proceedings of the 2022 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct), Singapore, 17–21 October 2022; pp. 495–500. [Google Scholar] [CrossRef]
- Pham, D.M.; Stuerzlinger, W. Is the Pen Mightier than the Controller? A Comparison of Input Devices for Selection in Virtual and Augmented Reality. In Proceedings of the 25th ACM Symposium on Virtual Reality Software and Technology, Parramatta, Australia, 12–15 November 2019; VRST ’19. pp. 1–11. [Google Scholar] [CrossRef]
- Kim, H.; Kwon, Y.T.; Lim, H.R.; Kim, J.H.; Kim, Y.S.; Yeo, W.H. Recent Advances in Wearable Sensors and Integrated Functional Devices for Virtual and Augmented Reality Applications. Adv. Funct. Mater. 2021, 31, 2005692. [Google Scholar] [CrossRef]
- Kang, H.J.; Shin, J.h.; Ponto, K. A Comparative Analysis of 3D User Interaction: How to Move Virtual Objects in Mixed Reality. In Proceedings of the 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Atlanta, GA, USA, 22–26 March 2020; pp. 275–284. [Google Scholar] [CrossRef]
- Prilla, M.; Janssen, M.; Kunzendorff, T. How to Interact with Augmented Reality Head Mounted Devices in Care Work? A Study Comparing Handheld Touch (Hands-on) and Gesture (Hands-free) Interaction. AIS Trans. Hum.-Comput. Interact. 2019, 11, 157–178. [Google Scholar] [CrossRef]
- Seinfeld, S.; Feuchtner, T.; Pinzek, J.; Müller, J. Impact of Information Placement and User Representations in VR on Performance and Embodiment. IEEE Trans. Vis. Comput. Graph. 2022, 28, 1545–1556. [Google Scholar] [CrossRef] [PubMed]
- Genay, A.; Lécuyer, A.; Hachet, M. Being an Avatar “for Real”: A Survey on Virtual Embodiment in Augmented Reality. IEEE Trans. Vis. Comput. Graph. 2022, 28, 5071–5090. [Google Scholar] [CrossRef]
- Hammady, R.; Ma, M.; Strathearn, C. User experience design for mixed reality: A case study of HoloLens in museum. Int. J. Technol. Mark. 2019, 13, 354–375. [Google Scholar] [CrossRef]
- Sutton, A.; Clowes, M.; Preston, L.; Booth, A. Meeting the review family: Exploring review types and associated information retrieval requirements. Health Inf. Libr. J. 2019, 36, 202–222. [Google Scholar] [CrossRef]
- Grave, R.B.d.; Bull, C.N.; Monteiro, D.M.d.S.; Margariti, E.; McMurchy, G.; Hutchinson, J.W.; Smeddinck, J.D. Smartphone Apps for Food Purchase Choices: Scoping Review of Designs, Opportunities, and Challenges. J. Med Internet Res. 2024, 26, e45904. [Google Scholar] [CrossRef]
- Hirzle, T.; Müller, F.; Draxler, F.; Schmitz, M.; Knierim, P.; Hornbæk, K. When XR and AI Meet—A Scoping Review on Extended Reality and Artificial Intelligence. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems, Hamburg, Germany, 23–28 April 2023. CHI ’23. [Google Scholar] [CrossRef]
- Oun, A.; Hagerdorn, N.; Scheideger, C.; Cheng, X. Mobile Devices or Head-Mounted Displays: A Comparative Review and Analysis of Augmented Reality in Healthcare. IEEE Access 2024, 12, 21825–21839. [Google Scholar] [CrossRef]
- Pollock, D.; Peters, M.D.J.; Khalil, H.; McInerney, P.; Alexander, L.; Tricco, A.C.; Evans, C.; de Moraes, É.B.; Godfrey, C.M.; Pieper, D.; et al. Recommendations for the extraction, analysis, and presentation of results in scoping reviews. JBI Evid. Synth. 2023, 21, 520–532. [Google Scholar] [CrossRef]
- Kleinheksel, A.J.; Rockich-Winston, N.; Tawfik, H.; Wyatt, T.R. Demystifying Content Analysis. Am. J. Pharm. Educ. 2020, 84, 7113. [Google Scholar] [CrossRef]
- Fischer, M.; Leuze, C.; Perkins, S.; Rosenberg, J.; Daniel, B.; Martin-Gomez, A. Evaluation of Different Visualization Techniques for Perception-Based Alignment in Medical AR. In Proceedings of the 2020 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct), Recife, Brazil, 9–13 November 2020; pp. 45–50. [Google Scholar] [CrossRef]
- Wagner, U.; Lystbæk, M.N.; Manakhov, P.; Grønbæk, J.E.S.; Pfeuffer, K.; Gellersen, H. A Fitts’ Law Study of Gaze-Hand Alignment for Selection in 3D User Interfaces. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems, Hamburg, Germany, 23–28 April 2023. CHI ’23. [Google Scholar] [CrossRef]
- Fischer, M.; Rosenberg, J.; Leuze, C.; Hargreaves, B.; Daniel, B. The Impact of Occlusion on Depth Perception at Arm’s Length. IEEE Trans. Vis. Comput. Graph. 2023, 29, 4494–4502. [Google Scholar] [CrossRef]
- Oshima, K.; Moser, K.R.; Rompapas, D.C.; Swan, J.E.; Ikeda, S.; Yamamoto, G.; Taketomi, T.; Sandor, C.; Kato, H. SharpView: Improved clarity of defocused content on optical see-through head-mounted displays. In Proceedings of the 2016 IEEE Symposium on 3D User Interfaces (3DUI), Greenville, SC, USA, 19–20 March 2016; pp. 173–181. [Google Scholar] [CrossRef]
- Rosa, N.; Hürst, W.; Werkhoven, P.; Veltkamp, R. Visuotactile integration for depth perception in augmented reality. In Proceedings of the 18th ACM International Conference on Multimodal Interaction, Tokyo, Japan, 12–16 November 2016; ICMI ’16. pp. 45–52. [Google Scholar] [CrossRef]
- Venkatakrishnan, R.; Venkatakrishnan, R.; Canales, R.; Raveendranath, B.; Pagano, C.C.; Robb, A.C.; Lin, W.C.; Babu, S.V. Investigating the Effects of Avatarization and Interaction Techniques on Near-field Mixed Reality Interactions with Physical Components. IEEE Trans. Vis. Comput. Graph. 2024, 30, 2756–2766. [Google Scholar] [CrossRef]
- Venkatakrishnan, R.; Venkatakrishnan, R.; Raveendranath, B.; Pagano, C.C.; Robb, A.C.; Lin, W.C.; Babu, S.V. Give Me a Hand: Improving the Effectiveness of Near-field Augmented Reality Interactions By Avatarizing Users’ End Effectors. IEEE Trans. Vis. Comput. Graph. 2023, 29, 2412–2422. [Google Scholar] [CrossRef] [PubMed]
- Weast, R.A.T.; Proffitt, D.R. Can I reach that? Blind reaching as an accurate measure of estimated reachable distance. Conscious. Cogn. 2018, 64, 121–134. [Google Scholar] [CrossRef]
- Hart, S.G. Nasa-Task Load Index (NASA-TLX); 20 Years Later. Proc. Hum. Factors Ergon. Soc. Annu. Meet. 2006, 50, 904–908. [Google Scholar] [CrossRef]
- Quek, F.; McNeill, D.; Bryll, R.; Duncan, S.; Ma, X.F.; Kirbas, C.; McCullough, K.E.; Ansari, R. Multimodal human discourse: Gesture and speech. ACM Trans. Comput.-Hum. Interact. 2002, 9, 171–193. [Google Scholar] [CrossRef]
- Katz, M. Convergence Demands by Spectacle Magnifiers. Optom. Vis. Sci. 1996, 73, 540. [Google Scholar] [CrossRef]
- Microsoft. Comfort—Mixed Reality. Available online: https://learn.microsoft.com/en-us/windows/mixed-reality/design/comfort (accessed on 13 June 2025).
- Feld, N.; Pointecker, F.; Anthes, C.; Zielasko, D. Perceptual Issues in Mixed Reality: A Developer-oriented Perspective on Video See-Through Head-Mounted Displays. In Proceedings of the 2024 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct), Bellevue, WA, USA, 21–25 October 2024; pp. 170–175. [Google Scholar] [CrossRef]
Main Concept | Alternative Keywords | |
---|---|---|
Participants | Anyone | x |
Concept | Perceptual Challenges Design Techniques Interaction Techniques | Visual challenges/issues Perceptual accuracy Perceptual efficiency Design methods Interaction methods |
Context | Near-Field Augmented Reality | Mixed reality Augmented reality AR MR Close up Close range Peripersonal Near range Nearby |
ID | Description |
---|---|
IC1 | Requires further reading |
IC2 | Near-field perceptual problem investigated in a design or interaction context |
IC3 | Interaction technique(s) applied to a perceptual problem in the near field |
IC4 | Design technique(s) applied to a perceptual problem in the near field |
EC1 | Mobile AR (including headset with phone in) |
EC2 | Virtual reality |
EC3 | Hardware suggestion, e.g., new lens tech |
EC4 | Not in main proceedings |
EC5 | Not in English |
EC6 | Missing relevance to AR |
EC7 | Not human perception |
EC8 | Missing relevance to design/interaction technique(s) |
EC9 | Not published in 2016–2025 |
EC10 | False positive, i.e., keyword used in a different context |
EC11 | Missing relevance to the near field |
EC12 | Review paper |
Code | Description | Values |
---|---|---|
C1 | Category | Interaction technique investigated in the context of perception, Design technique investigated in the context of perception, Design technique applied to perceptual issue, Interaction technique applied to perceptual issue |
C2 | Research Question/Objective | […] |
C3 | Contribution | […] |
C4 | Contribution Type | Empirical, Applications, Methodological, System/Artefact, Theoretical |
C5 | Limitations | […] |
C6 | Type of User Study | Yes Empirical, Yes Expert Evaluation, Yes Field study, Yes Workshop, No, Other |
C7 | Purpose of User Study | […] |
C8 | Metric for Evaluation | […] |
C9 | Field Study Task Type | […] |
C10 | Study Details (e.g., Participant Demographics, Target Users, Sample Size) | […] |
C11 | User Task | […] |
C12 | Type of AR | OST AR, VST AR |
C13 | Device | […] |
C14 | Interaction Technique | […, n/a] |
C15 | Design Technique | […, n/a] |
C16 | Perceptual Issue Addressed/Investigated | […] |
C17 | Performance Improvement | […, Area Investigated] |
C18 | Definition of Near Field | […] |
Title | Paper Number | Metrics of Evaluation | Task Type | Device | Interaction Technique | Design Technique |
---|---|---|---|---|---|---|
“Evaluation of Different Visualization Techniques for Perception-Based Alignment in Medical AR” [62] | 1 | Orientation error; distance error; alignment time; user feedback | Perceptual mapping | Microsoft HoloLens 1 | – | Outline; semi-transparent; wireframe; replicate |
“The Impact of Occlusion on Depth Perception at Arm’s Length” [64] | 2 | Alignment accuracy; confidence in choice | Depth estimation | Custom/HoloLens 2 | – | Occluding surface and illumination variables: none, gray and bright, gray with bright and hole, golf ball pattern and bright, golf ball pattern and bright with hole |
“SharpView: Improved clarity of defocused content on optical see-through head-mounted displays” [65] | 3 | Pupil diameter, focal blur size | “Adjustment” | Epson Moverio BT-200 OST HMD | – | Sharpness/blur |
“Visuotactile integration for depth perception in augmented reality” [66] | 4 | Final depth estimation bias (mean and standard deviation), task duration (mean) | Depth estimation | Meta DK1 | Visuotactile—haptics integrated for depth perception | – |
“Investigating the Effects of Avatarisation and Interaction Techniques on Near-field Mixed Reality Interactions with Physical Components” [67] | 5 | Accuracy, efficiency, perceived usability | Hoop on peg target task | Microsoft HoloLens 2 | Pinch-to-grab; stick-on-touch | No Augmented Avatar (No-Av), Augmented Avatar (Av), Augmented Avatar with Translational Gain (AvG) |
“Give Me a Hand: Improving the Effectiveness of Near-field Augmented Reality Interactions By Avatarizing Users’ End Effectors” [68] | 6 | Performance, perceived real hand visibility, perceived usability | Collision avoidance-based object retrieval task | Microsoft HoloLens 2 | Avatarisation of end effectors (hands): no avatarisation, Iconic Augmented Avatar, Realistic Augmented Avatar | Virtual light intensity, high and low |
“A Fitts’ Law Study of Gaze-Hand Alignment for Selection in 3D User Interfaces” [63] | 7 | Task completion time, throughput, effective width, error rate, hand movement, usability questionnaire | Point and select/depth estimation | Microsoft HoloLens 2 | 3D selection techniques: Gaze & Handray, Gaze & Finger vs. baseline Gaze & Pinch, Handray, HeadCrusher | – |
Paper Number | Tactile Feedback | Depth | Multi-Modal Interactions | Occlusion | Light Conditions | Avatarisation | Focus Blur | Parallax | VAC | Perception Action Coordination |
---|---|---|---|---|---|---|---|---|---|---|
1 | ||||||||||
2 | ||||||||||
3 | ||||||||||
4 | ||||||||||
5 | ||||||||||
6 | ||||||||||
7 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Hobbs, J.; Bull, C. Designs and Interactions for Near-Field Augmented Reality: A Scoping Review. Informatics 2025, 12, 77. https://doi.org/10.3390/informatics12030077
Hobbs J, Bull C. Designs and Interactions for Near-Field Augmented Reality: A Scoping Review. Informatics. 2025; 12(3):77. https://doi.org/10.3390/informatics12030077
Chicago/Turabian StyleHobbs, Jacob, and Christopher Bull. 2025. "Designs and Interactions for Near-Field Augmented Reality: A Scoping Review" Informatics 12, no. 3: 77. https://doi.org/10.3390/informatics12030077
APA StyleHobbs, J., & Bull, C. (2025). Designs and Interactions for Near-Field Augmented Reality: A Scoping Review. Informatics, 12(3), 77. https://doi.org/10.3390/informatics12030077