Development and Evaluation of a Data Glove-Based System for Assisting Puzzle Solving
Abstract
1. Introduction
2. Related Work
2.1. Data Gloves and Tactile Sensing
2.2. Operation Support Using Smart Glasses and XR
2.3. Comparison of Representative Approaches
2.4. Summary of Gaps and Design Implications
- Recognition vs. guidance: Many data-glove studies focus on gesture/object recognition, but do not couple recognition with action verification and step-by-step procedural support.
- Reliance on vision: Smart glasses and XR systems provide rich visual guidance but can be affected by occlusion and lighting, particularly when hand–object contact is difficult to observe.
- Limited closed-loop verification: Few systems explicitly verify whether a user executed the correct step using multimodal contact information (e.g., grasp state plus placement confirmation).
2.5. Positioning of This Work
3. System Description
3.1. Tactile Sensor and Data Glove

| Region | Sensor Count | Rationale (Disc Interaction) |
|---|---|---|
| Fingers (combined) | 8 | Detect fingertip/finger contact cues during grasp initiation |
| Palm (combined) | 80 | Capture pressure distribution during disc support and transport |
| Total | 88 |
3.2. Tower of Hanoi Task
3.3. System Architecture and Graphical User Interface
4. Methods
4.1. Dataset
4.1.1. Data Collection
4.1.2. Data Structure
4.2. Data Processing
4.3. Labeling and Quality Control
4.4. Sample Preparation for Model Training
- Frame selection: For each annotated grasping interval, frames corresponding to the stable grasp phase are selected while transient frames at the start and end of the motion are discarded.
- Normalization: Pressure values are normalized on a per-sensor basis to the range [0, 1] to reduce the impact of inter-sensor gain differences.
- Reshaping: The normalized vector is reshaped into a 2D grid consistent with the physical layout of the 88 tactile elements.
- Dataset split: Labeled samples are split into training and validation sets. The test set described in Section 4.1.1 is reserved exclusively for final evaluation.
5. CNN-Based Tactile Object Recognition
5.1. Input Representation
5.2. Network Architecture
- Conv Block 1: 32 filters of size 3 × 3, stride 1, padding 1; batch normalization; ReLU; 2 × 2 max pooling.
- Conv Block 2: 64 filters of size 3 × 3, stride 1, padding 1; batch normalization; ReLU; 2 × 2 max pooling.
- Conv Block 3: 128 filters of size 3 × 3, stride 1, padding 1; batch normalization; ReLU; 2 × 2 max pooling.
- Fully Connected 1: 256 units with ReLU and dropout.
- Fully Connected 2: 5 output units corresponding to the five classes, followed by a softmax layer.
5.3. Training Procedure
5.4. Evaluation Metrics
5.5. Baseline Models (For Comparison)
6. Experiments and Results
6.1. Participants and Experimental Design
- Baseline (no support): Participants solved the Tower of Hanoi puzzle without assistance.
- Support (proposed system): Participants solved the puzzle using the tactile-glove-based guidance system.
6.2. Task and Procedure
6.3. Recognition Performance
6.4. Task Performance: Solving Time
6.5. Task Performance: Number of Disc Movements
6.6. Subjective Workload (NASA-TLX)
6.7. Usability Evaluation (SUS)
6.8. Summary of Experimental Findings
- 93.3% recognition accuracy on the separate-day test set.
- 51.5% reduction in mean solving time (242.6 s → 117.8 s).
- 57.6% reduction in disc movements (35.4 → 15), i.e., about 20 fewer moves on average.
- 53.1% reduction in NASA-TLX (68.5 → 32.1), interpreted as a trend given n = 5.
- SUS = 75, indicating good usability.
7. Discussion
7.1. Effectiveness of Tactile Sensing for Task Guidance
7.2. Impact on Task Performance
7.3. Subjective Workload and User Perception
7.4. Potential Applications and Target Users
7.5. Limitations
- Limited participant pool and statistical power: Only five individuals participated in the evaluation. While improvements were consistent in this sample, larger and more diverse participant groups are required to quantify variability and draw stronger statistical conclusions.
- Learning and within-subject effects: Each participant completed both conditions (counterbalanced order). Although counterbalancing reduces systematic learning bias, practice effects and familiarity with the puzzle may still influence performance.
- Generalization of recognition: The independent test set was collected from a participant who was also included in the training data, which primarily evaluates day-to-day repeatability for a known user rather than cross-user generalization. Future studies should include leave-one-subject-out evaluation and testing with unseen participants.
- Similar disc classes: The middle-top and middle-bottom discs produced overlapping tactile signatures, occasionally leading to misclassification. Additional sensing modalities (e.g., IMUs) or redesigned sensing layouts may reduce ambiguity.
- Task domain: The Tower of Hanoi provides a controlled proxy for multi-step manipulation with a deterministic optimal solution. It does not fully represent the variability and constraints of real operational tasks, so domain-specific validation is required before practical deployment claims can be made.
- Hardware robustness: The glove requires careful fabrication and calibration. Long-term durability, comfort over extended use, and robustness to sensor drift and wear were not evaluated in this study.
7.6. Future Work
- Broader evaluation: Increase participant diversity and evaluate cross-user generalization using subject-independent splits (e.g., leave-one-subject-out).
- Multi-modal fusion: Combine tactile sensing with IMUs or other modalities to improve robustness for ambiguous grasps and complex interactions.
- Task-level transfer: Apply the framework to domain-specific multi-step procedures (e.g., assembly or inspection) to evaluate scalability and practical relevance.
- Improved glove design: Refine the tactile array layout and ergonomics, and evaluate long-term stability under repeated use.
8. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Appendix A. NASA-TLX Questionnaire




Appendix B. System Usability Scale Questionnaire


References
- Rathore, B. Digital Transformation 4.0: Integration of Artificial Intelligence and Metaverse in Marketing. Eduzone Int. Peer Rev. Multidiscip. J. 2023, 12, 42–48. [Google Scholar] [CrossRef]
- Hozdić, E. Smart Factory for Industry 4.0: A Review. Int. J. Mod. Manuf. Technol. 2015, 7, 28–35. [Google Scholar]
- Badue, C.; Guidolini, R.; Carneiro, R.V.; Azevedo, P.; Cardoso, V.B.; Forechi, A.; Jesus, L.; Berriel, R.; Paixão, T.; Mutz, F.; et al. Self-Driving Cars: A Survey. Expert Syst. Appl. 2021, 165, 113816. [Google Scholar] [CrossRef]
- Wang, H.; Liu, Z.; Peng, D.; Qin, Y. Understanding and Learning Discriminant Features Based on Multi-Attention 1D-CNN for Wheelset Bearing Fault Diagnosis. IEEE Trans. Ind. Inform. 2019, 16, 5735–5745. [Google Scholar] [CrossRef]
- Chalapathy, R.; Chawla, S. Deep Learning for Anomaly Detection: A Survey. arXiv 2019, arXiv:1901.03407. [Google Scholar] [CrossRef]
- Kim, H.-K.; Lee, S.; Yun, K.-S. Capacitive Tactile Sensor Array for Touch Screen Application. Sens. Actuators A Phys. 2011, 165, 2–7. [Google Scholar] [CrossRef]
- Luo, Y.; Li, Y.; Foshey, M.; Shou, W.; Sharma, P.; Palacios, T.; Torralba, A.; Matusik, W. Intelligent Carpet: Inferring 3D Human Pose from Tactile Signals. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Nashville, TN, USA, 20–25 June 2021; pp. 11255–11265. [Google Scholar]
- Sundaram, S.; Kellnhofer, P.; Li, Y.; Zhu, J.-Y.; Torralba, A.; Matusik, W. Learning the Signatures of the Human Grasp Using a Scalable Tactile Glove. Nature 2019, 569, 698–702. [Google Scholar] [CrossRef] [PubMed]
- Liu, H.; Xie, X.; Millar, M.; Edmonds, M.; Gao, F.; Zhu, Y.; Santos, V.J.; Rothrock, B.; Zhu, S.-C. A Glove-Based System for Studying Hand-Object Manipulation via Joint Pose and Force Sensing. In Proceedings of the 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Vancouver, BC, Canada, 24–28 September 2017; pp. 6617–6624. [Google Scholar]
- Zhang, Q.; Li, Y.; Luo, Y.; Shou, W.; Foshey, M.; Yan, J.; Tenenbaum, J.B.; Matusik, W.; Torralba, A. Dynamic Modeling of Hand-Object Interactions via Tactile Sensing. In Proceedings of the 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Prague, Czech Republic, 27 September–1 October 2021; pp. 2874–2881. [Google Scholar]
- Pohtongkam, S.; Srinonchat, J. Object Recognition Using Glove Tactile Sensor. In Proceedings of the 2022 International Electrical Engineering Congress (iEECON), Khon Kaen, Thailand, 9–11 March 2022; pp. 1–4. [Google Scholar]
- Makaussov, O.; Krassavin, M.; Zhabinets, M.; Fazli, S. A Low-Cost, IMU-Based Real-Time On Device Gesture Recognition Glove. In Proceedings of the 2020 IEEE International Conference on Systems, Man, and Cybernetics (SMC), Toronto, ON, Canada, 11–14 October 2020; pp. 3346–3351. [Google Scholar]
- Tashakori, A.; Jiang, Z.; Servati, A.; Soltanian, S.; Narayana, H.; Le, K.; Nakayama, C.; Yang, C.-L.; Wang, Z.J.; Eng, J.J.; et al. Capturing Complex Hand Movements and Object Interactions Using Machine Learning-Powered Stretchable Smart Textile Gloves. Nat. Mach. Intell. 2024, 6, 106–118. [Google Scholar] [CrossRef]
- Mukhiddinov, M.; Cho, J. Smart Glass System Using Deep Learning for the Blind and Visually Impaired. Electronics 2021, 10, 2756. [Google Scholar] [CrossRef]
- Silva, J.; Coelho, P.; Saraiva, L.; Vaz, P.; Martins, P.; López-Rivero, A. Validating the Use of Smart Glasses in Industrial Quality Control: A Case Study. Appl. Sci. 2024, 14, 1850. [Google Scholar] [CrossRef]
- Karagiannis, P.; Togias, T.; Michalos, G.; Makris, S. Operators Training Using Simulation and VR Technology. Procedia CIRP 2021, 96, 290–294. [Google Scholar] [CrossRef]
- Kaller, C.P.; Rahm, B.; Spreer, J.; Mader, I.; Unterrainer, J.M. Thinking Around the Corner: The Development of Planning Abilities. Brain Cogn. 2008, 67, 360–370. [Google Scholar] [CrossRef]
- Takagi, M.; Katayama, S.; Kojima, K. Examining the Problem-Solving Skills of Schizophrenic Patients Using the Hanoi Tower Task. J. Yonago Med. Assoc. 2005, 56, 61–71. [Google Scholar]
- Shen, Z.; Yi, J.; Li, X.; Lo, M.H.P.; Chen, M.Z.; Hu, Y.; Wang, Z. A Soft Stretchable Bending Sensor and Data Glove Applications. Robot. Biomim. 2016, 3, 22. [Google Scholar] [CrossRef] [PubMed]
- Shukor, A.Z.; Miskon, M.F.; Jamaluddin, M.H.; Ali@Ibrahim, F.; Asyraf, M.F.; Bahar, M.B. A New Data Glove Approach for Malaysian Sign Language Detection. Procedia Comput. Sci. 2015, 76, 60–67. [Google Scholar] [CrossRef]
- Henderson, J.; Condell, J.; Connolly, J.; Kelly, D.; Curran, K. Review of Wearable Sensor-Based Health Monitoring Glove Devices for Rheumatoid Arthritis. Sensors 2021, 21, 1576. [Google Scholar] [CrossRef] [PubMed]
- Luo, Y.; Li, Y.; Sharma, P.; Shou, W.; Wu, K.; Foshey, M.; Li, B.; Palacios, T.; Torralba, A.; Matusik, W. Learning Human–Environment Interactions Using Conformal Tactile Textiles. Nat. Electron. 2021, 4, 193–201. [Google Scholar] [CrossRef]
- Murphy, D.; Zhu, J.; Liang, P.; Matusik, W.; Luo, Y. WireSens Toolkit: An Open-Source Platform Towards Accessible Wireless Tactile Sensing. arXiv 2024, arXiv:2412.00247. [Google Scholar]
- Hart, S. Development of NASA-TLX (Task Load Index): Results of Empirical and Theoretical Research. In Human Mental Workload; Elsevier: Amsterdam, The Netherlands, 1988. [Google Scholar]
- Brooke, J. SUS: A Quick and Dirty Usability Scale. In Usability Evaluation in Industry; Taylor & Francis: London, UK, 1996; pp. 189–194. [Google Scholar]





























| Study/Category | Sensing Modality | Task Type | Model/Method | Output Focus |
|---|---|---|---|---|
| Sundaram et al. [8] | Dense tactile glove (>500) | Everyday objects | CNN | Object recognition; estimation |
| Pohtongkam and Srinonchat [11] | Tactile glove | Objects | CNN vs. BoW | Accuracy vs. speed |
| Liu et al. [9] | IMUs + tactile | Manipulation | Visualization/analysis | Pose + pressure visualization |
| Smart glasses/XR [14,15,16] | Camera-based | Guidance/training | Vision + overlays | Visual guidance; detection |
| Luo et al. [22] | Conformal tactile textiles | Interaction sensing | Learning-based | Tactile interaction modeling |
| Murphy et al. [23] | Wireless tactile toolkit | Tactile sensing platform | Toolkit/framework | Data acquisition; prototyping |
| This work | Tactile glove (88) + pressure sheet | Multi-step manipulation (proxy) | CNN + verification logic | Step verification + procedural guidance |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2026 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license.
Share and Cite
Bharadwaj, S.S.; Sato, K.; Jing, L. Development and Evaluation of a Data Glove-Based System for Assisting Puzzle Solving. Sensors 2026, 26, 2341. https://doi.org/10.3390/s26082341
Bharadwaj SS, Sato K, Jing L. Development and Evaluation of a Data Glove-Based System for Assisting Puzzle Solving. Sensors. 2026; 26(8):2341. https://doi.org/10.3390/s26082341
Chicago/Turabian StyleBharadwaj, Shashank Srikanth, Kazuma Sato, and Lei Jing. 2026. "Development and Evaluation of a Data Glove-Based System for Assisting Puzzle Solving" Sensors 26, no. 8: 2341. https://doi.org/10.3390/s26082341
APA StyleBharadwaj, S. S., Sato, K., & Jing, L. (2026). Development and Evaluation of a Data Glove-Based System for Assisting Puzzle Solving. Sensors, 26(8), 2341. https://doi.org/10.3390/s26082341

