Exploring Architectural Details Through a Wearable Egocentric Vision Device
AbstractAugmented user experiences in the cultural heritage domain are in increasing demand by the new digital native tourists of 21st century. In this paper, we propose a novel solution that aims at assisting the visitor during an outdoor tour of a cultural site using the unique first person perspective of wearable cameras. In particular, the approach exploits computer vision techniques to retrieve the details by proposing a robust descriptor based on the covariance of local features. Using a lightweight wearable board, the solution can localize the user with respect to the 3D point cloud of the historical landmark and provide him with information about the details at which he is currently looking. Experimental results validate the method both in terms of accuracy and computational effort. Furthermore, user evaluation based on real-world experiments shows that the proposal is deemed effective in enriching a cultural experience. View Full-Text
Scifeed alert for new publicationsNever miss any articles matching your research from any publisher
- Get alerts for new papers matching your research
- Find out the new papers from selected authors
- Updated daily for 49'000+ journals and 6000+ publishers
- Define your Scifeed now
Alletto, S.; Abati, D.; Serra, G.; Cucchiara, R. Exploring Architectural Details Through a Wearable Egocentric Vision Device. Sensors 2016, 16, 237.
Alletto S, Abati D, Serra G, Cucchiara R. Exploring Architectural Details Through a Wearable Egocentric Vision Device. Sensors. 2016; 16(2):237.Chicago/Turabian Style
Alletto, Stefano; Abati, Davide; Serra, Giuseppe; Cucchiara, Rita. 2016. "Exploring Architectural Details Through a Wearable Egocentric Vision Device." Sensors 16, no. 2: 237.
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.