Next Article in Journal
An Adaptive Second Order Sliding Mode Inverse Kinematics Approach for Serial Kinematic Chain Robot Manipulators
Next Article in Special Issue
Wheeled Robot Dedicated to the Evaluation of the Technical Condition of Large-Dimension Engineering Structures
Previous Article in Journal
Dedicated Nonlinear Control of Robot Manipulators in the Presence of External Vibration and Uncertain Payload
Previous Article in Special Issue
Numerical and Experimental Validation of the Prototype of a Bio-Inspired Piping Inspection Robot
 
 
Article
Peer-Review Record

Design and Implementation of a Connection between Augmented Reality and Sensors

by Marlon Aguero 1, Dilendra Maharjan 1, Maria del Pilar Rodriguez 1, David Dennis Lee Mascarenas 2 and Fernando Moreu 1,*
Reviewer 1: Anonymous
Reviewer 2: Anonymous
Submission received: 18 November 2019 / Revised: 14 December 2019 / Accepted: 19 December 2019 / Published: 1 January 2020
(This article belongs to the Special Issue Advances in Inspection Robotic Systems)

Round 1

Reviewer 1 Report

definition of remote sensing from oxford dictionary: the scanning of the earth by satellite or high-flying aircraft in order to obtain information about it. so the title is misleading and the authors are encouraged to change it

awkward sentence: The presented experiment 23 demonstrates that actionable data visualization can be significantly improved through the 24 application of AR tools such as an AR headset, AR is a tool which overlays the known attributes of 25 an object with the corresponding position on the headset screen.

Strain gauge images in Table 2 are too blurry

in section 2 you discuss the price differences in the two set ups, can you talk about if there are any limitations to each of them? are there latency issues with your approach that more costly sensors dont have? why is your approach better?

error bars are needed on figure 8

Figure 9 needs to be less blurry or add in a separate side image that shows what i should be seeing.

a literature review on existing papers which use augmented reality to visualize structural health monitoring data is missing and critical. the authors need to establish exactly how what they did was different than previous literature. was it just the new sensor development and the interaction with the database? was it the workflow with the ar? you talk about how this will enable live data visualization through ar but again this is something that has already been done so please clarify how your approach is better, faster, etc.

what is the program i can see in figure 9 in ar? is that plot automatically generated?

more information is needed about the system: what is the latency? how much data can you pass at once? does it only work with one sensor at once? even if a discussion is just added about how this could be done, it would suffice. additionally, the authors should discuss does the data stay stored in that point long term? if i come back tomorrow can i rewind to see yesterday's data? or is it just a live stream? 

a section about this methods limitations needs to be added. 

section needed to talk about human-machine interface. what are design principles here? what were you thinking about in terms of human factors?what is drift on your gps points with the ar? where does the graph pop up? can it be moved by a user? how does a user open and close this interface? can they? a flow chart of user interactions would be helpful

Author Response

Responses attached in word document for your convenience, thanks for your consideration

The authors

Author Response File: Author Response.pdf

Reviewer 2 Report

The quality of the figures are generally low.  Figures must be able to understand in black/white (see figure 8). Many of the references (aticles) are pretty old (5+ years) The AR-technology is an important part of your article but the text focusing AR is short (Section 3). Much of the text is listing technology details (sections 2 & 4). The the text "hologram effect" is meantioned about AR. The AR shown (figure 9) does not have any 3D-effect.  How does the system know which sensor value to display? In this case only one sensor is used but in reality more sensors are available. How does the user chose between sensor values and how are they displayed. This is lacking in the paper. The originality of the research done is unclear.  How can this new sensor system be used in reality? How can it be implemented into a construction, house or mine?

Author Response

Responses added in word document file, thanks for your consideration

Author Response File: Author Response.pdf

Round 2

Reviewer 1 Report

comments have been addressed, paper is ready for publication. 

Back to TopTop