Next Article in Journal
Long-Term Adaptivity in Distributed Intelligent Systems: Study of ViaBots in a Simulated Environment
Previous Article in Journal
A Robotic Head Stabilization Device for Medical Transport
 
 
Article
Peer-Review Record

Laban-Inspired Task-Constrained Variable Motion Generation on Expressive Aerial Robots

by Hang Cui 1, Catherine Maguire 2,3,4 and Amy LaViers 1,*
Reviewer 1: Anonymous
Reviewer 2: Anonymous
Submission received: 13 December 2018 / Revised: 12 March 2019 / Accepted: 18 March 2019 / Published: 27 March 2019

Round 1

Reviewer 1 Report

The paper is clearly explained and well-structured, so easily readable.

 

I found the paper had a very similar approach with regard to the application of Effort to the UAV motion as the paper of Sharma, M.; Hildebrandt, D.; Newman, G.; Young, J.E.; Eskicioglu, R. Communicating affect via flight path Exploring use of the Laban Effort System for designing affective locomotion paths. 201 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI), 2013, pp. 293–300 (which is cited by the authors)

I would like to have more explanation about what is really new in this research and its approach compared to the previous study.

 

In section 3, the authors explain that one task has been generated in different Effort configurations (table 1). Then the paper says that a CMA observed each simulated motion, evaluated and noted each of them in Motifs notation. In section 3.1, the paper mentions that the CMA’s evaluation and notation in Motifs are much more detailed than the researcher thought – what I understood. However, I really do not understand what the meaning of this process that the CMA observes and notes the robot motions which have already classified by Effort criteria. Is it a kind of verification process to assess if the simulated motions with Effort configuration truly correspond to the CMA’s observation? 

 

Concerning the description of Motif criteria by using words in order to for lay individuals to assess the robotic motion, I would like to have more explanation about who chose these words and how. I think that giving such words (hurried, puffy, chaotic) might result from subjective interpretation of the robotic motion of those who chose these words. Effort enables to classify the quality of motion, but giving such word-label to a motion warrants more investigation. For instance, if the motion which is classified as Time-sudden and Space-direct, it can give us an impression of ‘harried’ but it is only an impression and interpretation. Moreover, some words are confusing like ‘clear’. What does it mean the motion is ‘clear’? The lay observer can be confused by such terminology if they are asked to assess the robotic motion or the robot intention expressed by its motion. 

 

In the introduction, the authors evoked potential adoption of the expressive UAV to in-home situation for the elderly population. I find it hard to imagine the use of this kind of robot in in-home context and for the elderly population, even if I totally agree with the need to generate appropriate and safe robotic motion. In the conclusion, there is no discussion about the potential adoption of the expressive UAV. It is advisable to develop further the usefulness of this kind of robot for in-home context.

 

Here are few minor suggestions.

 

-      It is preferable to write; ‘Unmanned Aerial Vehicle (UAV)’ in the first phrase instead of using directly an abbreviation. 

-      Please add in Abbreviations: ‘UGV’ Unmanned Ground Vehicle


Author Response

See attached file.

Author Response File: Author Response.pdf

Reviewer 2 Report

This is a work creating a method for creating expressive UAVs through an algorithmic procedure for creating variable motion. There are numerous ambiguous parts through the paper as follows.


1. I recommend inserting an overview diagram in Section 1 to help readers understand what authors try to deliver. The paper seems very verbose in explaining the method which makes hard to follow the intent of authors.

2. Please convert the verbose description of the methods into equations to make it brief.

3. Please provide the enough comparison result between the presented method and the general path planning method. With the comparison result, I can evaluate the advantage of the presented method.

4. In section 2, four motion factors including time, weight, space, and flow, in the LMA method is the concept created by Laban [22-24] and authors adapted the LMA methods for their purpose. To compare what is indeed improved compared to the original method, please provide a comparison result between two methods.

 

This reviewer would be happy to assist a revised version of this paper.


Author Response

See attached.

Author Response File: Author Response.pdf

Round 2

Reviewer 2 Report

This is a work creating a method for creating expressive UAVs through an algorithmic procedure for creating variable motion. There are numerous ambiguous parts through the paper as follows.


1. I recommend inserting an overview diagram in Section 1 to help readers understand what authors try to deliver. The paper seems very verbose in explaining the method which makes hard to follow the intent of authors.

>> It would be better to enlarge the plots shown in the upper side of “Variable Motion Generation” and “Expert Observation” blocks in Figure 2.

 

2. Please convert the verbose description of the methods into equations to make it brief.

>> Satisfied.

 

3. Please provide the enough comparison result between the presented method and the general path planning method. With the comparison result, I can evaluate the advantage of the presented method.

>> Satisfied.

 

4. In section 2, four motion factors including time, weight, space, and flow, in the LMA method is the concept created by Laban [22-24] and authors adapted the LMA methods for their purpose. To compare what is indeed improved compared to the original method, please provide a comparison result between two methods.

>> Satisfied

 

This reviewer would be happy to assist a revised version of this paper.

Back to TopTop