Next Article in Journal
New Numerical Treatment for a Family of Two-Dimensional Fractional Fredholm Integro-Differential Equations
Previous Article in Journal
Adaptive Scatter Search to Solve the Minimum Connected Dominating Set Problem for Efficient Management of Wireless Networks
 
 
Article
Peer-Review Record

Approximation Algorithm for Shortest Path in Large Social Networks

Algorithms 2020, 13(2), 36; https://doi.org/10.3390/a13020036
by Dennis Nii Ayeh Mensah *, Hui Gao and Liang Wei Yang
Reviewer 1: Anonymous
Reviewer 2: Anonymous
Algorithms 2020, 13(2), 36; https://doi.org/10.3390/a13020036
Submission received: 31 December 2019 / Revised: 28 January 2020 / Accepted: 31 January 2020 / Published: 6 February 2020

Round 1

Reviewer 1 Report

This paper proposes an efficient and scalable approximation algorithm for shortest path finding (SPF) task. The proposed algorithm  iteratively constructs levels of hierarchical networks and thus the approximate path can be found in high level networks. Authors also considered paralleling computing to speed up the proposed framework, which shows benefits comparing to baselines like CDZ, LCA and LBFS.

 

The reviewer has the following concerns:

 

- Literature study is not very complete. The hierarchical ideal for SPF used in the paper has popular prior art, such as Contraction Hierarchies in many textbooks:

 

https://www.mjt.me.uk/posts/contraction-hierarchies/

 

or Geisberger, R., Sanders, P., Schultes, D., & Delling, D. (2008, May). Contraction hierarchies: Faster and simpler hierarchical routing in road networks. In International Workshop on Experimental and Efficient Algorithms (pp. 319-333). Springer, Berlin, Heidelberg

 

but is missing in the paper. Authors should clarify their novelty with this algorithm and cite the original paper. Moreover, people commonly used A* algorithm as the baseline instead of Dijkstra or Flowd-Warshall for SPF. These two algorithms are too classic.

 

- When proposing new SPF algorithms, academic papers should discuss (or prove) their optimality and completeness for scientific soundness. These two properties are missing in this paper. If the algorithm is suboptimal, authors should show to what extent the sub-optimality is compared to baselines.

 

- The numbering of section has too many 4.0’s, please correct the mistake.

Author Response

Please see the attachement

Author Response File: Author Response.pdf

Reviewer 2 Report

The topic and the results look fine.

The acronyms used throughout the paper have to be defined. Some explanation on how the authors came up with the complexity of their algorithms need to be given or provide some reference.

Please provide a clear distinction among normal nodes, super nodes, central nodes, and sub nodes.

Line 132: Explain how  "... the radiuses their corresponding super nodes will range from r to 3r+1".

From equation 2 line 141, it seems r0=r1=r2=...=rk-1=0.  How did the authors get ri=2i-1 on line 143?

Page 4 is very hard to read and needs to be improved.

I assume the original network at level zero (0) is undirected and unweighted and at the higher levels the network is undirected and weighted. There needs to be this consistency throughout the paper.

 

Author Response

Please see the attachment

Author Response File: Author Response.docx

Round 2

Reviewer 1 Report

A* and Contraction Hierarchies are added in the revised version.

The reviewer does not think that the second concern has been fully addressed. Please refer to "Wagner, Glenn, and Howie Choset. Subdimensional expansion for multirobot path planning. Artificial Intelligence 219 (2015): 1-24" for the definitions of 1) completeness and 2) optimality of a path-finding algorithm. The reviewer suggests that one separate section for discussion on the two properties be added for scientific soundness.

The reviewer has a question for authors: It is commonly agreed in the path-planning field that A* is faster than Dijkstra even if it requires more memory and more operations per node since it explores a lot less nodes and the gain is good in any case. So, if you can design proper heuristics and employ A* to your algorithm2 instead of Dijkstra's, would the run time be faster?

Author Response

it would probably be faster, especially considering the size of the graph A* will perform the search on after our algorithm literally compresses the larger graph. we will conduct experiments to investigate if there could be a trade off between cost and time. We appreciate your suggestions. Thank you. 

Round 3

Reviewer 1 Report

My concerns are addressed.

Back to TopTop