You are currently viewing a new version of our website. To view the old version click .
by
  • Scott A. Cameron1,2,*,
  • Hans C. Eggers1,2 and
  • Steve Kroon3

Reviewer 1: Anonymous Reviewer 2: Anonymous

Round 1

Reviewer 1 Report

This paper performed an interesting analysis of a stochastic gradient annealed importance sampling (SGAIS) technique to estimate ML with improved estimation speed compared to previous approaches. The paper was written in detail and clarity. The main issues of the paper are that it is too lengthy, and it did not distinguish its unique contribution from previous work. I would thus recommend it for publication after the authors carefully addressed my comments as listed below.

Please make the paper concise. For example, there are many detailed equations which seem redundant because most of them are not new. Therefore, please try to integrate them or delete some of them. Please clearly point out the original contributions and novelty of this paper that distinguish itself with previous work. It can be listed in a table or described with bullets. After reading through the paper, I failed to capture such important information. The data set used in the experiment is specific or general? Please explain how representative the data set is. What is the potential drawback or risk of the proposed technique?

Author Response

Please see the attachment

Author Response File: Author Response.pdf

Reviewer 2 Report

It's a very interesting work and presented nicely. I would like to recommend it to publish

Author Response

Dear Reviewer 2,

Thank you for taking the time to review our work. We have made some small changes in accordance with suggestions by Reviewer 1.

Regards,

Scott Cameron

on behalf of all authors