Pruning Adapters with Lottery Ticket
Round 1
Reviewer 1 Report
The authors' paper on adapter pruning appears to be a well-written research paper. The literature review is presented comprehensively, taking into account the latest research. In addition, the research conducted corresponds to the topic of the journal. I suggest accepting the paper in current form.
There are a couple of typos, highlighted in the attached PDF.
Comments for author File:
Comments.pdf
Author Response
Thank you for finding the typos for us. We’ve corrected the typos in the paper
Reviewer 2 Report
The paper proposes novel ways of pruning redundant parameters in adapters, that are relying on the Lottery Ticket Hypothesis.
Pruning is done on 3 levels: weights, neurons and adapter layers. Adapters are pruned iteratively and with each iteration weights are set to initial values. Evaluation is performed on GLUE datasets and subnetworks are found successfully. Results have shown a significant decrease in size with no performance drop. Original adapters are even outperformed in some datasets.
Suggestions for improvements:
- paper should be reorganized as some sections are missing
- rework Introduction section (give general intro, discuss open questions, what issues need to be solved, what is your motivation etc.)
- missing Related work section (provide an overview of existing research in comparison with your approach)
- state your motivation for choosing Adam (over SGD e.g.)
- how were the hyperparameters chosen... arbitrarily or motivated by some ground theory?
- how long did it take to train the models with the proposed architecture?
- explain how values were chosen in pruning strategies
- missing Conclusion section (rework Discussion section so final conclusions are separated from Discussion section)
- state the possible limitations of your work (e.g. generalization power of your findings to iterative pruning on deeper networks, effects of model sparseness on different types of datasets)
Author Response
Please see the attachment
Author Response File:
Author Response.docx
