Integrated Analysis of Malicious Software: Insights from Static and Dynamic Perspectives
Round 1
Reviewer 1 Report
This manuscript presents a comprehensive analysis of diverse malware samples utilizing a hybrid methodology that integrates both static and dynamic analysis techniques. By systematically evaluating a wide range of malicious software—including adware, backdoors, keyloggers, ransomware, spyware, and trojans—across controlled sandbox environments, the study identifies key behavioral characteristics and operational signatures. The proposed framework leverages multiple established tools to profile malware based on file structure, entropy, API calls, network activity, and registry modifications, culminating in a comparative behavioral database. While the work provides a practical and valuable contribution to the field of malware analysis and intrusion forensics, the following revisions are suggested to further strengthen the validity, clarity, and scholarly impact of the manuscript:(in detailed comments)
- It is advisable to have the manuscript undergo professional technical English editing to refine grammar, spelling, and sentence structure, ensuring clearer conveyance of research objectives and results.
- The methodology section currently lacks a rigorous justification for the experimental design, which affects the validity and reproducibility of the study. To address this, it is recommended to provide a clear rationale for the malware sample selection criteria and a more detailed description, including software versions and network settings.
- The presentation of the results relies heavily on tables and lacks effective data visualization, making it difficult to discern patterns and compare behaviors across different malware categories. This could be significantly improved by incorporating charts, such as box plots for entropy distribution, and by performing basic statistical analysis to robustly support the observed trends.
- The discussion does not sufficiently articulate the novel contributions of this work in direct comparison to prior research, leaving the reader uncertain of its specific advancement in the field. A more explicit and systematic comparison with cited works is needed to clearly highlight the advantages of the integrated analysis approach and the diverse malware dataset used in this study.
- The conclusion outlines future work in broad and general terms, lacking technical specificity. To be more impactful, it should propose concrete and technically focused research pathways, such as the integration of extracted features into machine learning models for automated classification or a deeper investigation into cross-platform malware behavior.
- To enrich the research background, the relevant paper is worth referencing:
[a] A Resilience Recovery Method for Complex Traffic Network Security Based on Trend Forecasting[J]. International Journal of Intelligent Systems, 2025, 2025(1): 3715086.
In summary, this study offers a meaningful contribution to the field of malware forensics by delivering a practical framework for the comparative analysis of malicious software through hybrid methodologies. However, improvements in methodological transparency, data presentation, and contextualization of the work's novelty are necessary. The manuscript would benefit from the aforementioned modifications prior to its acceptance for publication.
Author Response
Please see the attachment.
Author Response File:
Author Response.pdf
Reviewer 2 Report
The presented manuscript appears to be a practical work (or a portion of it) from an MS-level student, rather than a research paper, in my opinion.
It's not yet a scientific contribution because it lacks rigorous evaluation, comparison, reproducibility artefacts, and analytical positioning.
To publish this paper, the authors must clearly articulate the basis on which their solution outperforms existing ones, providing evaluation baselines, comparisons, and validation of the obtained results. And rewrite most of the paper in an academic tone, to be honest.
I'm sorry, but at this moment, the manuscript is far from being a good research paper to be published in a Q1 journal.
You can refer to the previous comments, but here I'll provide some comments on the English language:
- Informal, repetitive phrasing in the Introduction ("Cybersecurity is perhaps the most interesting topic...") // of course, it is, but you need to rephrase it;
-"Basically, if you have not been the victim…" // this is not academic writing.
And many more.
Additionally, you need to revise the paper and provide proper references for all figures and tables (such as referencing Figure 5.11, which I couldn't find in the paper). This fact reinforces my concerns that the paper's roots may be from the students' tutorial.
The following recommendations can be used as a plan for the improvements:
1) Define a measurable objective and build a paper and test plan for verification around it;
2) The results obtained shall be presented in a form to be externally validated;
3) Strengthen dynamic analysis rigour;
4) Strengthen the "Related works" section via an analytical comparison matrix (criteria: sample scale, OS focus, static/dynamic scope, metrics reported, dataset availability, etc.) and clearly state what this paper adds.
Author Response
Please see the attachment.
Author Response File:
Author Response.pdf
Reviewer 3 Report
refer to detailed comments.
This work uses both static and dynamic analysis techniques in identifying malicious files. Some suggestions are as follows:
The research focus and research questions are not clearly explained. It is mentioned that “The ultimate goal is to create a more robust automatic anomaly detection system” (line 58), which is not really achieved by this work.
There is no single citation in the introduction section.
The literature review section might need a major rework in introducing the scope, the development of related technologies and the critical assessment of the research gap.
The methods is not clear and theoretically sound. Is “identifying the most suitable analysis features” (figure 1) achieved?
What is the purpose of figure two in relation to research design?
Table 4 is first introduced in line 349 but appears only in line 559
Line 424-439 introduce and applies different existing solutions, how is it related to the research questions?
It is strongly recommended to divide section 3 and 4 into subsections and addressed by proper meta-text.
The discussion part just barely connect with existing literature. It is hard to establish the contributions of this work.
The conclusions is supposed to highlight the key findings. Still, there are claims that is not fully addressed in this work, e.g. each OS (Operating System) has, natively, several weaknesses that can be exploited (line 667), is this sufficiently addressed in the current manuscript?
Author Response
Please see the attachment.
Author Response File:
Author Response.pdf
Round 2
Reviewer 1 Report
All the problems are solved. It should be accepted.
I have no more opinions
Author Response
We sincerely thank the reviewer for their positive evaluation and for acknowledging that all major issues have been addressed. We appreciate the time and effort taken to review our manuscript and are grateful for the constructive feedback provided throughout the process.
Reviewer 2 Report
The authors did implement many editorial and structural revisions, making the paper appear more academic. It's much better now, and the chances of publishing the paper have increased.
However, and I can't avoid saying this, but the substantial scientific improvements still remain limited. THe manuscript is still somewhere methodologically descriptive and lacks quantitative validation. The title should indeed be changed, because the current one implies coverage and depth that the content of the manuscript does not achieve.
I'll provide some major comments on the coverletter that you sent, stating to what extent my comments have been addressed in the updated version of the manuscript:
1) Indroduction and related work: indeed rewritten in more formal language, but the "analytical comparison matrix" promised in the coverletter is absent, i.e. there is no visible table summarizing comparative features, only descriptive text paragraphs. Whilst you enhanced the references analysis, I could say that this comment is partially covered, yet I expected more;
2) Methodology and research design: Section 3 updates describe a structured experimental design, the dynamic environment now lists SW/HW set, but the following critical issues are still to be answered:
a) No quantitative performance metrics beyond entropy and "yes/no" outcomes;
b) No validation method (e.g., cross-run consistency, ground-truth comparison, or baseline tools) is added.
So, the statistical rigour or reproducibility evidence is to be included.
3) Conclusions: longer and more structured, but the critical issue is that there are quantitative links between results and conclusions (no baselines, metrics, or comparative performance). It must be added.
As you can see, among the major concerns, the biggest are related to comparative baselines and reproducibility, in my opinion.
You have to enhance the paper by addressing these comments, and they will be further evaluated in the second review round on my end.
Author Response
Response to Reviewer Comment:
We sincerely thank the reviewer for their detailed and constructive feedback, which has been invaluable in improving the rigor and clarity of our manuscript. We have carefully addressed each of the raised concerns as follows:
Linking research questions and conclusions: The discussion section has been revised to explicitly reference and answer the three research questions introduced at the beginning of the paper. Each question is now addressed in a dedicated paragraph within the conclusion section, ensuring coherence between the study objectives and the findings.
Analytical comparison matrix: We acknowledge the previous omission and have now included a dedicated analytical comparison table in the Related Work section (Section 2). This matrix summarizes key attributes from existing studies and contrasts them with our proposed approach.
Quantitative performance metrics and validation: To strengthen the methodological rigor, we have introduced a validation subsection to describe reproducibility protocols.
Reproducibility and baselines: We now describe a baseline comparison approach using publicly available malware datasets and standard analytical tools ( VirusTotal) to validate the obtained results. Also, we included a pragraph with reasons for yes/no values from the table.
Title revision: The title has been revised to better reflect the scope and methodological depth of the manuscript.
Overall, we have reinforced the paper’s analytical depth and methodological clarity, added quantitative indicators, and improved the connection between the research objectives, results, and conclusions as suggested.
Reviewer 3 Report
Thank you for taking in my comments into considerations. the revised manuscript has been improved, in language used, justification of methods and structure. Still, I would like to follow up on my comments last time as follows:
Is the research gaps in line 36-37 really the one this work is trying to mitigate?
for the RQ, is rq3 and 4 really being addressed? in fact, the later part of this work tries to analyse the characteristics of different types of malware rather than the OS.
perhaps section 2.1 can come before section 2.4
Figure 1 cannot reflect the hybrid analysis of this work. technically, is creating a sandbox environment a precedence of identifying the analysis tool and features?
Figure 2 could have illustrate the real testing conditions rather than a general Network diagram
It is observed that this revised version manuscript incorporate a lot more relevant literature. Still the literature remains still at a "who is saying what" level.
Minor:
It is understood that the reference in bracket could serve as a noun, still it is a bit odd to use it as subject in sentence. e.g. [6] highlights
referring to the major comments
Author Response
We sincerely thank the reviewer for their insightful feedback and for acknowledging the improvements made in language, structure, and methodological justification. We have carefully addressed each of the raised concerns as follows:
-
Research gap clarification (lines 36–37):
The research gap section has been revised to clearly align with the scope and objectives of this study. The revised version now emphasizes the lack of comparative hybrid analyses (static and dynamic) focused on feature-based characterization of malware samples rather than OS-dependent behavior. This adjustment ensures that the gap precisely reflects the work’s focus. -
Research Questions 3 and 4:
The research questions have been refined for better alignment with the study’s goals. Specifically, RQ3 now focus on analyzing how consistent are the static and dynamic characteristics across multiple samples within the same malware category rather than emphasizing operating system differences. -
Section order (2.1 and 2.4):
The structure of Section 2 has been adjusted so that Section 2.1 now precedes Section 2.4, ensuring smoother logical flow from theoretical foundations to applied methods. -
Figure 1 (workflow representation):
Figure 1 has been updated to better illustrate the hybrid analysis workflow. The new version shows the correct sequence of activities—identification of analysis tools and feature selection preceding sandbox environment creation—accurately reflecting the methodology followed in this study. -
Figure 2 (testing environment):
Figure 2 has been removed. -
Depth of Literature Review:
The Related Work section has been revised to move beyond descriptive summaries (“who said what”) toward an analytical synthesis. A comparison table has been added for each research cluster, identifying methodological similarities, differences, and research contributions relative to our study. -
Minor stylistic issue (“[6] highlights”):
References used as sentence subjects have been rephrased for stylistic clarity—for instance, “As highlighted in [6]…”—in accordance with standard academic writing conventions.
We believe these revisions significantly improve conceptual alignment, methodological transparency, and academic presentation.
Round 3
Reviewer 2 Report
The refined version of the manuscript, along with all improvements introduced by the authors in response to reviewers' comments, made it a good example of engineering-based experimental work with demonstrable methodological rigour and strong linkage to the research domain, the stated questions, and the outcomes obtained.
No more comments from my side.
Reviewer 3 Report
Thank you for addressing all my concerns. I shall have no further comments.
Nil.

