Next Article in Journal
Meta Network for Flow-Based Image Style Transfer
Previous Article in Journal
A High-Speed 8-Bit Single-Channel SAR ADC with Tailored Bit Intervals and Split Capacitors
Previous Article in Special Issue
A High-Temperature Stabilized Anti-Interference Beidou Array Antenna
 
 
Article
Peer-Review Record

Threat Classification and Vulnerability Analysis on 5G Firmware Over-the-Air Updates for Mobile and Automotive Platforms

Electronics 2025, 14(10), 2034; https://doi.org/10.3390/electronics14102034
by Insu Oh, Mahdi Sahlabadi, Kangbin Yim and Sunyoung Lee *
Reviewer 1: Anonymous
Reviewer 2: Anonymous
Electronics 2025, 14(10), 2034; https://doi.org/10.3390/electronics14102034
Submission received: 24 March 2025 / Revised: 30 April 2025 / Accepted: 6 May 2025 / Published: 16 May 2025

Round 1

Reviewer 1 Report

Comments and Suggestions for Authors
  1. The authors need to define the research void in the introduction part of their manuscript. Research authors need to define the specific sections of Firmware Over The Air (FOTA) security that prior research has omitted. The main contributions of the research need to be explicitly stated before the end of the introduction section.
  2. The paper must analyze a broad industry problem while shifting away from the direct investigation of Samsung, Motorola, Huawei, KIA, and BMW manufacturers.
  3. The paper needs to examine previous research about FOTA protocol encryption vulnerabilities, Man-in-the-Middle attacks, and their combination with firmware tampering issues. The review section of the paper requires a thorough evaluation of current threat categorization frameworks, which include mobile security and automotive protection system elements.
  4. The authors need to describe the distinctive elements of their research connected to previously released studies. Additional information regarding security testing procedures must be inserted into the methodology section. Security testing tools, along with their Wireshark, Burp Suite, and MITMproxy implementations, should be defined in the paper, and it should specify the sample group selection and threat classification tools, which are combined with encryption standard information and test environment details.
  5. The study requires research documentation on which real-world conditions and theoretical tests were used as foundational elements.
  6. The section must present numeric findings. A research presentation should display both the frequency of vulnerability targeting along with detailed descriptions of security problems. The research findings of this study must be compared to prior work to establish if similar weaknesses appear. The implementation of a risk assessment framework should help evaluate the impact that each vulnerability generates.

Author Response

Comments 1: The authors need to define the research void in the introduction part of their manuscript. Research authors need to define the specific sections of Firmware Over The Air (FOTA) security that prior research has omitted. The main contributions of the research need to be explicitly stated before the end of the introduction section.

Response 1: In the Introduction section, we described a specific part of the security of the missing FOTA that is a research void on line 44.

Comments 2: The paper must analyze a broad industry problem while shifting away from the direct investigation of Samsung, Motorola, Huawei, KIA, and BMW manufacturers.

Response 2: In line 17 of the introduction, this paper raises security concerns about the update process for FOTAs in general, not just for manufacturers.

Comments 3: The paper needs to examine previous research about FOTA protocol encryption vulnerabilities, Man-in-the-Middle attacks, and their combination with firmware tampering issues. The review section of the paper requires a thorough evaluation of current threat categorization frameworks, which include mobile security and automotive protection system elements.

Response 3: In the related work section, line 115, we further analyze three representative FOTA security threats (MitM, Bypass authentication, and Donwgrading), including a review of previous work and an evaluation of the overall security verification framework.

Comments 4: The authors need to describe the distinctive elements of their research connected to previously released studies. Additional information regarding security testing procedures must be inserted into the methodology section. Security testing tools, along with their Wireshark, Burp Suite, and MITMproxy implementations, should be defined in the paper, and it should specify the sample group selection and threat classification tools, which are combined with encryption standard information and test environment details.

Response 4: In the related research section, we have divided it into three parts (mobile, IoT, and automotive) on line 145 and summarized the characteristic elements of related research, analysis methods, etc.

Comments 5: The study requires research documentation on which real-world conditions and theoretical tests were used as foundational elements.

Response 5: In chapter 5 describes a sample of the research process based on real-world conditions and theoretical testing with one manufacturer example for validation for each threat.

Comments 6: The section must present numeric findings. A research presentation should display both the frequency of vulnerability targeting along with detailed descriptions of security problems. The research findings of this study must be compared to prior work to establish if similar weaknesses appear. The implementation of a risk assessment framework should help evaluate the impact that each vulnerability generates.

Response 6: It is difficult to quantify the results for each vulnerability for the communication protocols and integrity verification algorithms used by different manufacturers for FOTA updates, but we referenced existing papers(ref 50,51,52,53,54) that compare the security strength of communication protocols and integrity verification algorithms.

 

Reviewer 2 Report

Comments and Suggestions for Authors

The researchers provide a practically valuable and timely analysis of FOTA security vulnerabilities across multiple vendors. They propose a security testing framework based on real-world threat scenarios and validates it through cross-manufacturer experiments. However, I have the following comments: 

  1. The paper has many claims in many different places that require confirmations from other works (citations), for example and not limited to: in line 202, 2013, 215, 320, 333, etc.

  2. In many places, the phrasing is unclear or technically inaccurate. For instance, line 179 states, “The attack scenarios consist of security threats that attackers can use,” which is conceptually incorrect. Attackers exploit vulnerabilities, not threats, to launch attacks. The authors must ensure conceptual clarity and accurate use of security terminology.
  3. The security testing process diagram in Figure 4 is logically incorrect. It implies that attack scenarios lead to vulnerabilities, whereas the relationship is the opposite; vulnerabilities can be exploited via different attack scenarios. The figure requires both revision and a more thorough explanation in the text.
  4. In table 1, not all of the vulnerabilities are actually vulnerabilities; they’re instead threats. The authors need to differentiate between these security concepts. 

    “Vulnerabilities are weak points in a system that can be exploited to perform attacks, they can be code, design, or network management errors. Threats are possible ways to take advantage of vulnerabilities to perform attacks. Several threats may use the same vulnerability. Attacks are the realization of threats, which lead to some misuse of information. A misuse of information (impact of attack) is the result of an attack and is a goal of the attacker, e.g., a denial of service or a data leakage.” 

    Source: Alnaim, A.K. Securing 5G virtual networks: a critical analysis of SDN, NFV, and network slicing security. Int. J. Inf. Secur. 23, 3569–3589 (2024). https://doi.org/10.1007/s10207-024-00900-5

    5. Section 4.5 is very weak and require more development, and shows no results. 

    6. How can we make sure the validity of the scenarios mentioned in Section 5? The researchers need to validate the possibility of these security attack scenarios.

    7. The framework the researchers propose is little more than a checklist of existing attack vectors with superficial verification steps. There is no new algorithm, protocol enhancement, or significant framework presented that justifies publication in a reputable journal.

    8. The authors mentioned that they have tested seven different manufacturers but provide no evidence, no sample logs, no screenshots, and no detailed procedural breakdown for their experiment. 

    9. The analysis is entirely qualitative. There is no statistical evidence, no attack success rates, no timing metrics, and no quantitative comparison between methods.

    10. The researchers said ““Our proposed framework identifies critical security flaws…”. Yet, there is no actual framework just enumeration of common threats. They offer no real mitigation techniques.

    I recommend the researchers to first conduct a comparative analysis with recent FOTA security frameworks to strengthen their proposed security testing process.

    11. The conclusion section needs to be rewritten; it is more like an introduction, however, it should summarize the results generated from the research and some future works. 

    12. The manuscript requires extensive English reviewing as there are many grammar and structure mistakes, for example: “MitM aand certificate bypass”, “firwmare update precess”, etc. 

Comments on the Quality of English Language

The manuscript requires extensive English reviewing as there are many grammar and structure mistakes, for example: “MitM aand certificate bypass”, “firwmare update precess”, etc. 

Author Response

Comment 1: The paper has many claims in many different places that require confirmations from other works (citations), for example and not limited to: in line 202, 2013, 215, 320, 333, etc.

Response 1: We've added citations to Chapters 4 and 5 and backed up our claims with analysis from real-world experiments.

Comment 2: In many places, the phrasing is unclear or technically inaccurate. For instance, line 179 states, “The attack scenarios consist of security threats that attackers can use,” which is conceptually incorrect. Attackers exploit vulnerabilities, not threats, to launch attacks. The authors must ensure conceptual clarity and accurate use of security terminology.

Response 2: Changed the framework analysis procedure to attack scenarios through security threats, as shown in Figure 4 in Chapter 4.

Comment 3: The security testing process diagram in Figure 4 is logically incorrect. It implies that attack scenarios lead to vulnerabilities, whereas the relationship is the opposite; vulnerabilities can be exploited via different attack scenarios. The figure requires both revision and a more thorough explanation in the text.

Response 3: Modified Figure 4 to follow the security testing process suggested in Chapter 4.

Comments 4: In table 1, not all of the vulnerabilities are actually vulnerabilities; they’re instead threats. The authors need to differentiate between these security concepts.

Response 4: Revised Table 1 in Chapter 4 by replacing vulnerabilities with threats to better understand the definitions of vulnerabilities and threats.

Comments 5: Section 4.5 is very weak and require more development, and shows no results.

Response 5: The results for Section 4.5 are presented in Chapter 6, where we evaluated threats by manufacturer and changed 4.4 to errors in the manufacturer's implementation.

Comments 6: How can we make sure the validity of the scenarios mentioned in Section 5? The researchers need to validate the possibility of these security attack scenarios.

Response 6:  In the Chapter 5 describes a sample of the research process based on real-world conditions and theoretical testing with one manufacturer example for validation for each threat.

Comments 7: The framework the researchers propose is little more than a checklist of existing attack vectors with superficial verification steps. There is no new algorithm, protocol enhancement, or significant framework presented that justifies publication in a reputable journal.

Response 7: In Chapters 4 and 5, we presented a validation environment and analysis framework for safety verification based on security threats (MitM, Bypass Certification, and Downgrading) to the existing FOTA update process by adding real-world experiments.

Comments 8: The authors mentioned that they have tested seven different manufacturers but provide no evidence, no sample logs, no screenshots, and no detailed procedural breakdown for their experiment.

Response 8: Added evidence, sample logs, and screenshots for each security threat on lines 330, 364, and 386 in Chapter 5 and added detailed procedural analysis.

Comments 9: The analysis is entirely qualitative. There is no statistical evidence, no attack success rates, no timing metrics, and no quantitative comparison between methods.

Response 9: In Chapter 6, we presented the possible security threats based on each device from 7 different manufacturers, and this framework can be used in future research to verify the safety of devices with FOTA updates that will be developed in the future.

Comments 10: The researchers said ““Our proposed framework identifies critical security flaws…”. Yet, there is no actual framework just enumeration of common threats. They offer no real mitigation techniques. I recommend the researchers to first conduct a comparative analysis with recent FOTA security frameworks to strengthen their proposed security testing process.

Response 10:  We analyzed the safety of three representative FOTA security threats (MitM, Bypass certification, and Downgrading) by configuring a real-world experimental environment. As there is no similar framework, we added similar papers for security verification as references.

Comments 11: The conclusion section needs to be rewritten; it is more like an introduction, however, it should summarize the results generated from the research and some future works. 

Response 11: We have rewritten the results part in Chapter 7 to summarize our findings and future work.

Comments 12: The manuscript requires extensive English reviewing as there are many grammar and structure mistakes, for example: “MitM aand certificate bypass”, “firwmare update precess”, etc.

Response 12: Reviewed and corrected English grammar and typos throughout the paper.

 

Round 2

Reviewer 1 Report

Comments and Suggestions for Authors
  1. In the current manuscript, there are some repeats in the descriptions of two specific attack types that combine a Man-in-the-Middle (MITM) attack with certificate bypass vulnerabilities. The text maintains repetitive mentions of those concepts in various sections without showing any modification. A single subsection containing this information must be created because it can strengthen the manuscript's clarity and coherence. The manuscript contains a single subsection with clear definitions that readers can understand by using consistent cross-section referencing within other parts of the text.
  2. Sections 4.3 and 5.2 present a large number of complex textual blocks which creates obstacles for readers to easily understand the content. The sections should be split into compact subsections to enable better readability and faster comprehension of main ideas by readers. Short bullet points organized through text will improve both the speed of comprehension and the identification of essential information.
  3. Table and figure references show inconsistency because some cases lack full citations both in the text and within the references. Every reference needs to follow the reference guidelines defined by the journal which accepts either APA or IEEE or any other academic format. The manuscript achieves greater academic strength when every reference follows one standard format.
  4. The manuscript needs stronger clarification about new contributions from implementing the Funambol testbed and its implementation steps. The manuscript will benefit from answering these questions either in the introduction section or the conclusion
  5. Does the deployment of the Funambol environment represent an innovation in its respective field? The comparable results involving seven manufacturers analysis add new significant knowledge to current scholarly understanding.
  6. The research paper demonstrates consistent relevance to the main topics appearing in journals specializing in cybersecurity, telecommunications, and IoT studies. The research provides important solutions to security matters involving mobile platforms, OTA standards, and embedded systems, enhancing contemporary academic dialog.
  7. The manuscript should incorporate a performance comparison table that evaluates the testing outcomes between MD5 and HMAC protocols. A performance assessment table contains vital measurements, CPU utilization statistics, resistance to collisions, and successful MITM attack performance evaluation. The review should include recent OTA environment benchmark studies that analyze these algorithms, such as Gupta et al. (2016) and Kamra et al. (2019). Your research base will benefit from this inclusion which establishes links to the current scientific literature.
  8. The manuscript displays a well-constructed threat simulation but fails to provide comprehensive details about how these findings would benefit original equipment manufacturers (OEMs), telecommunications providers and regulatory bodies at an industrial level. Including a one-paragraph subsection on practical implications would enhance the manuscript in both the conclusion and discussion sections. In this section, clearly state:
  9. The paper demonstrates how industry manufacturers should apply discovered findings to improve their security procedures. The article examines the consequences that regulatory security audits conducted by NIST, ISO and IEC face regarding standards for automotive technology or IoT devices. The security measure recommendations for Android firmware providers should be developed as a result of your findings.
  10. Considerations on using the Funambol OMA-DM testbed as a research instrument need mention since its results apply specifically to Android devices. You should discuss the built-in limitations of your testbed through a quick paragraph to resolve this question. The simulation includes conditions of unencrypted firmware package access and configurable OMA-DM setup because it works under such assumptions. This method would miss critical limitations that exist in secure systems such as the ones found in iOS platforms. The inclusion of detailed explanations will make the study findings more trustworthy and reliable when applied to your work.
  11. The manuscript conducts essential and practical research while using test methodologies to tackle a critical safety problem within 5G and Internet of Things security domains. The paper presents good potential for academic publication when authors make a few changes to structure and normalize citations while minimizing repetitive content.

Author Response

Comments 1: In the current manuscript, there are some repeats in the descriptions of two specific attack types that combine a Man-in-the-Middle (MITM) attack with certificate bypass vulnerabilities. The text maintains repetitive mentions of those concepts in various sections without showing any modification. A single subsection containing this information must be created because it can strengthen the manuscript's clarity and coherence. The manuscript contains a single subsection with clear definitions that readers can understand by using consistent cross-section referencing within other parts of the text.

Response 1: Removed Table 1. in the background section to create subsections for 2.2 Man-in-the-Middle Attacks, 2.3 Certificate Bypass Vulnerabilities, and 2.4 Downgrade Attacks on line 120 to add consistent descriptions for easier understanding.

Comments 2: Sections 4.3 and 5.2 present a large number of complex textual blocks which creates obstacles for readers to easily understand the content. The sections should be split into compact subsections to enable better readability and faster comprehension of main ideas by readers. Short bullet points organized through text will improve both the speed of comprehension and the identification of essential information.

Response 2: In Sections 4.3 and 5.2, we rewrote the text to make it easier to understand where it was unorganized, confusing, or hard to follow.

Comments 3: Table and figure references show inconsistency because some cases lack full citations both in the text and within the references. Every reference needs to follow the reference guidelines defined by the journal which accepts either APA or IEEE or any other academic format. The manuscript achieves greater academic strength when every reference follows one standard format.

Response 3: Added references that were missing citations and reformatted them all to follow a single standardized format based on APA formatting.

Comments 4: The manuscript needs stronger clarification about new contributions from implementing the Funambol testbed and its implementation steps. The manuscript will benefit from answering these questions either in the introduction section or the conclusion.

Response 4: In the conclusion of Section 7, we add that the introduction of a poonamball testbed for FOTA safety verification is new and a new contribution of the thesis.

Comments 5: Does the deployment of the Funambol environment represent an innovation in its respective field? The comparable results involving seven manufacturers analysis add new significant knowledge to current scholarly understanding.

Response 5: Building a funambol environment is an innovative way to test the security safety of the FOTA update process. The results of a comparative analysis of seven manufacturers provide a sample to predict the current state of security measures in manufacturers' FOTA updates.

Comments 6: The research paper demonstrates consistent relevance to the main topics appearing in journals specializing in cybersecurity, telecommunications, and IoT studies. The research provides important solutions to security matters involving mobile platforms, OTA standards, and embedded systems, enhancing contemporary academic dialog.

Response 6: Addresses the security concerns of implementing a FOTA update environment for manufacturers and presents solutions.

Comments 7: The manuscript should incorporate a performance comparison table that evaluates the testing outcomes between MD5 and HMAC protocols. A performance assessment table contains vital measurements, CPU utilization statistics, resistance to collisions, and successful MITM attack performance evaluation. The review should include recent OTA environment benchmark studies that analyze these algorithms, such as Gupta et al. (2016) and Kamra et al. (2019). Your research base will benefit from this inclusion which establishes links to the current scientific literature.

Response 7: Because the HASH algorithm works on the same principle regardless of the OTA's environment, there is no need to compare it to real-world environments. If you need to compare the security strength of an algorithm, you can refer to other studies to determine its security strength. We've added a section on security comparisons in Chapter 6.

Comments 8: The manuscript displays a well-constructed threat simulation but fails to provide comprehensive details about how these findings would benefit original equipment manufacturers (OEMs), telecommunications providers and regulatory bodies at an industrial level. Including a one-paragraph subsection on practical implications would enhance the manuscript in both the conclusion and discussion sections. In this section, clearly state:

Response 8: Added a discussion in Section 6.3 about the utility of our findings in industry.

Comments 9: The paper demonstrates how industry manufacturers should apply discovered findings to improve their security procedures. The article examines the consequences that regulatory security audits conducted by NIST, ISO and IEC face regarding standards for automotive technology or IoT devices. The security measure recommendations for Android firmware providers should be developed as a result of your findings.

Response 9: Added a new section in Section 6.2 about NIST's recommendations for security measures for industry as a result of the FOTA update.

Comments 10: Considerations on using the Funambol OMA-DM testbed as a research instrument need mention since its results apply specifically to Android devices. You should discuss the built-in limitations of your testbed through a quick paragraph to resolve this question. The simulation includes conditions of unencrypted firmware package access and configurable OMA-DM setup because it works under such assumptions. This method would miss critical limitations that exist in secure systems such as the ones found in iOS platforms. The inclusion of detailed explanations will make the study findings more trustworthy and reliable when applied to your work.

Response 10: We mentioned the limitations of the Funambol OMA-DM testbed as it only covers Android-based devices in Section 6.2. We presented an alternative validation method for the iOS platform.

Comments 11: The manuscript conducts essential and practical research while using test methodologies to tackle a critical safety problem within 5G and Internet of Things security domains. The paper presents good potential for academic publication when authors make a few changes to structure and normalize citations while minimizing repetitive content.

Response 11:  Overall, we've reorganized it to reduce repetition and change the structure a bit.

Reviewer 2 Report

Comments and Suggestions for Authors

I thank the authors for the improvement they have shown in the paper. I have a few more comments:

  1. While the study examines six mobile manufacturers and one vehicle model, this may not be representative of the entire industry. It could be difficult to expand the sample size to include as much represent as the industry, the authors at least should clarify how the manufacturers were selected for analysis. 
  2. The title of the paper indicates 5G, however, the connection between 5G technology and specific security challenges for FOTA updates should be strengthened in the introduction and throughout the paper.
  3. There should be at least threats/attacks oriented in 5G technologies. 
  4. The methodology section should include more details about the Funambol implementation and configuration used for testing.
  5. The paper identifies vulnerabilities but provides limited specific recommendations for how manufacturers should address these issues.
  6. Every study has some limitations; the authors should discuss the limitations of their testing approach.
  7. In line 32, there's a reference to "OMA-DM" technology for firmware updates, but the corresponding reference [5] only mentions "OTA-oriented Protocol for Security Protection" without clearly explaining OMA-DM's role in firmware updates.
  8. In lines 135-137, there are multiple references [32] and [11][22][23][24][25] mentioned in a single paragraph discussing security issues, but reference [32] appears to be out of sequence (as later references start from [26]) and may be incorrectly cited.
  9. The paper mentions "fishbone diagram" multiple times (lines 69, 73, 125, 192) as a key analytical framework, but doesn't properly cite the methodological source for this approach.
  10. Some references appear to be cited in the text but have incomplete information in the reference list. For example, reference [13] and [14] about OMA-DM are cited together in line 85, but [14] in the reference list (lines 479-481) appears to be incomplete with "Draft ver 1 (2006)" without full publication details.

Author Response

Commnets 1: While the study examines six mobile manufacturers and one vehicle model, this may not be representative of the entire industry. It could be difficult to expand the sample size to include as much represent as the industry, the authors at least should clarify how the manufacturers were selected for analysis.

Response 1: The manufacturers were selected by sampling representative Android-based mobile device manufacturers, and we have added them to the limitations section in Section 6.2. 

Commnets 2: The title of the paper indicates 5G, however, the connection between 5G technology and specific security challenges for FOTA updates should be strengthened in the introduction and throughout the paper.

Response 2: Added a reference to the 5G FOTA in the introduction on line 39.

Commnets 3: There should be at least threats/attacks oriented in 5G technologies.

Response 3: Added a reference to the FOTA threat/attack orientation for 5G as stated in the paper title to line 39 of the Introduction.

Commnets 4: The methodology section should include more details about the Funambol implementation and configuration used for testing.

Response 4: Added specific configuration and Funambol implementation instructions to Section 4.1.

Commnets 5: The paper identifies vulnerabilities but provides limited specific recommendations for how manufacturers should address these issues.

Response 5: Identify vulnerabilities that may occur during the FOTA update process and help manufacturers implement a secure FOTA update experience.

Commnets 6: Every study has some limitations; the authors should discuss the limitations of their testing approach.

Response 6: We mentioned the limitations of the Funambol OMA-DM testbed as it only covers Android-based devices in Section 6.2. We presented an alternative validation method for the iOS platform.

Commnets 7: In line 32, there's a reference to "OMA-DM" technology for firmware updates, but the corresponding reference [5] only mentions "OTA-oriented Protocol for Security Protection" without clearly explaining OMA-DM's role in firmware updates.

Response 7: On line 32, changed reference[5] to a reference to the OTA protocol technology based on OMA-DM.

Commnets 8: In lines 135-137, there are multiple references [32] and [11][22][23][24][25] mentioned in a single paragraph discussing security issues, but reference [32] appears to be out of sequence (as later references start from [26]) and may be incorrectly cited.

Response 8: Fixed out-of-order references throughout the article again.

Commnets 9: The paper mentions "fishbone diagram" multiple times (lines 69, 73, 125, 192) as a key analytical framework, but doesn't properly cite the methodological source for this approach.

Response 9: The fishbone diagram approach is part of the holistic security threat categorization proposed in this paper.

Commnets 10: Some references appear to be cited in the text but have incomplete information in the reference list. For example, reference [13] and [14] about OMA-DM are cited together in line 85, but [14] in the reference list (lines 479-481) appears to be incomplete with "Draft ver 1 (2006)" without full publication details.

Response 10: Changed all incomplete references, such as [13] and [14], to the latest version of complete references.

Back to TopTop