Special Issue "Cyber Resilience"
Deadline for manuscript submissions: 31 December 2020.
Interests: Computer science; Cyber security; Cyber resilience
Modern cyber systems, especially those based on the technologies of Industry 4.0 (Artiﬁcial Intelligence (AI), cloud and fog computing, 6G, IoT/IIoT, Big Data and ETL, Q-computing, Blockchain, VR/AR, and others) do not have the required cyber resilience for targeted operation under conditions of heterogeneous mass cyberattacks due to the high structural and functional complexity of these systems, a potential danger of existing vulnerabilities, and “sleep” hardware and software tabs, the so-called “digital bombs”. Moreover, modern cyber security tools, including antivirus protection, vulnerability scanners, as well as systems for detecting, preventing, and neutralizing computer attacks, are still not sufﬁciently effective. The applied classical methods and means of ensuring reliability, response, and recovery using the capabilities of structural and functional redundancy, N-multiple reservation, standardization, and reconﬁguration are no longer suitable as they do not provide the required cyber resilience or prevent catastrophic consequences.
The above pose a problematic situation that lies in the contradiction between the ever-increasing need to ensure cyber resilience of critical information infrastructure under the conditions of destructive software impacts and the imperfection of methods and means of timely detection, prevention, and neutralization of cyberattacks. The removal of this contradiction requires the resolution of an urgent scientiﬁc and technical problem—the organization of cyber resilience of information infrastructure in terms of heterogeneous mass cyberattacks, based on new models and methods of similarity theory, big data collection and processing and stream data extraction, transfer, and load (ETL), deep learning, and semantic and cognitive analysis.
Problems of ensuring reliability, response, recovery, and cyber resilience of critical information infrastructure and related information have received the attention of leading foreign and domestic scientific researchers from the start. However, in the conditions of heterogeneous mass cyberattacks (especially previously unknown ones), it is necessary to ensure cyber resilience critical information infrastructure, providing that the process of restoring the functioning of its component systems, in the course of destructive programmatic impacts, helps towards a reduction of signiﬁcant or catastrophic consequences. The main idea behind this solution to the problem is using the abovementioned infrastructure’s ability to produce immunity to disturbances of the computational processes under exposure conditions, similarly to the immune system protecting a living organism. This requires the resolution of a scientiﬁc problem—the organization of cyber resilience information infrastructure in the context of heterogeneous mass cyberattacks, based on new models and methods of the similarity theory, big data collection and processing (ETL), deep learning, and semantic and cognitive analysis. The main goal is to provide the required level of cyber resilience of the aforementioned systems under the conditions of both known and previously unknown destructive program actions.
Prof. Dr. Sergei A. Petrenko
Manuscript Submission Information
Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.
Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Information is an international peer-reviewed open access monthly journal published by MDPI.
Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1000 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.
- Cyber systems of 4.0 Industry
- Cyber resilience management concept
- Quantitative metrics and cyber resistance measures
- Cyber resiliency engineering framework
- Business continuity management
The below list represents only planned manuscripts. Some of these manuscripts have not been received by the Editorial Office yet. Papers submitted to MDPI journals are subject to peer-review.
Title: Estimation of Efficiency of Transfer of Big Data on the Example of PostgreSQL Database, Hadoop Platform and Sqoop System
Authors: Badanina O.V.; Gindin S.I.; Khomonenko A.D.
Affiliation: Emperor Alexander I St. Petersburg State Transport University; WKA them. A.F. Mozhaysky
Abstract: The evaluation of the characteristics of the efficiency of large data transmission is considered. To do this, use the PostgreSQL demo database as a data source; Hadoop platform, as a data storage tool, as well as a Sqoop data transfer tool. The experimental results demonstrate a change in the parameters characterizing the efficiency of data transfer during import, depending on their volume and type. The obtained experimental results can be used for the predictive calculation of the performance characteristics of big data processing. Such a forecast can be made on the basis of using the obtained characteristics of efficiency as a training sample for a neural network. In addition, it is advisable to forecast the speed of processing big data using models of multichannel queuing systems with heating and cooling, as well as queuing networks. It is advisable to calculate the efficiency indicators for moving big data in various modes of functioning of the information infrastructure for optimal disaster recovery of data warehouses and ensuring the required cyber stability of digital platforms in general.
Title: A Mathematical Method of Ensuring the Recovery of Big Data in the Face of Its Losses in Global Disasters
Authors: Evgenii Vorobiev
Affiliation: Saint-Petersburg State Electrotechnical University
Abstract: Problem statement: The most widely used lossless compression methods as a rule, use the statistical properties of individual bytes or bits of text or image. The most common lossless compression algorithms based on the use of variable length codes. In this case, compression achieved by assigning short codes to frequently occurring data elements, and long codes to rarely occurring elements. A significant limitation of this approach is a relatively small degree of data compression, for which no losses allowed. The compression operation is an elementary cryptographic operation. The aim: of the work is to develop mathematical operations for cryptographic primitives based on the vector representation model of multi-bit binary data using pseudo-regular numbers. Novelty: the proposed approach is to apply the properties of binary numbers associated with their mathematical and structural dependence on pseudo-regular numbers to get their short record. Result: is that this increases the degree of compression of large-sized binary data and increases the information security of their transmission and storage. In addition, a mathematical model and a description of the algorithm for converting multi-digit binary numbers to a pseudo-regular structure based on binary-decimal transformation are proposed. The distribution of numbers with a pseudo-regular structure in ordered number fields shown. Practical significance: the proposed solution can find practical application in order to back up data to increase information security and availability of advanced information and computing systems operating under conditions of influence.
Title: Emerging Technologies and Security Threats in South Africa
Authors: Phillip Nyoni; Mthulisi Velempini; Nehemiah Mavetera
Affiliation: Phillip Nyoni: North-West University, Mafikeng, South Africa; Mthulisi Velempini: University of Limpopo, Polokwane, South Africa; Nehemiah Mavetera: North-West University, Mafikeng, South Africa
Abstract: New technologies challenge privacy by making users more visible on the Internet. This is because they track users’ activities and collect sensitive data about them. Previous studies have highlighted the impact of users trading their data for new services but there has been little research on emerging technologies in developing countries. The purpose of this study was to identify and describe what risks emerging technologies pose to users in South Africa. An online survey was carried out using a sample of 101 participants (n=101) to collect data on the usage patterns of users of new technologies. Interviews with 7 experts were also conducted to establish the types of threats users face. Findings show that there are multiple privacy risks such as mobile devices that can be infected by malware or data breaches which lead to a loss of personal data. Risks such as lack of control over data, data breaches and identity theft are real and affect users on a regular basis. These findings are valuable to end-users and developers of smart devices who are interested in protecting privacy by identifying threats and dealing with them.
Keywords: emerging technologies; cybersecurity; security threats; privacy; mobile devices; cloud computing; internet-of-things; social construction of technology