Selected Papers from MICSECS 2019

A special issue of Computers (ISSN 2073-431X).

Deadline for manuscript submissions: closed (1 May 2020) | Viewed by 26230

Special Issue Editors


E-Mail Website
Guest Editor
School of Computer Technologies and Control, ITMO University, Kronverkskiy Prospekt, 49, St Petersburg, Russia
Interests: FPGA; VLSI; embeddedsystems; cyber-physical systems; SoC

E-Mail Website
Guest Editor
Faculty of Software Engineering and Computer Systems, ITMO University, 197101 St. Petersburg, Russia
Interests: intelligent systems; linked data; semantic web

E-Mail Website
Guest Editor
Faculty of Software Engineering and Computer Systems, ITMO University, 197101 St. Petersburg, Russia
Interests: FPGA; SoC; embedded systems; computer vision; neural networks hardware accelerators

Special Issue Information

Dear Colleagues,

The Majorov International Conference on Software Engineering and Computer Systems (MICSECS 2019) will be held in ITMO University, Saint Petersburg, Russia on 12–13 December 2019. MICSECS 2019 is an international event dedicated to discussing research results and directions in areas related to software and hardware engineering, networks, cyber-physical systems, computer security, multimedia technologies, and AR/VR/MR technologies. For more info see: http://micsecs.org

The authors of a number of selected full papers of high quality will be invited after the conference to submit revised and extended versions of their originally-accepted conference papers to this Special Issue of Computers, published by MDPI, in open access. The selection of the best papers will be based on their ratings in the conference review process, the quality of presentation during the conference, and the expected impact on the research community. Each submission to this Special Issue should contain at least 50% of new material, e.g., in the form of technical extensions; more in-depth evaluations; or additional use cases and a change of title, abstract, and keywords. These extended submissions will undergo a peer-review process according to the journal’s rules of action. At least two technical committees will act as reviewers for each extended article submitted to this Special Issue; if needed, additional external reviewers will be invited to guarantee a high-quality reviewing process.

Dr. Pavel Kustarev
Dr. Dmitry Mouromtsev
Dr. Sergei Bykovskii
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Computers is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1800 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Published Papers (7 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

14 pages, 2499 KiB  
Article
Formation of Unique Characteristics of Hiding and Encoding of Data Blocks Based on the Fragmented Identifier of Information Processed by Cellular Automata
by Elena Kuleshova, Anatoly Marukhlenko, Vyacheslav Dobritsa and Maxim Tanygin
Computers 2020, 9(2), 51; https://doi.org/10.3390/computers9020051 - 19 Jun 2020
Cited by 5 | Viewed by 3380
Abstract
Currently, the following applications of the theory of cellular automata are known: symmetric encryption, data compression, digital image processing and some others. There are also studies suggesting the possibility of building a public key system based on cellular automata, but this problem has [...] Read more.
Currently, the following applications of the theory of cellular automata are known: symmetric encryption, data compression, digital image processing and some others. There are also studies suggesting the possibility of building a public key system based on cellular automata, but this problem has not been solved. The purpose of the study is to develop an algorithm for hiding and encoding data blocks based on a fragmented identifier of information processed on the basis of cellular automata in the scale of binary data streams using an original method containing an public parameter in the conversion key. A mathematical model of the formation of unique data characteristics is considered, based on the use of patterns that determine the individual neighborhood of elements in cell encryption. A multi-threaded computing scheme has been developed for processing confidential data using the single-key method with a public parameter based on cellular automata and using data segmentation. To study individual chains in data blocks, a software module has been developed that allows one to evaluate the uniformity of information distribution during encryption. A variant of estimating the distribution of bits is proposed that indirectly reflects the cryptographic strength of the method. Based on the developed theoretical principles, a software module is synthesized that implements a transformation rule that takes into account the individual neighborhood of the processed element on the basis of a cellular automata. Experimental studies have shown that this modification made it possible to increase the speed of the method by up to 13 percent due to segmentation and the possibility of parallel processing of the original matrix, as well as to increase cryptographic strength due to the use of a unique chain of pseudo-random neighborhood (hereinafter referred to as PRN) defined by the transformation key. At the same time, it was possible to maintain uniformity of distribution of the output chain at the bit level and to ensure that the number of inversions was included in the confidence interval. Full article
(This article belongs to the Special Issue Selected Papers from MICSECS 2019)
Show Figures

Figure 1

13 pages, 2231 KiB  
Article
Risk Reduction Optimization of Process Systems under Cost Constraint Applying Instrumented Safety Measures
by Aleksandr Moshnikov and Vladimir Bogatyrev
Computers 2020, 9(2), 50; https://doi.org/10.3390/computers9020050 - 19 Jun 2020
Cited by 2 | Viewed by 2974
Abstract
This article is devoted to an approach to develop a safety system process according to functional safety standards. With the development of technologies and increasing the specific energy stored in the equipment, the issue of safety during operation becomes more urgent. Adequacy of [...] Read more.
This article is devoted to an approach to develop a safety system process according to functional safety standards. With the development of technologies and increasing the specific energy stored in the equipment, the issue of safety during operation becomes more urgent. Adequacy of the decisions on safety measures made during the early stages of planning the facilities and processes contributes to avoiding technological incidents and corresponding losses. A risk-based approach to safety system design is proposed. The approach is based on a methodology for determining and assessing risks and then developing the necessary set of safety measures to ensure that the specified safety indicators are achieved. The classification of safety measures is given, and the model of risk reduction based on deterministic analysis of the process is considered. It is shown that the task of changing the composition of safety measures can be represented as the knapsack discrete optimization problem, and the solution is based on the Monte Carlo method. A numerical example is provided to illustrate the approach. The considered example contains a description of failure conditions, an analysis of the types and consequences of failures that could lead to accidents, and a list of safety measures. Solving the optimization problem used real reliability parameters and the cost of equipment. Based on the simulation results, the optimal composition of the safety measures providing cost minimization is given. This research is relevant to engineering departments, who specialize in planning and designing technological solutions. Full article
(This article belongs to the Special Issue Selected Papers from MICSECS 2019)
Show Figures

Figure 1

13 pages, 628 KiB  
Article
The Architecture of the Access Protocols of the Global Infocommunication Resources
by Natalya Verzun, Mikhail Kolbanev and Alexey Shamin
Computers 2020, 9(2), 49; https://doi.org/10.3390/computers9020049 - 9 Jun 2020
Cited by 4 | Viewed by 3153
Abstract
One of the important functions of cyberspace is to provide people and devices with access to global infocommunication resources, and as the network infrastructure develops, the number of access options increases, including those based on wireless technologies. A wide variety of access technologies [...] Read more.
One of the important functions of cyberspace is to provide people and devices with access to global infocommunication resources, and as the network infrastructure develops, the number of access options increases, including those based on wireless technologies. A wide variety of access technologies leads to the formation of heterogeneous broadcast networks. Following the concept of Always Best Connected and striving for rational use of access network resources, developers use Vertical Handover procedures today. This approach assumes the existence of a selection criterion that allows you to prefer a particular network to other networks from the number of available and able to provide the required connection and services, and a selection procedure that implements the process of calculating the characteristics of access in each of the acceptable options. When implementing a vertical handover, it should be taken into account that the rational choice depends on the moment of time and point of space at which the terminal device developed a request to establish a connection. The corresponding procedures can be implemented in accordance with decentralized or centralized architectures. In the first case, the choice is made by hardware and software of terminal devices. The disadvantage of this implementation is the complexity and, as a result, the rise in price of terminal devices, each of which requires a corresponding complexity of the selection procedure of the performance and memory reserve. Another negative consequence of the decentralized approach is a decrease in the last-mile network utilization rate due to the inability to make complex decisions. The article discusses the centralized architecture of access protocols to global infocommunication resources. In accordance with it, the access network is selected by a new centralized network device that was not previously used on communication networks. The protocols that this network element implements should be located between the first (physical) and second (channel) levels of the open system interaction model. The purpose of the study is to develop an effective architectural solution for access networks and create a mathematical model for evaluating the efficiency of using the last mile resources and the quality of user service. The object of research is architectural solutions for last-mile networks. The subject of research is models of the theory of tele-traffic that allow us to evaluate the qualitative characteristics of the corresponding process. To achieve this goal the following tasks were solved in the article: analysis of known approaches to selecting one of several available access networks; development of a centralized architecture that changes the basic model of interaction between open systems; description of the metadata exchange scenario between network elements of the new architecture; development of a mathematical model of the data transmission process in the access radio network; conducting numerical estimates of the probabilistic and temporal characteristics of the proposed procedures. Full article
(This article belongs to the Special Issue Selected Papers from MICSECS 2019)
Show Figures

Figure 1

26 pages, 727 KiB  
Article
Model Based Approach to Cyber–Physical Systems Status Monitoring
by Alexander Vodyaho, Saddam Abbas, Nataly Zhukova and Michael Chervoncev
Computers 2020, 9(2), 47; https://doi.org/10.3390/computers9020047 - 7 Jun 2020
Cited by 3 | Viewed by 4137
Abstract
The distinctive feature of new generation information systems is not only their complexity in terms of number of elements, number of connections and hierarchy levels, but also their constantly changing structure and behavior. In this situation the problem of receiving actual information about [...] Read more.
The distinctive feature of new generation information systems is not only their complexity in terms of number of elements, number of connections and hierarchy levels, but also their constantly changing structure and behavior. In this situation the problem of receiving actual information about the observed complex Cyber–Physical Systems (CPS) current status becomes a rather difficult task. This information is needed by stakeholders for solving tasks concerning keeping the system operational, improving its efficiency, ensuring security, etc. Known approaches to solving the problem of the complex distributed CPS actual status definition are not enough effective. The authors propose a model based approach to solving the task of monitoring the status of complex CPS. There are a number of known model based approaches to complex distributed CPS monitoring, but their main difference in comparison with the suggested one is that known approaches by the most part use static models which are to be build manually by experts. It takes a lot of human efforts and often results in errors. Our idea is that automata models of structure and behavior of the observed system are used and both of these models are built and kept in actual state in automatic mode on the basis of log file information. The proposed approach is based, on one hand, on the results of the authors researches in the field of automatic synthesis of multi-level automata models of observed systems and, on the other hand, on well known algorithms of process mining. In the paper typical monitoring tasks are described and generalized algorithms for solving them using the proposed system of models are presented. An example of real life systems based on the suggested approach is given. The approach can be recommended to use for building CPS of medium and high complexity, characterized by high structural dynamics and cognitive behavior. Full article
(This article belongs to the Special Issue Selected Papers from MICSECS 2019)
Show Figures

Figure 1

10 pages, 409 KiB  
Article
Evaluation of a Cyber-Physical Computing System with Migration of Virtual Machines during Continuous Computing
by Vladimir Bogatyrev and Aleksey Derkach
Computers 2020, 9(2), 42; https://doi.org/10.3390/computers9020042 - 23 May 2020
Cited by 22 | Viewed by 3729
Abstract
The Markov model of reliability of a failover cluster performing calculations in a cyber-physical system is considered. The continuity of the cluster computing process in the event of a failure of the physical resources of the servers is provided on the basis of [...] Read more.
The Markov model of reliability of a failover cluster performing calculations in a cyber-physical system is considered. The continuity of the cluster computing process in the event of a failure of the physical resources of the servers is provided on the basis of virtualization technology and is associated with the migration of virtual machines. The difference in the proposed model is that it considers the restrictions on the allowable time of interruption of the computational process during cluster recovery. This limitation is due to the fact that, if two physical servers fail, then object management is lost, which is unacceptable. Failure occurs if their recovery time is longer than the maximum allowable time of interruption of the computing process. The modes of operation of the cluster with and without system recovery in the event of a failure of part of the system resources that do not lead to loss of continuity of the computing process are considered. The results of the article are aimed at the possibility of assessing the probability of cluster operability while supporting the continuity of computations and its running to failure, leading to the interruption of the computational (control) process beyond the maximum permissible time. As a result of the calculation example for the presented models, it was shown that the mean time to failure during recovery under conditions of supporting the continuity of the computing process increases by more than two orders of magnitude. Full article
(This article belongs to the Special Issue Selected Papers from MICSECS 2019)
Show Figures

Figure 1

18 pages, 11324 KiB  
Article
Inertial Sensor Based Solution for Finger Motion Tracking
by Stepan Lemak, Viktor Chertopolokhov, Ivan Uvarov, Anna Kruchinina, Margarita Belousova, Leonid Borodkin and Maxim Mironenko
Computers 2020, 9(2), 40; https://doi.org/10.3390/computers9020040 - 12 May 2020
Cited by 5 | Viewed by 4324
Abstract
Hand motion tracking plays an important role in virtual reality systems for immersion and interaction purposes. This paper discusses the problem of finger tracking and proposes the application of the extension of the Madgwick filter and a simple switching (motion recognition) algorithm as [...] Read more.
Hand motion tracking plays an important role in virtual reality systems for immersion and interaction purposes. This paper discusses the problem of finger tracking and proposes the application of the extension of the Madgwick filter and a simple switching (motion recognition) algorithm as a comparison. The proposed algorithms utilize the three-link finger model and provide complete information about the position and orientation of the metacarpus. The numerical experiment shows that this approach is feasible and overcomes some of the major limitations of inertial motion tracking. The paper’s proposed solution was created in order to track a user’s pointing and grasping movements during the interaction with the virtual reconstruction of the cultural heritage of historical cities. Full article
(This article belongs to the Special Issue Selected Papers from MICSECS 2019)
Show Figures

Figure 1

16 pages, 4749 KiB  
Article
Eliminating Nonuniform Smearing and Suppressing the Gibbs Effect on Reconstructed Images
by Valery Sizikov, Aleksandra Dovgan and Aleksei Lavrov
Computers 2020, 9(2), 30; https://doi.org/10.3390/computers9020030 - 15 Apr 2020
Cited by 1 | Viewed by 3765
Abstract
In this work, the problem of eliminating a nonuniform rectilinear smearing of an image is considered, using a mathematical- and computer-based approach. An example of such a problem is a picture of several cars, moving with different speeds, taken with a fixed camera. [...] Read more.
In this work, the problem of eliminating a nonuniform rectilinear smearing of an image is considered, using a mathematical- and computer-based approach. An example of such a problem is a picture of several cars, moving with different speeds, taken with a fixed camera. The problem is described by a set of one-dimensional Fredholm integral equations (IEs) of the first kind of convolution type, with a one-dimensional point spread function (PSF) when uniform smearing, and by a set of new one-dimensional IEs of a general type (i.e., not the convolution type), with a two-dimensional PSF when nonuniform smearing. The problem is also described by a two-dimensional IE of the convolution type with a two-dimensional PSF when uniform smearing, and by a new two-dimensional IE of a general type with a four-dimensional PSF when nonuniform smearing. The problem of solving the Fredholm IE of the first kind is ill-posed (i.e., unstable). Therefore, IEs of the convolution type are solved by the Fourier transform (FT) method and Tikhonov’s regularization (TR), and IEs of the general type are solved by the quadrature/cubature and TR methods. Moreover, the magnitude of the image smear, Δ, is determined by the original “spectral method”, which increases the accuracy of image restoration. It is shown that the use of a set of one-dimensional IEs is preferable to one two-dimensional IE in the case of nonuniform smearing. In the inverse problem (i.e., image restoration), the Gibbs effect (the effect of false waves) in the image may occur. This may be an edge or an inner effect. The edge effect is well suppressed by the proposed technique, namely, “diffusing the edges”. The inner effect is difficult to eliminate, but the image smearing itself plays the role of diffusion and suppresses the inner Gibbs effect to a large extent. It is shown (in the presence of impulse noise in an image) that the well-known Tukey median filter can distort the image itself, and the Gonzalez adaptive filter also distorts the image (but to a lesser extent). We propose a modified adaptive filter. A software package was developed in MATLAB and illustrative calculations are performed. Full article
(This article belongs to the Special Issue Selected Papers from MICSECS 2019)
Show Figures

Figure 1

Back to TopTop