Symmetry2015, 7(3), 1333-1351; doi:10.3390/sym7031333 - published 20 July 2015 Show/Hide Abstract
Abstract: The role of symmetry in computer vision has waxed and waned in importance during the evolution of the field from its earliest days. At first figuring prominently in support of bottom-up indexing, it fell out of favour as shape gave way to appearance and recognition gave way to detection. With a strong prior in the form of a target object, the role of the weaker priors offered by perceptual grouping was greatly diminished. However, as the field returns to the problem of recognition from a large database, the bottom-up recovery of the parts that make up the objects in a cluttered scene is critical for their recognition. The medial axis community has long exploited the ubiquitous regularity of symmetry as a basis for the decomposition of a closed contour into medial parts. However, today’s recognition systems are faced with cluttered scenes and the assumption that a closed contour exists, i.e., that figure-ground segmentation has been solved, rendering much of the medial axis community’s work inapplicable. In this article, we review a computational framework, previously reported in [1–3], that bridges the representation power of the medial axis and the need to recover and group an object’s parts in a cluttered scene. Our framework is rooted in the idea that a maximally-inscribed disc, the building block of a medial axis, can be modelled as a compact superpixel in the image. We evaluate the method on images of cluttered scenes.
Symmetry2015, 7(3), 1289-1332; doi:10.3390/sym7031289 - published 20 July 2015 Show/Hide Abstract
Abstract: We suggest a diagrammatic model of computation based on an axiom of distributivity. A diagram of a decorated colored tangle, similar to those that appear in low dimensional topology, plays the role of a circuit diagram. Equivalent diagrams represent bisimilar computations. We prove that our model of computation is Turing complete and with bounded resources that it can decide any language in complexity class IP, sometimes with better performance parameters than corresponding classical protocols.
Symmetry2015, 7(3), 1275-1288; doi:10.3390/sym7031275 - published 16 July 2015 Show/Hide Abstract
Abstract: Big data refers to informationalization technology for extracting valuable information through the use and analysis of large-scale data and, based on that data, deriving plans for response or predicting changes. With the development of software and devices for next generation sequencing, a vast amount of bioinformatics data has been generated recently. Also, bioinformatics data based big-data technology is rising rapidly as a core technology by the bioinformatician, biologist and big-data scientist. KEGG pathway is bioinformatics data for understanding high-level functions and utilities of the biological system. However, KEGG pathway analysis requires a lot of time and effort because KEGG pathways are high volume and very diverse. In this paper, we proposed a network analysis and visualization system that crawl user interest KEGG pathways, construct a pathway network based on a hierarchy structure of pathways and visualize relations and interactions of pathways by clustering and selecting core pathways from the network. Finally, we construct a pathway network collected by starting with an Alzheimer’s disease pathway and show the results on clustering and selecting core pathways from the pathway network.
Symmetry2015, 7(3), 1211-1260; doi:10.3390/sym7031211 - published 14 July 2015 Show/Hide Abstract
Abstract: The similarity patterns of the genetic code result from similar codons encoding similar messages. We develop a new mathematical model to analyze these patterns. The physicochemical characteristics of amino acids objectively quantify their differences and similarities; the Hamming metric does the same for the 64 codons of the codon set. (Hamming distances equal the number of different codon positions: AAA and AAC are at 1-distance; codons are maximally at 3-distance.) The CodonPolytope, a 9-dimensional geometric object, is spanned by 64 vertices that represent the codons and the Euclidian distances between these vertices correspond one-to-one with intercodon Hamming distances. The CodonGraph represents the vertices and edges of the polytope; each edge equals a Hamming 1-distance. The mirror reflection symmetry group of the polytope is isomorphic to the largest permutation symmetry group of the codon set that preserves Hamming distances. These groups contain 82,944 symmetries. Many polytope symmetries coincide with the degeneracy and similarity patterns of the genetic code. These code symmetries are strongly related with the face structure of the polytope with smaller faces displaying stronger code symmetries. Splitting the polytope stepwise into smaller faces models an early evolution of the code that generates this hierarchy of code symmetries. The canonical code represents a class of 41,472 codes with equivalent symmetries; a single class among an astronomical number of symmetry classes comprising all possible codes.
Symmetry2015, 7(3), 1176-1210; doi:10.3390/sym7031176 - published 2 July 2015 Show/Hide Abstract
Abstract: Information technology (IT) security has become a major concern due to the growing demand for information and massive development of client/server applications for various types of applications running on modern IT infrastructure. How has security been taken into account and which paradigms are necessary to minimize security issues while increasing efficiency, reducing the influence on transmissions, ensuring protocol independency and achieving substantial performance? We have found cryptography to be an absolute security mechanism for client/server architectures, and in this study, a new security design was developed with the MODBUS protocol, which is considered to offer phenomenal performance for future development and enhancement of real IT infrastructure. This study is also considered to be a complete development because security is tested in almost all ways of MODBUS communication. The computed measurements are evaluated to validate the overall development, and the results indicate a substantial improvement in security that is differentiated from conventional methods.