Next Issue
Volume 2, September
Previous Issue
Volume 2, March
 
 

Algorithms, Volume 2, Issue 2 (June 2009) – 12 articles , Pages 623-878

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Section
Select all
Export citation of selected articles as:
1016 KiB  
Article
Bayesian Maximum Entropy Based Algorithm for Digital X-ray Mammogram Processing
by Radu Mutihac
Algorithms 2009, 2(2), 850-878; https://doi.org/10.3390/a2020850 - 09 Jun 2009
Cited by 23 | Viewed by 8324
Abstract
Basics of Bayesian statistics in inverse problems using the maximum entropy principle are summarized in connection with the restoration of positive, additive images from various types of data like X-ray digital mammograms. An efficient iterative algorithm for image restoration from large data sets [...] Read more.
Basics of Bayesian statistics in inverse problems using the maximum entropy principle are summarized in connection with the restoration of positive, additive images from various types of data like X-ray digital mammograms. An efficient iterative algorithm for image restoration from large data sets based on the conjugate gradient method and Lagrange multipliers in nonlinear optimization of a specific potential function was developed. The point spread function of the imaging system was determined by numerical simulations of inhomogeneous breast-like tissue with microcalcification inclusions of various opacities. The processed digital and digitized mammograms resulted superior in comparison with their raw counterparts in terms of contrast, resolution, noise, and visibility of details. Full article
(This article belongs to the Special Issue Algorithms and Molecular Sciences)
Show Figures

Figure 1

178 KiB  
Review
Computer-Aided Diagnosis in Mammography Using Content-Based Image Retrieval Approaches: Current Status and Future Perspectives
by Bin Zheng
Algorithms 2009, 2(2), 828-849; https://doi.org/10.3390/a2020828 - 04 Jun 2009
Cited by 56 | Viewed by 13089
Abstract
As the rapid advance of digital imaging technologies, the content-based image retrieval (CBIR) has became one of the most vivid research areas in computer vision. In the last several years, developing computer-aided detection and/or diagnosis (CAD) schemes that use CBIR to search for [...] Read more.
As the rapid advance of digital imaging technologies, the content-based image retrieval (CBIR) has became one of the most vivid research areas in computer vision. In the last several years, developing computer-aided detection and/or diagnosis (CAD) schemes that use CBIR to search for the clinically relevant and visually similar medical images (or regions) depicting suspicious lesions has also been attracting research interest. CBIR-based CAD schemes have potential to provide radiologists with “visual aid” and increase their confidence in accepting CAD-cued results in the decision making. The CAD performance and reliability depends on a number of factors including the optimization of lesion segmentation, feature selection, reference database size, computational efficiency, and relationship between the clinical relevance and visual similarity of the CAD results. By presenting and comparing a number of approaches commonly used in previous studies, this article identifies and discusses the optimal approaches in developing CBIR-based CAD schemes and assessing their performance. Although preliminary studies have suggested that using CBIR-based CAD schemes might improve radiologists’ performance and/or increase their confidence in the decision making, this technology is still in the early development stage. Much research work is needed before the CBIR-based CAD schemes can be accepted in the clinical practice. Full article
(This article belongs to the Special Issue Machine Learning for Medical Imaging)
Show Figures

Figure 1

616 KiB  
Article
Failure Assessment of Layered Composites Subject to Impact Loadings: a Finite Element, Sigma-Point Kalman Filter Approach
by Stefano Mariani
Algorithms 2009, 2(2), 808-827; https://doi.org/10.3390/a2020808 - 04 Jun 2009
Cited by 262 | Viewed by 9838
Abstract
We present a coupled finite element, Kalman filter approach to foresee impactinduced delamination of layered composites when mechanical properties are partially unknown. Since direct numerical simulations, which require all the constitutive parameters to be assigned, cannot be run in such cases, an inverse [...] Read more.
We present a coupled finite element, Kalman filter approach to foresee impactinduced delamination of layered composites when mechanical properties are partially unknown. Since direct numerical simulations, which require all the constitutive parameters to be assigned, cannot be run in such cases, an inverse problem is formulated to allow for modeling as well as constitutive uncertainties. Upon space discretization through finite elements and time integration through the explicit ®¡method, the resulting nonlinear stochastic state model, wherein nonlinearities are due to delamination growth, is attacked with sigma-point Kalman filtering. Comparison with experimental data available in the literature and concerning inter-laminar failure of layered composites subject to low-velocity impacts, shows that the proposed procedure leads to: an accurate description of the failure mode; converged estimates of inter-laminar strength and toughness in good agreement with experimental data. Full article
(This article belongs to the Special Issue Numerical Simulation of Discontinuities in Mechanics)
Show Figures

Figure 1

304 KiB  
Article
Security of the Bennett-Brassard Quantum Key Distribution Protocol against Collective Attacks
by Michel Boyer, Ran Gelles and Tal Mor
Algorithms 2009, 2(2), 790-807; https://doi.org/10.3390/a2020790 - 03 Jun 2009
Cited by 8 | Viewed by 8153
Abstract
The theoretical Quantum Key-Distribution scheme of Bennett and Brassard (BB84) has been proven secure against very strong attacks including the collective attacks and the joint attacks. Though the latter are the most general attacks, collective attacks are much easier to analyze, yet, they [...] Read more.
The theoretical Quantum Key-Distribution scheme of Bennett and Brassard (BB84) has been proven secure against very strong attacks including the collective attacks and the joint attacks. Though the latter are the most general attacks, collective attacks are much easier to analyze, yet, they are conjectured to be as informative to the eavesdropper. Thus, collective attacks are likely to be useful in the analysis of many theoretical and practical schemes that are still lacking a proof of security, including practical BB84 schemes. We show how powerful tools developed in previous works for proving security against the joint attack, are simplified when applied to the security of BB84 against collective attacks whilst providing the same bounds on leaked information and the same error threshold. Full article
(This article belongs to the Special Issue Algorithms and Molecular Sciences)
1452 KiB  
Article
SDPhound, a Mutual Information-Based Method to Investigate Specificity-Determining Positions
by Sara Bonella, Walter Rocchia, Pietro Amat, Riccardo Nifosí and Valentina Tozzini
Algorithms 2009, 2(2), 764-789; https://doi.org/10.3390/a2020764 - 26 May 2009
Cited by 124 | Viewed by 8651
Abstract
Considerable importance in molecular biophysics is attached to influencing by mutagenesis the specific properties of a protein family. The working hypothesis is that mutating residues at few selected positions can affect specificity. Statistical analysis of homologue sequences can identify putative specificity determining positions [...] Read more.
Considerable importance in molecular biophysics is attached to influencing by mutagenesis the specific properties of a protein family. The working hypothesis is that mutating residues at few selected positions can affect specificity. Statistical analysis of homologue sequences can identify putative specificity determining positions (SDPs) and help to shed some light on the peculiarities underlying their functional role. In this work, we present an approach to identify such positions inspired by state of the art mutual information-based SDP prediction methods. The algorithm based on this approach provides a systematic procedure to point at the relevant physical characteristics of putative SPDs and can investigate the effects of correlated mutations. The method is tested on two standard benchmarks in the field and further validated in the context of a biologically interesting problem: the multimerization of the Intrinsically Fluorescent Proteins (IFP). Full article
(This article belongs to the Special Issue Algorithms and Molecular Sciences)
Show Figures

Graphical abstract

400 KiB  
Article
Probabilistic Upscaling of Material Failure Using Random Field Models – A Preliminary Investigation
by Keqiang Hu and X. Frank Xu
Algorithms 2009, 2(2), 750-763; https://doi.org/10.3390/a2020750 - 30 Apr 2009
Cited by 29 | Viewed by 8494
Abstract
Complexity of failure is reflected from sensitivity of strength to small defects and wide scatter of macroscopic behaviors. In engineering practices, spatial information of materials at fine scales can only be partially measurable. Random field (RF) models are important to address the uncertainty [...] Read more.
Complexity of failure is reflected from sensitivity of strength to small defects and wide scatter of macroscopic behaviors. In engineering practices, spatial information of materials at fine scales can only be partially measurable. Random field (RF) models are important to address the uncertainty in spatial distribution. To transform a RF of micro-cracks into failure probability at full structural-scale crossing a number of length scales, the operator representing physics laws need be implemented in a multiscale framework, and to be realized in a stochastic setting. Multiscale stochastic modeling of materials is emerging as a new methodology at this research frontier, which provides a new multiscale thinking by upscaling fine-scale RFs. In this study, a preliminary framework of probabilistic upscaling is presented for bottom-up hierarchical modeling of failure propagation across micro-meso-macro scales. In the micro-to-meso process, the strength of stochastic representative volume element (SRVE) is probabilistically assessed by using a lattice model. A mixed Weibull-Gaussian distribution is proposed to characterize the statistical strength of SRVE, which can be used as input for the subsequent meso-to-macro upscaling process using smeared crack finite element analysis. Full article
(This article belongs to the Special Issue Numerical Simulation of Discontinuities in Mechanics)
Show Figures

Figure 1

2044 KiB  
Article
Application of an Image Tracking Algorithm in Fire Ant Motion Experiment
by Lichuan Gui and John M. Seiner
Algorithms 2009, 2(2), 735-749; https://doi.org/10.3390/a2020735 - 30 Apr 2009
Cited by 29 | Viewed by 7370
Abstract
An image tracking algorithm, which was originally used with the particle image velocimetry (PIV) to determine velocities of buoyant solid particles in water, is modified and applied in the presented work to detect motion of fire ant on a planar surface. A group [...] Read more.
An image tracking algorithm, which was originally used with the particle image velocimetry (PIV) to determine velocities of buoyant solid particles in water, is modified and applied in the presented work to detect motion of fire ant on a planar surface. A group of fire ant workers are put to the bottom of a tub and excited with vibration of selected frequency and intensity. The moving fire ants are captured with an image system that successively acquires image frames of high digital resolution. The background noise in the imaging recordings is extracted by averaging hundreds of frames and removed from each frame. The individual fire ant images are identified with a recursive digital filter, and then they are tracked between frames according to the size, brightness, shape, and orientation angle of the ant image. The speed of an individual ant is determined with the displacement of its images and the time interval between frames. The trail of the individual fire ant is determined with the image tracking results, and a statistical analysis is conducted for all the fire ants in the group. The purpose of the experiment is to investigate the response of fire ants to the substrate vibration. Test results indicate that the fire ants move faster after being excited, but the number of active ones are not increased even after a strong excitation. Full article
Show Figures

Figure 1

313 KiB  
Article
ALE-PSO: An Adaptive Swarm Algorithm to Solve Design Problems of Laminates
by Paolo Vannucci
Algorithms 2009, 2(2), 710-734; https://doi.org/10.3390/a2020710 - 21 Apr 2009
Cited by 13 | Viewed by 8586
Abstract
This paper presents an adaptive PSO algorithm whose numerical parameters can be updated following a scheduled protocol respecting some known criteria of convergence in order to enhance the chances to reach the global optimum of a hard combinatorial optimization problem, such those encountered [...] Read more.
This paper presents an adaptive PSO algorithm whose numerical parameters can be updated following a scheduled protocol respecting some known criteria of convergence in order to enhance the chances to reach the global optimum of a hard combinatorial optimization problem, such those encountered in global optimization problems of composite laminates. Some examples concerning hard design problems are provided, showing the effectiveness of the approach. Full article
(This article belongs to the Special Issue Numerical Simulation of Discontinuities in Mechanics)
Show Figures

Figure 1

465 KiB  
Article
Fast Structural Alignment of Biomolecules Using a Hash Table, N-Grams and String Descriptors
by Raphael André Bauer, Kristian Rother, Peter Moor, Knut Reinert, Thomas Steinke, Janusz M. Bujnicki and Robert Preissner
Algorithms 2009, 2(2), 692-709; https://doi.org/10.3390/a2020692 - 21 Apr 2009
Cited by 61 | Viewed by 13453
Abstract
This work presents a generalized approach for the fast structural alignment of thousands of macromolecular structures. The method uses string representations of a macromolecular structure and a hash table that stores n-grams of a certain size for searching. To this end, macromolecular structure-to-string [...] Read more.
This work presents a generalized approach for the fast structural alignment of thousands of macromolecular structures. The method uses string representations of a macromolecular structure and a hash table that stores n-grams of a certain size for searching. To this end, macromolecular structure-to-string translators were implemented for protein and RNA structures. A query against the index is performed in two hierarchical steps to unite speed and precision. In the first step the query structure is translated into n-grams, and all target structures containing these n-grams are retrieved from the hash table. In the second step all corresponding n-grams of the query and each target structure are subsequently aligned, and after each alignment a score is calculated based on the matching n-grams of query and target. The extendable framework enables the user to query and structurally align thousands of protein and RNA structures on a commodity machine and is available as open source from http://lajolla.sf.net. Full article
(This article belongs to the Special Issue Algorithms and Molecular Sciences)
Show Figures

Figure 1

311 KiB  
Article
A Bayesian Algorithm for Functional Mapping of Dynamic Complex Traits
by Tian Liu and Rongling Wu
Algorithms 2009, 2(2), 667-691; https://doi.org/10.3390/a2020667 - 21 Apr 2009
Cited by 10 | Viewed by 8481
Abstract
Functional mapping of dynamic traits measured in a longitudinal study was originally derived within the maximum likelihood (ML) context and implemented with the EM algorithm. Although ML-based functional mapping possesses many favorable statistical properties in parameter estimation, it may be computationally intractable for [...] Read more.
Functional mapping of dynamic traits measured in a longitudinal study was originally derived within the maximum likelihood (ML) context and implemented with the EM algorithm. Although ML-based functional mapping possesses many favorable statistical properties in parameter estimation, it may be computationally intractable for analyzing longitudinal data with high dimensions and high measurement errors. In this article, we derive a general functional mapping framework for quantitative trait locus mapping of dynamic traits within the Bayesian paradigm. Markov chain Monte Carlo techniques were implemented for functional mapping to estimate biologically and statistically sensible parameters that model the structures of time-dependent genetic effects and covariance matrix. The Bayesian approach is useful to handle difficulties in constructing confidence intervals as well as the identifiability problem, enhancing the statistical inference of functional mapping. We have undertaken simulation studies to investigate the statistical behavior of Bayesian-based functional mapping and used a real example with F2 mice to validate the utilization and usefulness of the model. Full article
(This article belongs to the Special Issue Algorithms and Molecular Sciences)
Show Figures

Figure 1

2169 KiB  
Article
Pattern Recognition and Pathway Analysis with Genetic Algorithms in Mass Spectrometry Based Metabolomics
by Wei Zou and Vladimir V. Tolstikov
Algorithms 2009, 2(2), 638-666; https://doi.org/10.3390/a2020638 - 03 Apr 2009
Cited by 18 | Viewed by 12520
Abstract
A robust and complete workflow for metabolic profiling and data mining was described in detail. Three independent and complementary analytical techniques for metabolic profiling were applied: hydrophilic interaction chromatography (HILIC–LC–ESI–MS), reversed-phase liquid chromatography (RP–LC–ESI–MS), and gas chromatography (GC–TOF–MS) all coupled to mass spectrometry [...] Read more.
A robust and complete workflow for metabolic profiling and data mining was described in detail. Three independent and complementary analytical techniques for metabolic profiling were applied: hydrophilic interaction chromatography (HILIC–LC–ESI–MS), reversed-phase liquid chromatography (RP–LC–ESI–MS), and gas chromatography (GC–TOF–MS) all coupled to mass spectrometry (MS). Unsupervised methods, such as principle component analysis (PCA) and clustering, and supervised methods, such as classification and PCA-DA (discriminatory analysis) were used for data mining. Genetic Algorithms (GA), a multivariate approach, was probed for selection of the smallest subsets of potentially discriminative predictors. From thousands of peaks found in total, small subsets selected by GA were considered as highly potential predictors allowing discrimination among groups. It was found that small groups of potential top predictors selected with PCA-DA and GA are different and unique. Annotated GC–TOF–MS data generated identified feature metabolites. Metabolites putatively detected with LC–ESI–MS profiling require further elemental composition assignment with accurate mass measurement by Fourier-transform ion cyclotron resonance mass spectrometry (FT-ICR-MS) and structure elucidation by nuclear magnetic resonance spectroscopy (NMR). GA was also used to generate correlated networks for pathway analysis. Several case studies, comprising groups of plant samples bearing different genotypes and groups of samples of human origin, namely patients and healthy volunteers’ urine samples, demonstrated that such a workflow combining comprehensive metabolic profiling and advanced data mining techniques provides a powerful approach for pattern recognition and biomarker discovery Full article
(This article belongs to the Special Issue Algorithms and Molecular Sciences)
Show Figures

Figure 1

89 KiB  
Article
Neural Network Modeling to Predict Shelf Life of Greenhouse Lettuce
by Wei-Chin Lin and Glen S. Block
Algorithms 2009, 2(2), 623-637; https://doi.org/10.3390/a2020623 - 03 Apr 2009
Cited by 18 | Viewed by 8979
Abstract
Greenhouse-grown butter lettuce (Lactuca sativa L.) can potentially be stored for 21 days at constant 0°C. When storage temperature was increased to 5°C or 10°C, shelf life was shortened to 14 or 10 days, respectively, in our previous observations. Also, commercial shelf life [...] Read more.
Greenhouse-grown butter lettuce (Lactuca sativa L.) can potentially be stored for 21 days at constant 0°C. When storage temperature was increased to 5°C or 10°C, shelf life was shortened to 14 or 10 days, respectively, in our previous observations. Also, commercial shelf life of 7 to 10 days is common, due to postharvest temperature fluctuations. The objective of this study was to establish neural network (NN) models to predict the remaining shelf life (RSL) under fluctuating postharvest temperatures. A box of 12 - 24 lettuce heads constituted a sample unit. The end of the shelf life of each head was determined when it showed initial signs of decay or yellowing. Air temperatures inside a shipping box were recorded. Daily average temperatures in storage and averaged shelf life of each box were used as inputs, and the RSL was modeled as an output. An R2 of 0.57 could be observed when a simple NN structure was employed. Since the "future" (or remaining) storage temperatures were unavailable at the time of making a prediction, a second NN model was introduced to accommodate a range of future temperatures and associated shelf lives. Using such 2-stage NN models, an R2 of 0.61 could be achieved for predicting RSL. This study indicated that NN modeling has potential for cold chain quality control and shelf life prediction. Full article
(This article belongs to the Special Issue Neural Networks and Sensors)
Show Figures

Graphical abstract

Previous Issue
Next Issue
Back to TopTop