Journal Description
Computation
Computation
is a peer-reviewed journal of computational science and engineering published monthly online by MDPI.
- Open Access— free for readers, with article processing charges (APC) paid by authors or their institutions.
- High Visibility: indexed within Scopus, ESCI (Web of Science), CAPlus / SciFinder, Inspec, dblp, and other databases.
- Journal Rank: CiteScore - Q2 (Applied Mathematics)
- Rapid Publication: manuscripts are peer-reviewed and a first decision is provided to authors approximately 16.2 days after submission; acceptance to publication is undertaken in 4.8 days (median values for papers published in this journal in the second half of 2022).
- Recognition of Reviewers: reviewers who provide timely, thorough peer-review reports receive vouchers entitling them to a discount on the APC of their next publication in any MDPI journal, in appreciation of the work done.
Latest Articles
Addressing the Folding of Intermolecular Springs in Particle Simulations: Fixed Image Convention
Computation 2023, 11(6), 106; https://doi.org/10.3390/computation11060106 (registering DOI) - 26 May 2023
Abstract
Mesoscopic simulations of long polymer chains and soft matter systems are conducted routinely in the literature in order to assess the long-lived relaxation processes manifested in these systems. Coarse-grained chains are, however, prone to unphysical intercrossing due to their inherent softness. This issue
[...] Read more.
Mesoscopic simulations of long polymer chains and soft matter systems are conducted routinely in the literature in order to assess the long-lived relaxation processes manifested in these systems. Coarse-grained chains are, however, prone to unphysical intercrossing due to their inherent softness. This issue can be resolved by introducing long intermolecular bonds (the so-called slip-springs) which restore these topological constraints. The separation vector of intermolecular bonds can be determined by enforcing the commonly adopted minimum image convention (MIC). Because these bonds are soft and long (ca 3–20 nm), subjecting the samples to extreme deformations can lead to topology violations when enforcing the MIC. We propose the fixed image convention (FIC) for determining the separation vectors of overextended bonds, which is more stable than the MIC and applicable to extreme deformations. The FIC is simple to implement and, in general, more efficient than the MIC. Side-by-side comparisons between the MIC and FIC demonstrate that, when using the FIC, the topology remains intact even in situations with extreme particle displacement and nonaffine deformation. The accuracy of these conventions is the same when applying affine deformation. The article is accompanied by the corresponding code for implementing the FIC.
Full article
(This article belongs to the Special Issue 10th Anniversary of Computation - Computational Chemistry)
►
Show Figures
Open AccessArticle
CrohnDB: A Web Database for Expression Profiling of Protein-Coding and Long Non-Coding RNA Genes in Crohn Disease
Computation 2023, 11(6), 105; https://doi.org/10.3390/computation11060105 - 24 May 2023
Abstract
►▼
Show Figures
Crohn disease (CD) is a type of inflammatory bowel disease that causes inflammation in the digestive tract. Cases of CD are increasing worldwide, calling for more research to elucidate the pathogenesis of CD. For this purpose, the usage of the RNA-sequencing (RNA-seq) technique
[...] Read more.
Crohn disease (CD) is a type of inflammatory bowel disease that causes inflammation in the digestive tract. Cases of CD are increasing worldwide, calling for more research to elucidate the pathogenesis of CD. For this purpose, the usage of the RNA-sequencing (RNA-seq) technique is increasingly appreciated, as it captures RNA expression patterns at a particular time point in a high-throughput manner. Although many RNA-seq datasets are generated from CD patients and compared to those of healthy donors, most of these datasets are analyzed only for protein-coding genes, leaving non-coding RNAs (ncRNAs) undiscovered. Long non-coding RNAs (lncRNAs) are any ncRNAs that are longer than 200 nucleotides. Interest in studying lncRNAs is increasing rapidly, as lncRNAs bind other macromolecules (DNA, RNA, and/or proteins) to finetune signaling pathways. To fill the gap in knowledge about lncRNAs in CD, we performed secondary analysis of published RNA-seq data of CD patients compared to healthy donors to identify lncRNA genes and their expression changes. To further facilitate lncRNA research in CD, we built a web database, CrohnDB, to provide a one-stop-shop for expression profiling of protein-coding and lncRNA genes in CD patients compared to healthy donors.
Full article

Figure 1
Open AccessArticle
Explainable Ensemble-Based Machine Learning Models for Detecting the Presence of Cirrhosis in Hepatitis C Patients
by
, , , , , , and
Computation 2023, 11(6), 104; https://doi.org/10.3390/computation11060104 (registering DOI) - 23 May 2023
Abstract
Hepatitis C is a liver infection caused by a virus, which results in mild to severe inflammation of the liver. Over many years, hepatitis C gradually damages the liver, often leading to permanent scarring, known as cirrhosis. Patients sometimes have moderate or no
[...] Read more.
Hepatitis C is a liver infection caused by a virus, which results in mild to severe inflammation of the liver. Over many years, hepatitis C gradually damages the liver, often leading to permanent scarring, known as cirrhosis. Patients sometimes have moderate or no symptoms of liver illness for decades before developing cirrhosis. Cirrhosis typically worsens to the point of liver failure. Patients with cirrhosis may also experience brain and nerve system damage, as well as gastrointestinal hemorrhage. Treatment for cirrhosis focuses on preventing further progression of the disease. Detecting cirrhosis earlier is therefore crucial for avoiding complications. Machine learning (ML) has been shown to be effective at providing precise and accurate information for use in diagnosing several diseases. Despite this, no studies have so far used ML to detect cirrhosis in patients with hepatitis C. This study obtained a dataset consisting of 28 attributes of 2038 Egyptian patients from the ML Repository of the University of California at Irvine. Four ML algorithms were trained on the dataset to diagnose cirrhosis in hepatitis C patients: a Random Forest, a Gradient Boosting Machine, an Extreme Gradient Boosting, and an Extra Trees model. The Extra Trees model outperformed the other models achieving an accuracy of 96.92%, a recall of 94.00%, a precision of 99.81%, and an area under the receiver operating characteristic curve of 96% using only 16 of the 28 features.
Full article
(This article belongs to the Special Issue Interpretable Statistical Machine for Decision Making: Modeling, Learning and Application Perspectives)
►▼
Show Figures

Figure 1
Open AccessArticle
Opinion Formation on Social Networks—The Effects of Recurrent and Circular Influence
by
Computation 2023, 11(5), 103; https://doi.org/10.3390/computation11050103 - 22 May 2023
Abstract
We present a generalised complex contagion model for describing behaviour and opinion spreading on social networks. Recurrent interactions between adjacent nodes and circular influence in loops in the network structure enable the modelling of influence spreading on the network scale. We have presented
[...] Read more.
We present a generalised complex contagion model for describing behaviour and opinion spreading on social networks. Recurrent interactions between adjacent nodes and circular influence in loops in the network structure enable the modelling of influence spreading on the network scale. We have presented details of the model in our earlier studies. Here, we focus on the interpretation of the model and discuss its features by using conventional concepts in the literature. In addition, we discuss how the model can be extended to account for specific social phenomena in social networks. We demonstrate the differences between the results of our model and a simple contagion model. Results are provided for a small social network and a larger collaboration network. As an application of the model, we present a method for profiling individuals based on their out-centrality, in-centrality, and betweenness values in the social network structure. These measures have been defined consistently with our spreading model based on an influence spreading matrix. The influence spreading matrix captures the directed spreading probabilities between all node pairs in the network structure. Our results show that recurrent and circular influence has considerable effects on node centrality values and spreading probabilities in the network structure.
Full article
(This article belongs to the Special Issue Computational Social Science and Complex Systems)
►▼
Show Figures

Figure 1
Open AccessArticle
Τhe Behavior of Hybrid Reinforced Concrete-Steel Buildings under Sequential Ground Excitations
Computation 2023, 11(5), 102; https://doi.org/10.3390/computation11050102 - 18 May 2023
Abstract
In common construction practice, various examples can be found involving a building type consisting of a lower, older, reinforced concrete structure and a more recent upper steel part, forming a so-called “hybrid” building. Conventional seismic design rules give full guidelines for the earthquake
[...] Read more.
In common construction practice, various examples can be found involving a building type consisting of a lower, older, reinforced concrete structure and a more recent upper steel part, forming a so-called “hybrid” building. Conventional seismic design rules give full guidelines for the earthquake design of buildings constructed with the same material throughout. The current seismic codes neglect to provide specific design and detailing guidelines for vertical hybrid buildings and limited existing research is available in the literature, thus leaving a scientific gap that needs to be investigated. In the present work, an effort is made to fill this gap in the knowledge about the behavior of this hybrid building type in sequential earthquakes, which are found in the literature to burden the seismic structural response. Three-dimensional models of hybrid reinforced concrete–steel frames are exposed to sequential ground excitations in horizontal and vertical directions while considering the elastoplastic behavior of these structural elements in the time domain. The lower reinforced concrete parts of the hybrid buildings are detailed here as corresponding to a former structure by a simple approximation. In addition, two boundary connections of the structural steel part upon the r/c part are distinguished for examination in the elastoplastic analyses. Comparisons of the arithmetical analysis results of the hybrid frames for the examined connections are carried out. The seismic response plots of the current non-linear dynamic time-domain analyses of the 3D hybrid frames subjected to sequential ground excitations yield useful conclusions to provide guidelines for a safer seismic design of the hybrid building type, which is not covered by the current codes despite being a common practice.
Full article
(This article belongs to the Special Issue Finite Element Methods with Applications in Civil and Mechanical Engineering)
►▼
Show Figures

Figure 1
Open AccessArticle
Social Learning and the Exploration-Exploitation Tradeoff
by
and
Computation 2023, 11(5), 101; https://doi.org/10.3390/computation11050101 - 18 May 2023
Abstract
Cultures around the world show varying levels of conservatism. While maintaining traditional ideas prevents wrong ones from being embraced, it also slows or prevents adaptation to new times. Without exploration there can be no improvement, but often this effort is wasted as it
[...] Read more.
Cultures around the world show varying levels of conservatism. While maintaining traditional ideas prevents wrong ones from being embraced, it also slows or prevents adaptation to new times. Without exploration there can be no improvement, but often this effort is wasted as it fails to produce better results, making it better to exploit the best known option. This tension is known as the exploration/exploitation issue, and it occurs at the individual and group levels, whenever decisions are made. As such, it is has been investigated across many disciplines. We extend previous work by approximating a continuum of traits under local exploration, employing the method of adaptive dynamics, and studying multiple fitness functions. In this work, we ask how nature would solve the exploration/exploitation issue, by allowing natural selection to operate on an exploration parameter in a variety of contexts, thinking of exploration as mutation in a trait space with a varying fitness function. Specifically, we study how exploration rates evolve by applying adaptive dynamics to the replicator-mutator equation, under two types of fitness functions. For the first, payoffs are accrued from playing a two-player, two-action symmetric game, we consider representatives of all games in this class, including the Prisoner’s Dilemma, Hawk-Dove, and Stag Hunt games, finding exploration rates often evolve downwards, but can also undergo neutral selection as well depending on the games parameters or initial conditions. Second, we study time dependent fitness with a function having a single oscillating peak. By increasing the period, we see a jump in the optimal exploration rate, which then decreases towards zero as the frequency of environmental change increases. These results establish several possible evolutionary scenarios for exploration rates, providing insight into many applications, including why we can see such diversity in rates of cultural change.
Full article
(This article belongs to the Special Issue Computational Social Science and Complex Systems)
►▼
Show Figures

Figure 1
Open AccessFeature PaperArticle
A Multiple Response Prediction Model for Dissimilar AA-5083 and AA-6061 Friction Stir Welding Using a Combination of AMIS and Machine Learning
by
, , , and
Computation 2023, 11(5), 100; https://doi.org/10.3390/computation11050100 - 15 May 2023
Abstract
This study presents a methodology that combines artificial multiple intelligence systems (AMISs) and machine learning to forecast the ultimate tensile strength (UTS), maximum hardness (MH), and heat input (HI) of AA-5083 and AA-6061 friction stir welding. The machine learning model integrates two machine
[...] Read more.
This study presents a methodology that combines artificial multiple intelligence systems (AMISs) and machine learning to forecast the ultimate tensile strength (UTS), maximum hardness (MH), and heat input (HI) of AA-5083 and AA-6061 friction stir welding. The machine learning model integrates two machine learning methods, Gaussian process regression (GPR) and a support vector machine (SVM), into a single model, and then uses the AMIS as the decision fusion strategy to merge SVM and GPR. The generated model was utilized to anticipate three objectives based on seven controlled/input parameters. These parameters were: tool tilt angle, rotating speed, travel speed, shoulder diameter, pin geometry, type of reinforcing particles, and tool pin movement mechanism. The effectiveness of the model was evaluated using a two-experiment framework. In the first experiment, we used two newly produced datasets, (1) the 7PI-V1 dataset and (2) the 7PI-V2 dataset, and compared the results with state-of-the-art approaches. The second experiment used existing datasets from the literature with varying base materials and parameters. The computational results revealed that the proposed method produced more accurate prediction results than the previous methods. For all datasets, the proposed strategy outperformed existing methods and state-of-the-art processes by an average of 1.35% to 6.78%.
Full article
(This article belongs to the Section Computational Engineering)
►▼
Show Figures

Figure 1
Open AccessArticle
Development of Trading Strategies Using Time Series Based on Robust Interval Forecasts
Computation 2023, 11(5), 99; https://doi.org/10.3390/computation11050099 - 15 May 2023
Abstract
►▼
Show Figures
The task of time series forecasting is to estimate future values based on available observational data. Prediction Intervals methods are aimed at finding not the next point, but the interval that the future value or several values on the forecast horizon can fall
[...] Read more.
The task of time series forecasting is to estimate future values based on available observational data. Prediction Intervals methods are aimed at finding not the next point, but the interval that the future value or several values on the forecast horizon can fall into given current and historical data. This article proposes an approach for modeling a robust interval forecast for a stock portfolio. Here, a trading strategy was developed to profit from trading stocks in the market. The study used real trading data of real stocks. Forty securities were used to calculate the IMOEX. The securities with the highest weight were the following: GAZP, LKOH, SBER. This definition of the strategy allows operating with large portfolios. Increasing the accuracy of the forecast was carried out by estimating the interval of the forecast. Here, a range of values was considered to be a result of forecasting without considering specific moments, which guarantees the reliability of the forecast. The use of a predictive interval approach for the price of shares allows increasing their profitability.
Full article

Figure 1
Open AccessArticle
Reconstruction of Meteorological Records by Methods Based on Dimension Reduction of the Predictor Dataset
Computation 2023, 11(5), 98; https://doi.org/10.3390/computation11050098 - 12 May 2023
Abstract
The reconstruction or prediction of meteorological records through the Analog Ensemble (AnEn) method is very efficient when the number of predictor time series is small. Thus, in order to take advantage of the richness and diversity of information contained in a large number
[...] Read more.
The reconstruction or prediction of meteorological records through the Analog Ensemble (AnEn) method is very efficient when the number of predictor time series is small. Thus, in order to take advantage of the richness and diversity of information contained in a large number of predictors, it is necessary to reduce their dimensions. This study presents methods to accomplish such reduction, allowing the use of a high number of predictor variables. In particular, the techniques of Principal Component Analysis (PCA) and Partial Least Squares (PLS) are used to reduce the dimension of the predictor dataset without loss of essential information. The combination of the AnEn and PLS techniques results in a very efficient hybrid method (PLSAnEn) for reconstructing or forecasting unstable meteorological variables, such as wind speed. This hybrid method is computationally demanding but its performance can be improved via parallelization or the introduction of variants in which all possible analogs are previously clustered. The multivariate linear regression methods used on the new variables resulting from the PCA or PLS techniques also proved to be efficient, especially for the prediction of meteorological variables without local oscillations, such as the pressure.
Full article
(This article belongs to the Special Issue Applications of Computational Mathematics to Simulation and Data Analysis II)
►▼
Show Figures

Figure 1
Open AccessArticle
A Flexible and General-Purpose Platform for Heterogeneous Computing
Computation 2023, 11(5), 97; https://doi.org/10.3390/computation11050097 - 11 May 2023
Abstract
In the big data era, processing large amounts of data imposes several challenges, mainly in terms of performance. Complex operations in data science, such as deep learning, large-scale simulations, and visualization applications, can consume a significant amount of computing time. Heterogeneous computing is
[...] Read more.
In the big data era, processing large amounts of data imposes several challenges, mainly in terms of performance. Complex operations in data science, such as deep learning, large-scale simulations, and visualization applications, can consume a significant amount of computing time. Heterogeneous computing is an attractive alternative for algorithm acceleration, using not one but several different kinds of computing devices (CPUs, GPUs, or FPGAs) simultaneously. Accelerating an algorithm for a specific device under a specific framework, i.e., CUDA/GPU, provides a solution with the highest possible performance at the cost of a loss in generality and requires an experienced programmer. On the contrary, heterogeneous computing allows one to hide the details pertaining to the simultaneous use of different technologies in order to accelerate computation. However, effective heterogeneous computing implementation still requires mastering the underlying design flow. Aiming to fill this gap, in this paper we present a heterogeneous computing platform (HCP). Regarding its main features, this platform allows non-experts in heterogeneous computing to deploy, run, and evaluate high-computational-demand algorithms following a semi-automatic design flow. Given the implementation of an algorithm in C with minimal format requirements, the platform automatically generates the parallel code using a code analyzer, which is adapted to target a set of available computing devices. Thus, while an experienced heterogeneous computing programmer is not required, the process can run over the available computing devices on the platform as it is not an ad hoc solution for a specific computing device. The proposed HCP relies on the OpenCL specification for interoperability and generality. The platform was validated and evaluated in terms of generality and efficiency through a set of experiments using the algorithms of the Polybench/C suite (version 3.2) as the input. Different configurations for the platform were used, considering CPUs only, GPUs only, and a combination of both. The results revealed that the proposed HCP was able to achieve accelerations of up to 270× for specific classes of algorithms, i.e., parallel-friendly algorithms, while its use required almost no expertise in either OpenCL or heterogeneous computing from the programmer/end-user.
Full article
(This article belongs to the Section Computational Engineering)
►▼
Show Figures

Figure 1
Open AccessArticle
Diabetes Classification Using Machine Learning Techniques
Computation 2023, 11(5), 96; https://doi.org/10.3390/computation11050096 - 10 May 2023
Abstract
Machine learning techniques play an increasingly prominent role in medical diagnosis. With the use of these techniques, patients’ data can be analyzed to find patterns or facts that are difficult to explain, making diagnoses more reliable and convenient. The purpose of this research
[...] Read more.
Machine learning techniques play an increasingly prominent role in medical diagnosis. With the use of these techniques, patients’ data can be analyzed to find patterns or facts that are difficult to explain, making diagnoses more reliable and convenient. The purpose of this research was to compare the efficiency of diabetic classification models using four machine learning techniques: decision trees, random forests, support vector machines, and K-nearest neighbors. In addition, new diabetic classification models are proposed that incorporate hyperparameter tuning and the addition of some interaction terms into the models. These models were evaluated based on accuracy, precision, recall, and the F1-score. The results of this study show that the proposed models with interaction terms have better classification performance than those without interaction terms for all four machine learning techniques. Among the proposed models with interaction terms, random forest classifiers had the best performance, with 97.5% accuracy, 97.4% precision, 96.6% recall, and a 97% F1-score. The findings from this study can be further developed into a program that can effectively screen potential diabetes patients.
Full article
(This article belongs to the Special Issue Application of Machine learning and Operation Research in Healthcare Management)
►▼
Show Figures

Figure 1
Open AccessArticle
A Two-Step Machine Learning Method for Predicting the Formation Energy of Ternary Compounds
by
, , , , , and
Computation 2023, 11(5), 95; https://doi.org/10.3390/computation11050095 - 09 May 2023
Abstract
Predicting the chemical stability of yet-to-be-discovered materials is an important aspect of the discovery and development of virtual materials. The conventional approach for computing the enthalpy of formation based on ab initio methods is time consuming and computationally demanding. In this regard, alternative
[...] Read more.
Predicting the chemical stability of yet-to-be-discovered materials is an important aspect of the discovery and development of virtual materials. The conventional approach for computing the enthalpy of formation based on ab initio methods is time consuming and computationally demanding. In this regard, alternative machine learning approaches are proposed to predict the formation energies of different classes of materials with decent accuracy. In this paper, one such machine learning approach, a novel two-step method that predicts the formation energy of ternary compounds, is presented. In the first step, with a classifier, we determine the accuracy of heuristically calculated formation energies in order to increase the size of the training dataset for the second step. The second step is a regression model that predicts the formation energy of the ternary compounds. The first step leads to at least a 100% increase in the size of the dataset with respect to the data available in the Materials Project database. The results from the regression model match those from the existing state-of-the-art prediction models. In addition, we propose a slightly modified version of the Adam optimizer, namely centered Adam, and report the results from testing the centered Adam optimizer.
Full article
(This article belongs to the Section Computational Chemistry)
►▼
Show Figures

Figure 1
Open AccessArticle
An Algebraic Approach to the Solutions of the Open Shop Scheduling Problem
Computation 2023, 11(5), 94; https://doi.org/10.3390/computation11050094 - 08 May 2023
Abstract
The open shop scheduling problem (OSSP) is one of the standard scheduling problems. It consists of scheduling jobs associated with a finite set of tasks developed by different machines. In this case, each machine processes at most one operation at a time, and
[...] Read more.
The open shop scheduling problem (OSSP) is one of the standard scheduling problems. It consists of scheduling jobs associated with a finite set of tasks developed by different machines. In this case, each machine processes at most one operation at a time, and the job processing order on the machines does not matter. The goal is to determine the completion times of the operations processed on the machines to minimize the largest job completion time, called Cmax. This paper proves that each OSSP has associated a path algebra called Brauer configuration algebra whose representation theory (particularly its dimension and the dimension of its center) can be given using the corresponding Cmax value. It has also been proved that the dimension of the centers of Brauer configuration algebras associated with OSSPs with minimal Cmax are congruent modulo the number of machines.
Full article
(This article belongs to the Section Computational Engineering)
►▼
Show Figures

Figure 1
Open AccessArticle
A Novel Finite Element Model for the Study of Harmful Vibrations on the Aging Spine
Computation 2023, 11(5), 93; https://doi.org/10.3390/computation11050093 - 05 May 2023
Abstract
The human spine is susceptible to a wide variety of adverse consequences from vibrations, including lower back discomfort. These effects are often seen in the drivers of vehicles, earth-moving equipment, and trucks, and also in those who drive for long hours in general.
[...] Read more.
The human spine is susceptible to a wide variety of adverse consequences from vibrations, including lower back discomfort. These effects are often seen in the drivers of vehicles, earth-moving equipment, and trucks, and also in those who drive for long hours in general. The human spine is composed of vertebrae, discs, and tissues that work together to provide it with a wide range of movements and significant load-carrying capability needed for daily physical exercise. However, there is a limited understanding of vibration characteristics in different age groups and the effect of vibration transmission in the spinal column, which may be harmful to the different sections. In this work, a novel finite element model (FEM) was developed to study the variation of vibration absorption capacity due to the aging effect of the different sections of the human spine. These variations were observed from the first three natural frequencies of the human spine structure, which were obtained by solving the eigenvalue problem of the novel finite element model for different ages. From the results, aging was observed to lead to an increase in the natural frequencies of all three spinal segments. As the age increased beyond 30 years, the natural frequency significantly increased for the thoracic segment, compared to lumber and cervical segments. A range of such novel findings indicated the harmful frequencies at which resonance may occur, causing spinal pain and possible injuries. This information would be indispensable for spinal surgeons for the prognosis of spinal column injury (SCI) patients affected by harmful vibrations from workplaces, as well as manufacturers of automotive and aerospace equipment for designing effective dampers for better whole-body vibration mitigation.
Full article
(This article belongs to the Special Issue Application of Finite Element Methods)
►▼
Show Figures

Figure 1
Open AccessArticle
The Cost of Understanding—XAI Algorithms towards Sustainable ML in the View of Computational Cost
by
, , , , and
Computation 2023, 11(5), 92; https://doi.org/10.3390/computation11050092 - 04 May 2023
Abstract
In response to socioeconomic development, the number of machine learning applications has increased, along with the calls for algorithmic transparency and further sustainability in terms of energy efficient technologies. Modern computer algorithms that process large amounts of information, particularly artificial intelligence methods and
[...] Read more.
In response to socioeconomic development, the number of machine learning applications has increased, along with the calls for algorithmic transparency and further sustainability in terms of energy efficient technologies. Modern computer algorithms that process large amounts of information, particularly artificial intelligence methods and their workhorse machine learning, can be used to promote and support sustainability; however, they consume a lot of energy themselves. This work focuses and interconnects two key aspects of artificial intelligence regarding the transparency and sustainability of model development. We identify frameworks for measuring carbon emissions from Python algorithms and evaluate energy consumption during model development. Additionally, we test the impact of explainability on algorithmic energy consumption during model optimization, particularly for applications in health and, to expand the scope and achieve a widespread use, civil engineering and computer vision. Specifically, we present three different models of classification, regression and object-based detection for the scenarios of cancer classification, building energy, and image detection, each integrated with explainable artificial intelligence (XAI) or feature reduction. This work can serve as a guide for selecting a tool to measure and scrutinize algorithmic energy consumption and raise awareness of emission-based model optimization by highlighting the sustainability of XAI.
Full article
(This article belongs to the Special Issue Intelligent Computing, Modeling and its Applications)
►▼
Show Figures

Figure 1
Open AccessArticle
Marine Predators Algorithm for Sizing Optimization of Truss Structures with Continuous Variables
by
and
Computation 2023, 11(5), 91; https://doi.org/10.3390/computation11050091 - 30 Apr 2023
Abstract
In this study, the newly developed Marine Predators Algorithm (MPA) is formulated to minimize the weight of truss structures. MPA is a swarm-based metaheuristic algorithm inspired by the efficient foraging strategies of marine predators in oceanic environments. In order to assess the robustness
[...] Read more.
In this study, the newly developed Marine Predators Algorithm (MPA) is formulated to minimize the weight of truss structures. MPA is a swarm-based metaheuristic algorithm inspired by the efficient foraging strategies of marine predators in oceanic environments. In order to assess the robustness of the proposed method, three normal-sized structural benchmarks (10-bar, 60-bar, and 120-bar spatial dome) and three large-scale structures (272-bar, 942-bar, and 4666-bar truss tower) were selected from the literature. Results point to the inherent strength of MPA against all state-of-the-art metaheuristic optimizers implemented so far. Moreover, for the first time in the field, a quantitative evaluation and an answer to the age-old question of the proper convergence behavior (exploration vs. exploitation balance) in the context of structural optimization is conducted. Therefore, a novel dimension-wise diversity index is adopted as a methodology to investigate each of the two schemes. It was concluded that the balance that produced the best results was about 90% exploitation and 10% exploration (on average for the entire computational process).
Full article
(This article belongs to the Special Issue Applications of Evolutionary Computation: Past Success and Future Challenges)
►▼
Show Figures

Figure 1
Open AccessArticle
Informal Sector, ICT Dynamics, and the Sovereign Cost of Debt: A Machine Learning Approach
Computation 2023, 11(5), 90; https://doi.org/10.3390/computation11050090 - 28 Apr 2023
Abstract
We examine the main effects of ICT penetration and the shadow economy on sovereign credit ratings and the cost of debt, along with possible second-order effects between the two variables, on a dataset of 65 countries from 2001 to 2016. The paper presents
[...] Read more.
We examine the main effects of ICT penetration and the shadow economy on sovereign credit ratings and the cost of debt, along with possible second-order effects between the two variables, on a dataset of 65 countries from 2001 to 2016. The paper presents a range of machine-learning approaches, including bagging, random forests, gradient-boosting machines, and recurrent neural networks. Furthermore, following recent trends in the emerging field of interpretable ML, based on model-agnostic methods such as feature importance and accumulated local effects, we attempt to explain which factors drive the predictions of the so-called ML black box models. We show that policies facilitating the penetration and use of ICT and aiming to curb the shadow economy may exert an asymmetric impact on sovereign ratings and the cost of debt depending on their present magnitudes, not only independently but also in interaction.
Full article
(This article belongs to the Special Issue Quantitative Finance and Risk Management Research)
►▼
Show Figures

Figure 1
Open AccessArticle
Preemptive Priority Markovian Queue Subject to Server Breakdown with Imperfect Coverage and Working Vacation Interruption
Computation 2023, 11(5), 89; https://doi.org/10.3390/computation11050089 - 27 Apr 2023
Abstract
►▼
Show Figures
This work considers a preemptive priority queueing system with vacation, where the single server may break down with imperfect coverage. Various combinations of server vacation priority queueing models have been studied by many scholars. A common assumption in these models is that the
[...] Read more.
This work considers a preemptive priority queueing system with vacation, where the single server may break down with imperfect coverage. Various combinations of server vacation priority queueing models have been studied by many scholars. A common assumption in these models is that the server will only resume its normal service rate after the vacation is over. However, such speculation is more limited in real-world situations. Hence, in this study, the vacation will be interrupted if a customer waits for service in the system at the moment of completion of service during vacation. The stationary probability distribution is derived by using the probability generating function approach. We also develop varieties of performance measures and provide a simple numerical example to illustrate these measures. Optimization analysis is finally carried out, including cost optimization and tri-object optimization.
Full article

Figure 1
Open AccessCommunication
Understanding Antioxidant Abilities of Dihydroxybenzenes: Local and Global Electron Transfer Properties
by
, , , , , and
Computation 2023, 11(5), 88; https://doi.org/10.3390/computation11050088 - 26 Apr 2023
Abstract
►▼
Show Figures
In the current work, globally based on Koopmans’ approximation, local electron transport characteristics of dihydroxybenzenes have been examined using the density functional theory for understanding their antioxidant activity. Our experimental and theoretical studies show that hydroquinone has better antioxidant activities when compared to
[...] Read more.
In the current work, globally based on Koopmans’ approximation, local electron transport characteristics of dihydroxybenzenes have been examined using the density functional theory for understanding their antioxidant activity. Our experimental and theoretical studies show that hydroquinone has better antioxidant activities when compared to resorcinol and catechol. To identify the antioxidant sites for each dihydroxybenzene molecule, an average analytical Fukui analysis was used. The typical Fukui analytical results demonstrate that dihydroxybenzene oxygen atoms serve as antioxidant sites. The experimental and theoretical results are in good agreement with each other; therefore, our results are reliable.
Full article

Figure 1
Open AccessArticle
Topological Optimization of Interconnection of Multilayer Composite Structures
Computation 2023, 11(5), 87; https://doi.org/10.3390/computation11050087 - 25 Apr 2023
Abstract
This study focuses on the topological optimization of adhesive overlap joints for structures subjected to longitudinal mechanical loads. The aim is to reduce peak stresses at the joint interface of the elements. Peak stresses in such joints can lead to failure of both
[...] Read more.
This study focuses on the topological optimization of adhesive overlap joints for structures subjected to longitudinal mechanical loads. The aim is to reduce peak stresses at the joint interface of the elements. Peak stresses in such joints can lead to failure of both the joint and the structure itself. A new approach based on Rational Approximation of Material Properties (RAMP) and the Finite Element Method (FEM) has been proposed to minimize peak stresses in multi-layer composite joints. Using this approach, the Mises peak stresses of the optimal structural joint have been significantly reduced by up to 50% under mechanical loading in the longitudinal direction. The paper includes numerical examples of different types of structural element connections.
Full article
(This article belongs to the Special Issue Application of Finite Element Methods)
►▼
Show Figures

Figure 1
Journal Menu
► ▼ Journal Menu-
- Computation Home
- Aims & Scope
- Editorial Board
- Reviewer Board
- Topical Advisory Panel
- Instructions for Authors
- Special Issues
- Topics
- Sections
- Article Processing Charge
- Indexing & Archiving
- Most Cited & Viewed
- Journal Statistics
- Journal History
- Journal Awards
- Conferences
- Editorial Office
- 10th Anniversary of Computation
Journal Browser
► ▼ Journal BrowserHighly Accessed Articles
Latest Books
E-Mail Alert
News
Topics
Topic in
Entropy, Algorithms, Computation, Fractal Fract
Computational Complex Networks
Topic Editors: Alexandre G. Evsukoff, Yilun ShangDeadline: 30 June 2023
Topic in
Axioms, Computation, Dynamics, Mathematics, Symmetry
Structural Stability and Dynamics: Theory and Applications
Topic Editors: Harekrushna Behera, Chia-Cheng Tsai, Jen-Yi ChangDeadline: 30 September 2023
Topic in
Entropy, Algorithms, Computation, MAKE, Energies, Materials
Artificial Intelligence and Computational Methods: Modeling, Simulations and Optimization of Complex Systems
Topic Editors: Jaroslaw Krzywanski, Yunfei Gao, Marcin Sosnowski, Karolina Grabowska, Dorian Skrobek, Ghulam Moeen Uddin, Anna Kulakowska, Anna Zylka, Bachil El FilDeadline: 20 October 2023
Topic in
Applied Sciences, BioMedInformatics, BioTech, Genes, Computation
Computational Intelligence and Bioinformatics (CIB)
Topic Editors: Marco Mesiti, Giorgio Valentini, Elena Casiraghi, Tiffany J. CallahanDeadline: 31 October 2023
Conferences
Special Issues
Special Issue in
Computation
Computational Biology and High-Performance Computing
Guest Editor: Amal KhalifaDeadline: 30 May 2023
Special Issue in
Computation
Intelligent Computing, Modeling and its Applications
Guest Editors: José Antonio Marmolejo Saucedo, Joshua Thomas, Leo Mrsic, Román Rodríguez Aguilar, Pandian VasantDeadline: 30 June 2023
Special Issue in
Computation
Computation to Fight SARS-CoV-2 (CoVid-19)
Guest Editors: Simone Brogi, Vincenzo CalderoneDeadline: 31 July 2023
Special Issue in
Computation
Applications of Statistics and Machine Learning in Electronics
Guest Editors: Stefan Hensel, Marin B. Marinov, Malinka Ivanova, Maya Dimitrova, Hiroaki WagatsumaDeadline: 31 August 2023




