Next Article in Journal
CyVerse Austria—A Local, Collaborative Cyberinfrastructure
Next Article in Special Issue
Windowing as a Sub-Sampling Method for Distributed Data Mining
Previous Article in Journal
Numerical Approach to a Nonlocal Advection-Reaction-Diffusion Model of Cartilage Pattern Formation
Previous Article in Special Issue
Evolutionary Multi-Objective Energy Production Optimization: An Empirical Comparison
 
 
Article
Peer-Review Record

Data-Driven Bayesian Network Learning: A Bi-Objective Approach to Address the Bias-Variance Decomposition

Math. Comput. Appl. 2020, 25(2), 37; https://doi.org/10.3390/mca25020037
by Vicente-Josué Aguilera-Rueda *, Nicandro Cruz-Ramírez and Efrén Mezura-Montes
Reviewer 1: Anonymous
Reviewer 2: Anonymous
Math. Comput. Appl. 2020, 25(2), 37; https://doi.org/10.3390/mca25020037
Submission received: 30 May 2020 / Revised: 19 June 2020 / Accepted: 19 June 2020 / Published: 20 June 2020
(This article belongs to the Special Issue New Trends in Computational Intelligence and Applications)

Round 1

Reviewer 1 Report

This study presents a novel bi-objective approach to address the data-driven learning problem of Bayesian networks. In general, the study is interesting, and I would recommend it to be published with minor revisions.

Several detailed comments are listed below.

1. The computational expense, e.g., computational time consumed, is also a important criteria to MOP method comparison.

2. Line 186, “The solution with the shortest Euclidean distance is referred…” This method has a formal name of LINMAP in multi-criteria decision-making research field, which may need to be mentioned. For you reference,

Jing, R., Wang, M., Zhang, Z., et al. Comparative study of posteriori decision-making methods when designing building integrated energy systems with multi-objectives. (2019) Energy and Buildings, 194, 123-139. DOI: 10.1016/j.enbuild.2019.04.023

Author Response

For reviewer 1
Dear reviewer, thank you for your time to read our work and valuable comments.

Point 1. The computational expense, e.g., computational time consumed, is also a important criteria to MOP method comparison.

Response 1. Concerning the computational expense; we add the computational cost of our NS2BN algorithm using the big O notation as can be seen on page 5 line 166.


Point 2. Line 186, “The solution with the shortest Euclidean distance is referred...” This method has a formal name of LINMAP in multi-criteria decision-making research field, which may need to be mentioned. For you reference,
Jing, R., Wang, M., Zhang, Z., et al. Comparative study of posteriori decision-making methods when designing building integrated energy systems with multi-objectives. (2019) Energy and Buildings, 194, 123-139. DOI: 10.1016/j.enbuild.2019.04.023

Response 2. As you suggest, we added the missing reference about preference management called LINMAP and you can see on page 6, line 189.

Reviewer 2 Report

The paper is extremely well written.  A few thoughts came to my mind while reading the paper.

  1. Single-objective method may work well when we are interested only in the efficiency and accuracy of our goal. But that is not reality very often in practice. So, the bi-objective method is more realistic than the single-objective method in the data analysis. The multi-objective method is even better. The Bias - Variance trade-off has started in the statistics literature. A key reference is easy to dig in and should be included in the paper.
  2. The equation (3) with MDL reminds me immediately of the Bayesian Information Criterion (BIC) developed by Schwartz in the literature and is widely used in practice. Would you please look into the difference between them? I do not find any difference than commonality. Your finding should be included in the paper.

 

 

 

 

Author Response

For reviewer 2
Dear reviewer, thank you for your time to read our work and valuable comments.

Point 1. Single-objective method may work well when we are interested only in the efficiency and accuracy of our goal. But that is not reality very often in practice. So, the bi-objective method is more realistic than the single-objective method in the data analysis. The multi-objective method is even better. The Bias - Variance trade-off has started in the statistics literature. A key reference is easy to dig in and should be included in the paper.

Response 1. As you suggest, we complement the references for the bias-variance decomposition as you can see on page 2, line 34. Some of the most representative references in that topic are:

Geman, S.; Bienenstock, E.L.; Doursat, R. Neural Networks and the Bias/Variance Dilemma. Neural Computation 1992, 4, 1–58.

Friedman, J.H. On Bias, Variance, 0’/1 Loss, and the Curse-of-Dimensionality. Data Min. Knowl. Discov. 1997, 1, 55–77. doi:10.1023/A:1009778005914.

Myung, I.J. The Importance of Complexity in Model Selection. Journal of Mathematical Psychology 2000, 44, 190 – 204. doi:http://dx.doi.org/10.1006/jmps.1999.1283.

Hastie, T.; Tibshirani, R.; Friedman, J., The Elements of Statistical Learning; Springer Series in Statistics, Springer New York Inc.: New York, NY, USA, 2001; pp. 37–38.

Point 2. The equation (3) with MDL reminds me immediately of the Bayesian Information Criterion (BIC) developed by Schwartz in the literature and is widely used in practice. Would you please look into the difference between them? I do not find any difference than commonality. Your finding should be included in the paper.

Response 2. As you note, the Minimum Description Length criterion (MDL) has as its objective to determine the model that provides the shortest description of the data set and although the principles of Bayesian Information Criterion (BIC) are different in the practice, some authors assure that MDL is simply the additive inverse of the BIC. We add this explanation to the manuscript on page 2, lines 39-42.

Neapolitan RE (2004) Learning Bayesian networks. New Jersey: Pearson- Prentice Hall. p. 624

Hastie T, Tibshirani R, Friedman J (2001) The elements of statistical learning. New York: Springer. p. 235

Thank you for reviewing our manuscript.

Back to TopTop