# Accuracy Analysis Mechanism for Agriculture Data Using the Ensemble Neural Network Method

^{1}

^{2}

^{3}

^{4}

^{*}

## Abstract

**:**

## 1. Introduction

## 2. Research Backgrounds and Related Works

#### 2.1. Data Mining Methods

#### 2.1.1. Cluster Analysis

#### 2.1.2. Classification

#### 2.1.3. Statistical Analysis

#### 2.2. Agricultural Production Forecasting

#### 2.3. Stepwise Regression

#### 2.4. Back-Propagation Neural Network

- (1)
- Setting the parameters (e.g., neural network structure, learning rate, etc.) of the BPN.
- (2)
- Setting the weights (e.g., W
_{i,j}in Figure 1) among neurons in the BPN. - (3)
- (4)
- Calculating the output value of each neuron in the hidden layer in accordance with inputs and the output value of the neuron (e.g., Y
_{j}in Figure 1) in the output layer. - (5)
- Evaluating the error rate between the predicted output and actual output.
- (6)
- Evaluating the error rate among the value of the output neuron, the output value of each neuron in the hidden layer, and the value of the input neurons.
- (7)
- Updating the weights of neurons in accordance with error rates.
- (8)
- Repeating Steps (4)–(7) until convergence.

## 3. Materials and Methods

#### 3.1. Data Collection Mechanism

_{1,j}, and the mean and standard deviation of the relative humidity in the historical dataset can be calculated by Equations (1) and (2), respectively. Then the normalized average of the relative humidity during the j-th month can be expressed as x

_{1,j}by Equation (3).

#### 3.2. Stepwise Multiple Regression Mechanism

#### 3.3. Ensemble Neural Network Analysis Mechanism

#### 3.3.1. Learning Stage

#### 3.3.2. Recall Stage

#### 3.3.3. Prediction Stage

## 4. Analyses of Experimental Results

#### 4.1. Experimental Environments

#### 4.2. Experimental Results and Discussions

#### 4.2.1. Regression Analysis of Experimental Results

#### 4.2.2. Experimental Results of Traditional Back-Propagation Neural Network Analysis

#### 4.2.3. Ensemble Neural Network Analysis of Experimental Results

## 5. Conclusions and Future Work

## Acknowledgments

## Author Contributions

## Conflicts of Interest

## References

- Fahad, A.; Alshatri, N.; Tari, Z.; Alamri, A.; Khalil, I.; Zomaya, A.Y.; Foufou, S.; Bouras, A. A survey of clustering algorithms for big data: Taxonomy and empirical analysis. IEEE Trans. Emerg. Top. Comput.
**2014**, 2, 267–279. [Google Scholar] [CrossRef] - Huang, G.B.; Zhou, H.; Ding, X.; Zhang, R. Extreme learning machine for regression and multiclass classification. IEEE Trans. Syst. Man Cybern. B Cybern.
**2012**, 42, 513–529. [Google Scholar] [CrossRef] [PubMed] - Bai, Z.; Huang, G.B.; Wang, D.; Wang, H.; Westover, M.B. Sparse extreme learning machine for classification. IEEE Trans. Cybern.
**2014**, 44, 1858–1870. [Google Scholar] [PubMed] - Lo, C.C.; Chen, C.H.; Cheng, D.Y.; Kung, H.Y. Ubiquitous healthcare service system with context-awareness capability: Design and implementation. Expert Syst. Appl.
**2011**, 38, 4416–4436. [Google Scholar] [CrossRef] - Lu, F.; Sugano, Y.; Okabe, T.; Sato, Y. Adaptive linear regression for appearance-based gaze estimation. IEEE Trans. Pattern Anal. Mach. Intell.
**2014**, 36, 2033–2046. [Google Scholar] [CrossRef] [PubMed] - Chen, C.H.; Lin, J.H.; Kuan, T.S.; Lo, K.R. A high-efficiency method of mobile positioning based on commercial vehicle operation data. ISPRS Int. J. Geo-Inf.
**2016**, 5. [Google Scholar] [CrossRef] - Chen, C.H.; Yang, Y.T.; Chang, C.S.; Hsieh, C.M.; Kuan, T.S.; Lo, K.R. The design and implementation of a garbage truck fleet management system. S. Afr. J. Ind. Eng.
**2016**, 27, 32–46. [Google Scholar] [CrossRef] - Zhou, Z.H.; Chawla, N.V.; Jin, Y.; Williams, G.J. Big data opportunities and challenges: Discussions from data analytics perspectives. IEEE Comput. Intell. Mag.
**2014**, 9, 62–74. [Google Scholar] [CrossRef] - Chen, J.C.; Wang, K.J.; Wu, C.J.; Lai, Y.L.; Chen, C.W. Application of artificial neural network to Taiwan’s agriculture forecasting. J. Adv. Eng.
**2008**, 3, 241–249. [Google Scholar] - Zhang, L.; Zhang, J.; Kyei-Boahen, S.; Zhang, M. Simulation and prediction of soybean growth and development under field conditions. Am. Eurasian J. Agric. Environ. Sci.
**2010**, 7, 374–385. [Google Scholar] - Tsai, C.Y.; Shiue, Y.C. Predicting the productions of napier-grass based on back-propagation neural network. Acad. J. Kang-Ning
**2004**, 6, 97–107. [Google Scholar] - Ghunem, R.A.; Assaleh, K.; El-hag, A.H. Artificial neural networks with stepwise regression for predicting transformer oil furan content. IEEE Trans. Dielectr. Electr. Insul.
**2012**, 19, 414–420. [Google Scholar] [CrossRef] - Zhou, N.; Pierre, J.W.; Trudnowski, D. A stepwise regression method for estimating dominant electromechanical modes. IEEE Trans. Power Syst.
**2012**, 27, 1051–1059. [Google Scholar] [CrossRef] - Heermann, P.D.; Khazenie, N. Classification of multispectral remote sensing data using a back-propagation neural network. IEEE Trans. Geosci. Remote Sens.
**1992**, 30, 81–88. [Google Scholar] [CrossRef] - Yuan, J.; Yu, S. Privacy preserving back-propagation neural network learning made practical with cloud computing. IEEE Trans. Parallel Distrib. Syst.
**2013**, 25, 212–221. [Google Scholar] [CrossRef] - Lin, H.F.; Chen, C.H. Design and application of augmented reality query-answering system in mobile phone information navigation. Expert Syst. Appl.
**2015**, 42, 810–820. [Google Scholar] [CrossRef]

**Figure 3.**Architecture of accuracy analysis mechanism for agricultural data based on the ENN method.

Type | Algorithm |
---|---|

segmentation-based | K-means, K-medoids, K-modes, PAM (partitioning around medoids), CLARANS (clustering large applications based on randomized search), CLARA (clustering large applications), FCM (fuzzy c-means) |

hierarchical-based | BIRCH (balanced iterative reducing and clustering using hierarchies), CURE (clustering using representatives), ROCK (robust clustering algorithm), Chameleon, Echidna |

density-based | DBSCAN (density-based spatial clustering of applications with noise), OPTICS (ordering points to identify the clustering structure), DBCLASD (density based clustering algorithms for mining large spatial databases), DENCLUE (density-based clustering) |

grid-based | Wave-Cluster, STING (statistical information grid), CLIQUE (clustering in quest), OptiGrid (optimal grid-clustering) |

model-based | EM (expectation maximization), Cobweb, CLASSIT (a robust concept formation system), SOMs (self-organizing maps) |

Parameter | Description |
---|---|

x_{1} | Relative humidity |

x_{2} | Precipitation |

x_{3} | Planting area |

x_{4} | Air temperature |

x_{5} | Cost of production |

x_{6} | Market trading price |

Y | Total harvest |

Parameter | Status |
---|---|

Neurons of input layer | Known (learning data set) |

Weight | Unknown (learned through constant learning and revision) |

Neurons of output layer | Known |

Parameter | Status |
---|---|

Neurons of input layer | Known (testing data set) |

Weight | Known (learned through learning stage) |

Neurons of output layer | Unknown (to verify the accuracy of the model output) |

Item | Tool |
---|---|

Operating System | Windows 7 64-bit |

Processor | Intel(R) Core(TM) i5 1.6GHz |

Random Access Memory | 4 GB |

Integrated Development Environment | Eclipse |

Programming Language | Java SE 1.8 JDK |

Statistics Tool | IBM SPSS Statistics 22.0 |

Model | The Number of Hidden Layers | The Number of Hidden Neurons | Correct Rate | |
---|---|---|---|---|

First times | Model 1 | 5 | {1,3,1,2,1} | 90.81% |

Model 2 | 5 | {2,4,5,5,1} | 86.70% | |

Model 3 | 5 | {5,1,1,4,5} | 88.10% | |

Model 4 | 1 | {3} | 89.87% | |

Model 5 | 3 | {1,3,2} | 93.30% | |

Second times | Model 1 | 5 | {2,5,3,1,5} | 86.03% |

Model 2 | 5 | {4,1,3,3,4} | 90.74% | |

Model 3 | 2 | {2,5} | 82.99% | |

Model 4 | 1 | {2} | 83.5% | |

Model 5 | 1 | {3} | 90.01% | |

Third times | Model 1 | 4 | {5,1,5,5} | 90.7% |

Model 2 | 4 | {5,4,4,2} | 93.81% | |

Model 3 | 4 | {2,5,4,5} | 87.27% | |

Model 4 | 1 | {3} | 94.75% | |

Model 5 | 1 | {5} | 94.29% | |

Fourth times | Model 1 | 5 | {5,3,3,2,5} | 85.93% |

Model 2 | 1 | {1} | 87.63% | |

Model 3 | 3 | {3,2,3} | 90.06% | |

Model 4 | 2 | {2,3} | 86.95% | |

Model 5 | 4 | {5,1,4,3} | 94.46% | |

Five times | Model 1 | 2 | {4,2} | 88.16% |

Model 2 | 3 | {4,3,4} | 91.02% | |

Model 3 | 5 | {5,3,1,3,5} | 90.53% | |

Model 4 | 3 | {2,5,2} | 89.85% | |

Model 5 | 1 | {5} | 88.67% |

© 2016 by the authors; licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC-BY) license (http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Kung, H.-Y.; Kuo, T.-H.; Chen, C.-H.; Tsai, P.-Y.
Accuracy Analysis Mechanism for Agriculture Data Using the Ensemble Neural Network Method. *Sustainability* **2016**, *8*, 735.
https://doi.org/10.3390/su8080735

**AMA Style**

Kung H-Y, Kuo T-H, Chen C-H, Tsai P-Y.
Accuracy Analysis Mechanism for Agriculture Data Using the Ensemble Neural Network Method. *Sustainability*. 2016; 8(8):735.
https://doi.org/10.3390/su8080735

**Chicago/Turabian Style**

Kung, Hsu-Yang, Ting-Huan Kuo, Chi-Hua Chen, and Pei-Yu Tsai.
2016. "Accuracy Analysis Mechanism for Agriculture Data Using the Ensemble Neural Network Method" *Sustainability* 8, no. 8: 735.
https://doi.org/10.3390/su8080735