# Cross-Voting SVM Method for Multiple Vehicle Classification in Wireless Sensor Networks

^{*}

## Abstract

**:**

## 1. Introduction

## 2. Comparison of Two-Class Classifiers

#### 2.1. Datasets and Feature Extraction Method

#### 2.2. Comparison of Performance of Different Two-Class Classifiers

## 3. Comparison of Multi-Classes SVM Classifier

#### 3.1. M-RLP SVM Method

**w**is the parameter of the classification hyperplane, x

_{i}is the classification sample point on the boundary, and y

_{i}is the label, and its value is either −1 or +1. T is the matrix transpose symbol and the constant c is the distance from the origin to the hyperplane.

_{i}is the slack variable introduced when the ith sample data is linearly inseparable.

**w**

_{i}and

**w**

_{j}are the parameters of the piecewise linear margin. K is the number of data categories to be classified, and the specific derivation process is available in the literature in Reference [26]. This method utilizes a piecewise nonlinear classification function constructed by a single quadratic program to classify a multi-class dataset.

^{M}), where M is the number of categories of targets to be classified. For WSN nodes with insufficient computing capacity, a multi-target classification algorithm based on multiple two-class classifiers is more appropriate.

#### 3.2. DAGSVM and Binary-Tree SVM

_{ABCD}(A), which is also the probability of accurately determining A from A, B, C, and D. Similarly, the classification accuracy of B, C, and D are denoted as P

_{ABCD}(A), P

_{ABCD}(B), and P

_{ABCD}(D), respectively. The formulas for calculating these values are as follows:

_{AD}(A) is the classification accuracy of A in the one-against-one classifier (A/D), and P

_{AC}(A) is the classification accuracy of A in the one-against-one classifier (A/C). P

_{AD}(A|B) represents the probability that A is classified as B when using the one-against-one classifier (A,D), and P

_{BD}(D|C) represents the probability that C is classified as D when using the one-against-one classifier (B,D). The classification accuracy calculation Formulas (9)–(12), reveal that the target classification accuracy of the DAGSVM algorithm depends on the category number N, and the classification accuracy of the two-class classifiers at the node of the directed acyclic graph. Top-down error accumulation affects the classification performance of this method. For example, the classification accuracy of A (P

_{ABCD}(A)) is determined by the classification accuracy of three two-class classifiers (P

_{AD}(A), P

_{AC}(A), and P

_{AB}(A)), and the classification error of each two-class classifiers will reduce the final classification accuracy P

_{ABCD}(A). The most important factor affecting the classification accuracy of the two-class classifier is the geometric spacing between the training features of the two types of targets. The larger the geometric spacing is, the smaller the upper bound of the classification error rate is, i.e., the higher the classification accuracy that can be achieved by optimizing the parameter method.

#### 3.3. Binary-Tree SVM

_{A∪C, B∪D}(A ∪ C) represents the probability that target A or C is classified as A ∪ C, when using the two-class classifier (A ∪ C/B ∪ D). P

_{AC}(A) denotes the classification accuracy of target A, when using the two-class classifier (A/C).

## 4. Voting Comparison Method

_{AB}and r

_{CD}are the results of the classifier (A ∪ B/C ∪ D). If the classification result of the new dataset is A ∪ B, r

_{AB}is equal to 1, and r

_{CD}is equal to 0. P

_{AB}is the voting weight. At the end of the algorithm, the votes of all the candidate results are counted, and the class with the highest ticket is the final classification result of the data to be classified.

_{AB}= P

_{CD}= P

_{AC}= P

_{BD}= P

_{AD}= P

_{BC}= 1.

#### 4.1. Class Distance and Classification Accuracy of Two-Class SVM Classifiers

#### 4.2. Distance of Training Dataset and Voting Weight

**A**and

**B**, which are N × M matrices, represent the feature matrices of the acoustic signals of two targets. The number of rows in the matrix N, denotes the number of segments N, that the target signal corresponding to A was divided into. When using the WCER method to extract features from a signal sequence, segmentation is performed every 1024 points to extract the features. M is the number of variables of each line in the matrix, which is also twice the wavelet decomposition depth number in the WCER feature extraction method. The parameter w

_{ai}represents the ith eigenvector of the feature matrix

**A**, and w

_{bi}represents the ith eigenvector of the feature matrix

**B**. The different columns in the matrix react to the characteristics of the target signal in different dimensions. Therefore, when calculating the distance between two feature matrices, the distance between the feature variables in the same column of the two feature matrices is first calculated using Equation (18).

_{ij}is the jth variable in the ith eigenvector of the feature matrix

**A,**and b

_{ij}is the jth variable in the ith eigenvector of the feature matrix

**B**. There are M eigenvectors (

**w**

_{1}, …,

**w**

_{M}) in the feature matrix of the target signal. After calculating the distance between the eigenvectors of the feature matrix, it is also necessary to quantify the distance between the feature matrices, which is the distance between the target signals. The distance between the matrices is calculated by the formula below:

_{B}is the number of rows in feature matrix

**B**, and N

_{C}is the number of rows in the feature matrix

**C**, which is also the number of segments of the target signal corresponding to the feature matrix

**C**. Using Equation (20) for further derivation, we can obtain the distance between the union dataset (A ∪ D) and the merged dataset (B ∪ C):

_{A}_{,D}and α

_{D}_{,A}are the adjustment coefficients and can be calculated by Equation (21).

#### 4.3. Average Segmentation of Training Data

#### 4.4. Cross-Voting SVM Classification Algorithm

Algorithm 1 Cross-Voting Algorithm for 4 Targets Classification |

Input: Training dataset: Feature matrices {AFT, BFT, CFT, DFT} of 4 target signals (A, B, C, D). Feature vector X of unknown target. |

Output: Vehicle type of feature vector X. |

(1) Using the method shown in Figure 8 to segmentally average the training data to obtain a different target signal features matrix for training classifier. {(AFT1, AFT2, AFT3); (BFT1, BFT2, BFT3); (CFT1, CFT2, CFT3); (DFT1, DFT2, DFT3);} |

(2) Using the obtained feature matrix to construct different training datasets and train three classifiers separately: using ((AFT1 ∪ BFT1), (CFT1 ∪ DFT1)) to train the first two-class classifier 1; using ((AFT2 ∪ CFT2), (BFT2 ∪ DFT2)) to train the second two-class classifier 2; using ((AFT3 ∪ DFT3), (BFT3 ∪ CFT3)) to train the third two-class classifier 3; |

(3) Calculating the distance between the training datasets for the three classifiers using Equation (22): Dist (1): D((AFT1 ∪ BFT1), (CFT1 ∪ DFT1)); Dist (2): D((AFT2 ∪ CFT2), (BFT2 ∪ DFT2)); Dist (3): D((AFT3 ∪ DFT3), (BFT3 ∪ CFT3)); Calculating the voting weight of the three classifier results P(i) (i = 1, 2, 3), using Equations (23) and (24); |

(4) Classifying the feature vector X using the trained three classifiers, i.e., classifier 1, classifier 2, and classifier 3, and obtain the classification result: (rAB, rCD); (rAC, rBD); (rAD, rBC); |

(5) Counting vote results of 4 different targets using the classification results of classifier 1, classifier 2, and classifier 3. Vr(A) = rAB·P(1) + rAC·P(2) + rAD·P(3); Vr(B) = rAB·P(1) + rBD·P(2)+ rBC·P(3); Vr(C) = rCD·P(1) + rAC·P(2) + rBC·P(3); Vr(D) = rCD·P(1)+ rBD·P(2) + rAD·P(3); |

(6) Finding Max{Vr(A), Vr(B), Vr(C), Vr(D)} and the corresponding vehicle type is the vehicle type of feature vector X; |

## 5. Performance of Cross-Voting SVM Algorithm

#### 5.1. Classification Accuracy of Cross-Voting SVM Algorithm

#### 5.2. Cross-Voting Method Using Different Two-Class Classifier

#### 5.3. Comparison with Other Multi-Class Classification Method

## 6. Conclusions

## Author Contributions

## Funding

## Conflicts of Interest

## References

- Sohraby, K.; Minoli, D.; Znati, T. Wireless Sensor Networks: Technology, Protocols, and Applications, 1st ed.; John Wiley & Sons, INC.: Hoboken, NJ, USA, 2007. [Google Scholar]
- Mishra, D.P.; Dorale, S.S. An application of wireless sensor network in intelligent transportation system. In Proceedings of the 2013 6th International Conference on Emerging Trends in Engineering and Technology, Nagpur, India, 16–18 December 2013; pp. 90–91. [Google Scholar]
- Balid, W.; Tafish, H.; Refai, H.H. Development of portable wireless sensor network system for real-time traffic surveillance. In Proceedings of the 2015 IEEE 18th International Conference on Intelligent Transportation System, Las Palmas, Spain, 15–18 September 2015; pp. 1630–1637. [Google Scholar]
- Myles, A.J.; Feudale, R.N.; Liu, Y.; Woody, N.A.; Brown, S.D. An introduction to decision tree modeling. J. Chemom.
**2004**, 18, 275–285. [Google Scholar] [CrossRef] - Langseth, H.; Nielsen, T.D. Classification using Hierarchical Naive Bayes models. Mach. Learn.
**2006**, 63, 135–159. [Google Scholar] [CrossRef][Green Version] - Galar, M.; Fernandez, A.; Barrenechea, E.; Bustince, H.; Herrera, F. An overview of ensemble methods for binary classifiers in multi-class problems: Experimental study on one-vs-one and one-vs-all schemes. Pattern Recognit.
**2011**, 44, 1761–1776. [Google Scholar] [CrossRef] - Chang, C.C.; Lin, C.J. LIBSVM: A library for support vector machines. ACM Trans. Intell. Syst. Technol.
**2011**, 2. [Google Scholar] [CrossRef] - Guo, H.S.; Wang, W.J. An active learning-based SVM multi-class classification model. Pattern Recognit.
**2015**, 48, 1577–1597. [Google Scholar] [CrossRef] - Angulo, C.; Parra, X.; Catala, A. K-SVCR. A support vector machine for multi-class classification. Neurocomputing
**2003**, 55, 57–77. [Google Scholar] [CrossRef] - Melin, P.; Amezcua, J.; Valdez, F.; Castillo, O.A. New neural network model based on the LVQ algorithm for multi-class classification of arrhythmias. Inf. Sci.
**2014**, 279, 483–497. [Google Scholar] [CrossRef] - Calvo-Zaragoza, J.; Valero-Mas, J.J.; Rico-Juan, J.R. Improving kNN multi-label classification in Prototype Selection scenarios using class proposals. Pattern Recognit.
**2015**, 48, 1608–1622. [Google Scholar] [CrossRef][Green Version] - Sucar, L.E.; Bielza, C.; Morales, E.F.; Hernandez-Leal, P.; Zaragoza, J.H.; Larranaga, P. Multi-label classification with Bayesian network-based chain classifiers. Pattern Recognit. Lett.
**2014**, 41, 14–22. [Google Scholar] [CrossRef] - Guo, H.X.; Li, Y.J.; Li, Y.A.; Liu, X.; Li, J.L. BPSO-Adaboost-KNN ensemble learning algorithm for multi-class imbalanced data classification. Eng. Appl. Artif. Intell.
**2016**, 49, 176–193. [Google Scholar] [CrossRef] - Xu, T.A. New sphere-structured multi-class classifier. In Proceedings of the 2009 Pacific-Asia Conference on Circuits, Communications and Systems, Chengdu, China, 16–17 May 2009; pp. 520–525. [Google Scholar]
- Lopez, J.; Maldonado, S. Multi-class second-order cone programming support vector machines. Inf. Sci.
**2016**, 330, 328–341. [Google Scholar] [CrossRef] - Tomar, D.; Agarwal, S.A. Comparison on multi-class classification methods based on least squares twin support vector machine. Knowl.-Based Syst.
**2015**, 81, 131–147. [Google Scholar] [CrossRef] - Hsu, C.W.; Lin, C.J. A comparison of methods for multiclass support vector machines. IEEE Trans. Neural Netw.
**2002**, 13, 415–425. [Google Scholar] [CrossRef] [PubMed][Green Version] - Manikandan, J.; Venkataramani, B. Design of a modified one-against-all SVM classifier. In Proceedings of the 2009 IEEE International Conference on Systems Man and Cybernetics, San Antonio, TX, USA, 11–14 October 2009; pp. 1869–1874. [Google Scholar] [CrossRef]
- Fei, B.; Liu, J.B. Binary tree of SVM: A new fast multiclass training and classification algorithm. IEEE Trans. Neural Netw.
**2006**, 17, 696–704. [Google Scholar] [CrossRef] [PubMed] - Cheng, L.L.; Zhang, J.P.; Yang, J.; Ma, J. An improved hierarchical multi-class support vector machine with binary tree architecture. In Proceedings of the 2008 International Conference on Internet Computing in Science and Engineering, Harbin, China, 28–29 January 2008. [Google Scholar]
- Vens, C.; Struyf, J.; Schietgat, L.; Dzeroski, S.; Blockeel, H. Decision trees for hierarchical multi-label classification. Mach. Learn.
**2008**, 73, 185–214. [Google Scholar] [CrossRef][Green Version] - Zhang, H.; Pan, Z.; Zhang, W. Acoustic-seismic mixed feature extraction based on wavelet transform for vehicle classification in wireless sensor networks. Sensors
**2018**, 18, 1862. [Google Scholar] [CrossRef] [PubMed] - Duarte, M.F.; Hu, Y.H. Vehicle classification in distributed sensor networks. J. Parallel Distrib. Comput.
**2004**, 64, 826–838. [Google Scholar] [CrossRef][Green Version] - Guo, G.D.; Wang, H.; Bell, D.; Bi, Y.X.; Greer, K. KNN model-based approach in classification. In Proceedings of the OTM Confederated International Conference CoopIS, DOA and ODBASE, Catania, Italy, 3–7 November 2003; pp. 986–996. [Google Scholar]
- Wen, X.Z.; Shao, L.; Xue, Y.; Fang, W. A rapid learning algorithm for vehicle classification. Inf. Sci.
**2015**, 295, 395–406. [Google Scholar] [CrossRef] - Erin, J.B.; Kristin, P.B. Multicategory classification by support vector machines. Comput. Optim.
**1999**, 53–79. [Google Scholar] [CrossRef] - Murtagh, F.; Legendre, P. Ward’s hierarchical agglomerative clustering method: Which algorithms implement Ward’s criterion? J. Classif.
**2014**, 31, 274–295. [Google Scholar] [CrossRef] - Nauman, A.; Junaid, Q.; Nasir, A. Neural networks in wireless networks: Techniques, applications and guidelines. J. Netw. Comput. Appl.
**2017**, 68, 1–27. [Google Scholar] [CrossRef] - Sun, N.; Han, G.; Du, K.; Liu, J.X.; Li, X.F. Person/Vehicle Classification based on Deep Belief Networks. In Proceedings of the 10th International Conference on Natural Computation, Xiamen, China, 19–21 August 2014; pp. 113–117. [Google Scholar]
- Zhang, W.Y.; Zhang, Z.J. Belief function based decision fusion for decentralized target classification in wireless sensor networks. Sensors
**2015**, 15, 20524–20540. [Google Scholar] [CrossRef] [PubMed]

**Figure 2.**Two-class support vector machine (SVM) classifier and three-class SVM classifier: (

**a**) two-class datasets separated by hyperplane; (

**b**) three-class datasets separated by piecewise linear function.

**Figure 6.**Two instances of multi-classification using the cross-voting algorithm: (

**a**) in an ideal condition; (

**b**) in the condition where it is unfeasible to classify.

**Figure 7.**Relationship between SVM classification accuracy and distance between training datasets: (

**a**) distance between AAV3/AAV5 and other DW datasets, and corresponding average classification accuracy; (

**b**) distance between DW3/DW5 and other AAV datasets, and corresponding average classification accuracy.

**Figure 9.**Classification accuracy of each step when using DAGSVM method: (

**a**) and binary-tree SVM method; (

**b**) to classify acoustic signal of DW9.

**Table 1.**Average classification accuracy and time required for different classification methods using acoustic features. KNN—k-nearest neighbor; DT—decision tree; NB—naïve Bayes; AdaBoost—adaptive boosting; SVM—support vector machine.

KNN | DT | NB | AdaBoost | SVM | |
---|---|---|---|---|---|

Average classification accuracy | 72.22% | 52.82% | 65.71% | 69.83% | 71.03% |

Average time required for classification | 1.4783 s | 0.0024 s | 0.0102 s | 0.0743 s | 0.1440 s |

**Table 2.**Classification accuracy of four types of target signals using the cross-voting SVM method, DAG SVM method, and binary-tree SVM method.

Run Name | Cross-Voting SVM | DAG SVM | Binary-Tree SVM | |||
---|---|---|---|---|---|---|

A | S | A | S | A | S | |

AAV5 | 99.77% | 41.14% | 99.32% | 44.08% | 99.66% | 41.02% |

AAV6 | 98.44% | 47.92% | 93.07% | 49.74% | 96.45% | 40.92% |

AAV7 | 93.67% | 45.01% | 89.90% | 46.56% | 91.24% | 42.86% |

AAV8 | 93.50% | 42.37% | 92.34% | 48.67% | 93.67% | 43.34% |

AAV9 | 97.64% | 45.80% | 99.78% | 51.32% | 100% | 46.24% |

AAV10 | 82.33% | 54.17% | 90.77% | 54.79% | 94.68% | 52.23% |

AAV11 | 89.75% | 38.10% | 90.03% | 42.04% | 90.31% | 36.03% |

Average | 93.59% | 44.93% | 93.60% | 48.17% | 95.14% | 43.23% |

DW5 | 68.58% | 86.13% | 80.97% | 82.00% | 73.40% | 88.40% |

DW6 | 86.31% | 89.93% | 85.94% | 91.68% | 84.26% | 96.58% |

DW7 | 73.71% | 88.40% | 25.37% | 86.37% | 22.96% | 90.70% |

DW8 | 69.81% | 75.72% | 31.02% | 69.65% | 28.16% | 76.22% |

DW9 | 72.21% | 74.88% | 7.26% | 73.10% | 5.82% | 76.70% |

DW10 | 67.17% | 92.38% | 33.69% | 90.97% | 29.24% | 95.43% |

DW11 | 56.28% | 98.52% | 47.11% | 96% | 45.00% | 99.21% |

DW12 | 61.53% | 91.36% | 40.96% | 87.19% | 39.65% | 94.11% |

Average | 70.58% | 87.16% | 44.03% | 84.62% | 41.06% | 89.67% |

**Table 3.**Comparison of average classification accuracy and time consumption using the cross-voting SVM method, DAG SVM method, and binary-tree SVM method.

Cross-Voting SVM | DAGSVM | Binary-Tree SVM | |
---|---|---|---|

Overall average classification accuracy | 74.52% | 67.39% | 67.15% |

Time Consumption of Classification | 0.3255 s | 1.1207 s | 0.2796 s |

**Table 4.**Classification accuracy of each step when using the cross-voting SVM method to classify the acoustic signal of DW9.

Classification Accuracy | Overall Classification Accuracy | |
---|---|---|

Classifier 1 (AAV_A ∪ AAV_S/DW_A ∪ DW_S) | 73.71% (DW_A ∪ DW_S) | 72.21% (DW_A) |

Classifier 2 (AAV_A ∪ DW_A/AAV_S ∪ DW_S) | 98.16% (AAV_A ∪ DW_A) | |

Classifier 3 (AAV_A ∪ DW_S/AAV_S ∪ DW_A) | 19.75% (AAV_S ∪ DW_A) |

**Table 5.**Average classification accuracy and time consumption of the cross-voting method using different two-class classifier.

Cross-Voting NB | Cross-Voting DT | Cross-Voting Adaboosting | Cross-Voting SVM | |
---|---|---|---|---|

Overall average classification accuracy | 37.48% | 75.64% | 72.55% | 74.52% |

Time Consumption of Classification | 0.0253 s | 4.1748 s | 1.0750 s | 0.3255 s |

**Table 6.**Comparison of average classification accuracy and time consumption using different multi-class classification algorithms.

NB | Adaboosting | Cross-Voting SVM | |
---|---|---|---|

Overall average classification accuracy | 63.85% | 53.52% | 74.52% |

Time Consumption of Classification | 0.0077 s | 0.3621 s | 0.3255 s |

**Table 7.**Comparison of average classification accuracy, time consumption, and memory consumption using different multi-class classification algorithms.

DBN | FFNN | ELM | Cross-Voting SVM | |
---|---|---|---|---|

Overall average classification accuracy | 78.40% | 75.76% | 71.88% | 74.52% |

Time Consumption of Training | 350.84 s | 452.36 s | 0.5581 s | 15.1698 s |

Average Time Consumption of Classification | 0.0298 s | 0.0452 s | 0.1783 s | 0.3255 s |

Memory Consumption | 9253.94 M | 34,660.9 M | 369.02 M | 123.95 M |

© 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Zhang, H.; Pan, Z.
Cross-Voting SVM Method for Multiple Vehicle Classification in Wireless Sensor Networks. *Sensors* **2018**, *18*, 3108.
https://doi.org/10.3390/s18093108

**AMA Style**

Zhang H, Pan Z.
Cross-Voting SVM Method for Multiple Vehicle Classification in Wireless Sensor Networks. *Sensors*. 2018; 18(9):3108.
https://doi.org/10.3390/s18093108

**Chicago/Turabian Style**

Zhang, Heng, and Zhongming Pan.
2018. "Cross-Voting SVM Method for Multiple Vehicle Classification in Wireless Sensor Networks" *Sensors* 18, no. 9: 3108.
https://doi.org/10.3390/s18093108