Next Article in Journal
A CNN Deep Local and Global ASD Classification Approach with Continuous Wavelet Transform Using Task-Based FMRI
Next Article in Special Issue
An Absorbing Markov Chain Model to Predict Dairy Cow Calving Time
Previous Article in Journal
Assessing the Bowing Technique in Violin Beginners Using MIMU and Optical Proximity Sensors: A Feasibility Study
Previous Article in Special Issue
Classifying Ingestive Behavior of Dairy Cows via Automatic Sound Recognition
Article

Cross-Modality Interaction Network for Equine Activity Recognition Using Imbalanced Multi-Modal Data †

1
Department of Infectious Diseases and Public Health, Jockey Club College of Veterinary Medicine and Life Sciences, City University of Hong Kong, Hong Kong, China
2
Department of Computer Science, City University of Hong Kong, Hong Kong, China
3
College of Electronic Engineering, South China Agricultural University, Guangzhou 510642, China
4
Department of Veterinary Clinical Sciences, Jockey Club College of Veterinary Medicine and Life Sciences, City University of Hong Kong, Hong Kong, China
5
Centre for Companion Animal Health, Jockey Club College of Veterinary Medicine and Life Sciences, City University of Hong Kong, Hong Kong, China
6
Animal Health Research Centre, Chengdu Research Institute, City University of Hong Kong, Chengdu 610000, China
*
Author to whom correspondence should be addressed.
This manuscript is an extension version of the conference paper: Mao, A.X.; Huang, E.D.; Xu, W.T.; Liu, K. Cross-modality Interaction Network for Equine Activity Recognition Using Time-Series Motion Data. In Proceedings of the 2021 International Symposium on Animal Environment and Welfare (ISAEW), Chongqing, China, 20–23 October 2021 (in press).
Academic Editors: Yongliang Qiao, Lilong Chai, Dongjian He and Daobilige Su
Sensors 2021, 21(17), 5818; https://doi.org/10.3390/s21175818
Received: 26 July 2021 / Revised: 25 August 2021 / Accepted: 27 August 2021 / Published: 29 August 2021
With the recent advances in deep learning, wearable sensors have increasingly been used in automated animal activity recognition. However, there are two major challenges in improving recognition performance—multi-modal feature fusion and imbalanced data modeling. In this study, to improve classification performance for equine activities while tackling these two challenges, we developed a cross-modality interaction network (CMI-Net) involving a dual convolution neural network architecture and a cross-modality interaction module (CMIM). The CMIM adaptively recalibrated the temporal- and axis-wise features in each modality by leveraging multi-modal information to achieve deep intermodality interaction. A class-balanced (CB) focal loss was adopted to supervise the training of CMI-Net to alleviate the class imbalance problem. Motion data was acquired from six neck-attached inertial measurement units from six horses. The CMI-Net was trained and verified with leave-one-out cross-validation. The results demonstrated that our CMI-Net outperformed the existing algorithms with high precision (79.74%), recall (79.57%), F1-score (79.02%), and accuracy (93.37%). The adoption of CB focal loss improved the performance of CMI-Net, with increases of 2.76%, 4.16%, and 3.92% in precision, recall, and F1-score, respectively. In conclusion, CMI-Net and CB focal loss effectively enhanced the equine activity classification performance using imbalanced multi-modal sensor data. View Full-Text
Keywords: equine behavior; wearable sensor; deep learning; intermodality interaction; class-balanced focal loss equine behavior; wearable sensor; deep learning; intermodality interaction; class-balanced focal loss
Show Figures

Figure 1

MDPI and ACS Style

Mao, A.; Huang, E.; Gan, H.; Parkes, R.S.V.; Xu, W.; Liu, K. Cross-Modality Interaction Network for Equine Activity Recognition Using Imbalanced Multi-Modal Data. Sensors 2021, 21, 5818. https://doi.org/10.3390/s21175818

AMA Style

Mao A, Huang E, Gan H, Parkes RSV, Xu W, Liu K. Cross-Modality Interaction Network for Equine Activity Recognition Using Imbalanced Multi-Modal Data. Sensors. 2021; 21(17):5818. https://doi.org/10.3390/s21175818

Chicago/Turabian Style

Mao, Axiu, Endai Huang, Haiming Gan, Rebecca S.V. Parkes, Weitao Xu, and Kai Liu. 2021. "Cross-Modality Interaction Network for Equine Activity Recognition Using Imbalanced Multi-Modal Data" Sensors 21, no. 17: 5818. https://doi.org/10.3390/s21175818

Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop