Next Article in Journal
Magnetic Detection Structure for Lab-on-Chip Applications Based on the Frequency Mixing Technique
Next Article in Special Issue
Detecting and Monitoring the Flavor of Tomato (Solanum lycopersicum) under the Impact of Postharvest Handlings by Physicochemical Parameters and Electronic Nose
Previous Article in Journal
Inverse Piezoresistive Nanocomposite Sensors for Identifying Human Sitting Posture
Article Menu
Issue 6 (June) cover image

Export Article

Open AccessArticle
Sensors 2018, 18(6), 1746; https://doi.org/10.3390/s18061746

A Kinect-Based Segmentation of Touching-Pigs for Real-Time Monitoring

Department of Computer Convergence Software, Korea University, Sejong City 30019, Korea
*
Author to whom correspondence should be addressed.
Received: 2 May 2018 / Revised: 23 May 2018 / Accepted: 27 May 2018 / Published: 29 May 2018
(This article belongs to the Special Issue Sensors in Agriculture 2018)
  |  
PDF [7553 KB, uploaded 29 May 2018]
  |  

Abstract

Segmenting touching-pigs in real-time is an important issue for surveillance cameras intended for the 24-h tracking of individual pigs. However, methods to do so have not yet been reported. We particularly focus on the segmentation of touching-pigs in a crowded pig room with low-contrast images obtained using a Kinect depth sensor. We reduce the execution time by combining object detection techniques based on a convolutional neural network (CNN) with image processing techniques instead of applying time-consuming operations, such as optimization-based segmentation. We first apply the fastest CNN-based object detection technique (i.e., You Only Look Once, YOLO) to solve the separation problem for touching-pigs. If the quality of the YOLO output is not satisfied, then we try to find the possible boundary line between the touching-pigs by analyzing the shape. Our experimental results show that this method is effective to separate touching-pigs in terms of both accuracy (i.e., 91.96%) and execution time (i.e., real-time execution), even with low-contrast images obtained using a Kinect depth sensor. View Full-Text
Keywords: agriculture IT; computer vision; depth information; touching-objects segmentation; convolutional neural network; YOLO agriculture IT; computer vision; depth information; touching-objects segmentation; convolutional neural network; YOLO
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).
SciFeed

Share & Cite This Article

MDPI and ACS Style

Ju, M.; Choi, Y.; Seo, J.; Sa, J.; Lee, S.; Chung, Y.; Park, D. A Kinect-Based Segmentation of Touching-Pigs for Real-Time Monitoring. Sensors 2018, 18, 1746.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Sensors EISSN 1424-8220 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top