Previous Article in Journal
An RBFNN-Based Prescribed Performance Controller for Spacecraft Proximity Operations with Collision Avoidance
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
This is an early access version, the complete PDF, HTML, and XML versions will be available soon.
Article

Lightweight Power-Line Visual Detection in Agricultural UAV Scenarios Based on an Improved YOLOv12n Model

Academy of Ecological Unmanned Farm, College of Agricultural Engineering and Food Science, Shandong University of Technology, Zibo 255049, China
*
Author to whom correspondence should be addressed.
Sensors 2026, 26(1), 109; https://doi.org/10.3390/s26010109 (registering DOI)
Submission received: 4 November 2025 / Revised: 13 December 2025 / Accepted: 16 December 2025 / Published: 23 December 2025
(This article belongs to the Section Remote Sensors)

Abstract

To address the problems of low detection accuracy, slow inference speed, and high computational cost in power-line detection during autonomous operations of agricultural UAVs, this study proposes an improved object detection model based on YOLOv12n. A power-line dataset was constructed using real-field images supplemented with the TTPLA dataset. The lightweight EfficientNetV2 was introduced as the backbone network to replace the original backbone. In the neck, dynamic snake convolution and a multi-scale cross-axis attention mechanism were incorporated, while the region attention partitioning and residual efficient layer aggregation network from the baseline model were retained. In the head, a Mixture of Experts (MoE) layer from ParameterNet was integrated. The improved model achieved 80.07%, 43.07%, and 77.35% of the original model’s parameters, computation, and weight size, respectively. With an IoU threshold greater than 0.5, the mean average precision (mAP0.5) reached 75.5%, representing improvements of 13.53%, 15.09%, 7.5% and 7.54%over YOLOv8n, YOLOv11n, YOLOv5n, and Line-YOLO, respectively. Only inferior to RF-DETR-Nano. On mobile-end testing, the inference speed reached 88.36 FPS and exhibits the highest inference speed across all experimental models. The improved model demonstrates excellent generalization, robustness, detection accuracy, target localization, and processing speed, making it highly suitable for power-line detection in agricultural UAV applications and providing technical support for future autonomous and intelligent agricultural operations.
Keywords: object detection; deep learning; power line; agricultural drones; YOLOv12n; dynamic snake convolution object detection; deep learning; power line; agricultural drones; YOLOv12n; dynamic snake convolution

Share and Cite

MDPI and ACS Style

Ge, Y.-T.; Wang, B.-J.; Sun, S.; Lan, Y.-B. Lightweight Power-Line Visual Detection in Agricultural UAV Scenarios Based on an Improved YOLOv12n Model. Sensors 2026, 26, 109. https://doi.org/10.3390/s26010109

AMA Style

Ge Y-T, Wang B-J, Sun S, Lan Y-B. Lightweight Power-Line Visual Detection in Agricultural UAV Scenarios Based on an Improved YOLOv12n Model. Sensors. 2026; 26(1):109. https://doi.org/10.3390/s26010109

Chicago/Turabian Style

Ge, Yi-Tong, Bao-Ju Wang, Shuai Sun, and Yu-Bin Lan. 2026. "Lightweight Power-Line Visual Detection in Agricultural UAV Scenarios Based on an Improved YOLOv12n Model" Sensors 26, no. 1: 109. https://doi.org/10.3390/s26010109

APA Style

Ge, Y.-T., Wang, B.-J., Sun, S., & Lan, Y.-B. (2026). Lightweight Power-Line Visual Detection in Agricultural UAV Scenarios Based on an Improved YOLOv12n Model. Sensors, 26(1), 109. https://doi.org/10.3390/s26010109

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Article metric data becomes available approximately 24 hours after publication online.
Back to TopTop