Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (6)

Search Parameters:
Keywords = safflower filaments

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
20 pages, 10100 KiB  
Article
A Method for Identifying Picking Points in Safflower Point Clouds Based on an Improved PointNet++ Network
by Baojian Ma, Hao Xia, Yun Ge, He Zhang, Zhenghao Wu, Min Li and Dongyun Wang
Agronomy 2025, 15(5), 1125; https://doi.org/10.3390/agronomy15051125 - 2 May 2025
Cited by 1 | Viewed by 737
Abstract
To address the challenge of precise picking point localization in morphologically diverse safflower plants, this study proposes PointSafNet—a novel three-stage 3D point cloud analysis framework with distinct architectural and methodological innovations. In Stage I, we introduce a multi-view reconstruction pipeline integrating Structure from [...] Read more.
To address the challenge of precise picking point localization in morphologically diverse safflower plants, this study proposes PointSafNet—a novel three-stage 3D point cloud analysis framework with distinct architectural and methodological innovations. In Stage I, we introduce a multi-view reconstruction pipeline integrating Structure from Motion (SfM) and Multi-View Stereo (MVS) to generate high-fidelity 3D plant point clouds. Stage II develops a dual-branch architecture employing Star modules for multi-scale hierarchical geometric feature extraction at the organ level (filaments and frui balls), complemented by a Context-Anchored Attention (CAA) mechanism to capture long-range contextual information. This synergistic feature learning approach addresses morphological variations, achieving 86.83% segmentation accuracy (surpassing PointNet++ by 7.37%) and outperforming conventional point cloud models. Stage III proposes an optimized geometric analysis pipeline combining dual-centroid spatial vectorization with Oriented Bounding Box (OBB)-based proximity analysis, resolving picking coordinate localization across diverse plants with 90% positioning accuracy and 68.82% mean IoU (13.71% improvement). The experiments demonstrate that PointSafNet systematically integrates 3D reconstruction, hierarchical feature learning, and geometric reasoning to provide visual guidance for robotic harvesting systems in complex plant canopies. The framework’s dual emphasis on architectural innovation and geometric modeling offers a generalizable solution for precision agriculture tasks involving morphologically diverse safflowers. Full article
(This article belongs to the Section Precision and Digital Agriculture)
Show Figures

Figure 1

17 pages, 6269 KiB  
Article
Morphogenetic Identification of a New Record Condica capensis (Lepidoptera: Noctuidae) in Yunnan, China
by Pengfan Qian, Jiayin Fan, Xiaoyuan Zhang, Minfang Zeng, Xiaolong Han, Yonghe Li and Xulu Luo
Insects 2025, 16(2), 130; https://doi.org/10.3390/insects16020130 - 29 Jan 2025
Viewed by 1100
Abstract
Condica capensis (Lepidoptera: Noctuidae), a newly identified pest in Yunnan Province, China, poses a threat to safflower crops. Discovered in Nanhua County in November 2023, the pest damages safflower at multiple life stages, especially during its larval stage, when it feeds on leaves, [...] Read more.
Condica capensis (Lepidoptera: Noctuidae), a newly identified pest in Yunnan Province, China, poses a threat to safflower crops. Discovered in Nanhua County in November 2023, the pest damages safflower at multiple life stages, especially during its larval stage, when it feeds on leaves, tender stems, and flower filaments, sometimes causing the entire plant to die. Morphological and molecular analyses, including mitochondrial cytochrome C oxidase I (COI) gene sequencing, confirmed its identity as C. capensis, a new species record for Yunnan. The study also documented the pest’s life cycle, reproductive behavior, and natural enemies, highlighting the potential for biological control using parasitic wasps such as Cotesia sp. This research emphasizes the need for accurate pest identification and monitoring to develop effective, sustainable pest management strategies. As safflower cultivation grows in Yunnan, managing C. capensis is critical to safeguarding local agriculture and preventing broader agricultural threats. Full article
(This article belongs to the Section Insect Systematics, Phylogeny and Evolution)
Show Figures

Figure 1

19 pages, 16555 KiB  
Article
WED-YOLO: A Detection Model for Safflower Under Complex Unstructured Environment
by Zhenguo Zhang, Yunze Wang, Peng Xu, Ruimeng Shi, Zhenyu Xing and Junye Li
Agriculture 2025, 15(2), 205; https://doi.org/10.3390/agriculture15020205 - 18 Jan 2025
Cited by 5 | Viewed by 1155
Abstract
Accurate safflower recognition is a critical research challenge in the field of automated safflower harvesting. The growing environment of safflowers, including factors such as variable weather conditions in unstructured environments, shooting distances, and diverse morphological characteristics, presents significant difficulties for detection. To address [...] Read more.
Accurate safflower recognition is a critical research challenge in the field of automated safflower harvesting. The growing environment of safflowers, including factors such as variable weather conditions in unstructured environments, shooting distances, and diverse morphological characteristics, presents significant difficulties for detection. To address these challenges and enable precise safflower target recognition in complex environments, this study proposes an improved safflower detection model, WED-YOLO, based on YOLOv8n. Firstly, the original bounding box loss function is replaced with the dynamic non-monotonic focusing mechanism Wise Intersection over Union (WIoU), which enhances the model’s bounding box fitting ability and accelerates network convergence. Then, the upsampling module in the network’s neck is substituted with the more efficient and versatile dynamic upsampling module, DySample, to improve the precision of feature map upsampling. Meanwhile, the EMA attention mechanism is integrated into the C2f module of the backbone network to strengthen the model’s feature extraction capabilities. Finally, a small-target detection layer is incorporated into the detection head, enabling the model to focus on small safflower targets. The model is trained and validated using a custom-built safflower dataset. The experimental results demonstrate that the improved model achieves Precision (P), Recall (R), mean Average Precision (mAP), and F1 score values of 93.15%, 86.71%, 95.03%, and 89.64%, respectively. These results represent improvements of 2.9%, 6.69%, 4.5%, and 6.22% over the baseline model. Compared with Faster R-CNN, YOLOv5, YOLOv7, and YOLOv10, the WED-YOLO achieved the highest mAP value. It outperforms the module mentioned by 13.06%, 4.85%, 4.86%, and 4.82%, respectively. The enhanced model exhibits superior precision and lower miss detection rates in safflower recognition tasks, providing a robust algorithmic foundation for the intelligent harvesting of safflowers. Full article
Show Figures

Figure 1

16 pages, 8893 KiB  
Article
A Method for Real-Time Recognition of Safflower Filaments in Unstructured Environments Using the YOLO-SaFi Model
by Bangbang Chen, Feng Ding, Baojian Ma, Liqiang Wang and Shanping Ning
Sensors 2024, 24(13), 4410; https://doi.org/10.3390/s24134410 - 8 Jul 2024
Cited by 6 | Viewed by 1958
Abstract
The identification of safflower filament targets and the precise localization of picking points are fundamental prerequisites for achieving automated filament retrieval. In light of challenges such as severe occlusion of targets, low recognition accuracy, and the considerable size of models in unstructured environments, [...] Read more.
The identification of safflower filament targets and the precise localization of picking points are fundamental prerequisites for achieving automated filament retrieval. In light of challenges such as severe occlusion of targets, low recognition accuracy, and the considerable size of models in unstructured environments, this paper introduces a novel lightweight YOLO-SaFi model. The architectural design of this model features a Backbone layer incorporating the StarNet network; a Neck layer introducing a novel ELC convolution module to refine the C2f module; and a Head layer implementing a new lightweight shared convolution detection head, Detect_EL. Furthermore, the loss function is enhanced by upgrading CIoU to PIoUv2. These enhancements significantly augment the model’s capability to perceive spatial information and facilitate multi-feature fusion, consequently enhancing detection performance and rendering the model more lightweight. Performance evaluations conducted via comparative experiments with the baseline model reveal that YOLO-SaFi achieved a reduction of parameters, computational load, and weight files by 50.0%, 40.7%, and 48.2%, respectively, compared to the YOLOv8 baseline model. Moreover, YOLO-SaFi demonstrated improvements in recall, mean average precision, and detection speed by 1.9%, 0.3%, and 88.4 frames per second, respectively. Finally, the deployment of the YOLO-SaFi model on the Jetson Orin Nano device corroborates the superior performance of the enhanced model, thereby establishing a robust visual detection framework for the advancement of intelligent safflower filament retrieval robots in unstructured environments. Full article
(This article belongs to the Section Intelligent Sensors)
Show Figures

Figure 1

17 pages, 8391 KiB  
Article
Safflower Picking Trajectory Planning Strategy Based on an Ant Colony Genetic Fusion Algorithm
by Hui Guo, Zhaoxin Qiu, Guomin Gao, Tianlun Wu, Haiyang Chen and Xiang Wang
Agriculture 2024, 14(4), 622; https://doi.org/10.3390/agriculture14040622 - 17 Apr 2024
Cited by 8 | Viewed by 1728
Abstract
In order to solve the problem of the low pickup efficiency of the robotic arm when harvesting safflower filaments, we established a pickup trajectory cycle and an improved velocity profile model for the harvest of safflower filaments according to the growth characteristics of [...] Read more.
In order to solve the problem of the low pickup efficiency of the robotic arm when harvesting safflower filaments, we established a pickup trajectory cycle and an improved velocity profile model for the harvest of safflower filaments according to the growth characteristics of safflower. Bezier curves were utilized to optimize the picking trajectory, mitigating the abrupt changes produced by the delta mechanism during operation. Furthermore, to overcome the slow convergence speed and the tendency of the ant colony algorithm to fall into local optima, a safflower harvesting trajectory planning method based on an ant colony genetic algorithm is proposed. This method includes enhancements through an adaptive adjustment mechanism, pheromone limitation, and the integration of optimized parameters from genetic algorithms. An optimization model with working time as the objective function was established in the MATLAB environment, and simulation experiments were conducted to optimize the trajectory using the designed ant colony genetic algorithm. The simulation results show that, compared to the basic ant colony algorithm, the path length with the ant colony genetic algorithm is reduced by 1.33% to 7.85%, and its convergence stability significantly surpasses that of the basic ant colony algorithm. Field tests demonstrate that, while maintaining an S-curve velocity, the ant colony genetic algorithm reduces the harvesting time by 28.25% to 35.18% compared to random harvesting and by 6.34% to 6.81% compared to the basic ant colony algorithm, significantly enhancing the picking efficiency of the safflower-harvesting robotic arm. Full article
(This article belongs to the Topic Current Research on Intelligent Equipment for Agriculture)
Show Figures

Figure 1

15 pages, 19203 KiB  
Article
Improved Faster Region-Based Convolutional Neural Networks (R-CNN) Model Based on Split Attention for the Detection of Safflower Filaments in Natural Environments
by Zhenguo Zhang, Ruimeng Shi, Zhenyu Xing, Quanfeng Guo and Chao Zeng
Agronomy 2023, 13(10), 2596; https://doi.org/10.3390/agronomy13102596 - 11 Oct 2023
Cited by 16 | Viewed by 2288
Abstract
The accurate acquisition of safflower filament information is the prerequisite for robotic picking operations. To detect safflower filaments accurately in different illumination, branch and leaf occlusion, and weather conditions, an improved Faster R-CNN model for filaments was proposed. Due to the characteristics of [...] Read more.
The accurate acquisition of safflower filament information is the prerequisite for robotic picking operations. To detect safflower filaments accurately in different illumination, branch and leaf occlusion, and weather conditions, an improved Faster R-CNN model for filaments was proposed. Due to the characteristics of safflower filaments being dense and small in the safflower images, the model selected ResNeSt-101 with residual network structure as the backbone feature extraction network to enhance the expressive power of extracted features. Then, using Region of Interest (ROI) Align improved ROI Pooling to reduce the feature errors caused by double quantization. In addition, employing the partitioning around medoids (PAM) clustering was chosen to optimize the scale and number of initial anchors of the network to improve the detection accuracy of small-sized safflower filaments. The test results showed that the mean Average Precision (mAP) of the improved Faster R-CNN reached 91.49%. Comparing with Faster R-CNN, YOLOv3, YOLOv4, YOLOv5, and YOLOv6, the improved Faster R-CNN increased the mAP by 9.52%, 2.49%, 5.95%, 3.56%, and 1.47%, respectively. The mAP of safflower filaments detection was higher than 91% on a sunny, cloudy, and overcast day, in sunlight, backlight, branch and leaf occlusion, and dense occlusion. The improved Faster R-CNN can accurately realize the detection of safflower filaments in natural environments. It can provide technical support for the recognition of small-sized crops. Full article
(This article belongs to the Special Issue AI, Sensors and Robotics for Smart Agriculture)
Show Figures

Figure 1

Back to TopTop