Tea chrysanthemum detection under unstructured environments using the TC-YOLO model

Qi, Chao, Gao, Junfeng, Gao, Simon , Harman, Helen, Chen, Kunjie and Shu, Lei (2022) Tea chrysanthemum detection under unstructured environments using the TC-YOLO model. Expert Systems with Applications, 193 . ISSN 0957-4174

Full content URL: https://doi.org/10.1016/j.eswa.2021.116473

WarningThere is a more recent version of this item available.

Tea chrysanthemum detection under unstructured environments using the TC-YOLO model
Authors' Accepted Manuscript

Request a copy
[img] PDF
Revised manuscript.pdf - Whole Document
Restricted to Repository staff only until 31 December 2022.
Available under License Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International.

Item Type:Article
Item Status:Live Archive


Tea chrysanthemum detection at its flowering stage is one of the key components for selective chrysanthemum harvesting robot development. However, it is a challenge to detect flowering chrysanthemums under unstructured field environments given variations on illumination, occlusion and object scale. In this context, we propose a highly fused and lightweight deep learning architecture based on YOLO for tea chrysanthemum detection (TC-YOLO). First, in the backbone component and neck component, the method uses the Cross-Stage Partially Dense network (CSPDenseNet) and the Cross-Stage Partial ResNeXt network (CSPResNeXt) as the main networks, respectively, and embeds custom feature fusion modules to guide the gradient flow. In the final head component, the method combines the recursive feature pyramid (RFP) multiscale fusion reflow structure and the Atrous Spatial Pyramid Pool (ASPP) module with cavity convolution to achieve the detection task. The resulting model was tested on 300 field images using a data enhancement strategy combining flipping and rotation, showing that under the NVIDIA Tesla P100 GPU environment, if the inference speed is 47.23 FPS for each image (416 × 416), TC-YOLO can achieve the average precision (AP) of 92.49% on our own tea chrysanthemum dataset. Through further validation, it was found that overlap had the least effect on tea chrysanthemum detection, and illumination had the greatest effect on tea chrysanthemum detection. In addition, this method (13.6 M) can be deployed on a single mobile GPU, and it could be further developed as a perception system for a selective chrysanthemum harvesting robot in the future.

Keywords:Tea chrysanthemum, Flowering stage detection, Deep convolutional neural network, Agricultural robotics
Subjects:G Mathematical and Computer Sciences > G400 Computer Science
G Mathematical and Computer Sciences > G760 Machine Learning
C Biological Sciences > C910 Applied Biological Sciences
G Mathematical and Computer Sciences > G740 Computer Vision
D Veterinary Sciences, Agriculture and related subjects > D400 Agriculture
Divisions:College of Science > Lincoln Institute for Agri-Food Technology
ID Code:47699
Deposited On:21 Mar 2022 11:48

Available Versions of this Item

Repository Staff Only: item control page