EAAI Journal 2026 Journal Article
A visual detection method for blueberries with different ripeness in a complex orchard environment
- Chengbiao Fu
- Fan Chen
- Anhong Tian
Efficiently and accurately detecting blueberries of different maturity levels in the orchard is the key to smart agriculture. We propose a detection model based on the You Only Look Once (YOLO) version 11 nano (YOLOv11n) architecture for detecting blueberries. It incorporates an Efficient Multi-scale Attention (EMA) module in the shallow backbone, employs a Region-focused Contrastive Loss (RCL), and upgrades the neck with a "Gather and Distribution" (GD) mechanism. We name it YOLOv11-EMA-RCL-GD (abbreviated as YOLOv11-ERG for brevity). The improved model incorporates three key modules: the EMA module enhances background-fruit distinction; the RCL method focuses on local regions for detecting occluded fruits; and the GD mechanism strengthens multi-scale feature fusion. It was trained and tested on our self-constructed blueberry dataset. YOLOv11-ERG achieves a mean Average Precision (mAP) of 91. 8% at an Intersection over Union (IoU) threshold of 50% (mAP@0. 5), and a mAP of 68. 6% at IoU thresholds ranging from 50% to 95% (mAP@0. 5-0. 95), with 5. 93 million parameters and 10. 9 Giga Floating-point Operations Per Second. The proposed model outperforms several existing detectors, including Real-Time DEtection TRansformer-Large(RT-DETR-L), Faster Region-based Convolutional Network method (Faster R-CNN), and other YOLO series models. While achieving higher detection performance, the improved model requires just 62. 8% of the parameters and 51. 2% of the computation of YOLOv11-small (YOLOv11s). This balance therefore fulfills the key need for efficient and accurate detection in smart agriculture. In addition, experimental results show that the RCL method can effectively improve model performance on other public agricultural datasets. The code is publicly available: https: //github. com/C-F-Chen/YOLOv11-ERG.