Mapping Maize Planting Densities Using Unmanned Aerial Vehicles, Multispectral Remote Sensing, and Deep Learning Technology

[ad_1]

1. Introduction

Maize is one of the most widely cultivated cereal crops worldwide [1,2] and is extensively applied in human food, livestock feed, and fuel production [3]. The accurate detection of maize occurrences and the monitoring of its planting densities during the growth process are crucial, especially considering that early maize densities are a major factor that influences yields [4,5]. This study provides essential information for farm management, early breeding decisions, improving seedling emergence rates, and timely replanting [6].
Compared to traditional manual data collection methods, the use of unmanned aerial vehicles (UAVs) has increased efficiency, reduced personnel costs, and simplified the complexity of data collection [7,8]. In recent years, UAV applications for agricultural monitoring have rapidly developed. UAV-based monitoring offers highly flexible opportunities in terms of the temporal, spectral, and spatial resolutions, enabling more precise observations and analyses of crop growth and timely adjustments to management practices [9,10,11]. Moreover, UAVs equipped with various cameras, such as RGB digital and multispectral cameras, enable the rapid and accurate acquisition of crop growth information in the field [12,13,14]. Over the past two decades, researchers have explored UAV methods to replace the time-consuming manual extraction of crop trait information, which has achieved many successful outcomes [15].
Researchers use UAVs to efficiently capture high-resolution digital images of crops at the field scale and extract trait information by analyzing the crop texture features. This includes areas such as disease detection [16,17] and phenological predictions [18], among others, that are crucial for optimizing field management practices and predicting yields [19,20,21]. Current research has applied these methods in estimating the number of crops, including maize [4,6,22], wheat [23], rice [24], peanuts [25], and potatoes [26]. The success of these methods is primarily attributed to the development of deep learning technologies, which can automatically learn different features from images or datasets [27,28,29] and can effectively handle images with complex backgrounds that are captured by UAVs [30,31,32,33]. Especially in terms of the estimations of maize quantities, Liu et al. [34] used you only look once (YOLO) v3 to estimate the maize plant counts with a precision of 96.99%. Gao et al. [35] fine-tuned the Mask Region-convolutional neural network (R-CNN) model for automatic maize identification, achieving an average precision (AP) @ 0.5 intersection over union (IOU) of 0.729 and an average recall (AR)@ 0.5IOU of 0.837. Xu et al. [36], Xiao et al. [22], and others used the YOLOv5 model to estimate the maize plant counts from UAV digital images. Lucas et al. [37] proposed a CNN-based deep learning method for maize plant counting, with a precision of 85.6% and a recall of 90.5%. Vong et al. [38] developed an image processing workflow based on the U-Net deep learning model for maize segmentation and quantity estimation, with the highest counting accuracy occurring at R2 = 0.95. While these methods achieve high accuracy, deep neural network models require substantial computational resources and large datasets for training [39,40,41,42]. Additionally, due to the small size of maize plants, low flight altitudes are necessary, resulting in slow data acquisition, poor real-time performance, and low efficiency. In practical applications, high-resolution digital image acquisition for maize plants when using UAVs is challenging.
The use of multispectral sensors and vegetation indices (VIs) in assessing plant growth status has achieved significant success in areas such as coverage [43,44], biomass [45,46,47], leaf area index (LAI) [48], nitrogen [49] and chlorophyll content [50]. However, its applications in estimating plant quantities and densities are relatively limited. The successful plant growth assessments using the VIs from multispectral sensors provide valuable insights for further exploration in this field. For instance, Sankaran et al. [51] used multispectral sensors and VIs to assess the emergence and spring survival rates of wheat by using UAV-obtained data. By using machine learning regression methods, Bikram et al. [52] estimated wheat quantities using spectral and morphological information that was extracted from multispectral images. Wilke et al. [53] evaluated wheat plant densities using UAV multispectral data and regression models. The application of multispectral data for estimating plant quantities and densities is still evolving, mainly due to the lower resolution of multispectral sensors compared to digital sensors, which lack the high resolution needed to accurately determine the geometric shapes of seedlings. However, multispectral imaging has the advantage of allowing the computation of VIs, which can be used in developing multivariate regression models for estimating plant traits [54].

At present, there is limited clarity regarding the suitability of ultrahigh-definition digital images and multispectral data for extracting maize planting density information through UAV-based remote sensing. Factors such as image resolution, spectral resolution, method selection (such as object detection or statistical regression), and their impact on results and mapping, have not been thoroughly understood. Therefore, there is a need for a more comprehensive comparison and evaluation of their practical application in this context.

The main tasks of this study are as follows: (1) Develop two UAV remote sensing-based methods for monitoring maize planting densities. These methods are based on ultrahigh-definition imagery combined with object detection (UHDI-OD) and multispectral remote sensing combined with machine learning (Multi-ML) for monitoring maize planting densities. (2) In addition, the maize planting densities were measured at a maize breeding trial site and UAV ultrahigh-definition imagery and multispectral imagery were collected. Experimental testing and validation were conducted using the proposed maize planting density monitoring methods. (3) Conduct in-depth analysis and investigation of the applicability and limitations of both methods. The advantages and disadvantages of the two estimation models were analyzed and discussed.

This study aimed to answer the following questions:

(1)

How can the maize planting densities be estimated by combining ultrahigh-definition RGB digital cameras, UAVs, and object detection models?

(2)

How can multispectral remote sensing sensors, UAVs, and machine learning techniques be integrated to estimate maize planting densities?

(3)

What are the advantages, disadvantages, and applicable scenarios for (a) UHDI-OD and (b) Multi-ML in estimations of maize planting densities?

6. Conclusions

This study developed two maize planting density monitoring methods using UAV remote sensing, namely UHDI-OD and Multi-ML. UHDI-OD achieved highly accurate results (R2 = 0.99, RMSE = 0.09 plants/m2) and showed promising precision in recognizing maize objects. Multi-ML, combining VIs and GLCM texture features, yielded higher-precision results (R2 = 0.76, RMSE = 0.67 plants/m2), with the RF model performing optimally. While UHDI-OD is sensitive to image resolution and not suitable for UAV images with pixel sizes exceeding 2 cm, Multi-ML is less sensitive to resolution, offering greater convenience and cost-effectiveness for large-scale field monitoring. Therefore, although UHDI-OD achieves higher accuracy, Multi-ML has more cost-effective advantages in large-scale field monitoring. While this study has introduced two methods for monitoring maize planting densities, there remain areas for improvement. Future research should focus on refining model performance with real low-resolution images, incorporating temporal dynamics for comprehensive monitoring, and expanding the scope to different regions and crop types for broader applicability.

[ad_2]

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More