多光谱图像
播种
精准农业
遥感
农学
RGB颜色模型
数学
环境科学
人工智能
生物
计算机科学
农业
地理
生态学
作者
Norman Wilke,Bastian Siegmann,Johannes Postma,Onno Muller,Vera Krieger,Ralf Pude,Uwe Rascher
标识
DOI:10.1016/j.compag.2021.106380
摘要
Cereal plant density is a relevant agronomic trait in agriculture and high-throughput phenotyping of plant density is important for the decision-making process in precision farming and breeding. It influences the water as well as the fertilization requirements, the intraspecific competition, and the occurrence of weeds or pathogens. Recent studies have determined plant density using machine-learning approaches and feature extraction. This requires spatially very highly resolved images (0.02 cm) because the accuracy distinctly decreased when images had lower resolution. In this study, we present an approach that uses the linear relationship between plant density manually counted in the field and fractional cover derived from a RGB and a multispectral camera equipped on an unmanned aerial vehicle (UAV). We assumed that at an early seedling stage fractional cover is closely related to the number of plants. Spring barley and spring wheat experiments, each with three genotypes and four different sowing densities, were examined. The practicability and repeatability of the methodology were evaluated with an independent experiment consisting of 42 winter wheat genotypes. This experiment mainly differed for genotypes, sowing density and season. The empirical regression models that make us of multispectral images having a GSD of 0.69 cm were able to determine plant density with a high prediction accuracy for barley and wheat (R2 > 0.91, mean absolute error (MAE) < 28 plants). In addition, prediction accuracy only slightly declines for multispectral image data having 1.4 cm GSD or RGB image data having 0.6 cm GSD (MAE < 35 plants m−2). BBCH stage 13 was identified as the ideal growth stage in which the plants were large enough to accurately determine fractional cover even from the lower resolution image data. Moreover, a developed empirical regression model was transferred to an independent experimental field verifying its robustness across different conditions. The prediction accuracy of UAV estimated plant density showed an R2 value of 0.83 and an MAE of less than 21 plants m−2. Furthermore, manual measurements of 11 randomly selected plots proved sufficient for a user-based training of the regression model (R2 = 0.83, MAE < 23 plants m−2) adapted to the independent experimental field. The method and the use of UAV image data enable high-throughput phenotyping of cereal plant density with uncertainties of less than 10 %.The practicability, repeatability and robustness of the developed approach were demonstrated in this study.
科研通智能强力驱动
Strongly Powered by AbleSci AI