Skip to main content

Advertisement

High-throughput phenotyping in cotton: a review

Article metrics

  • 319 Accesses

Abstract

Recent technological advances in cotton (Gossypium hirsutum L.) phenotyping have offered tools to improve the efficiency of data collection and analysis. High-throughput phenotyping (HTP) is a non-destructive and rapid approach of monitoring and measuring multiple phenotypic traits related to the growth, yield, and adaptation to biotic or abiotic stress. Researchers have conducted extensive experiments on HTP and developed techniques including spectral, fluorescence, thermal, and three-dimensional imaging to measure the morphological, physiological, and pathological resistance traits of cotton. In addition, ground-based and aerial-based platforms were also developed to aid in the implementation of these HTP systems. This review paper highlights the techniques and recent developments for HTP in cotton, reviews the potential applications according to morphological and physiological traits of cotton, and compares the advantages and limitations of these HTP systems when used in cotton cropping systems. Overall, the use of HTP has generated many opportunities to accurately and efficiently measure and analyze diverse traits of cotton. However, because of its relative novelty, HTP has some limitations that constrains the ability to take full advantage of what it can offer. These challenges need to be addressed to increase the accuracy and utility of HTP, which can be done by integrating analytical techniques for big data and continuous advances in imaging.

Background

Plant phenotyping measures the morphological and physiological traits of plants as a function of genetics, environment, and management (Yang et al. 2017). Phenotyping on large quantities of plants has traditionally been challenging, involving time- and resource-consuming measurements of the parameters (Qiu et al. 2018). However, the digital revolution has brought advancements in phenotyping that will be greatly beneficial to the plant sciences. In plant breeding, high throughput phenotyping (HTP) – a nondestructive and noninvasive approach of measuring complex plant traits – is a promising tool that can help to reach solutions toward the long-standing “10 Billion People Question” (Ray et al. 2013; Tester and Langridge 2010). Like the advent of high throughput production in other industries and sciences, rapid phenotyping of complex plant traits related to the growth, yield, and adaptation to biotic or abiotic stress would significantly optimize crop production.

Plant phenotyping techniques based on remote sensing technologies and reflectance data are important tools in improving agricultural management schemes (Candiago et al. 2015). Vegetation indices (VI) derived from the spectral reflectance data can be used to estimate and monitor plant growth parameters such as leaf area index, ground cover fraction, leaf water status, chlorophyll or nitrogen concentrations, amongst other variables (Cammarano et al. 2014; Haboudane et al. 2008; Tanriverdi 2006). More specifically, VI are key components of precision agriculture because of their valuable applications in estimating crop yield, in variable-rate application technologies involving chemical spraying and fertility management, and in detecting weeds and crop diseases (Grisso et al. 2011; Zerger et al. 2010).

More recently, HTP using imaging techniques were developed to improve the efficiency of cotton (Gossypium hirsutum L.) phenotyping. Some applications of these technologies include in-field cotton boll detection based on color and textural features using two-dimensional (2-D) color images (Li et al. 2016), measurement of plant height and canopy cover (Jiang et al. 2016; Sharma and Ritchie 2015), detection of flower based on multispectral images (Xu et al. 2019), measurement of internode lengths using an in-field machine vision system (McCarthy et al. 2010), estimation of water status using thermal images by using an infrared thermal camera (Cohen et al. 2005), and measurement of the canopy height, temperature, and normalized differential vegetation index (NDVI) (Andrade-Sanchez et al. 2014). With the extensive production of cotton worldwide due to its great importance as a natural fiber-producing crop, these HTP systems offer greater potential in improving the accuracy, efficiency, speed, and quality of data collection for determining in-season crop growth and development in comparison with traditional phenotyping. However, due to the heterogeneity of field plots and variations in environmental conditions in cotton production, it is inevitable to come across challenges when implementing these systems.

This review paper has the following objectives:

  1. 1.

    Identify the techniques and recent developments of HTP in cotton;

  2. 2.

    Discuss the potential applications according to the morphological and physiological traits of cotton;

  3. 3.

    Compare the advantages and limitations of these HTP systems when used in cotton cropping systems.

Techniques and developments

HTP sensors, platforms, and other high-resolution applications

a. Sensors

Electromagnetic sensors are commonly used in HTP, because they offer quick and nondestructive estimation of crop growth parameters. Commonly used sensors detect radiation with frequencies that correspond with reflectance, emission, and fluorescence of electromagnetic radiation. As a result, the sensor types are categorized by wavelength or frequency, as well as by the physical parameters being measured. For the purposes of this paper, spectral, thermal, fluorescence, and three-dimensional (3D) sensors will be discussed separately, although there are overlaps in technology and wavelengths among some of these sensors. A summary of the different sensing techniques used for cotton HTP applications is presented in Table 1.

Table 1 Summary of the sensing techniques typically used in high-throughput phenotyping applications in cotton

Spectral cameras and spectrometers are usually used in high throughput sensing to measure reflected visible and near-infrared radiation (NIR), with specific wavelengths chosen for their relationship to plant structure and biochemistry (Curran 1989). Plant leaf reflectance is highly characterized as a function of chlorophyll, mesophyll structure, water, oxygen, and several other chemical and structural characteristics (Liu et al. 2016a). As a result, ratios, normalized ratios, and other more complex formulas have been used to ascertain chlorophyll density, ground cover fraction, nitrogen status, and several other broad physiological parameters (Knyazikhin et al. 2013; Ollinger et al. 2008; Ritchie et al. 2010; Xue and Su 2017).

Imaging quantifies plant structure, using measurements of reflected, absorbed, or transmitted light for quantitative phenotypic analysis of multiple traits such as ground cover fraction, leaf area, color, seedling vigor and morphology, root structures, nutrient content, disease detection and assessment, and yield (Li et al. 2014). The interactions of plant and light, particularly in relation to photosynthetic responses, are also the basic concept of the VI, which are defined as spectral transformation that emphasize the presence and state of vegetation (Bannari et al. 1995; Khan et al. 2018b). Some of the widely known VI are the NDVI, green normalized difference vegetation index (GNDVI), red edge normalized difference vegetation index or normalized difference red edge (RENDVI or NDRE), soil adjusted vegetation index (SAVI), modified soil adjusted vegetation index (MSAVI), and enhanced vegetation index (EVI) (Bannari et al. 1995; Haboudane et al. 2004; Jackson and Huete 1991; Panda et al. 2010). Imaging techniques using these VI have been widely integrated in different remote sensing applications, especially in precision agriculture. Thenkabail et al. (2000) evaluated NDVI, SAVI, and optimum multiple narrow band reflectance (OMNBR) values that were obtained using a spectroradiometer and reported that 12 specific narrow bands, between 350 nm to 1 050 nm, provided the optimal estimations of leaf area index, plant height, and yield in cotton with accuracy ranging from 64 to 88%. Ritchie et al. (2008) reported a close correlation (r2 = 0.72) of cotton NDVI values obtained from a camera system (unmodified and NIR-sensitive Nikon Coolpix 4 300 digital camera) and from ground-based spectrometer. Quantitative and qualitative assessment of vegetation using VI can be affected by several factors, such as complex canopy system and varying soil properties (Bannari et al. 1995). To minimize the influence of soil effect on vegetation spectra, Huete (1988) developed the SAVI using the value of 0.05 as a fixed soil adjustment factor (L). The SAVI was later improved when Qi et al. (1994) developed the MSAVI based on the spectral measurements of cotton with different soil color and moisture levels. Unlike SAVI, MSAVI has a self-adjusting L to account for the variability in soil conditions. Aside from SAVI and MSAVI, perpendicular vegetation index (PVI) can also be used to minimize background effects (Elvidge and Chen 1995).

The use of plant reflectance for HTP is useful, but there are several principles that should be considered when using the reflectance approach. First, reflectance is most likely to correlate with pigments or plant structures that are most prevalent in a plant. For example, the dominant absorption in plant leaves in the visible spectrum is due to chlorophyll, and VI which use visible reflectance largely measure chlorophyll density, either within the leaf or within the scene detected by the sensing system. As a result, the use of a VI using visible and NIR reflectance to measure nitrogen stress, water stress, or any other limiting factors only has validity in the absence of other plant stressors that affect the reflectance of the plant in a similar manner. For example, water deficit stress in cotton results in a decreased leaf area index, which corresponds with a lower VI due to less leaf area sensed within the system. However, nitrogen stress also results in a decreased leaf area index and lower chlorophyll density within individual leaves, which also corresponds with a lower VI. Therefore, researchers should be cautious in assigning changes in vegetation reflectance indices to particular causal agents without elimination of other potential confounding factors.

The advent of sensing systems with high spatial resolution provides opportunities for the discrimination of leaf color from leaf coverage. For example, a satellite image with 1 m × 1 m resolution will detect an individual pixel as a combination of plant leaves, soil, and any other features within the scene of the pixel. Conversely, pixel resolutions of 2 cm × 2 cm or smaller are common in unmanned aerial vehicle (UAV) applications, so an individual pixel may correspond with an individual leaf or adjacent leaves. These increases in resolution may be of value in HTP, because they allow the discrimination of leaf color from leaf coverage. However, spectral calibration becomes increasingly important in these cases, since imagery of a field may be composed of thousands of individual images with their own corresponding lighting and camera settings.

Fluorescence meters have also been used to detect plant metabolic or biochemical activity (Li et al. 2018). Fluorescence is the re-emission of radiation at a different wavelength by a surface that has absorbed light or similar electromagnetic radiation. The re-emitted light usually has a longer wavelength and consequently lower energy than the original absorbed radiation. Therefore, fluorescence differs from reflectance in that reflectance measures the quantity of light at the same wavelength which is reflected from the surface. Fluorescence has many practical applications, but in plants, it is valuable because it can be used to quantify the activity of several pigments, including photosynthetic conversion efficiency (Massacci et al. 2008; Gao et al. 2017; Zhang et al. 2018;). The techniques have been used extensively in recent years to determine heat tolerance in cotton (Oosterhuis et al. 2008; Snider et al. 2015; Wu et al. 2014).

As discussed by Meroni et al. (2009), remote sensing of fluorescence in plants usually focuses on solar-induced chlorophyll fluorescence (F). In cases where the sensor is in close proximity to the plant, it may be possible to use an active light source to more accurately ascertain fluorescence, but many remote sensing applications attempt to quantify F passively. These methods are still considered to be developed, even though the first attempt at passive fluorescence measurements in plants was made in the 1970s. Because fluorescence is based on wavelengths of radiation that are also reflected, fluorescence is not measured independently of plant reflectance and is subject to the same challenges discussed for reflectance, with the added limitation that fluorescence creates a small spectral signal beyond that of reflectance and requires a combination of high spectral resolution and minimization of background noise for accurate measurements.

Thermal sensing is a nondestructive method of assessing the level of crop water deficit based on the measurement of canopy temperature. As cotton becomes water-stressed, stomatal closure results in a decrease of transpiration and an attendant temperature increase (Blonquist Jr. et al. 2009a). As a result, thermal sensing has been used to detect temperature stress and temperature profiles within crop canopies in several studies (Blum et al. 1982; Falkenberg et al. 2007; Jones et al. 2009; Mahan et al. 2010; Sullivan et al. 2007; Wanjura et al. 2004). Thermal sensing measures temperature based on a combination of emitted thermal radiation and the relative emissivity of objects being measured. Since the composition of plant leaves are mostly water and water has a high emissivity, measuring the temperature of plant leaves can be quite accurate; in many cases, the measurement error is within 0.1 °C (Blonquist Jr. et al. 2009b). However, some limitations of canopy temperature measurements include low spatial resolution (Manfreda et al. 2018) and the effects of surrounding features with relatively high radiation outputs (Jones et al. 2009). There are also challenges brought about by the thermal drift associated with sensor temperature (Blonquist Jr. et al. 2009b; Mahan et al. 2010). In addition, thermal sensors tend to be expensive and fragile, particularly thermal cameras. Because of these drawbacks, the use of thermal imagery to measure plant canopy stress has always tried to balance the physical constraints of the sensing environment with the promise of sensing a biologically important abiotic stress.

Another remote sensing system that is growing in popularity uses light detection and ranging (LiDAR) sensing. It involves estimation of the distance between the sensor and the target object and analysis of the time-of-flight (TOF) once the target object is illuminated with a laser (Deery et al. 2014; Sun et al. 2018). The output of LiDAR is a point cloud that is commonly used in 3D reconstruction, which is the process of capturing the shape and appearance of real objects from a set of images (Whitaker 1998). One particular advantage which is being offered by LiDAR remote sensing and 3D reconstruction over manual methods is their capability to characterize the volume of canopy and crop density even in heterogenous field plots (Bietresato et al. 2016).

The two most prominent studies that feature the use of LiDAR to scan cotton plants were done by French et al. (2016) and Sun et al. (2017). Both systems were accompanied by a global positioning system (GPS) and mounted on a tractor platform. High resolution and low distortion mapping of cotton heights, widths, leaf area, and boll counts were achieved by the system developed by French et al. (2016) while multiple traits including plant height, projected canopy area, and plant volume were simultaneously extracted from repeated measurements over the growing season by Sun et al. (2017).

In practice, limitations of LiDAR in HTP have been related to the cost of the sensing system, which may be in the tens to hundreds of thousands of dollars, and the lack of corresponding red, green, and blue (RGB) spectral information associated with the 3D structural measurements of the sensors. As with other 3D imaging systems, LiDAR also requires an open path to detect all of the features within a plant canopy, so features may be obscured without multiple angles of detection. In addition, LiDAR may be affected by surface reflectivity and has potential health hazards associated with the lasers which are used.

UAV with spectral imaging sensors can obtain the spectral absorption and reflectance characteristics of crops, which can be used to monitor the crop planting area and crop growth, evaluate the biological and physical characteristics of a crop, and predict crop yield (Yang et al. 2017). Ritchie and Bednarz (2005) used a photosynthetically active radiation/near infrared spectrometer to investigate the relationship of red-edge based NDVI and leaf area index and to quantify cotton defoliation. Results showed that spectral data based on red edge measurements can provide accurate defoliation estimates which could possibly improve defoliation efficiency.

b. Platforms

With the development of acquisition technologies for HTP, crop growth and development can be monitored with phenotyping systems mounted on a ground-based or aerial-based platform (Duan et al. 2017), which allows capturing high resolution images and multiple crop traits at canopy level (Khan et al. 2018a). Ground-based HTP platforms, typically equipped with GPS navigation device and sensors, can produce data of higher resolution because of their ability to capture images at a closer range relative to the plant (Araus and Cairns 2014; Condorelli et al. 2018). Aerial-based HTP platforms offer greater speed in capturing and measuring traits in a larger coverage area. These two platforms have their own advantages and limitations when used in cotton phenotyping (Table 2).

Table 2 Advantages and disadvantages of ground-based and aerial-based types of platforms for cotton phenotyping

Various ground-based systems have been developed and applied for a wide range of phenotypic and agronomic studies in cotton. A ground-based plant phenotyping system built on a LeeAgra 3434 DL open rider sprayer with three types of sensor was used by Andrade-Sanchez et al. (2014) to evaluate the variations in canopy height, reflectance, and temperature of 25 Pima cotton grown under optimal irrigation supply and water-limited conditions. As expected, data acquisition efficiency of the system was higher when compared with manual measurements (r2 = 0.35–0.82). One advantage of this system is the stability of the structure that is holding the sensors and its minimal damage to the cotton stands particularly to the plots with tall plants. This is considered as an improvement since a concern commonly raised when using ground-based platforms is the damage that can be brought about by the size of the platform relative to the size of the plant as well as space allotted for unrestrained movement of the system. Another advantage of this system over the manual phenotyping method is that when multiple georeferenced sensors were used, the bias in selecting the representative samples in a plot was minimized. However, some potential limitations of this HTP system are the difficulty in maneuvering particularly when the plant row spacing is limited or when the soil is wet, and the relatively low clearance of one of the sensors (ultrasonic proximity sensor). It was emphasized in this study that current maximum clearance of the ultrasonic proximity sensor for the system was not high enough to be able to cover the tallest cotton plants. This highlights the important consideration that should be given to the sensor height especially in areas with large variations of plant or canopy height. Meanwhile, the image spatial resolution was limited by the vehicle speed through the field and by the sampling frequency of the data collection system. So, the improvement of electronics and signal processing will be needed for higher throughput in cotton.

Another ground-based phenotyping system that was developed recently was the GPhenoVision which mainly consisted of RGB image combining with image depth (RGB-D), thermal, and hyperspectral cameras (Xu et al. 2018b). This HTP system was used to evaluate multi-dimensional morphological traits of cotton such as leaf area and canopy volume. It showed the potential of measuring phenotypic traits for genomics and breeding studies at a small scale. A rubber cushion was applied on the sensor frames to reduce the vibrations that could decrease the possibility of acquiring blurry images, which has been one of the main concerns for the ground-based platform. The authors noted some limitations of the system that can be further improved such as optimization of the illumination configuration for the three sensors, improvement of data processing algorithms so that it will be able to capture data in a regular plot layout and collect data of complex traits from 3D or hyperspectral images, and further improvement of the speed of data processing.

The commonly used aerial-based platforms for cotton phenotyping are rotary-wing and fixed-wing UAV. The fixed-wing UAV has faster flight speed, longer flight time, and a larger flight area coverage compared with rotary-wing UAV (Ziliani et al. 2018). However, lack of free hover ability and high flight speeds and altitudes of fixed-wing UAV often result in blurry images (Herwitz et al. 2004). Rotary-wing UAV has been commonly used for crop phenotyping because it is relatively inexpensive, easy to control, and has the ability to hover. The flight planers such as Precision Flight, Drone Deploy, DJI Go, and Litchi can build the flight missions with flight height, speed, and overlaps which enable to design flight routes and automatic landing. However, rotary-wing UAV offers shorter flight time, lower payload, greater sensitivity to weather conditions, and weaker wind resistance compared with fixed-wing UAV (Shi et al. 2016; Zhang and Kovacs 2012). These disadvantages limit the application of rotary-wing UAV in crop phenotyping at a large scale. Areas of improvement for rotary-wing UAV system include longer battery duration to ensure greater area coverage. For fixed-wing UAV, a faster frame rate, shorter exposure time, and higher spatial resolution would greatly improve its performance (Shi et al. 2016).

Han et al. (2018) reported that high wind speed is a challenge when acquiring high-quality plant height data using UAV. In addition, digital terrain model (DTM) or digital surface model (DSM) errors can also contribute to the biases of plant height assessment. The highest point of the cotton plant could be smoothed out due to the pixel size or the movement of the plant, resulting in a lower value than the actual maximum plant height (Wang et al. 2018; Xu et al. 2019). Wang et al. (2018) reported that when plant density is low, the plant height measurements collected with UAV were lower compared with the data collected using the ground-based platform. This may be due to the lower resolution of the images generated by the UAV platform. A lower resolution digital elevation model (DEM) delineated with UAV platform results in partially complete canopy profile and lower plant height values than the ground-based measurement. In addition, the movement of plant leaves could affect overlapping images, which in turn could lead to noise in 3D points (Xu et al. 2019). The unevenness of the soil surface could also be an issue for cotton plant height measurement. It was reported by Xu et al. (2019) that the standard deviation of the difference between the ground plane and the DEM for the ground pixels was 1~12 cm. Similarly, Chu et al. (2016) reported that the bare soil areas have about 5 cm uncertainty in DEM, which resulted in the error of baseline when calculating the plant height. In addition, the georeferencing errors from the ortho-mosaic and DEM constrained the plot scale and temporal data analysis. These errors would be greatly reduced if there will be accurately surveyed ground control points (GCP) that can be applied when georeferencing the UAV images. Therefore, GCP at multiple heights are needed for the calibration of the plant height (Han et al. 2018).

c. Other high-resolution applications

High throughput phenotyping technologies, from different platforms equipped with single or multiple sensors, have generated massive and diverse sets of data for analysis (Singh et al. 2016). These datasets are important in computer vision-based plant phenotyping applications, such as pattern recognition (Mochida et al. 2018). Several studies have used pattern recognition techniques to improve agronomic resources management. Biradar and Shrikhande (2015) proposed a method of developing a system that detects and counts the number of flowers using image patterns/flower patterns captured by a digital camera. The method used Gaussian low-pass filtering and morphological operations that removed non-flower regions of the image and emphasized fine details of the flower region. This method is advantageous in a greenhouse setting, mainly for farmers who rely on flower counts for revenue purposes. Similar principles of pattern recognition were also used in the systems developed by Adamsen et al. (2000) and Hsu et al. (2011).

In cotton, one of the proposed applications of pattern recognition technique is in identifying cotton leaf diseases. Revathi and Hemalatha (2012) proposed the use of image processing edge detection techniques and homogeneous pixel counting technique for cotton diseases detection (HPCCDD) algorithm to detect symptoms of Fusarium wilt, Verticillium wilt, and leaf blight.

A pattern recognition algorithm called Convolutional Neural Networks was used by Xu et al. (2018a) to identify and count the number of opened cotton flowers using aerial color images. Convolutional Neural Networks distinguishes and differentiates objects or aspects from one another by assigning learnable weights or biases to various objects in the input image (Saha 2018). Liu et al. (2016b) reported the same algorithm to be effective in identifying flower species. Although results from Xu et al. (2018a) confirmed that the system developed for identifying and automatic counting of cotton flower was comparable with the results from manual counting, one disadvantage which was emphasized by the proponents was the underestimation in bloom counts when data were collected from a single plot with multiple crop stands. This limitation was due to inability of the system to capture hidden flowers.

Xu et al. (2018b) developed an autonomous ground robot system designed to count the number of cotton bolls. The robot is equipped with real time kinematics (RTK)-GPS system, inertial measurement unit, and Waypoint. These three components are important to ensure that the robot can navigate the fields accurately, without human intervention, and without damaging the crops when it’s in between rows. Data processing involves constructing 3D point cloud from raw images, then counting the number of cotton bolls from the point cloud. This study was successful in a sense because it was able to show that opened cotton bolls can be counted from 3D point cloud with less human participation in the actual collection. The field set-up in this study consisted of one plant per plot, each plot was 1 m apart, and the distance between rows was 1.6 m. It would be interesting to see if this type of robot system would be effective under a more realistic field scenario with 9–13 plants per meter and narrower plot and row spacing.

A time series can be used to monitor the changes in growth characteristics of cotton over time (Hansen et al. 2014). In general, data acquired from multi-temporal high-resolution and low-resolution time series can provide relevant information about the type of the crops, cropping patterns, and other crop growth parameters (Liu et al. 2018; Waldner et al. 2015). Wu et al. (2018) monitored the progression of cotton root rot based on the extracted NDVI time series profiles from combined 250-m moderate resolution imaging spectroradiometer (MODIS) NDVI and 10-m Sentinal-2 NDVI time series. When compared with a healthy cotton plant, the results from this study showed decrease in values of parameters pertinent in assessing cotton root rot infections such as growth duration and maximum NDVI values. Similar concept of identifying cotton diseases using spectral and temporal signatures was also proposed by McKellip et al. (2005). Hao et al. (2016) used this technique to develop a method which can classify crops based on NDVI time series of multiple years. However, this system can be limited by the differences in location and the nature of cropping systems.

Conclusions

Improvement in cotton productivity highly depends on the availability of good quality phenotypic data. This review shows that a lot of potential is seen in HTP when it comes to improving data collection, management, and analysis when measuring phenotypic traits in cotton and in providing economic benefits in terms of decreased input costs and resources (labor, time). Imaging techniques and sensor technologies using spectral, thermal, fluorescence, and 3D sensors are useful tools in assessing crop characteristics, monitoring crop growth and development, and in assessing health status of cotton. With the advent of these HTP technologies, various ground-based and aerial based platform systems have been developed for phenotypic and agronomic studies in cotton. Even though each system has its advantages and limitations, it is clear that both systems offer potential for precise plant phenotyping based on the studies cited in this review. In addition to these techniques, there are other high-resolution applications (pattern recognition and time series) that have led to significant contributions in understanding and monitoring the responses of cotton in different environmental conditions or scenarios. Future research should focus on improving the robustness, accuracy, effectivity, affordability, and maneuverability of these HTP systems in cotton production. In addition, improvements of HTP platforms should tackle the ability of these systems to capture the variability in cotton fields.

Availability of data and materials

Not applicable. No datasets were generated or analyzed in this review paper.

Abbreviations

DEM:

Digital elevation model

DSM:

Digital surface model

DTM:

Digital terrain model

EVI:

Enhanced vegetation index

GCP:

Ground control points

GNDVI:

Green normalized difference vegetation index

GPS:

Global positioning system

HTP:

High-throughput phenotyping

LiDAR:

Light detection and ranging

MODIS:

Moderate resolution imaging spectroradiometer

MSAVI:

Modified soil adjusted vegetation index

NDRE:

Normalized difference red edge

NDVI:

Normalized differential vegetation index

NIR:

Near-infrared radiation

PVI:

Perpendicular vegetation index

RENDVI:

Red edge normalized difference vegetation index

RGB:

Red-green-blue

RGB-D:

Red-green-blue-depth

RTK:

Real time kinematics

SAVI:

Soil adjusted vegetation index

TOF:

Time-of-flight

UAV:

Unmanned aerial vehicle

VI:

Vegetation indices

References

  1. Adamsen F, Coffelt T, Nelson JM, et al. Method for using images from a color digital camera to estimate flower number. Crop Sci. 2000;40(3):704–9. https://doi.org/10.2135/cropsci2000.403704x.

  2. Andrade-Sanchez P, Gore MA, Heun JT, et al. Development and evaluation of a field-based high-throughput phenotyping platform. Funct Plant Biol. 2014;41(1):68–79. https://doi.org/10.1071/FP13126.

  3. Araus JL, Cairns JE. Field high-throughput phenotyping: the new crop breeding frontier. Trends Plant Sci. 2014;19(1):52–61. https://doi.org/10.1016/j.tplants.2013.09.008.

  4. Bannari A, Morin D, Bonn F, Huete AR. A review of vegetation indices. Remote Sens Rev. 1995;13(1–2):95–120. https://doi.org/10.1080/02757259509532298.

  5. Bietresato M, Carabin G, Vidoni R, et al. Evaluation of a LiDAR-based 3D-stereoscopic vision system for crop-monitoring applications. Comput Electron Agr. 2016;124:1–13. https://doi.org/10.1016/j.compag.2016.03.017.

  6. Biradar BV, Shrikhande SP. Flower detection and counting using morphological and segmentation technique. Int J Comput Sci Inform Technol. 2015;6:2498–501.

  7. Blonquist J Jr, Norman JM, Bugbee B. Automated measurement of canopy stomatal conductance based on infrared temperature. Agric For Meteorol. 2009a;149(11):1931–45. https://doi.org/10.1016/j.agrformet.2009.06.021.

  8. Blonquist J Jr, Tanner B, Bugbee B. Evaluation of measurement accuracy and comparison of two new and three traditional net radiometers. Agric For Meteorol. 2009b;149(10):1709–21. https://doi.org/10.1016/j.agrformet.2009.05.015.

  9. Blum A, Mayer J, Gozlan G. Infrared thermal sensing of plant canopies as a screening technique for dehydration avoidance in wheat. Field Crop Res. 1982;5:137–46. https://doi.org/10.1016/0378-4290(82)90014-4.

  10. Cammarano D, Fitzgerald G, Casa R, Basso B. Assessing the robustness of vegetation indices to estimate wheat N in Mediterranean environments. Remote Sens. 2014;6(4):2827–44. https://doi.org/10.3390/rs6042827.

  11. Candiago S, Remondino F, De Giglio M, et al. Evaluating multispectral images and vegetation indices for precision farming applications from UAV images. Remote Sens. 2015;7(4):4026–47. https://doi.org/10.3390/rs70404026.

  12. Chu T, Chen R, Landivar JA, et al. Cotton growth modeling and assessment using unmanned aircraft system visual-band imagery. Journal of Applied Remote Sens. 2016;10(3):036018. https://doi.org/10.1117/1.JRS.10.036018.

  13. Cohen Y, Alchanatis V, Meron M, et al. Estimation of leaf water potential by thermal imagery and spatial analysis. J Exp Bot. 2005;56(417):1843–52. https://doi.org/10.1093/jxb/eri174.

  14. Condorelli GE, Maccaferri M, Newcomb M, et al. Comparative aerial and ground based high throughput phenotyping for the genetic dissection of NDVI as a proxy for drought adaptive traits in durum wheat. Front Plant Sci. 2018;9:893. https://doi.org/10.3389/fpls.2018.00893 .

  15. Curran PJ. Remote sensing of foliar chemistry. Remote Sens Environ. 1989;30(3):271–8.

  16. Deery D, Jimenez-Berni J, Jones H, et al. Proximal remote sensing buggies and potential applications for field-based phenotyping. Agronomy. 2014;4(3):349–79. https://doi.org/10.3390/agronomy4030349.

  17. Duan T, Zheng B, Guo W, et al. Comparison of ground cover estimates from experiment plots in cotton, sorghum and sugarcane based on images and ortho-mosaics captured by UAV. Funct Plant Biol. 2017;44(1):169–83. https://doi.org/10.1071/FP16123.

  18. Elvidge CD, Chen Z. Comparison of broad-band and narrow-band red and near-infrared vegetation indices. Remote Sens Environ. 1995;54:38–48. https://doi.org/10.1016/0034-4257(95)00132-K.

  19. Falkenberg NR, Piccinni G, Cothren JT, et al. Remote sensing of biotic and abiotic stress for irrigation management of cotton. Agric Water Manag. 2007;87(1):23–31. https://doi.org/10.1016/j.agwat.2006.05.021.

  20. French AN, Gore MA, Thompson A. Cotton phenotyping with lidar from a track-mounted platform. Autonomous air and ground sensing systems for agricultural optimization and phenotyping. International Society for Optics and Photonics: Bellingham; 2016. https://doi.org/10.1117/12.2224423.

  21. Gao XF, Han JM, Lei CY, et al. Heterogeneity of chlorophyll fluorescence characteristics of leaves and non-foliar organs of cotton. Cotton Sci. 2017;29(2):195–203. 

  22. Grisso RD, Alley MM, Thomason W, et al. Precision farming tools: variable-rate application.2011. https://vtechworks.lib.vt.edu/bitstream/handle/10919/47448/442-505_PDF.pdf.

  23. Haboudane D, Miller JR, Pattey E, et al. Hyperspectral vegetation indices and novel algorithms for predicting green LAI of crop canopies: modeling and validation in the context of precision agriculture. Remote Sens Environ. 2004;90(3):337–52. https://doi.org/10.1016/j.rse.2003.12.013.

  24. Haboudane D, Tremblay N, Miller JR, et al. Remote estimation of crop chlorophyll content using spectral indices derived from hyperspectral data. IEEE T Geosci Remote. 2008;46(2):423–37. https://doi.org/10.1109/TGRS.2007.904836.

  25. Han X, Thomasson JA, Bagnall GC, et al. Measurement and calibration of plant-height from fixed-wing UAV images. Sensors. 2018;18(12):4092. https://doi.org/10.3390/s18124092.

  26. Hansen M, Egorov A, Potapov PV, et al. Monitoring conterminous United States (CONUS) land cover change with web-enabled Landsat data (WELD). Remote Sens Environ. 2014;140:466–84. https://doi.org/10.1016/j.rse.2013.08.014.

  27. Hao P, Wang L, Zhan Y, Niu Z. Using moderate-resolution temporal NDVI profiles for high-resolution crop mapping in years of absent ground reference data: a case study of bole and manas counties in Xinjiang. ISPRS Int J Geo-Inf. 2016;5(5):67. https://doi.org/10.3390/ijgi5050067.

  28. Herwitz SR, Johnson LF, Dunagan SE, et al. Imaging from an unmanned aerial vehicle: agricultural surveillance and decision support. Comput Electron Agr. 2004;44(1):49–61. https://doi.org/10.1016/j.compag.2004.02.006.

  29. Hsu TH, Lee CH, Chen LH. An interactive flower image recognition system. Multimed Tools Appl. 2011;53(1):53–73. https://doi.org/10.1007/s11042-010-0490-6.

  30. Huete AR. A soil-adjusted vegetation index (SAVI). Remote Sens Environ. 1988;25(3):295–309. https://doi.org/10.1016/0034-4257(88)90106-X.

  31. Jackson RD, Huete AR. Interpreting vegetation indices. Prev Vet Med. 1991;11(3–4):185–200. https://doi.org/10.1016/S0167-5877(05)80004-2.

  32. Jiang Y, Li C, Paterson AH. High throughput phenotyping of cotton plant height using depth images under field conditions. Comput Electron Agric. 2016;130:57–68. https://doi.org/10.1016/j.compag.2016.09.017.

  33. Jones HG, Serraj R, Loveys BR, et al. Thermal infrared imaging of crop canopies for the remote diagnosis and quantification of plant responses to water stress in the field. Funct Plant Biol. 2009;36(10-11):978–89. https://doi.org/10.1071/FP09123.

  34. Khan Z, Chopin J, Cai J, et al. Quantitative estimation of wheat phenotyping traits using ground and aerial imagery. Remote Sens. 2018a;10(6):950. https://doi.org/10.3390/rs10060950.

  35. Khan Z, Rahimi-Eichi V, Haefele S, et al. Estimation of vegetation indices for high-throughput phenotyping of wheat using aerial imaging. Plant Methods. 2018b;14(1):20. https://doi.org/10.1186/s13007-018-0287-6.

  36. Knyazikhin Y, Schull MA, Stenberg P, et al. Hyperspectral remote sensing of foliar nitrogen content. Proc Natl Acad Sci. 2013;110(3):E185–92. https://doi.org/10.1073/pnas.1210196109.

  37. Li DW, Zhang YJ, Liu LT, et al. Responses of canopy photosynthesis, spectral indices and solar-induced chlorophyll fluorescence in cotton under drought stress. Cotton Sci. 2018;30(3):242–51. https://doi.org/10.11963/issn.1002-7807.ldwzyj.20180503.

  38. Li L, Zhang Q, Huang D. A review of imaging techniques for plant phenotyping. Sensors. 2014;14(11):20078–111. https://doi.org/10.3390/s141120078.

  39. Li Y, Cao Z, Lu H, et al. In-field cotton detection via region-based semantic image segmentation. Comput Electron Agric. 2016;127:475–86. https://doi.org/10.1016/j.compag.2016.07.006.

  40. Liu C, Sun P, Liu S. A review of plant spectral reflectance response to water physiological changes. Chin J Plant Ecol. 2016a;40:80–91. https://doi.org/10.17521/cjpe.2015.0267.

  41. Liu J, Zhu W, Atzberger C, et al. A phenology-based method to map cropping patterns under a wheat-maize rotation using remotely sensed time-series data. Remote Sens. 2018;10(8):1203. https://doi.org/10.3390/rs10081203.

  42. Liu Y, Tang F, Zhou D, et al, editors. Flower classification via convolutional neural network. In: 2016 IEEE International Conference on Functional-Structural Plant Growth Modeling, Simulation, Visualization and Applications (FSPMA). New York: IEEE. 2016b. p. 110–16. https://doi.org/10.1109/FSPMA.2016.7818296.

  43. Mahan JR, Conaty W, Neilsen J, et al. Field performance in agricultural settings of a wireless temperature monitoring system based on a low-cost infrared sensor. Comput Electron Agric. 2010;71(2):176–81. https://doi.org/10.1016/j.compag.2010.01.005.

  44. Manfreda S, McCabe M, Miller P, et al. On the use of unmanned aerial systems for environmental monitoring. Remote Sens. 2018;10(4):641. https://doi.org/10.3390/rs10040641.

  45. Massacci A, Nabiev SM, Pietrosanti L, et al. Response of the photosynthetic apparatus of cotton (Gossypium hirsutum) to the onset of drought stress under field conditions studied by gas-exchange analysis and chlorophyll fluorescence imaging. Plant Physiol Biochem. 2008;46(2):189–95. https://doi.org/10.1016/j.plaphy.2007.10.006.

  46. McCarthy C, Hancock N, Raine S. Apparatus and infield evaluations of a prototype machine vision system for cotton plant internode length measurement. J Cotton Sci. 2010;14(4):221–32.

  47. McKellip R, Ryan RE, Blonski S, Prados D. Crop surveillance demonstration using a near-daily MODIS derived vegetation index time series. In: Proc. of the 2005 International workshop on the analysis of multi-temporal remote sensing images. New York: IEEE. 2005. https://doi.org/10.1109/AMTRSI.2005.1469839.

  48. Mochida K, Koda S, Inoue K, et al. Computer vision-based phenotyping for improvement of plant productivity: a machine learning perspective. GigaScience. 2018;8(1):giy153. https://doi.org/10.1093/gigascience/giy153.

  49. Ollinger SV, Richardson AD, Martin ME, et al. Canopy nitrogen, carbon assimilation, and albedo in temperate and boreal forests: functional relations and potential climate feedbacks. Proc Natl Acad Sci. 2008;105(49):19336–41. https://doi.org/10.1073/pnas.0810021105.

  50. Oosterhuis DM, Bourland FM, Bibi AC, et al. Screening for temperature tolerance in cotton. Summaries of Arkansas Cotton Research. 2008:37–41.

  51. Panda SS, Ames DP, Panigrahi S. Application of vegetation indices for agricultural crop yield prediction using neural network techniques. Remote Sens. 2010;2(3):673–96.

  52. Qi J, Chehbouni A, Huete A, et al. A modified soil adjusted vegetation index. Remote Sens Environ. 1994;48(2):119–26. https://doi.org/10.1016/0034-4257(94)90134-1.

  53. Qiu R, Wei S, Zhang M, et al. Sensors for measuring plant phenotyping: a review. Int J Agr Biol Eng. 2018;11(2):1–17. https://doi.org/10.25165/j.ijabe.20181102.2696.

  54. Ray DK, Mueller ND, West PC, Foley JA. Yield trends are insufficient to double global crop production by 2050. PLoS One. 2013;8(6):e66428. https://doi.org/10.1371/journal.pone.0066428.

  55. Revathi P, Hemalatha M. Advance computing enrichment evaluation of cotton leaf spot disease detection using image edge detection. In: 2012 Third International Conference on Computing, Communication and Networking Technologies (ICCCNT'12). Coimbatore: IEEE-20180. 2012. https://doi.org/10.1109/ICCCNT.2012.6395903.

  56. Ritchie G, Bednarz C. Estimating defoliation of two distinct cotton types using reflectance data. J Cotton Sci. 2005;9:182–9.

  57. Ritchie G, Sullivan D, Perry C, et al. Preparation of a low-cost digital camera system for remote sensing. Appl Eng Agric. 2008;24(6):885–94.

  58. Ritchie G, Sullivan D, Vencill W, et al. Sensitivities of normalized difference vegetation index and a green/red ratio index to cotton ground cover fraction. Crop Sci. 2010;50(3):1000–10. https://doi.org/10.2135/cropsci2009.04.0203.

  59. Saha S. A comprehensive guide to convolutional neural networks — the ELI5 way. The Medium: Towards Data Science Inc.; 2018. 

  60. Sharma B, Ritchie GL. High-throughput phenotyping of cotton in multiple irrigation environments. Crop Sci. 2015;55(2):958–69. https://doi.org/10.2135/cropsci2014.04.0310.

  61. Shi Y, Thomasson JA, Murray SC, et al. Unmanned aerial vehicles for high-throughput phenotyping and agronomic research. PLoS One. 2016;11(7):e0159781. https://doi.org/10.1371/journal.pone.0159781.

  62. Singh A, Ganapathysubramanian B, Singh AK, Sarkar S. Machine learning for high-throughput stress phenotyping in plants. Trends Plant Sci. 2016;21(2):110–24. https://doi.org/10.1016/j.tplants.2015.10.015.

  63. Snider J, Chastain D, Collins G. Field-grown cotton exhibits seasonal variation in photosynthetic heat tolerance without exposure to heat-stress or water-deficit conditions. J Agron Crop Sci. 2015;201(4):312–20. https://doi.org/10.1111/jac.12113.

  64. Sullivan DG, Fulton JP, Shaw JW, Bland G. Evaluating the sensitivity of an unmanned thermal infrared aerial system to detect water stress in a cotton canopy. Trans ASABE (Am Soc Agric Biol Eng). 2007;50(6):1963–9. https://doi.org/10.13031/2013.24091

  65. Sun S, Li C, Paterson A. In-field high-throughput phenotyping of cotton plant height using LiDAR. Remote Sens. 2017;9(4):377. https://doi.org/10.3389/fpls.2018.00016.

  66. Sun S, Li C, Paterson AH, et al. In-field high throughput phenotyping and cotton plant growth analysis using LiDAR. Front Plant Sci. 2018;9:16. https://doi.org/10.3389/fpls.2018.00016.

  67. Tanriverdi C. A review of remote sensing and vegetation indices in precision farming. J Sci Eng. 2006;9:69–76.

  68. Tester M, Langridge P. Breeding technologies to increase crop production in a changing world. Science. 2010;327(5967):818–22. https://doi.org/10.1126/science.1183700.

  69. Thenkabail PS, Smith RB, De Pauw E. Hyperspectral vegetation indices and their relationships with agricultural crop characteristics. Remote Sens Environ. 2000;71(2):158–82. https://doi.org/10.1016/S0034-4257(99)00067-X.

  70. Waldner F, Lambert M-J, Li W, et al. Land cover and crop type classification along the season based on biophysical variables retrieved from multi-sensor high-resolution time series. Remote Sens. 2015;7(8):10400–24. https://doi.org/10.3390/rs70810400.

  71. Wang X, Singh D, Marla S, et al. Field-based high-throughput phenotyping of plant height in sorghum using different sensing technologies. Plant Methods. 2018;14(1):53. https://doi.org/10.1186/s13007-018-0324-5.

  72. Wanjura D, Maas S, Winslow J, Upchurch DR. Scanned and spot measured canopy temperatures of cotton and corn. Comput Electron Agric. 2004;44(1):33–48. https://doi.org/10.1016/j.compag.2004.02.005.

  73. Whitaker RT. A level-set approach to 3D reconstruction from range data. Int J Comput Vis. 1998;29(3):203–31. https://doi.org/10.1023/A:1008036829907.

  74. Wu M, Yang C, Song X, et al. Monitoring cotton root rot by synthetic Sentinel-2 NDVI time series using improved spatial and temporal data fusion. Sci Rep. 2018;8(1):2016. https://doi.org/10.3390/rs70810400.

  75. Wu T, Weaver DB, Locy RD, et al. Identification of vegetative heat-tolerant upland cotton (Gossypium hirsutum L.) germplasm utilizing chlorophyll fluorescence measurement during heat stress. Plant Breed. 2014;133(2):250–5. https://doi.org/10.3390/rs70810400.

  76. Xu R, Li C, Paterson AH. Multispectral imaging and unmanned aerial systems for cotton plant phenotyping. PLoS One. 2019;14(2):e0205083. https://doi.org/10.1371/journal.pone.0205083.

  77. Xu R, Li C, Paterson AH, et al. Aerial images and convolutional neural network for cotton bloom detection. Front Plant Sci. 2018a;8:2235. https://doi.org/10.3389/fpls.2017.02235.

  78. Xu R, Li C, Velni JM. Development of an autonomous ground robot for field high throughput phenotyping. IFAC-PapersOnLine. 2018b;51(17):70–4. https://doi.org/10.1016/j.ifacol.2018.08.063.

  79. Xue J, Su B. Significant remote sensing vegetation indices: a review of developments and applications. J Sensors. 2017;2017:17. https://doi.org/10.1155/2017/1353691.

  80. Yang G, Liu J, Zhao C, et al. Unmanned aerial vehicle remote sensing for field-based crop phenotyping: current status and perspectives. Front Plant Sci. 2017;8:1111. https://doi.org/10.3389/fpls.2017.0111.

  81. Zerger A, Rossel RV, Swain D, et al. Environmental sensor networks for vegetation, animal and soil sciences. Int J Appl Earth Obs Geoinfo. 2010;12(5):303–16. https://doi.org/10.1016/j.jag.2010.05.001.

  82. Zhang C, Kovacs JM. The application of small unmanned aerial systems for precision agriculture: a review. Precis Agric. 2012;13(6):693–712. https://doi.org/10.1007/s11119-012-9274-5.

  83. Zhang XH, Li Y, Yu K, et al. Mechanism of Verticillium wilt stress affecting photosynthetic characteristics and chlorophyll fluorescence characteristics of cotton seedlings. J Cotton Sci. 2018;30(2):136–44. https://doi.org/10.11963/1002-7807.zxhcbl.20180313.

  84. Ziliani M, Parkes S, Hoteit I, et al. Intra-season crop height variability at commercial farm scales using a fixed-wing UAV. Remote Sens. 2018;10(12):2007. https://doi.org/10.3390/rs10122007.

Download references

Acknowledgements

Not applicable.

Funding

Not applicable.

Author information

ILBP and GLR designed the paper’s approach and structure and wrote the manuscript with input from all authors. YZS and WXG contributed in conceptualizing and provided input for the manuscript. All authors read, edited, and approved the final manuscript.

Correspondence to Irish Lorraine B. PABUAYON.

Ethics declarations

Ethics approval and consent to participate

Not applicable.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

PABUAYON, I., SUN, Y., GUO, W. et al. High-throughput phenotyping in cotton: a review. J Cotton Res 2, 18 (2019) doi:10.1186/s42397-019-0035-0

Download citation

Keywords

  • Cotton
  • High-throughput phenotyping
  • Remote sensing
  • Sensors
  • Spectral
  • Fluorescence
  • Thermal
  • Platforms
  • Aerial-based
  • Ground-based