In recent years, the advancement of molecular technologies, particularly high-throughput gene sequencing, has dramatically accelerated research into plant genomes. However, the study of plant phenotypes under field conditions largely remains reliant on labor-intensive and destructive sampling methods. The lack of rapid, efficient, and high-throughput data acquisition, analysis, and evaluation technologies has become a significant bottleneck in research on crop growth, development, and phenotyping. Unmanned Aerial Vehicles (UAVs), especially quadrotor drone systems, offer a promising solution. These platforms can carry various sensors and capture field imagery at high throughput, with advantages including minimal ground interference and non-contact, non-destructive measurement of plants.

Rotary-wing UAVs, like the quadrotor drone, are less constrained by take-off and landing site requirements, making them suitable for low-altitude operations. Their adjustable flight height, controllable speed, and ability to hover on demand make them ideal for real-time monitoring of large-scale cultivation areas. The flexibility of the quadrotor drone platform allows for precise image capture tailored to specific phenotyping needs.
1. The Quadrotor Drone Imaging System and Methodology
The core of this approach is a commercially available quadrotor drone system equipped with a high-resolution digital camera. A typical system includes the multi-rotor aircraft, a gimbal-stabilized camera, a ground control station (often a tablet or laptop with specialized software), and a remote controller. The quadrotor drone‘s stability is crucial for acquiring clear, geo-referenced images.
The general workflow involves pre-flight planning, mission execution, and post-flight data processing:
- Mission Planning: Defining the target field area on a satellite map, designing the flight path (waypoints), and setting key parameters.
- Parameter Configuration: Setting UAV parameters (altitude, speed) and camera parameters (shutter speed, ISO, white balance).
- Field Deployment & Flight: Executing the autonomous flight mission to capture images.
- Data Processing: Transferring images to a workstation for stitching, 3D reconstruction, and subsequent analysis like vegetation segmentation.
1.1 Flight Mission Planning and Parameter Optimization
Effective planning is critical for successful image acquisition using a quadrotor drone. Key parameters include flight altitude, image overlap, and camera settings.
Flight Altitude (H): This is a primary determinant of image resolution and coverage area. Lower altitudes yield higher spatial resolution but reduce the area covered per image, requiring more flight time and battery cycles. The Ground Sampling Distance (GSD), which represents the ground distance represented by one pixel in the image, is calculated as:
$$ GSD = \frac{S \times H}{f} \times 10^{-3} $$
where \( S \) is the camera sensor pixel size (in µm), \( H \) is the flight altitude above ground (in m), and \( f \) is the camera focal length (in mm). A smaller GSD indicates higher spatial resolution.
| Flight Altitude (m) | Calculated GSD (cm/pixel) | Relative Area Coverage per Image | Estimated Flight Time for a 1-ha Field | Suitability |
|---|---|---|---|---|
| 15 | 0.43 | Low | High | Ultra-high-resolution plant-level analysis |
| 20 | 0.58 | Medium-Low | Medium-High | High-resolution phenotyping (recommended balance) |
| 30 | 0.87 | Medium | Medium | Field-scale monitoring and moderate-resolution analysis |
| 50 | 1.44 | High | Low | Rapid, large-area assessment (low detail) |
Image Overlap: Sufficient forward overlap (between consecutive images along the flight path) and side overlap (between adjacent flight lines) is essential for successful photogrammetric processing. High overlap (e.g., 80%) provides more data points for generating accurate orthomosaics and 3D models. The quadrotor drone‘s flight speed and the camera’s capture interval are automatically calculated by planning software based on the desired GSD and overlap.
Camera Settings: To avoid motion blur, a fast shutter speed (e.g., 1/1000s or faster) must be used in conjunction with the quadrotor drone‘s flight speed. Aperture and ISO should be set for optimal exposure under prevailing light conditions. Shooting in RAW format is advantageous for post-processing, though JPEG is often sufficient.
2. Post-Flight Data Processing Pipeline
The raw images captured by the quadrotor drone are processed using a suite of software tools to generate actionable data products.
2.1 Image Stitching and 3D Reconstruction
The set of overlapping images is processed using Structure-from-Motion (SfM) photogrammetry software. This algorithm identifies common feature points across images, calculates their 3D positions, and reconstructs the scene. The primary outputs are:
- Orthomosaic: A geometrically corrected, seamless map of the field where every pixel is directly aligned to a ground coordinate, eliminating perspective distortion.
- 3D Point Cloud & Digital Surface Model (DSM): A dense set of 3D points representing the surface geometry of the field, including crops and terrain. The DSM is a raster representation of canopy height.
The quality of these reconstructions is highly dependent on the stability of the quadrotor drone‘s platform and the consistency of lighting during the flight.
2.2 Vegetation Segmentation Using a Decision Tree Algorithm
To analyze crop-specific traits, the vegetation (crop plants) must be separated from the background (soil, shadows, residues). A robust, illumination-invariant segmentation algorithm based on a decision tree model is highly effective. This method uses color indices derived from the RGB channels that maximize the separability between vegetation and non-vegetation pixels under varying light conditions.
The algorithm is trained on sample pixels manually labeled as either “vegetation” or “background” from the orthomosaic. Common features used in the decision tree include:
- Excess Green Index (ExG): \( ExG = 2g – r – b \)
- Excess Red Index (ExR): \( ExR = 1.4r – g \)
- Color Index of Vegetation Extraction (CIVE): \( CIVE = 0.441r – 0.811g + 0.385b + 18.78745 \)
- Normalized RGB values: \( r = R/(R+G+B), g = G/(R+G+B), b = B/(R+G+B) \)
The decision tree learns optimal thresholds and combinations of these indices to classify each pixel. The output is a binary mask where vegetation pixels are assigned a value of 1 and background pixels are 0. This mask can be applied to the original orthomosaic or used for subsequent calculations. The strength of this method, when applied to imagery from a quadrotor drone, is its ability to effectively remove canopy shadows, which are a major source of error in simpler thresholding techniques.
3. Application: Estimating Missing Plants in Field Plots
An immediate application of the processed quadrotor drone imagery is the automated assessment of crop establishment, specifically estimating the number of missing plants (skips) in field plots. This is a valuable metric for evaluating seed germination, transplant success, or early seedling vigor.
The process, applied to the segmented binary image of a crop like tobacco, involves several computational steps:
3.1 Algorithm for Missing Plant Estimation
- Plant Contour Detection: The binary segmentation mask is used to find contours of individual plants. A minimum area threshold is applied to filter out noise and small weeds.
$$ A_{contour} \geq Threshold_{area} $$ - Stem Position Estimation: For each plant contour, a morphological skeletonization algorithm is applied to reduce it to a 1-pixel wide skeleton. The stem position is estimated as the centroid of the skeleton’s junction points. If few junctions exist, the centroid of the contour’s bounding box is used.
$$ P_{stem} = \left( \frac{1}{n}\sum_{i=1}^{n} x_{J_i}, \frac{1}{n}\sum_{i=1}^{n} y_{J_i} \right) $$
where \( (x_{J_i}, y_{J_i}) \) are the coordinates of the \( n \) skeleton junction points. - Row Detection and Alignment: Detected stem points are clustered into rows based on their spatial distribution. A linear regression is performed for the points in each row to define the row direction line.
$$ y = \beta_0 + \beta_1 x $$ - Projection and Spacing Calculation: Stem points are projected onto their respective row line. The distances between consecutive projected points along the row are calculated. The median of these distances is taken as the estimated average plant spacing (\( \overline{AD} \)) for that row.
$$ D_{ij} = \| \text{Proj}(P_{j+1}) – \text{Proj}(P_{j}) \| $$
$$ \overline{AD} = \text{median}({D_{ij}}) $$ - Missing Plant Calculation: For each gap between detected plants, the number of missing plants (\( M_{ij} \)) is estimated by comparing the gap distance to the average spacing.
$$ M_{ij} = \text{round}\left( \frac{D_{ij}}{\overline{AD}} + 0.5 \right) – 1 $$
The \( +0.5 \) term ensures standard rounding rules. A result of 0 indicates no missing plant, 1 indicates one missing plant, etc.
3.2 Results and Validation
When this algorithm is applied to high-resolution orthomosaics generated from low-altitude (e.g., 20m) quadrotor drone flights, it provides rapid and accurate estimates of stand count. The automated counts show strong agreement with manual counts from the same imagery.
| Plot ID | Total Plant Positions | Missing Plants (Visual Count) | Missing Plants (Algorithm Count) | Agreement |
|---|---|---|---|---|
| 1 | 190 | 4 | 4 | Yes |
| 2 | 195 | 4 | 5 | Near |
| 3 | 191 | 5 | 5 | Yes |
| 4 | 192 | 5 | 6 | Near |
| 5 | 193 | 6 | 6 | Yes |
Minor discrepancies (e.g., Plot 2 & 4) can often be attributed to plants that are heavily covered by soil or are extremely small, causing their segmented area to fall below the detection threshold. This automated method, enabled by the quadrotor drone imagery, is orders of magnitude faster than traditional field walking and provides georeferenced data on the exact location of planting skips.
4. Discussion and Considerations for Quadrotor Drone Phenotyping
The integration of quadrotor drone technology with robust image processing pipelines presents a transformative tool for field-based plant phenotyping. However, several practical and methodological factors must be considered for reliable data acquisition.
4.1 Operational and Environmental Factors
Weather and Lighting: The quality of imagery from a quadrotor drone is highly sensitive to weather. Direct, bright sun can cause harsh shadows and saturation, while overcast conditions provide diffuse, even lighting ideal for color analysis but may require higher ISO settings. Consistency across multiple flights (e.g., for time-series analysis) is crucial, so flights should be conducted under similar solar elevation angles and weather conditions when possible. The decision-tree segmentation algorithm significantly mitigates, but does not fully eliminate, challenges posed by varying shadow patterns.
Flight Logistics: Battery life is a key limiting factor for the quadrotor drone platform, especially at low altitudes. Mission planning must balance resolution (low altitude) with coverage area per battery. Furthermore, pilots must be aware of field obstacles such as power lines, tall trees, and uneven terrain that pose risks to low-altitude flight.
4.2 Analytical Challenges and Advancements
Weed Discrimination: A significant challenge for segmentation algorithms is discriminating between the crop of interest and weeds that have a similar spectral signature. The area-based filtering used in the missing plant algorithm is a simple solution for early growth stages. For later stages or more complex scenarios, more advanced machine learning models (e.g., convolutional neural networks) trained on specific crop and weed features may be necessary. These models can be deployed on high-resolution imagery from a quadrotor drone.
From 2D to 3D Phenotyping: While this article focuses on 2D orthomosaic analysis, the 3D point cloud and DSM generated from quadrotor drone imagery are rich data sources. Canopy height, plant volume, and canopy architecture metrics can be extracted. These 3D traits are often more directly related to biomass and yield than 2D projected area. The formula for estimating plant volume from a 3D point cloud, for instance, can involve voxelization or convex hull algorithms applied to points segmented as belonging to an individual plant.
Scalability and High-Throughput: The true power of the quadrotor drone platform is its scalability. A single flight can image hundreds of plots in a breeding trial or many hectares in a production field. The subsequent automated image processing pipeline transforms this massive image collection into quantifiable phenotypic data tables. This enables the selection pressure based on field performance to keep pace with advances in genomic selection.
5. Conclusion
The methodology outlined demonstrates that a quadrotor drone is a highly effective platform for the rapid acquisition of high-resolution field imagery. By optimizing flight parameters—particularly using a lower altitude like 20m under stable lighting—researchers can obtain imagery with a Ground Sampling Distance fine enough for detailed plant-level analysis. The subsequent processing chain, featuring photogrammetric stitching and robust decision-tree-based vegetation segmentation, reliably extracts clean plant images from the complex field background.
The application of this pipeline to automatically estimate missing plants showcases its practical utility for agronomic assessment, providing accuracy comparable to manual counts but with vastly superior speed and spatial explicitness. Beyond stand count, this foundational imagery and derived data products (orthomosaics, DSMs, segmentation masks) serve as the basis for extracting a wide array of phenotypic traits, such as canopy cover, plant height, leaf area index (estimated from cover), and early vigor.
As the demand for high-throughput field phenotyping grows across both commercial agriculture and plant science research, the quadrotor drone platform, coupled with increasingly sophisticated and automated image analysis algorithms, will play an indispensable role. It bridges the gap between genomic potential and expressed phenotypic performance, enabling more efficient selection, improved crop management, and accelerated breeding cycles.
