Autonomous Crop Spraying Drone System for Multi-Span Greenhouses

In recent years, the application of unmanned aerial vehicles (UAVs) in agriculture has revolutionized open-field farming, offering enhanced mobility, flexibility, and integration with digital technologies. However, the use of crop spraying drones in controlled environments like multi-span greenhouses remains underexplored due to challenges such as limited space, weak GNSS signals, and dense obstacles. This article presents a comprehensive system for an autonomous spraying UAV designed specifically for film-covered multi-span greenhouses. We address critical issues including precise positioning, obstacle avoidance, and adaptive spraying through innovative sensor fusion and control algorithms. By leveraging real-time kinematic global navigation satellite systems (RTK-GNSS), extended Kalman filters (EKF), multi-sensor perception, and variable rate technology, our crop spraying drone achieves reliable inspection and plant protection tasks in these constrained environments. The system’s performance is validated through experiments demonstrating safe autonomous flight, accurate image-based inspection, and effective pest control, outperforming manual methods. This work establishes a foundational framework for deploying spraying UAVs in facility agriculture, paving the way for smarter, more efficient crop management.

The integration of crop spraying drones into greenhouse operations requires overcoming unique hurdles. In open fields, spraying UAVs rely heavily on GNSS for navigation and employ various sensors for obstacle detection and spraying control. However, in multi-span greenhouses, the semi-enclosed structure attenuates GNSS signals, leading to positioning inaccuracies that compromise flight safety. Additionally, the confined vertical and horizontal spaces, coupled with transparent or textured obstacles like films and metal supports, demand robust perception and close-range avoidance capabilities. Traditional methods used in outdoor settings, such as radar or camera-based systems, often fall short indoors due to deployment costs and susceptibility to environmental interference. Moreover, ultra-low-altitude spraying in greenhouses can result in uneven pesticide distribution, causing over- or under-application that affects crop health and food safety. Our approach tackles these challenges by designing a specialized spraying UAV system that combines RTK-GNSS with dual EKF switching for stable positioning, fuses forward vision and multi-directional ultrasonics for comprehensive obstacle sensing, and implements speed-adaptive variable spraying for uniform chemical application. This article details the system architecture, key technologies, and experimental results, highlighting the practicality of autonomous crop spraying drones in enhancing greenhouse productivity.

The overall architecture of our autonomous crop spraying drone system is modular, ensuring coordinated functionality across various components. As summarized in Table 1, the system comprises several interconnected modules: task management, inspection monitoring, spraying plant protection, perception and obstacle avoidance, precision positioning, and motion control, along with external devices like RTK base stations and ground displays. The task management module, centered on an onboard computer, handles decision-making and complex algorithm execution, while the flight control computer (e.g., CUAV V6X) ensures precise UAV maneuvering. Motion control translates commands into actuator movements, enabling autonomous flight. For precision positioning, the spraying UAV employs RTK-GNSS (e.g., CUAV C-RTK 9Ps) fused with LiDAR (e.g., Beixing TP-02pro) via dual EKF to maintain accuracy despite signal fluctuations. Inspection monitoring uses a gimbal camera (e.g., MER2-503-23GC) with GPIO-triggered shutter control to capture high-resolution images tagged with precise location data. The spraying plant protection module features a pump-nozzle system with flow sensors and high-pressure扇形 nozzles to achieve uniform droplet distribution under low-altitude conditions. Perception and obstacle avoidance integrate a ZED2i stereo vision sensor and KS114 ultrasonic sensors (with IP65/66 ratings) for detecting obstacles at various ranges and textures. External devices, including RTK base stations, remote controllers, and ground display systems, facilitate communication and monitoring via WiFi and 2.4 GHz radio. This modular design ensures that the crop spraying drone operates efficiently in the challenging greenhouse environment, with each component contributing to autonomous navigation, inspection, and spraying.

Table 1: Modules of the Autonomous Spraying UAV System
Module Components Function
Task Management Onboard computer, Flight control computer Decision-making, algorithm execution, and flight control coordination
Motion Control Electrical actuators Execute flight commands for autonomous movement
Precision Positioning RTK-GNSS, LiDAR Provide centimeter-level positioning using dual EKF switching
Inspection Monitoring Gimbal camera with GPIO Capture and transmit images with precise location data
Spraying Plant Protection High-pressure扇形 nozzles, flow sensors, PID controller Adaptive variable spraying based on UAV speed
Perception and Obstacle Avoidance ZED2i vision sensor, KS114 ultrasonic sensors Detect obstacles and enable close-range avoidance
External Devices RTK base station, Remote controller, Ground display Support communication, monitoring, and emergency control

Precise positioning is paramount for the safe operation of a spraying UAV in multi-span greenhouses, where GNSS signals are weak and unstable. Our system employs RTK-GNSS combined with a dual EKF strategy to estimate the drone’s pose accurately. The state vector for the EKF is defined as:
$$
\mathbf{x} = \left[ \mathbf{q}^T \quad \mathbf{V}^T \quad \mathbf{P}^T \quad \Delta\boldsymbol{\theta}_b^T \quad \Delta\mathbf{V}_b^T \quad \mathbf{M}_I^T \quad \mathbf{M}_B^T \right]^T
$$
where $\mathbf{q}$ is the attitude quaternion, $\mathbf{V}$ and $\mathbf{P}$ are the body-frame velocity and position in the north-east-down (NED) coordinate system, $\Delta\boldsymbol{\theta}_b$ and $\Delta\mathbf{V}_b$ are incremental angle and velocity biases in the body frame, and $\mathbf{M}_I$ and $\mathbf{M}_B$ are the Earth’s magnetic field vector and bias in the body frame, respectively. The observation equation for height is:
$$
z_D = H_D \mathbf{x} + R_D
$$
with $H_D$ as the observation matrix and $R_D$ as the sensor uncertainty. The primary EKF uses RTK-GNSS for height observations when in RTK Fixed mode (centimeter-level accuracy), while the secondary EKF switches to LiDAR-derived height during signal degradation. The update step follows:
$$
\begin{aligned}
\mathbf{K} &= \mathbf{P}_{k+1|k} H_D^T \left( H_D \mathbf{P}_{k+1|k} H_D^T + R_D \right)^{-1} \\
\mathbf{x}_{k+1|k+1} &= \mathbf{x}_{k+1|k} + \mathbf{K} \left( z_D – H_D \mathbf{x}_{k+1|k} \right) \\
\mathbf{P}_{k+1|k+1} &= \left( \mathbf{I} – \mathbf{K} H_D \right) \mathbf{P}_{k+1|k}
\end{aligned}
$$
This dual EKF approach ensures stable altitude estimation, critical for avoiding collisions in height-restricted greenhouses. For inspection images, the spraying UAV uses GPIO-triggered camera exposure to timestamp shots accurately. The image center position in WGS-84 coordinates, $\mathbf{P}^G$, is computed from the drone’s position $\mathbf{U}^G$ and camera offset $\mathbf{C}^U$:
$$
\mathbf{C}^G = \mathbf{U}^G + \begin{bmatrix} K & 0 & 0 \\ 0 & \frac{K}{\cos U_x^G} & 0 \\ 0 & 0 & -1 \end{bmatrix} (\mathbf{T}_n^b)^{-1} \mathbf{C}^U
$$
where $K$ is the inverse of the Earth’s radius, and $\mathbf{T}_n^b$ is the rotation matrix from body to NED frame. The horizontal offset $\Delta\mathbf{C}^F$ due to drone attitude $(\phi, \theta, \psi)$ and height $z_c^G$ is:
$$
\Delta C_n^F = \frac{z_c^G (\sin\theta \cos\psi + \sin\phi \sin\psi)}{\sqrt{1 – \sin^2\theta – \sin^2\phi}}, \quad \Delta C_e^F = \frac{z_c^G (\sin\theta \sin\psi – \sin\phi \cos\psi)}{\sqrt{1 – \sin^2\theta – \sin^2\phi}}
$$
Thus, the image center position is:
$$
\mathbf{P}^G = \mathbf{C}^G(x,y) + \begin{bmatrix} K & 0 \\ 0 & \frac{K}{\cos C_x^G} \end{bmatrix} \Delta\mathbf{C}^F
$$
This method achieves horizontal position errors within ±7 cm, enabling precise location tagging for subsequent agricultural tasks.

Obstacle avoidance in confined greenhouse spaces requires a robust perception system capable of detecting diverse obstacles, including transparent films and metal structures. Our spraying UAV integrates a forward-facing ZED2i stereo camera for fine-grained distance estimation and KS114 ultrasonic sensors mounted in multiple directions (e.g., horizontal and upward) for close-range detection. The ultrasonic sensors operate at 5 V with a range of 1 cm to 3 m and IP65 waterproofing, while the vision sensor offers a resolution of 2208×1242 and IP66 rating, ensuring reliability in dusty, humid conditions. Sensor fusion combines depth data from the camera with proximity readings from ultrasonics, allowing the drone to perceive obstacles at varying distances and sizes. For close-range avoidance, we apply a static safety distance to obstacles rather than the drone itself, transforming dynamic safety checks into a static constraint. As illustrated in Figure 3, the drone is treated as a point mass, and obstacles are inflated by a safety margin exceeding the drone’s collision radius. This approach minimizes conservative path planning and ensures safe navigation with minimal clearance. The minimum avoidance distance upper bound $d_{\text{min}}$ is derived from geometric principles: if $D$ is the distance from a laser emitter to an obstacle, $R$ is the drone’s collision radius, and $r$ is the distance from a laser spot on a calibration plate to its center, then:
$$
d_{\text{min}} \leq D – (R – r)
$$
Experiments show that the spraying UAV maintains an upper bound of $44 \pm 5.5$ cm, with instances as low as 38.5 cm, demonstrating effective close-range avoidance in dense environments.

The plant protection module of our crop spraying drone addresses the challenges of ultra-low-altitude spraying, where wind from rotors can cause uneven pesticide distribution. We employ high-pressure扇形 nozzles that produce uniform droplets with sufficient downward force to minimize drift. A variable spraying system adapts the flow rate based on the drone’s real-time speed to ensure consistent application. The area covered by the spraying UAV in time $\Delta t$ is:
$$
S = L v \Delta t
$$
where $v$ is the drone’s velocity, $L$ is the swath width, and $\Delta t$ is the time interval. The total pesticide volume $Q$ is:
$$
Q = F L v \Delta t
$$
with $F$ as the target application rate per unit area. The desired nozzle flow rate $C$ is then:
$$
C = F L v
$$
A closed-loop control system, as depicted in Figure 4, uses a PID controller to adjust the pump speed based on the error between the desired flow rate $C$ and the actual flow rate $C_{\text{actual}}$ measured by a flow sensor. This ensures that the spraying UAV maintains uniform coverage even as speed varies, reducing chemical waste and improving efficacy. The transfer function of the PID controller can be expressed as:
$$
u(t) = K_p e(t) + K_i \int_0^t e(\tau) \, d\tau + K_d \frac{de(t)}{dt}
$$
where $e(t) = C – C_{\text{actual}}$ is the error signal, and $K_p$, $K_i$, $K_d$ are proportional, integral, and derivative gains, respectively. This adaptive system allows the crop spraying drone to achieve precise chemical application, enhancing pest control while minimizing environmental impact.

To validate the system, we conducted experiments in an 8430 film-covered multi-span greenhouse with a span of 8 m, bay of 4 m, and column height of 3 m. The spraying UAV was configured to fly at approximately 1 m/s and 1 m altitude for inspection and spraying tasks. In positioning tests, the dual EKF strategy maintained stable flight despite RTK-GNSS fluctuations, as shown in Table 2, which summarizes the RTK state during operation. The inspection module captured clear images of crops, with position accuracy verified using calibration plates. For image center positioning, we placed two calibration plates at known relative positions and computed the offset between drone-reported and actual coordinates. The error $\epsilon_i$ for the $i$-th image is:
$$
\epsilon_i = \Delta \mathbf{H}_i^* – \Delta \mathbf{H}_i
$$
where $\Delta \mathbf{H}_i^*$ is the relative offset from drone data and $\Delta \mathbf{H}_i$ is the actual offset from plate readings. Over multiple flights, the horizontal error remained within ±7 cm, confirming high precision for agricultural analytics.

Table 2: RTK-GNSS State Statistics During Greenhouse Flight
RTK State Description Percentage of Time (%)
4 (3D DGPS) Meter-level accuracy 15
5 (RTK Float) Centimeter to meter accuracy 25
6 (RTK Fixed) Centimeter-level accuracy 60

For obstacle avoidance, we measured the minimum distance upper bound using a calibration plate and laser emitters. Parameters included a drone collision radius of 81.5 cm and laser emitter distance of 117 cm. The results, averaged over 10 trials, yielded an upper bound of $44 \pm 5.5$ cm, with a minimum of 38.5 cm, proving the spraying UAV’s ability to navigate tightly spaced obstacles safely. In plant protection efficacy tests, we compared the drone’s performance against manual spraying for controlling striped flea beetles on leafy greens. The experimental setup divided the field into three blocks: manual spraying, blank control, and drone spraying, each with two ridges of 1.1 m width and 35 m length. Application rates were equal for manual and drone groups. We sampled five 1.1 m × 0.2 m areas per block, recording pest damage indices before and after spraying. The control efficacy was calculated as:
$$
\text{Efficacy} = \left(1 – \frac{\text{Post-treatment index}}{\text{Pre-treatment index}}\right) \times 100\%
$$
As shown in Table 3, the spraying UAV achieved 89.63% efficacy at 2 days post-application, compared to 82.13% for manual spraying, and maintained 80.93% at 7 days, outperforming manual methods by 9.26 percentage points. This demonstrates the crop spraying drone’s superiority in sustained pest control, attributed to better droplet penetration and uniform coverage.

Table 3: Pest Control Efficacy Comparison (%)
Days After Application Spraying UAV Manual Spraying
2 89.63 82.13
7 80.93 71.67

In conclusion, our autonomous crop spraying drone system effectively addresses the challenges of film-covered multi-span greenhouses through innovative positioning, perception, and spraying technologies. The dual EKF with RTK-GNSS and LiDAR ensures stable flight and accurate image geotagging, while sensor fusion enables safe close-range obstacle avoidance. The speed-adaptive variable spraying system promotes uniform pesticide application, resulting in higher efficacy than manual methods. This spraying UAV represents a significant advancement in facility agriculture, offering a viable solution for automated inspection and plant protection. Future work could focus on integrating AI for real-time disease detection and multi-drone coordination to further enhance productivity. The success of this system underscores the potential of crop spraying drones to transform greenhouse farming into a more efficient, sustainable practice.

Scroll to Top