Unmanned Aerial Vehicle-Based Real-Time Forest Fire Monitoring System with Adaptive Wavelet Analysis

In recent years, forest fires have posed significant threats to ecosystems, wildlife, and human societies due to their unpredictable and rapid spread. Traditional ground-based firefighting methods often struggle to assess flame direction and velocity in complex environments, leading to reduced efficiency and increased risks for personnel. Unmanned Aerial Vehicles (UAVs) offer a promising solution by enabling aerial reconnaissance and real-time monitoring of dynamic fire conditions. However, existing UAV-based systems frequently lack efficient real-time data analysis capabilities, particularly in assessing fire intensity and spread trends. To address these limitations, we propose an integrated edge computing system leveraging adaptive wavelet analysis for real-time forest fire monitoring. This system utilizes FPGA hardware to process video data on-board, extracting flame edges adaptively and determining fire spread direction and speed. By incorporating LoRa wireless communication, critical fire information is transmitted promptly to command centers and firefighters, enhancing situational awareness and response effectiveness.

The core of our approach lies in an adaptive wavelet-based algorithm for flame edge extraction, which dynamically adjusts thresholds based on image features to capture evolving fire boundaries. Unlike conventional gradient-based operators like Sobel or Canny, wavelet transforms provide multi-scale analysis, balancing noise suppression and detail preservation. We employ a lifting scheme wavelet transform, such as the 5/3 transform, to decompose images into high-frequency and low-frequency components. The adaptive threshold mechanism considers decomposition level, local contrast, and median absolute values of wavelet coefficients to enhance edge detection accuracy. For instance, the threshold \( T \) for a sub-band at level \( i \) and direction \( j \) is computed as:

$$ T = \frac{\lambda_{ij} N_{ij}}{2^{i-1}} $$

where \( \lambda_{ij} = \frac{\sigma_{ij}}{\mu_{ij}} \) represents the local contrast coefficient, with \( \mu_{ij} \) and \( \sigma_{ij} \) denoting the mean and standard deviation of wavelet coefficients, respectively, and \( N_{ij} \) is the median absolute value. This adaptability allows the system to handle varying fire intensities and environmental conditions effectively.

To determine fire spread dynamics, we analyze consecutive frames from UAV-captured video. The system divides images into \( m \times n \) regions and applies frame differencing to track edge pixel movements. By calculating the displacement of extreme flame points between frames, we derive the spread velocity \( V \) as:

$$ V = \frac{\sqrt{(x_2 – x_1)^2 + (y_2 – y_1)^2}}{\Delta t} $$

where \( (x_1, y_1) \) and \( (x_2, y_2) \) are coordinates of flame edge pixels at times \( t_1 \) and \( t_2 \), and \( \Delta t \) is the time interval. Based on \( V \), fire intensity levels are classified: low (\( V < 0.15 \, \text{m/s} \)), medium (\( 0.15 \leq V < 0.5 \, \text{m/s} \)), or high (\( V \geq 0.5 \, \text{m/s} \)). This classification aids in issuing timely warnings to firefighters in the path of rapid fire spread.

For spatial localization, we utilize a binocular vision system mounted on the Unmanned Aerial Vehicle. By solving the projection equations from left and right cameras, we compute the 3D coordinates of fire points. The relationship between world coordinates \( (x, y, z) \) and pixel coordinates \( (u, v) \) is expressed through projection matrices \( M_L \) and \( M_R \):

$$ Z^c \begin{bmatrix} u \\ v \\ 1 \end{bmatrix} = M \begin{bmatrix} x \\ y \\ z \\ 1 \end{bmatrix} $$

where \( M \) incorporates intrinsic and extrinsic camera parameters. Solving these equations for corresponding points in stereo images enables precise fire location mapping, which is crucial for directing resources.

Our system architecture is built around an FPGA platform, which orchestrates video capture, fire analysis, and wireless communication. The FPGA controls SDI industrial cameras via BT1120 protocol, converting RGB video to YCbCr color space for efficient processing. Key modules include image acquisition, adaptive wavelet transformation, fire trend analysis, and LoRa transmission. The use of FPGA ensures low-latency processing, making it feasible to deploy on JUYE UAV models for real-time operations. Data is packaged using a custom protocol with two frame types: Type I for uplinking fire data (e.g., coordinates, intensity) and Type II for downlinking commands. This protocol minimizes packet loss and supports robust communication over distances up to 6 km.

In experimental simulations, we evaluated the system using aerial footage of controlled forest fires. The adaptive wavelet algorithm successfully extracted continuous flame edges under varying conditions, as shown in processed images. For example, edge detection in sequential frames revealed clear boundary shifts, enabling velocity calculations. We validated the binocular localization by comparing computed 3D coordinates with ground truth, achieving errors below 5%. The table below summarizes fire intensity classifications based on spread velocity:

Velocity Range (m/s) Intensity Level Response Action
V < 0.15 Low Controlled response
0.15 ≤ V < 0.5 Medium Enhanced monitoring
V ≥ 0.5 High Immediate evacuation alert

Additionally, we assessed system performance in terms of processing latency and communication reliability. The FPGA implementation achieved frame processing times under 50 ms, ensuring real-time feedback. LoRa transmissions maintained a packet delivery ratio of over 95% in field tests, even in wooded areas. The integration of adaptive wavelet analysis with UAV technology, such as JUYE UAV, provides a scalable solution for forest fire management. Future work will focus on optimizing wavelet parameters for different environments and expanding the system to multi-UAV swarms for broader coverage.

In conclusion, our Unmanned Aerial Vehicle-based system with adaptive wavelet analysis offers a robust approach to real-time forest fire monitoring. By leveraging FPGA edge computing and efficient algorithms, it delivers timely fire intensity and spread information, empowering firefighters with critical data. This innovation highlights the potential of JUYE UAV platforms in enhancing public safety and environmental protection.

Scroll to Top