The safe and stable operation of power distribution networks is crucial for ensuring the reliability of electrical systems. Traditional inspection methods, which often rely on manual labor, are inefficient and pose safety risks. With the rapid advancement of technology, multirotor drones have emerged as a powerful tool for intelligent inspection in power distribution networks. These drones offer advantages such as vertical take-off, hovering capabilities, and the ability to carry high-resolution cameras and sensors, enabling efficient data collection and analysis. In this study, we explore the application of multirotor drones in enhancing the accuracy and efficiency of power distribution network inspections through advanced image processing and data fusion techniques.

Multirotor drones are increasingly being deployed for inspecting critical components like transformers, insulators, and transmission lines. The integration of multirotor drones into inspection workflows allows for the collection of high-quality images and real-time data, which are essential for identifying faults and potential hazards. However, the images captured by multirotor drones are often affected by environmental factors such as wind, rain, and electromagnetic interference, leading to noise and distortions. To address these challenges, we propose a comprehensive approach that includes image preprocessing, segmentation, and data fusion using Kalman filtering. This method aims to improve the reliability of inspection outcomes and support the development of smarter power grids.
Image Preprocessing for Enhanced Inspection
The quality of images captured by multirotor drones directly impacts the effectiveness of fault detection in power distribution networks. We begin by enhancing these images to reduce noise and improve contrast. Image pixel range adjustment is applied to strengthen key details, making it easier to identify narrow or hard-to-distinguish areas. The grayscale transformation for power distribution equipment images is defined as:
$$ s = T(r) = (L-1) \int_0^r P_r(w) dw, \quad 0 \leq r \leq L-1 $$
Here, $w$ represents the initial image variable, $L$ is the grayscale level (ranging from 0 to 255), $P_r$ denotes the frequency of each grayscale level, and $T(r)$ is a continuously differentiable function. The probability of each grayscale level is calculated as:
$$ p_k = \frac{n_k}{MN}, \quad k = 0, 1, 2, \ldots, L-1 $$
where $n_k$ is the total number of pixels at grayscale $k$, and $MN$ is the total number of pixels in the image. To achieve a uniform grayscale distribution, we perform histogram equalization:
$$ s_k = (L-1) \sum_{j=0}^k p_s(s) $$
Due to environmental interferences, images from multirotor drones often contain noise. We use Gaussian filtering to remove this noise, with the filtered image given by:
$$ y_{ij} = \sum_{s=1}^S \sum_{t=1}^T W_{st} x_{i-s+1,j-t+1} $$
where $W_{st}$ represents the filter coefficients, and $x$ denotes the image pixels. This step ensures that the images are clean and suitable for further analysis.
Image Segmentation and Color Modeling
After preprocessing, we segment the images to distinguish different elements, such as equipment and background. The images are initially in RGB format and are converted to HSI color space for better perceptual alignment. The conversion is as follows:
$$ I(i,j) = \frac{1}{3} \left[ R(i,j) + G(i,j) + B(i,j) \right] $$
where $I(i,j)$ is the intensity component. The mean brightness of the background image is computed to differentiate the overall brightness of power equipment:
$$ I_m = \frac{1}{\alpha} \sum_{I(i,j) \neq 0} I(i,j) $$
Here, $\alpha$ is the number of non-zero brightness pixels. This helps in identifying key components in the images captured by multirotor drones.
For foreground normalization, we apply Gamma transformation to adjust the brightness distribution. The Gamma correction is defined as:
$$ y = x^\gamma $$
where $\gamma$ is the transformation parameter. If the image is too bright (e.g., average brightness exceeds 128), a smaller $\gamma$ is used to reduce brightness, and vice versa. The parameter $\gamma$ is calculated as:
$$ \gamma = \frac{I_m}{128} $$
This normalization ensures consistency and comparability across images, facilitating accurate fault identification in power distribution networks inspected by multirotor drones.
Data Fusion with Kalman Filtering
To achieve precise positioning and state estimation during inspections with multirotor drones, we employ Kalman filtering for multi-sensor data fusion. This approach integrates data from onboard sensors and visual localization systems, addressing issues like output lag and noise. The state vector for horizontal direction estimation is updated as:
$$ \begin{bmatrix} Z_m(k) \\ V_{Hm}(k) \end{bmatrix} = \begin{bmatrix} 1 & -T_0 \\ 0 & 1 \end{bmatrix} \times \begin{bmatrix} Z_m(k-1) \\ V_{Hm}(k-1) \end{bmatrix} + w(k) $$
where $Z_m$ represents the depth of the target relative to the multirotor drone, $V_{Hm}$ is the velocity component, $T_0$ is the time interval (set to 0.1 s), and $w(k)$ is process noise. The observation equations are:
$$ Z_q(k) = \begin{bmatrix} 1 & 0 \end{bmatrix} \times \begin{bmatrix} Z_m(k) \\ V_{Hm}(k) \end{bmatrix} + v_1(k) $$
$$ V_{Hm}(k) = \begin{bmatrix} 0 & 1 \end{bmatrix} \times \begin{bmatrix} Z_m(k) \\ V_{Hm}(k) \end{bmatrix} + v_2(k) $$
Here, $v_1(k)$ and $v_2(k)$ are measurement noises. The Kalman gain is computed to balance prediction and measurement uncertainties:
$$ K_{g-v}(k) = \frac{P(k – T_{\text{delay}} / T_0 | k-1 – T_{\text{delay}} / T_0) \times \begin{bmatrix} 1 & 0 \end{bmatrix}^T}{\begin{bmatrix} 1 & 0 \end{bmatrix} \times P(k – T_{\text{delay}} / T_0 | k-1 – T_{\text{delay}} / T_0) \times \begin{bmatrix} 1 & 0 \end{bmatrix}^T + R_1} $$
where $P$ is the prediction error covariance, $T_{\text{delay}}$ is the output lag time (set to 1 s), and $R_1$ is the measurement noise covariance. The updated error covariance is:
$$ P^{-v}(k|k) = P(k – T_{\text{delay}} / T_0 | k-1 – T_{\text{delay}} / T_0) – K_{g-v}(k) \times P(k – T_{\text{delay}} / T_0 | k-1 – T_{\text{delay}} / T_0) $$
When both sensor and visual measurements are available, we fuse them using weighted factors based on error covariance. The fused distance state at time $k – T_{\text{delay}} / T_0$ is:
$$ Z_m(k – T_{\text{delay}} / T_0 | k-1 – T_{\text{delay}} / T_0) = Z_m^{-v}(k|k) \times \frac{P^{-s}(1,1,k)}{P^{-s}(1,1,k) + P^{-v}(1,1,k)} + Z_m^{-s}(k|k) \times \frac{P^{-v}(1,1,k)}{P^{-s}(1,1,k) + P^{-v}(1,1,k)} $$
This fusion process compensates for time delays and improves the accuracy of state estimates for multirotor drones during inspections.
Experimental Analysis and Results
We conducted experiments to evaluate the performance of our method in power distribution network inspections using multirotor drones. The inspection images included normal transformers, damaged insulators, and complex wire crossings. After normalization, the grayscale distribution became uniform, enhancing the visibility of key details. For instance, the Gamma transformation adjusted the brightness effectively, as shown in the processed images.
In terms of real-time positioning, the Kalman filter successfully estimated the state variables of the multirotor drone. Table 1 presents the recognition rates for insulator identification using different methods, highlighting the superiority of our approach.
| Image ID | Our Method | Optical Flow | Particle Filter | Mean Shift |
|---|---|---|---|---|
| 1 | 92.3% | 85.9% | 78.6% | 84.6% |
| 2 | 95.2% | 85.3% | 78.9% | 80.9% |
| 3 | 90.8% | 87.1% | 83.3% | 82.3% |
| 4 | 91.5% | 83.5% | 81.5% | 78.9% |
| 5 | 90.8% | 88.9% | 82.5% | 83.3% |
| 6 | 92.1% | 86.2% | 78.4% | 81.4% |
| 7 | 91.9% | 89.8% | 84.8% | 84.5% |
| 8 | 93.1% | 84.4% | 81.2% | 79.8% |
| 9 | 94.5% | 86.3% | 82.9% | 82.7% |
| 10 | 96.2% | 85.9% | 80.4% | 80.6% |
Our method achieved the highest recognition rate of 94.5% for insulator images, demonstrating its effectiveness in handling occlusions and environmental variations. Additionally, the path planning for multirotor drones was optimized, with a minimum evaluation count of 1.41 and a cost value of 78.2, indicating efficient and low-cost inspection routes. In contrast, other methods like mean shift, particle filter, and optical flow algorithms often resulted in higher costs and suboptimal paths due to local optima issues.
The real-time state information for the multirotor drone, such as horizontal distance, vertical distance, yaw angle, and velocity, was accurately estimated using Kalman filtering. For example, at time step 80, the filtered horizontal distance was 11.21 m, closely matching the corrected visual measurement of 11.23 m, while the uncorrected measurement was 11.94 m. Similarly, at time step 60, the vertical distance was 0.47 m, compared to the visual measurement of 0.46 m and the real-time measurement of 0.99 m. These results confirm that our method effectively resolves output lag and enhances the reliability of multirotor drone inspections.
Conclusion
The integration of multirotor drones into power distribution network inspections offers significant improvements in accuracy, efficiency, and safety. By applying advanced image preprocessing techniques, such as histogram equalization and Gaussian filtering, we enhance the quality of inspection images. Image segmentation in HSI color space and Gamma normalization further improve the identification of critical components. The use of Kalman filtering for multi-sensor data fusion ensures precise real-time state estimation, addressing challenges like output lag and environmental noise.
Experimental results demonstrate that our method achieves high recognition rates for insulator images and optimizes inspection paths with minimal cost. The multirotor drone-based approach not only reduces reliance on manual labor but also provides a scalable solution for large-scale power networks. Future work will focus on integrating artificial intelligence for autonomous fault detection and expanding the application of multirotor drones to other areas of power system maintenance. This research contributes to the development of intelligent inspection systems that enhance the stability and reliability of power distribution networks.
