Multirotor Drone-Based Internal Inspection of Preheater

In industrial processes, preheaters play a critical role in operations such as heating, cooling, and pretreatment. However, internal inspection of these structures has long been hampered by challenges like confined spaces, high dust concentrations, and poor lighting. Traditional manual inspection methods are not only inefficient but also pose significant safety risks to personnel. To address these issues, I propose a novel inspection scheme utilizing a multirotor drone equipped with advanced sensors and intelligent navigation algorithms. This approach aims to enhance inspection efficiency, accuracy, and safety while reducing operational costs. The integration of visual cameras, LiDAR, and IMU sensors enables comprehensive environmental perception, while grid-based mapping and optimized path planning ensure reliable autonomous operation. In this article, I will detail the system design, algorithmic foundations, and experimental validation of this multirotor drone-based inspection solution.

The current state of preheater internal inspection reveals several persistent problems. Manual inspections require personnel to enter hazardous environments, leading to low工作效率 due to the time-consuming nature of data collection and recording. Moreover, human operators are prone to errors caused by fatigue or oversight, resulting in data inaccuracies that can compromise maintenance decisions. Safety concerns are paramount, as workers are exposed to high temperatures, toxic gases, and structural risks. These limitations underscore the need for an automated alternative. The multirotor drone emerges as a promising tool, capable of navigating complex internal spaces without direct human involvement. By leveraging its mobility and sensor payload, the drone can perform detailed inspections while minimizing risks.

Central to this inspection scheme is the selection and configuration of the multirotor drone. The drone’s structure includes components such as arms, propellers, motors, and flight control systems, all integrated within a protective cage to prevent collisions. This cage, inspired by a soccer ball’s geodesic design, comprises carbon fiber tubes arranged in a three-dimensional framework to absorb impacts and safeguard internal parts. The multirotor drone’s ability to hover and maneuver in tight spaces makes it ideal for preheater environments, where obstacles and narrow passages are common. Key considerations during selection include payload capacity, flight endurance, and stability under varying atmospheric conditions within the preheater.

To achieve effective internal inspection, the multirotor drone is equipped with a fusion of sensors: a visual camera, LiDAR, and an IMU. The visual camera, specifically an RGB-D type like Kinect v1, captures both color and depth images, providing rich scene information. However, in dusty, low-light preheater conditions, visual data alone may be insufficient. LiDAR (e.g., model LS01B) complements this by emitting laser pulses to measure distances and generate precise environmental maps using triangulation. Despite challenges like signal attenuation due to dust, LiDAR offers reliable depth perception. The IMU sensor, consisting of accelerometers, gyroscopes, and magnetometers, tracks the multirotor drone’s orientation, acceleration, and angular velocity, ensuring stable flight. By fusing data from these sensors, the multirotor drone constructs a coherent representation of the preheater’s interior, overcoming individual sensor limitations. The fusion process involves independent analysis of each sensor’s output, followed by integration to enhance perception accuracy. For instance, while LiDAR detects obstacles, the IMU corrects for positional drift, and the visual camera adds contextual details.

Environmental mapping is a crucial step for autonomous navigation of the multirotor drone. I employ the occupancy grid mapping algorithm to represent the preheater’s interior as a discrete grid of cells, where each cell indicates the presence or absence of obstacles. This method is robust to environmental complexities and allows for incremental updates as the multirotor drone explores. Let $m_i$ denote a grid cell, and $\bar{m}$ represent the vector of all cells. The posterior probability of the grid map given sensor data $y_{1:t}$ and poses $x_{1:t}$ is factorized as:

$$p(m | y_{1:t}, x_{1:t}) = \prod_i p(m_i | y_{1:t}, x_{1:t})$$

Here, $m_i$ can be in two states: occupied ($x$) or free ($\bar{x}$). Using Bayesian inference, the probability of a cell being occupied is updated as:

$$p(m | x_{1:t}, y_{1:t}) = \frac{p(y_t | m, x_{1:t}) \cdot p(m | x_{1:t}, y_{1:t-1})}{p(y_t | x_{1:t}, y_{1:t-1})}$$

Applying Bayes’ rule, the probabilities for occupied and free states are derived as:

$$p(m | x_{1:t}, y_{1:t}) = \frac{p(m | x_t, y_t) p(x_t, y_t)}{p(m) \cdot p(y_t | x_{1:t}, y_{1:t-1})} \cdot p(m | x_{1:t-1}, y_{1:t-1})$$

and

$$p(\bar{m} | x_{1:t}, y_{1:t}) = \frac{p(\bar{m} | x_t, y_t) p(x_t, y_t)}{p(\bar{m}) \cdot p(y_t | x_{1:t}, y_{1:t-1})} \cdot p(\bar{m} | x_{1:t-1}, y_{1:t-1})$$

These equations enable the multirotor drone to dynamically assess obstacle presence and update the environment map in real-time.

For navigation path planning, I adopt a modified ant colony optimization (ACO) algorithm, termed the species-based ant colony algorithm, to compute optimal inspection paths for the multirotor drone. This approach divides the ant population into two groups, $group_1$ and $group_2$, with sizes $m_1$ and $m_2$ (where $m_1 > m_2$), to enhance search efficiency. Suppose there are $n$ target inspection points. The probability of an ant $k$ moving from position $i$ to $j$ at time $t$ is given by:

$$p_{ij}^k(t) = \begin{cases}
\frac{\tau_{ij}(t)^\alpha \cdot \mu_{ij}^\beta}{\sum_{l \in \text{allowed}_k} \tau_{il}(t)^\alpha \cdot \mu_{il}^\beta}, & \text{if } j \in \text{allowed}_k \\
0, & \text{otherwise}
\end{cases}$$

Here, $\tau_{ij}(t)$ is the pheromone level on path $(i,j)$, $\mu_{ij}$ is the heuristic information, and $\alpha$ and $\beta$ are influence factors. The heuristic information incorporates proximity to the goal and is defined as:

$$\mu_{ij} = \frac{1}{d_{k_1}(i,j) \cdot d_{k_2}(j, \text{Goal})}$$

where $d(i,j)$ is the distance between cells $i$ and $j$, $d(j, \text{Goal})$ is the distance to the target, and $k_1$, $k_2$ are weighting factors. Pheromone update is performed as:

$$\tau_{ij}(t + n) = (1 – \rho) \cdot \tau_{ij}(t) + \Delta \tau_{ij}(t, t + n)$$

with

$$\Delta \tau_{ij}(t, t + n) = \sum_{k=1}^m \Delta \tau_{ij}^k(t, t + n)$$

where $\rho$ is the evaporation coefficient. The algorithm iterates until the maximum iterations are reached, ensuring the multirotor drone follows the shortest and safest path.

Experimental validation was conducted on a boiler rotary air preheater to evaluate the multirotor drone’s performance. The inspection results, captured using the sensor fusion system, revealed critical internal components such as refractory linings, inner cylinder conditions, material buildup on dispersion plates, and hot air ducts. For example, the multirotor drone successfully identified areas with accumulated debris and potential wear, enabling proactive maintenance. To quantify navigation accuracy, I measured deviations in the X and Y directions, as well as angular errors, across varying flight distances. The results, summarized in the table below, demonstrate that the multirotor drone maintained errors within acceptable bounds, confirming the effectiveness of the path planning algorithm.

Test Content X Direction (cm) Y Direction (cm) Angle (°)
Flight Distance (cm) Range / Max Error / Avg Abs Error Range / Max Error / Avg Abs Error Range / Max Error / Avg Abs Error
115 2.59 / 1.37 / 0.72 7.11 / 4.62 / 1.55 3.22 / 1.71 / 0.72
230 2.62 / 1.35 / 0.70 7.14 / 4.60 / 1.53 3.23 / 1.63 / 0.75
345 2.57 / 1.40 / 0.74 7.12 / 4.63 / 1.53 3.21 / 1.67 / 0.76
460 2.60 / 1.39 / 0.75 7.11 / 4.63 / 1.52 3.24 / 1.65 / 0.74
575 2.64 / 1.38 / 0.73 7.14 / 4.65 / 1.56 3.22 / 1.62 / 0.75

The table shows that for all flight distances, the multirotor drone’s errors in position and orientation remained within expected thresholds (e.g., X-direction range under 3 cm, Y-direction under 8 cm, and angular errors below 4°), highlighting the robustness of the navigation system. This performance ensures that the multirotor drone can reliably inspect preheaters without colliding with obstacles or deviating from planned paths.

In conclusion, the multirotor drone-based inspection system offers a transformative solution for preheater maintenance. By integrating multi-sensor data and advanced algorithms, it addresses the inefficiencies and risks of manual methods. The occupancy grid mapping provides a reliable environment model, while the species-based ant colony optimization enables efficient path planning. Experimental results confirm that the multirotor drone reduces inspection time and costs, improves data accuracy, and enhances safety. Future work could focus on enhancing sensor fusion under extreme conditions and expanding the multirotor drone’s capabilities to other industrial applications. As technology evolves, the multirotor drone is poised to become a standard tool for industrial inspections, driving productivity and safety to new heights.

Scroll to Top