Design of an Autonomous Flight System for a Quadrotor UAV

In recent years, the development of unmanned aerial vehicles (UAVs) has expanded from military applications to civilian uses, such as agricultural monitoring, search and rescue operations, and industrial inspections. Among various UAV configurations, the quadrotor has gained prominence due to its ability to perform vertical take-off and landing, hover steadily, and maintain a simple structure with low development and maintenance costs. However, traditional quadrotor systems often rely on GPS for navigation, which becomes ineffective in indoor environments where signals are obstructed. To address this limitation, we have designed an autonomous flight platform for a quadrotor that operates without GPS by integrating optical flow sensors and laser rangefinders. This system enables precise path following in both indoor and outdoor settings, enhancing the quadrotor’s autonomy and reliability.

The quadrotor is a nonlinear, highly coupled, underactuated system with four inputs and six outputs, meaning that its linear and angular motions are tightly constrained. We selected an “X”-shaped quadrotor configuration for its superior stability and disturbance rejection compared to the “+”-type. In the “X” configuration, the forward direction is oriented at a 45-degree angle to the arms, allowing for simultaneous adjustment of all four rotors to achieve complex maneuvers. The basic flight motions of a quadrotor include lift, pitch, roll, and yaw, which are controlled by varying the rotational speeds of the motors. The motor speed control strategies for these motions are summarized in Table 1.

Table 1: Motor Speed Control for Quadrotor Flight Modes
Mode Lift Pitch Roll Yaw
Motor 1 + + +
Motor 2 + + +
Motor 3 +
Motor 4 + + +

Here, “+” denotes an increase in rotor speed, and “-” denotes a decrease. The quadrotor’s motion is governed by the principles of thrust and torque. For instance, lift is achieved by uniformly increasing the speed of all motors, while pitch and roll involve differential speed changes to tilt the quadrotor. Yaw control is implemented by exploiting the reactive torques of the rotors; since adjacent rotors spin in opposite directions, adjusting the speed differences generates a net yaw moment. The dynamic equations for a quadrotor can be expressed using Newton-Euler formulations. The translational motion is described by:

$$ \ddot{x} = \frac{1}{m} \left( \sum F_x – k_f \dot{x} \right) $$

$$ \ddot{y} = \frac{1}{m} \left( \sum F_y – k_f \dot{y} \right) $$

$$ \ddot{z} = \frac{1}{m} \left( \sum F_z – mg – k_f \dot{z} \right) $$

where \( m \) is the mass of the quadrotor, \( g \) is gravitational acceleration, \( k_f \) is the drag coefficient, and \( \sum F_x, \sum F_y, \sum F_z \) are the total forces in the body frame. The rotational dynamics are given by:

$$ I \dot{\omega} + \omega \times (I \omega) = \tau $$

where \( I \) is the inertia matrix, \( \omega \) is the angular velocity vector, and \( \tau \) is the torque vector. These equations highlight the complex coupling in quadrotor systems, necessitating advanced control strategies for autonomous flight.

Our autonomous flight platform for the quadrotor consists of two main subsystems: the aerial platform and the ground terminal. The aerial platform includes the quadrotor frame, a flight control board, an onboard computer, and navigation sensors, while the ground terminal comprises a data transmission module, a PC, and a wireless network card. The overall architecture is designed to facilitate real-time communication and control, enabling the quadrotor to execute predefined paths without GPS. The flight control board serves as the core, processing sensor data and adjusting motor speeds, while the onboard computer handles high-level path planning and data fusion.

For the flight control board, we selected the Pixhawk4, an open-source hardware platform that integrates multiple sensors, including gyroscopes, accelerometers, magnetometers, and barometers. It features dual processors from the STM F7 series, operating at 216 MHz, which provide redundant computation to minimize failure rates. The Pixhawk4 supports both PX4 and ArduPilot flight stacks; we chose PX4 for its modular architecture and compatibility with Linux-based systems. The board includes two GPS modules for automatic failover, 14 PWM output interfaces, and communicates via the MAVLink protocol. This allows it to receive commands from the onboard computer or ground station and transmit real-time telemetry data, such as attitude and position.

The onboard computer is a Raspberry Pi 4B, chosen for its compact size, low weight, and sufficient processing power. It features a 64-bit quad-core BCM2711 processor running at 1.5 GHz and 32 GB of SD storage, enabling efficient data handling. The Raspberry Pi connects to the Pixhawk4 via a UART serial interface and establishes a wireless local area network (WLAN) with the ground terminal for data exchange. It runs Ubuntu Mate OS and utilizes the Dronekit-Python library to implement autonomous control algorithms. This setup allows the quadrotor to process navigation data and issue velocity commands based on sensor inputs.

Navigation and positioning in GPS-denied environments are achieved through a combination of a laser rangefinder and an optical flow sensor. The laser rangefinder, a Beixing TFmini PLUS, is mounted on the bottom of the quadrotor for altitude control. It operates on the time-of-flight (TOF) principle, measuring distance by calculating the round-trip time of laser pulses. The distance \( D \) is given by:

$$ D = \frac{c \cdot T}{2} $$

where \( c \) is the speed of light, and \( T \) is the time difference between signal transmission and reception. The TFmini PLUS has a range of 12 meters, a frequency of 1000 Hz, and an accuracy of ±10 cm, making it ideal for maintaining stable height above ground.

For horizontal positioning, we employed an LC-302 optical flow sensor, which captures images and computes displacement vectors using embedded algorithms. The sensor consists of a camera and a digital signal processor that analyzes consecutive frames to estimate motion. The optical flow vector \( \mathbf{v} = (v_x, v_y) \) is derived from the brightness constancy equation:

$$ I(x, y, t) = I(x + \Delta x, y + \Delta y, t + \Delta t) $$

where \( I \) is the image intensity, and \( \Delta x, \Delta y \) are the displacements over time \( \Delta t \). By integrating these displacements, the sensor provides velocity estimates that are used for position hold and drift compensation. This allows the quadrotor to hover accurately and follow paths indoors. The specifications of key components are summarized in Table 2.

Table 2: Specifications of Quadrotor Hardware Components
Component Model Key Features Parameters
Flight Control Board Pixhawk4 Dual STM F7 processors, 14 PWM outputs, MAVLink support 216 MHz, 2 GPS modules
Onboard Computer Raspberry Pi 4B 64-bit quad-core, 1.5 GHz, WLAN 32 GB SD, Ubuntu Mate OS
Laser Rangefinder TFmini PLUS TOF principle, UART interface Range: 12 m, Accuracy: ±10 cm
Optical Flow Sensor LC-302 Integrated DSP, image processing Update rate: 100 Hz, Resolution: 0.1 m/s

The software design focuses on enabling communication between the onboard computer and the flight control board, data acquisition, position estimation, and control. We use the MAVLink protocol for real-time data exchange, which transmits messages containing position, velocity, and attitude information. The Dronekit-Python library facilitates the implementation of control algorithms on the Raspberry Pi. The software framework involves reading sensor data from the Pixhawk4, processing it to generate control commands, and sending them back to adjust the quadrotor’s flight path.

To set up the system, we first configured the Pixhawk4 flight modes, including Stabilize, AltHold (altitude hold), Loiter (position hold), RTL (return to launch), Land, and Circle. The laser rangefinder was connected via a UART port with a baud rate of 115200 bits/s, and its parameters were set to a minimum range of 10 cm, maximum range of 1000 cm, and a blind zone of 10 cm. The optical flow sensor was configured as “upflow” with a time lag of 10 ms relative to inertial measurements. Additionally, we extended the Kalman filter parameters to incorporate optical flow data for improved state estimation.

On the Raspberry Pi, we installed Ubuntu Mate and updated the system. Using Dronekit-Python, we developed code for autonomous path following. The control algorithm involves sending body-frame velocity commands to the Pixhawk4. For example, to move the quadrotor in a square path, we defined functions for velocity control in the x, y, and z directions. The code snippet below illustrates the velocity command function:

def send_body_ned_velocity(velocity_x, velocity_y, velocity_z, duration=0):
    msg = vehicle.message_factory.set_position_target_local_ned_encode(
        0,  # time_boot_ms (not used)
        0, 0,  # target system, target component
        mavutil.mavlink.MAV_FRAME_BODY_NED,
        0b0000111111000111,  # type_mask
        0, 0, 0,  # x, y, z positions (not used)
        velocity_x, velocity_y, velocity_z,  # m/s
        0, 0, 0,  # x, y, z acceleration
        0, 0)
    for x in range(0, duration):
        vehicle.send_mavlink(msg)
        time.sleep(1)

For autonomous square path flight, we sequentially sent velocity commands to traverse each side of the square. For instance, to move north at 0.2 m/s for 5 seconds, we set velocity_x = 0, velocity_y = -0.2, velocity_z = 0, and duration = 5, followed by a 2-second delay. Similarly, eastward motion was achieved with velocity_x = 0.2, velocity_y = 0, and the same duration. This approach ensured smooth transitions between path segments.

The control logic relies on feedback from the sensors. The laser rangefinder provides height estimates \( h \) using the TOF equation, while the optical flow sensor gives velocity corrections. The position update can be expressed as:

$$ x_{k+1} = x_k + v_x \Delta t $$

$$ y_{k+1} = y_k + v_y \Delta t $$

$$ z_{k+1} = z_k + v_z \Delta t $$

where \( v_x, v_y, v_z \) are the velocity components, and \( \Delta t \) is the sampling time. The quadrotor’s attitude is controlled by a PID controller that adjusts motor speeds based on error signals. For example, the height control loop uses the error \( e_h = h_{desired} – h_{actual} \) to compute a thrust command:

$$ T = K_p e_h + K_i \int e_h \, dt + K_d \frac{de_h}{dt} $$

where \( K_p, K_i, K_d \) are PID gains. Similarly, the optical flow data is fused with inertial measurements to reduce drift, enabling precise hovering and path following.

To validate our quadrotor autonomous flight platform, we conducted experiments in an indoor environment without GPS signals. The quadrotor was powered by a battery with safe voltage levels, and we ensured that the ground station software (QGroundControl) was operational. The Raspberry Pi and ground PC were connected to the same WLAN, allowing remote control and data logging. We executed the autonomous flight code, which commanded the quadrotor to take off to a height of 1 meter using laser-based altitude hold and then follow a 1m × 1m square path using optical flow for position control. After completing the path, the quadrotor landed automatically.

The results demonstrated that the quadrotor successfully followed the predefined trajectory with minimal deviation. The laser rangefinder maintained stable altitude, while the optical flow sensor provided accurate horizontal positioning, compensating for drift over time. The quadrotor’s path was logged and analyzed, showing that it achieved the desired square shape with errors of less than 10 cm in position and height. This confirms the effectiveness of our sensor fusion approach for autonomous quadrotor navigation in GPS-denied environments.

In conclusion, we have designed and implemented an autonomous flight system for a quadrotor that leverages laser rangefinders and optical flow sensors to enable path following without GPS. The integration of Pixhawk4, Raspberry Pi, and custom software allows for robust control in indoor settings. However, optical flow sensors accumulate noise over time, which can affect long-term accuracy. Future work will focus on incorporating more precise sensors, such as stereo vision systems, to enhance localization and enable more complex missions. This quadrotor platform serves as a foundation for applications in logistics, inspection, and other areas where GPS is unavailable.

The development of autonomous quadrotor systems is crucial for expanding their utility in constrained environments. Our design highlights the importance of multi-sensor integration and real-time processing for achieving reliable autonomy. As quadrotor technology advances, further improvements in algorithms and hardware will continue to push the boundaries of what these systems can accomplish.

Scroll to Top