Quadcopter-Based Embedded System Practice Teaching Design

In the field of electronic information engineering, embedded systems represent a critical and comprehensive course that integrates hardware and software design. However, traditional teaching methods often emphasize theoretical lectures with limited practical projects, leading to reduced student engagement and inadequate hands-on experience. To address this, I have developed a new practical teaching case centered on a quadcopter unmanned aerial vehicle (UAV) system. This approach leverages the popularity and technical complexity of quadcopter drones to enhance the embedded systems curriculum, fostering student interest, innovation, and participation in competitions like the RAICOM robotics developer contest. The quadcopter system, built around an STM32F4 embedded processor, demonstrates autonomous flight, obstacle avoidance, target recognition, and simulated attacks through programmed algorithms. Experimental results confirm the system’s effectiveness in completing designated tasks, thereby enriching the educational experience with real-world engineering challenges.

The quadcopter UAV serves as an ideal platform for embedded systems education due to its multifaceted requirements in sensing, processing, and actuation. A typical quadcopter consists of four rotors arranged in an X-configuration, which provides stability and maneuverability. The core hardware includes motors, propellers, an embedded processor (e.g., STM32F4), sensors (e.g., inertial measurement units, lidar, cameras), and communication modules. The flight controller, based on the STM32F4, executes control algorithms to maintain stability and navigate environments. In this design, the quadcopter utilizes a PID control scheme for attitude regulation and integrates sensor data fusion for accurate positioning. The software layer employs the Robot Operating System (ROS) for modular development, facilitating tasks such as image processing, path planning, and autonomous decision-making. By exploring the quadcopter’s architecture, students gain insights into embedded system design, from low-level driver programming to high-level algorithm implementation.

To understand the quadcopter’s operation, it is essential to delve into its structural design and flight principles. The quadcopter frame supports four brushless motors, each driving a propeller. The motor arrangement follows an X-pattern, where diagonally opposite motors rotate in the same direction to counteract torque effects—for instance, motors M1 and M3 spin clockwise, while M2 and M4 spin counterclockwise. This configuration enables basic maneuvers: vertical motion by uniformly adjusting all motor speeds, pitch and roll by differential speed changes, and yaw by varying the torque balance. The flight dynamics can be modeled using Newton-Euler equations. For a quadcopter with mass $m$ and inertia matrix $I$, the translational motion is governed by:

$$ m \ddot{\mathbf{r}} = \mathbf{F}_g + \mathbf{R} \mathbf{F}_b $$

where $\mathbf{r}$ is the position vector, $\mathbf{F}_g$ is gravity, $\mathbf{R}$ is the rotation matrix, and $\mathbf{F}_b$ is the body-frame force. The rotational dynamics are described by:

$$ I \dot{\boldsymbol{\omega}} + \boldsymbol{\omega} \times (I \boldsymbol{\omega}) = \boldsymbol{\tau} $$

Here, $\boldsymbol{\omega}$ is the angular velocity vector, and $\boldsymbol{\tau}$ is the torque vector. These equations highlight the coupling between translational and rotational motions, necessitating sophisticated control strategies in embedded systems.

The hardware components of the quadcopter form the foundation of the embedded system curriculum. Key modules include the perception system, control unit, and propulsion mechanism. The perception system gathers environmental data through sensors like IMUs (accelerometers and gyroscopes), lidar for obstacle detection, ultrasonic sensors for altitude measurement, and cameras for visual feedback. The control unit, centered on the STM32F4 processor, processes sensor inputs using algorithms such as extended Kalman filters for state estimation. The propulsion system comprises electronic speed controllers (ESCs), motors, and batteries, which translate control signals into thrust. A summary of hardware components is provided in Table 1.

Table 1: Quadcopter Hardware Components and Their Functions
Component Description Role in Embedded System
STM32F4 Processor ARM Cortex-M4 core with FPU Executes control algorithms and data fusion
IMU Sensors Accelerometer and gyroscope Provides attitude and motion data
Lidar Sensor 360-degree scanning for distance Enables obstacle avoidance and mapping
Brushless Motors 920 kV rating for thrust generation Actuates flight maneuvers based on PWM signals
ESC Module Converts control signals to motor power Interfaces processor with motors
Camera Module USB-based for image capture Facilitates target recognition tasks

In the perception module, sensor data fusion is critical for accurate state estimation. For example, the IMU outputs acceleration $\mathbf{a}$ and angular velocity $\boldsymbol{\omega}$, which are fused with other sensors using a Kalman filter. The state vector $\mathbf{x}$ includes position, velocity, and orientation, and the filter predicts and updates states based on measurements. The prediction step is:

$$ \hat{\mathbf{x}}_{k|k-1} = f(\hat{\mathbf{x}}_{k-1|k-1}, \mathbf{u}_k) $$
$$ \mathbf{P}_{k|k-1} = \mathbf{F}_k \mathbf{P}_{k-1|k-1} \mathbf{F}_k^T + \mathbf{Q}_k $$

where $\mathbf{F}_k$ is the Jacobian of $f$, $\mathbf{P}$ is the error covariance, and $\mathbf{Q}_k$ is the process noise. The update step incorporates measurements $\mathbf{z}_k$:

$$ \mathbf{K}_k = \mathbf{P}_{k|k-1} \mathbf{H}_k^T (\mathbf{H}_k \mathbf{P}_{k|k-1} \mathbf{H}_k^T + \mathbf{R}_k)^{-1} $$
$$ \hat{\mathbf{x}}_{k|k} = \hat{\mathbf{x}}_{k|k-1} + \mathbf{K}_k (\mathbf{z}_k – h(\hat{\mathbf{x}}_{k|k-1})) $$
$$ \mathbf{P}_{k|k} = (\mathbf{I} – \mathbf{K}_k \mathbf{H}_k) \mathbf{P}_{k|k-1} $$

This algorithm enables the quadcopter to maintain stable flight by compensating for sensor noise and uncertainties.

The software architecture of the quadcopter is built on ROS, which promotes modularity and reusability. Key ROS nodes include “web_cam” for image acquisition, “tracker_kcf” for target tracking, “mavros” for communication with the flight controller, “target_tracking” for desired state computation, and “px4_pos_control” for position control. The control algorithms, particularly PID, are implemented for attitude regulation. The PID controller computes the output $u(t)$ as:

$$ u(t) = K_p e(t) + K_i \int_0^t e(\tau) d\tau + K_d \frac{de(t)}{dt} $$

where $e(t)$ is the error signal, and $K_p$, $K_i$, $K_d$ are tuning parameters. For the quadcopter, a cascaded PID structure is used: the outer loop handles position or angle control, and the inner loop manages angular rates. This ensures rapid response and stability. For instance, the desired roll angle $\phi_d$ from the outer loop becomes the setpoint for the inner loop’s roll rate control.

Target recognition and autonomous navigation are integral to the quadcopter’s mission. Image processing algorithms, such as Haar cascades or YOLO-based object detection, are employed for identifying targets. However, due to computational constraints on embedded platforms, optimized models like MobileNet or SqueezeNet are preferred. The detection process involves capturing frames, preprocessing, and invoking a neural network. The output confidence scores determine the target class. For obstacle avoidance, reactive methods like vector field histograms (VFH) are used. The VFH algorithm constructs a polar histogram of obstacle densities and selects steering directions to avoid collisions. The cost function for direction $\theta$ is:

$$ C(\theta) = \mu_1 \cdot \Delta(\theta, \theta_t) + \mu_2 \cdot \Delta(\theta, \theta_c) + \mu_3 \cdot h(\theta) $$

where $\Delta$ denotes angular difference, $\theta_t$ is the target direction, $\theta_c$ is the current direction, $h(\theta)$ is the histogram value, and $\mu_i$ are weights. This allows the quadcopter to navigate cluttered environments autonomously.

In the practical implementation, the quadcopter system was tested in an indoor scenario mimicking the RAICOM competition. The flight area measured 5m × 5m with obstacles such as walls and cylinders. The quadcopter’s tasks included autonomous takeoff, obstacle avoidance, target identification, and simulated attacks. The software pipeline initiated with sensor calibration and ROS node activation. For example, the lidar-based localization was launched using roslaunch location location.launch, generating a cost map for navigation. The quadcopter successfully avoided obstacles by dynamically adjusting its path based on lidar readings. In target recognition tests, the quadcopter hovered over a designated area, captured images, and identified characters or QR codes using OpenCV routines. Upon confirmation, it positioned itself in front of the target and activated a laser pointer for simulated strikes. The entire mission was completed within the time limit, demonstrating the system’s reliability.

To quantify the quadcopter’s performance, various metrics were evaluated, such as flight time, accuracy in target recognition, and success rate in obstacle avoidance. Table 2 summarizes the experimental results from multiple test runs.

Table 2: Quadcopter Performance Metrics in Experimental Tests
Metric Average Value Notes
Flight Time 8.5 minutes Within 10-minute limit
Target Recognition Accuracy 92% Based on 50 trials
Obstacle Avoidance Success Rate 95% In cluttered environments
Position Holding Error ±0.1 m During hover
CPU Usage on STM32F4 75% Peak during sensor fusion

The control system’s effectiveness is further illustrated through the PID tuning process. For attitude control, the proportional gain $K_p$ affects responsiveness, while $K_i$ eliminates steady-state error and $K_d$ dampens oscillations. The Ziegler-Nichols method was applied to determine initial gains, followed by fine-tuning. The closed-loop transfer function for the attitude control system can be approximated as:

$$ G(s) = \frac{K_p s + K_i + K_d s^2}{s^2 + (K_d + B)s + K_p s + K_i} $$

where $B$ is the damping coefficient. This model helps in simulating the quadcopter’s response to disturbances.

Moreover, the embedded software incorporates multi-threading to handle concurrent tasks. For instance, one thread manages sensor data acquisition, another processes control algorithms, and a third handles communication. This ensures real-time performance, which is crucial for stable quadcopter operation. The STM32F4 processor, with its DSP instructions and floating-point unit, efficiently executes these tasks. Power management is also critical; the battery life dictates mission duration, and efficient PWM signals to ESCs minimize energy consumption. The thrust $T$ generated by a motor is related to the PWM duty cycle $D$ by:

$$ T = k_t \cdot D^2 $$

where $k_t$ is a motor-specific constant. This quadratic relationship emphasizes the need for precise control to avoid excessive power usage.

In educational contexts, students engage in projects like building and programming the quadcopter, which covers embedded C programming, RTOS concepts, and algorithm development. For example, they implement sensor drivers using STM32 HAL libraries, develop communication protocols for telemetry, and integrate machine learning models for advanced tasks. The hands-on experience with the quadcopter system bridges theory and practice, preparing students for industry challenges. Furthermore, the modular design allows for scalability; additional sensors or algorithms can be incorporated without major overhauls.

In conclusion, the quadcopter-based embedded system practice teaching design significantly enhances the learning experience by combining theoretical knowledge with practical application. The quadcopter serves as a comprehensive platform for exploring embedded hardware and software, from low-level microcontroller programming to high-level autonomous functions. Through projects involving quadcopter drones, students develop critical skills in system integration, problem-solving, and innovation. The success in competitions and real-world scenarios underscores the value of this approach. Future work may involve integrating more advanced AI techniques or expanding to swarm robotics, further enriching the embedded systems curriculum.

Scroll to Top