The rapid development of low-altitude economies necessitates robust solutions for low altitude UAV operations, requiring both high-reliability communication and effective safety supervision. Integrated Sensing and Communications (ISAC) emerges as a pivotal 6G technology capable of simultaneously enabling broadband connectivity for cooperative low altitude drones while detecting non-cooperative targets. This integration addresses two fundamental requirements: establishing low-latency command links and high-capacity data channels for large-scale drone operations, while implementing comprehensive surveillance to identify rogue drones operating in restricted airspaces.

Traditional approaches face significant limitations in low altitude UAV scenarios. Radar systems offer precise motion tracking but suffer from high deployment costs and limited coverage continuity. Visual detection provides target classification but fails under poor lighting or adverse weather conditions. Communication-centric methods enable connectivity but lack native sensing capabilities. ISAC overcomes these limitations by leveraging existing cellular infrastructure to create unified communication-perception networks specifically optimized for low altitude drone operations.
Research Review
Single-Station Sensing for Low Altitude UAVs
Fundamental research in single-station ISAC focuses on dynamic target detection and parameter estimation. Millimeter-wave massive MIMO systems significantly enhance angular resolution for low altitude drone tracking through compressed sampling theory and beamforming optimization:
$$ \text{SINR}_{\text{sensing}} = \frac{\|\mathbf{w}^H\mathbf{H}_{\text{TR}}\mathbf{f}\|^2 P_t}{\sigma_n^2 + \sum_{k=1}^K \|\mathbf{w}^H\mathbf{H}_{\text{intf},k\mathbf{f}_k\|^2 P_{t,k}} $$
where \(\mathbf{w}\) is the receive beamformer, \(\mathbf{f}\) is the transmit beamformer, \(\mathbf{H}_{\text{TR}}\) is the target response channel, and \(\sigma_n^2\) represents noise power. OFDM-based parameter estimation enables simultaneous distance and radial velocity measurement for low altitude UAVs, though resolution limitations persist. Recent innovations incorporate Reconfigurable Intelligent Surfaces (RIS) to distinguish target echoes from environmental reflections, significantly enhancing tracking accuracy for small low altitude drones.
Multi-Station Cooperative Sensing
Multi-base station cooperation dramatically extends coverage and improves sensing accuracy for low altitude UAV detection. Collaborative networks leverage spatial diversity to overcome single-station limitations through three fundamental approaches:
| Cooperation Level | Data Volume | Accuracy | Robustness |
|---|---|---|---|
| Signal-Level | High | Excellent | Low |
| Feature-Level | Medium | High | Medium |
| Decision-Level | Low | Moderate | High |
The optimal feature-level cooperative framework balances accuracy and efficiency through distributed feature extraction and centralized fusion:
$$ \mathcal{F}_{\text{fused}} = \Phi\left( \bigcup_{i=1}^N \psi(\mathbf{Y}_i) \right) $$
where \(\psi(\cdot)\) denotes feature extraction at each base station, \(\bigcup\) represents feature transmission, and \(\Phi(\cdot)\) is the fusion function at the edge computing center. This architecture enables precise tracking of low altitude drones across cell boundaries while maintaining manageable computational loads.
Multi-Modal Perception
Fusing wireless sensing with visual information creates robust classification systems for low altitude UAV identification. The complementary strengths address individual modality weaknesses:
$$ \text{Decision}_{\text{final}} = \begin{cases}
\alpha \cdot \mathcal{D}_{\text{rf}} + \beta \cdot \mathcal{D}_{\text{vision}} & \text{if } \mathcal{Q}_{\text{vision}} > \tau \\
\mathcal{D}_{\text{rf}} & \text{otherwise}
\end{cases} $$
where \(\mathcal{D}\) represents detection confidence, \(\mathcal{Q}_{\text{vision}}\) denotes visual quality metrics, and \(\alpha\), \(\beta\) are fusion weights. Deep learning architectures enable sophisticated feature fusion between point clouds from RF sensing and visual embeddings:
$$ \mathbf{v}_{\text{fused}} = \text{MLP}\left( \text{MaxPool}\left( \text{Conv3D}(\mathbf{P}_{\text{pointcloud}}) \right) \oplus \text{CNN}(\mathbf{I}_{\text{visual}}) \right) $$
This multi-modal approach achieves >95% classification accuracy for low altitude drones versus birds in field tests, significantly outperforming single-modality solutions.
Resource Management and Interference Control
Cellular ISAC networks require sophisticated interference management between communication and sensing functions. The joint optimization problem maximizes network utility:
$$ \max_{\mathbf{P},\mathbf{W},\mathbf{B}} \sum_{u=1}^U R_u(\mathbf{P},\mathbf{W},\mathbf{B}) + \lambda \sum_{s=1}^S \text{CRB}^{-1}(\mathbf{P},\mathbf{W},\mathbf{B}) $$
$$ \text{s.t.} \quad \sum_{b=1}^B P_{b,c} \leq P_{\max}, \quad \sum_{c=1}^C B_{b,c} \leq B_{\text{total}}, \quad \text{SINR}_u \geq \gamma_{\min} $$
where \(R_u\) is communication rate, CRB is Cramér-Rao Bound for sensing, \(\mathbf{P}\) is power allocation, \(\mathbf{W}\) is beamforming weights, and \(\mathbf{B}\) is bandwidth allocation. Multi-base station beam coordination employs reinforcement learning to minimize interference:
| Parameter | State Space | Action Space | Reward Function |
|---|---|---|---|
| Beam Alignment | Channel estimates, Target positions | Beam index selection | Throughput + Sensing accuracy |
| Power Control | Interference measurements, QoS status | Power adjustment levels | Energy efficiency + SINR balance |
Optimal solutions demonstrate 40% interference reduction compared to non-cooperative approaches, significantly enhancing low altitude UAV tracking precision.
Key Technologies for Low Altitude UAV ISAC
Single-Station Sensing Framework
Comprehensive low altitude drone monitoring requires integrated signal processing chains:
$$ \small \mathbf{Y}_{\text{processed}} = \underbrace{\mathcal{F}_{\text{Doppler}}\left( \overbrace{\mathcal{F}_{\text{range}}\left( \mathbf{Y}_{\text{raw}} \odot \mathbf{S}_{\text{comp}}^* \right)}_{\text{Matched filtering}} \right)}_{\text{Doppler processing}} \ominus \underbrace{\mathcal{F}_{\text{clutter}}\left( \mathbf{Y}_{\text{static}} \right)}_{\text{Clutter cancellation}} $$
The processing chain includes: 1) Beam scanning across designated sectors, 2) Static clutter suppression using spatial filtering, 3) CFAR target detection, 4) Multi-parameter estimation (range, angle, velocity), 5) Coordinate transformation to Earth frame, and 6) Multi-target tracking with IMM filters. This enables real-time detection and continuous tracking of multiple low altitude UAVs within a 300m radius.
Multi-Station Feature-Level Cooperative Sensing
Our cooperative framework employs distributed feature extraction with deep learning-based compression:
$$ \mathbf{F}_i = \text{Enc}_{\theta_i}\left( \mathcal{C}\left( \mathbf{Y}_i \right) \right) $$
where \(\mathcal{C}(\cdot)\) is clutter suppression, and \(\text{Enc}_{\theta_i}(\cdot)\) is a lightweight encoder network. The fusion center reconstructs signals using:
$$ \hat{\mathbf{Y}}_i = \text{Dec}_{\phi}\left( \mathbf{F}_i \right) $$
Key advantages include: 1) 80% reduced fronthaul requirements versus raw data transmission, 2) Graceful degradation during partial network failures, 3) Scalable architecture supporting 10+ base stations, and 4) 0.5m positioning accuracy for low altitude drones across heterogeneous networks.
Vision-RF Fusion for Target Identification
The integrated perception framework combines complementary sensing modalities:
- Visual Processing: $$ \mathbf{b}_{\text{bbox}}, \mathbf{v}_{\text{vis}} = \text{YOLOv7}\left( \mathcal{I}_{\text{enhanced}} \right) $$
- Beam Steering: $$ \mathbf{f}_{\text{beam}} = \arg\max_{\mathbf{f} \in \mathcal{F}} \mathbf{f}^H \mathbf{a}(\theta_{\text{vis}}, \phi_{\text{vis}}) $$
- RF Feature Extraction: $$ \mathbf{v}_{\text{rf}} = \text{PointNet++}\left( \mathcal{P}_{\text{enhanced}} \right) $$
- Cross-Modal Fusion: $$ \mathbf{v}_{\text{fused}} = \text{CrossAttention}\left( \mathbf{v}_{\text{rf}}, \mathbf{v}_{\text{vis}} \right) $$
This architecture maintains >90% identification accuracy for low altitude UAVs under diverse environmental conditions, including night operations and moderate precipitation.
Multi-Station Resource Management
Base Station Deployment Optimization
Optimal ISAC network planning incorporates environmental constraints:
$$ \min_{\mathbf{X}, \mathbf{z}} \sum_{i=1}^N z_i \cdot C_{\text{deploy}} + \sum_{k=1}^K \mathbb{E} \left[ \text{CRB}(\mathbf{x}_{\text{uav},k}, \mathbf{X}, \mathbf{z}) \right] $$
$$ \text{s.t.} \quad \|\mathbf{x}_i – \mathbf{x}_j\| \geq d_{\min}, \quad z_i \in \{0,1\}, \quad \mathcal{G}(\mathbf{X}, \mathbf{z}) \geq \eta $$
where \(\mathbf{X}\) contains BS positions, \(\mathbf{z}\) indicates activation status, and \(\mathcal{G}\) represents coverage requirements. Reinforcement learning enables dynamic activation control:
$$ Q(s,a) \leftarrow (1-\alpha)Q(s,a) + \alpha\left[ r + \gamma \max_{a’} Q(s’,a’) \right] $$
with states \(s = (\text{traffic}, \text{UAV density}, \text{interference})\), actions \(a = \text{activation set}\), and reward \(r = \eta_{\text{comm}} \cdot R_{\text{sum}} + \eta_{\text{sense}} \cdot \text{CRB}^{-1}\). This reduces energy consumption by 35% while maintaining sensing coverage for low altitude drones.
Collaborative Beamforming Design
Multi-cell beam coordination minimizes interference through semi-definite relaxation:
$$ \min_{\mathbf{R}_i} \sum_{i=1}^N \text{tr}(\mathbf{R}_i) $$
$$ \text{s.t.} \quad \frac{\text{tr}(\mathbf{H}_{i,u}\mathbf{R}_i)}{\sum_{j\neq i} \text{tr}(\mathbf{H}_{j,u}\mathbf{R}_j) + \sigma^2} \geq \gamma_u $$
$$ \quad \quad \text{tr}(\mathbf{A}(\theta)\mathbf{R}_i) \geq P_{\text{sens}}, \quad \mathbf{R}_i \succeq 0 $$
After solving the convex problem, we recover rank-one solutions through randomization: \(\mathbf{R}_i = \mathbf{f}_i\mathbf{f}_i^H\). The optimized beam patterns achieve 18dB interference suppression while maintaining precise beam pointing toward low altitude UAV targets.
Resource Allocation Strategy
Joint bandwidth and power allocation balances communication and sensing demands:
$$ \max_{\mathbf{p},\mathbf{b}} \sum_{u=1}^U b_u \log_2\left(1 + \frac{p_u |h_u|^2}{b_u N_0}\right) + \lambda \sum_{s=1}^S \frac{1}{\text{CRB}_{\text{range}} + \frac{1}{\text{CRB}_{\text{angle}}} $$
$$ \text{CRB}_{\text{range}} = \frac{c^2}{8\pi^2 \cdot \text{SNR} \cdot \beta^2}, \quad \text{CRB}_{\text{angle}} = \frac{\lambda^2}{8\pi^2 \cdot \text{SNR} \cdot N_{\text{ant}} \cdot \cos^2 \theta} $$
where \(\beta\) is effective bandwidth, \(N_{\text{ant}}\) is antenna elements, and \(\theta\) is AoA. The alternating optimization approach:
| Step | Fixed Variables | Optimized Variables | Method |
|---|---|---|---|
| 1 | Band assignment | Power allocation | Convex optimization |
| 2 | Power allocation | Band assignment | Genetic algorithm |
This strategy improves spectral efficiency by 25% while maintaining sensing precision for low altitude UAV surveillance.
Millimeter-Wave ISAC Prototype Validation
We developed an FPGA-based hardware platform to validate ISAC performance for low altitude drone applications:
$$ \small \mathbf{Y}_{\text{radar}} = \text{IFFT}\left( \text{FFT}(\mathbf{y}_{\text{rx}}) \oslash \hat{\mathbf{H}}_{\text{comm}} \right) \quad \text{(Communication-aided sensing)} $$
$$ \small \hat{\mathbf{s}}_{\text{comm}} = \text{Decode}\left( \mathbf{y}_{\text{rx}} \odot \Phi(\hat{\mathbf{\Theta}}_{\text{radar}}) \right) \quad \text{(Sensing-aided communication)} $$
| Parameter | Value | Parameter | Value |
|---|---|---|---|
| Carrier Frequency | 26 GHz | Bandwidth | 820 MHz |
| Antenna Configuration | 8×8 ULA | Range Resolution | 0.183 m |
| Velocity Resolution | 0.091 m/s | Angle Resolution | 3.0° |
| Communication Rate | 1.2 Gbps | Max Detection Range | 300 m |
Field tests demonstrated exceptional performance in challenging scenarios:
- Multi-Target Tracking: Simultaneous tracking of 3 low altitude drones with 0.5m position accuracy
- Classification Performance: 98.7% drone-bird discrimination using RF-vision fusion
- Adverse Conditions: Maintained 92% detection probability in moderate rain and fog
- Communication-Sensing Integration: <5% throughput degradation during active sensing
The prototype validates ISAC’s capability to simultaneously deliver high-speed connectivity for authorized low altitude UAVs while detecting unauthorized drones – a critical requirement for secure low-altitude airspace management.
Conclusion
ISAC technology fundamentally transforms low altitude drone management by integrating communication and sensing capabilities within unified cellular infrastructure. Our research demonstrates: 1) Multi-station cooperation extends detection range to 300m with 0.5m positioning accuracy, 2) Vision-RF fusion achieves >98% classification accuracy between low altitude UAVs and birds, 3) Collaborative beamforming reduces interference by 18dB while maintaining sensing precision, and 4) Hardware prototypes validate real-time tracking during active communication service. These advancements establish ISAC as the foundational technology for secure, efficient low-altitude airspace management systems required by rapidly expanding drone applications.
