UAV oblique photogrammetry integrates traditional photographic vision with computer technology to capture object information. Its efficiency, reliability, and adaptability make it invaluable for cultural heritage conservation, enabling rapid site surveying, structural documentation, and deterioration monitoring. Surveying drones provide high-resolution, real-time spatial data critical for developing precise conservation strategies.
Earthen sites—historical structures composed primarily of soil—are prevalent throughout China’s Yellow and Yangtze River basins. Subject to natural erosion and human activity, these sites develop complex deterioration patterns. Accurately assessing their geometry and condition remains challenging. Traditional methods rely on manual measurements using ranging poles and tape measures, followed by labor-intensive photo stitching. This approach introduces distortions, measurement errors, and fails to capture three-dimensional deterioration features like undercuts and cavities. Surveying UAVs overcome these limitations by acquiring comprehensive multi-angle imagery for high-fidelity 3D reconstruction.
Technical Methodology
System Composition
The workflow integrates hardware and software components. Hardware includes surveying UAVs (equipped with ≥20MP cameras), ground control points (GCPs), RTK-GNSS receivers, and measurement poles. Software platforms like Metashape and Geomagic Wrap process data into measurable outputs. Camera parameters are standardized: aperture F/2.8, ISO ≤200, shutter speed 1/60–1/80s. Consistent lighting minimizes texture variations during image matching.
Data Acquisition
Surveying UAVs perform oblique photography at 45° tilt angles, maintaining ≥70% overlap between consecutive images. Flight altitude is set to twice the target’s height (e.g., 20m for an 8m wall). Parallel flight paths cover both interior and exterior surfaces, with increased shot density at corners. GCPs (5 per 1,000m²) are positioned around the target and georeferenced using RTK (planar error ≤1cm, elevation error ≤2cm). Complementary ground images (2,207 photos in this study) are captured at 45° intervals with ≥70% overlap, tied to scale bars.

Data Processing
Metashape processes imagery through:
- Alignment: Images are grouped by source (UAV/ground). Keypoint matching uses “Generic” preselection (or “Reference” if GPS-tagged). Match accuracy settings:
- High: <100 images
- Medium: 100–1,000 images
- Low: >1,000 images
- Georeferencing: GCP coordinates scale the sparse cloud. Bundle adjustment refines camera parameters.
- Dense Cloud Generation: Depth maps reconstruct surface geometry.
- Mesh Texturing: Models are built with polygon counts adjusted to image volume:
- High: <400 images
- Medium: 400–1,000 images
- Low: >1,000 images
Texture resolution follows $N_{textures} = \frac{L_{wall}}{20}$ (e.g., 13 textures for 130m wall at 4096×4096px).
- Outputs: Orthomosaics, digital elevation models (DEMs), and textured OBJ files are exported. Geomagic Wrap extracts profiles and isobaths via plane-projection slicing.
Precision Analysis
Surveying UAV accuracy is quantified using root mean square error (RMSE). For $n$ checkpoints with errors $X_1, X_2, …, X_n$:
$$RMSE = \sqrt{\frac{1}{n} \sum_{i=1}^{n} X_i^2}$$
Planar ($RMSE_{xy}$) and vertical ($RMSE_z$) errors combine into 3D accuracy:
$$RMSE_{3D} = \sqrt{RMSE_{xy}^2 + RMSE_z^2}$$
$$RMSE_{xy} = \sqrt{RMSE_x^2 + RMSE_y^2}$$
Under optimal conditions—flight height ≤30m, GCP density ≥5/1,000m², uniform lighting—surveying drones achieve:
Configuration | Planar RMSE (cm) | Vertical RMSE (cm) | 3D RMSE (cm) |
---|---|---|---|
Without GCPs | 4.2 | 3.4 | 5.4 |
With GCPs | 2.3 | 1.0 | 2.5 |
Traditional methods exhibit significantly higher errors in wall length (≤1.56m), height (≤0.97m), undercut depth (≤0.15m), and deterioration area (≤0.52m²).
Efficiency and Application
Metric | Traditional Methods | Surveying UAV | Improvement |
---|---|---|---|
Personnel | 4 | 2 | 50% reduction |
Data Acquisition Time* | 4.18h | 1.98h | 52.6% faster |
Processing Time* | 6h | 3h | 50% faster |
*For a 250m earthen wall site.
Surveying UAV outputs support multiple conservation applications:
- Orthomosaics: Generate distortion-free 1:200-scale 2D maps for condition assessment.
- 3D Models: Extract cross-sections revealing undercut depths and stratigraphy.
- Isobaths: Visualize surface erosion patterns through contour analysis.
- DEMs: Map site topography, with vegetation removal achieving 95% accuracy via FCN algorithms.
Conclusion
Surveying UAV photogrammetry delivers centimeter-level accuracy (RMSE3D = 2.5cm) under standardized conditions, outperforming traditional methods in precision, efficiency, and data richness. It enables comprehensive documentation of inaccessible areas, provides quantifiable deterioration metrics, and supports conservation design through integrated 2D/3D outputs. Current limitations include millimeter-scale feature resolution (e.g., crack widths) and vegetation occlusion challenges. Future integration with terrestrial LiDAR and multispectral sensors will enhance capability, solidifying surveying drones as indispensable tools in heritage preservation.