Detection and Ranging

Sensor systems for traverse in reduced-visibility conditions

Effective navigation in post-collapse terrain requires multiple sensor modalities. No single system provides complete coverage across all environmental conditions. The following table summarizes primary sensor types deployed in traverse operations.

Sensor Type Operating Principle Effective Range (m) Resolution Conditions Affected Power Draw (W)
LIDAR Laser pulse time-of-flight 150 ~5 cm Dust, rain, snow, fog 8–12
Millimeter-wave radar RF pulse reflection 200 ~15 cm Dense vegetation 3–5
Acoustic ranging Ultrasonic pulse echo 15 ~2 cm Wind, atmospheric inversion 0.5–1
Thermal Infrared emission detection 100 Qualitative Heavy rain, fog 1–3
Optical Electromagnetic spectrum collection Varies (10–300+) Variable Light conditions, atmospheric clarity 0.2–2

LIDAR provides the highest resolution point-cloud data but is severely degraded by particulate matter in the atmosphere. Millimeter-wave radar penetrates weather well but scatters in dense vegetation, creating false returns. Acoustic ranging operates effectively only at short distances and in stable air columns. Thermal imaging identifies heat differentials—animal biomass, geothermal features, operational structures—without requiring ambient light. Optical systems remain the primary observation method for human-readable terrain analysis but depend on atmospheric visibility.

The following table presents normalized performance metrics (percentage of optimal function) measured under controlled conditions and verified against field observation.

Condition LIDAR % Radar % Acoustic % Thermal % Optical %
Clear 100 100 100 100 100
Light rain 70 95 90 95 80
Heavy rain 30 85 60 80 40
Fog 20 90 50 70 15
Dust 15 90 40 90 20
Snow 25 85 45 75 30
Dense canopy 60 40 80 50 40

Multi-Sensor Fusion

No operational traverse relies on a single sensor modality. Effective navigation combines data streams from all available systems. In heavy rain, optical and LIDAR become unreliable; radar and thermal carry the primary load. In dense canopy, LIDAR and optical degrade; acoustic and radar provide short-to-medium range confirmation. Sensor fusion algorithms weight each input based on signal confidence metrics derived from real-time environmental assessment. Redundancy is not optional—it is foundational to safe traverse.

Sensor placement determines effective coverage area and blind spots. Vehicle geometry—hull shape, protrusions, sensor tower height—creates detection shadows at specific angles and ranges.

Forward Position

Primary forward-facing sensors (LIDAR, radar, optical) are typically mounted at the leading edge of the hull, 0.5–1.2 m above ground level. This height balances early detection of ground obstacles with penetration of vegetation. At 1 m height, a forward-facing LIDAR with 150 m range provides ground coverage extending approximately 150 m ahead. Angle of incidence affects reflectivity; surfaces angled away from the sensor produce weaker returns.

Lateral Position

Side-mounted sensors (radar, thermal, optical) provide obstacle detection perpendicular to direction of travel. Mounting at 0.6–0.8 m lateral height captures terrain features and stationary hazards. Radar side-mounting is particularly effective for detecting structures and terrain edges obscured by vegetation. Blind spots form directly beside the hull; gaps between forward and lateral sensors require secondary optical confirmation.

Rear Position

Rear sensors monitor terrain degradation and approaching obstacles during retreat or repositioning. Thermal rear-mounting is effective for detecting biomass or heat signatures at distance. Optical rear cameras provide situational awareness. Acoustic rear transducers detect obstacles during backing maneuvers.

Vertical Coverage and Blind Spots

Sensor tower height directly affects overhead coverage. At 1.5 m, sensors clear low vegetation but may miss ground-level hazards within 2–3 m. At 0.8 m, ground-level coverage improves but tall vegetation occludes distant returns. All mount configurations produce dead zones: directly beneath the sensor plane, behind the vehicle hull, and at extreme angles below the mounting plane. Multi-sensor geometry ensures that no single obstacle remains undetected by all systems simultaneously.

Sensor accuracy degrades over time and with environmental stress. Regular calibration maintains data reliability.

Calibration Frequency

  • LIDAR: Every 50 operational hours or after temperature cycling exceeding 30°C
  • Radar: Every 100 operational hours or after mechanical shock/vibration event
  • Acoustic: Every 30 operational hours or after dust storm transit
  • Thermal: Every 75 operational hours or after temperature differential exceeding 40°C
  • Optical: Every 40 operational hours or after dust/moisture contamination

Environmental Drift Factors

Temperature cycling causes optical element expansion/contraction, shifting focal planes and introducing range error. Vibration accumulated during rough terrain traversal loosens mounting hardware and misaligns emitter/receiver pairs. Dust accumulation on optical windows and LIDAR emitter ports reduces signal strength and return clarity. Moisture condensation on cold sensors creates false reflections. Thermal sensors drift as calibration reference materials thermally shift. Field operations in post-collapse environments expose all these factors simultaneously.

Field Calibration Procedures

Calibration requires known reference points: measured distances to fixed landmarks or deployed reflective targets. Standard practice involves stopping at a static position, identifying terrain features at known distances (measured by surveying or GPS), and recording sensor output against ground truth. Discrepancies exceeding 5% for LIDAR, 10% for radar, or 2% for acoustic warrant recalibration or component replacement. Thermal calibration uses ambient temperature reference and observed heat signatures from known-temperature structures. Optical calibration involves target contrast assessment and focus verification.

Certain documented segments of traverse routes exhibit sensor degradation exceeding what environmental conditions alone explain. These areas are classified as interference zones.

Observed Effects

In interference zones, LIDAR scatter increases beyond atmospheric particulate levels. Radar returns become unreliable or contradictory—multiple false echoes, range uncertainty, velocity estimation errors. Acoustic systems exhibit anomalous attenuation or unexpected phase shifts. Thermal imaging shows coherent noise patterns unrelated to ambient heat sources. Optical systems may report atmospheric distortion effects. Multi-sensor fusion algorithms detect these zones automatically by comparing expected performance against actual sensor agreement metrics.

Documentation and Route Planning

Interference zones are documented in route reference materials. Specific segments identify affected sensor modalities, typical degradation patterns, and documented operational adjustments (reduced speed, increased lateral spacing, heightened manual verification requirements). The cause of interference in these zones is not specified in this document. Operational logs from repeated traverse of affected segments have established consistent, repeatable patterns of sensor behavior, allowing predictive modeling for subsequent passages.

For detailed cross-reference to sensor behavior in specific routes, consult the vehicle-navigation-in-interference-zones reference document.

Sensor performance data compiled from operational logs spanning multiple traverse seasons. Degradation percentages measured under controlled conditions and verified against field observation. Interference zone data derived from repeated traverse of affected segments. No speculative interpretation; this document records observed effect only.