Project MARV

Accelerating USV Autonomy for Combat Search and Rescue​

Developed in just three months in partnership with the Air Force’s Blue Horizons program, MARV is TILT’s real-world demonstrator for autonomous combat search and rescue.
Project MARV

Accelerating USV Autonomy for Combat Search and Rescue​

Developed in just three months in partnership with the Air Force’s Blue Horizons program, MARV is TILT’s real-world demonstrator for autonomous combat search and rescue.

Introduction

In late 2024, the Air Force’s Blue Horizons program posed a challenge: could an uncrewed vessel autonomously locate and recover a downed airman, without a human on board? TILT Autonomy took on that challenge—engineering, integrating, and deploying a complete, sensor-driven rescue platform that fused autonomy, communications, and operator control into one system.

Problem Set

Building a functional uncrewed surface vessel (USV) for CSAR missions presents deep integration challenges across the USV space. These platforms must bring together a complex ecosystem of components such as cameras, radar systems, autopilots, satellite communication, and high power compute modules, while ensuring reliable power distribution and communication in real world maritime conditions.

Achieving seamless interoperability between these systems requires not only precise hardware engineering but also the ability to consolidate sensor and control data into a single ground control system that supports both operator awareness and autonomous decision making in dynamic environments.

  • Reliable power distribution to diverse sensor and compute subsystems
  • Physical and data integration of radar, FLIR, EO/IR, GPS, and communication hardware
  • Interoperability between autopilot logic and real-time sensor data streams
  • High-bandwidth, low-latency connectivity over satellite and RF links
  • Coordinated control across operator interface, autonomy stack, and vehicle hardware
  • Designing a unified GCS capable of supporting both autonomous control and human oversight

From Simulation to Open Water

TILT’s development process began not on the water, but inside our autonomy development platform: ENDGAME. Using a digital version of the mission environment built in Unreal Engine, our engineers simulated full autonomy cycles in ArduPilot weeks before the vessel was physically complete. During boat development, the software stack was already executing mission scenarios virtually, navigating to targets, identifying objects, and practicing docking maneuvers so it could transition smoothly into field operations.

By the time MARV touched the water, the software had already been stress-tested in a high-fidelity environment. The transition from simulation to physical deployment required only minimal tuning, enabling rapid progress toward field validation.

Autonomous boat approaches rescue raft in simulation with onboard camera tracking a raft and heading overlays.
ENDGAME synthetic environment running inside of TILT GCS

Designing the Full Autonomy Stack

TILT engineered the complete autonomy stack, including hardware, software, and system integration, to deploy MARV as a fully unified system. This included designing the autonomy and control architecture, while also configuring the boat to handle complex sensor payloads, networking infrastructure, and power distribution. The modularity of TILT’s autonomy hardware enabled seamless integration across the platform and provided the flexibility to adapt to different vehicle configurations.

Diagram showing MARV system components: sensors, navigation modules, and controllers integrated on a fishing boat.

ENDGAME: Ground Control and Command Hub

TILT's integrated Ground Control Station served as the central interface for mission execution and oversight. Through the TILT GCS, a single operator could coordinate with external systems and:

NAVCOM: Autonomy and Control Core

The TILT NAVCOM module houses MARV’s autopilot, sensor power distribution, and sensor interfaces. It performs real-time sensor fusion and autopilot functions while communicating across:

  • View radar returns and task the FLIR camera for inspection
  • Use EO/IR sensors to estimate position of objects and navigate
  • Run machine learning models to identify rafts, boats, or personnel
  • Add and execute search and rescue patterns
  • Receive last known position markers from ATAK and route the vessel to those locations
  • Monitor and send commands to the vessel over Starlink
  • Trigger custom behaviors like reverse-docking to a rescue raft
  • LAN: Radar, FLIR, and onboard cameras
  • WAN: Starlink, LTE, and radios
  • CANBUS: Control signals to the Boat Control Module (BCM)

BCM: Executing Commands in the Vessel

The TILT Boat Control Module translates NAVCOM’s high-level autonomy decisions into actuation: throttle, steering, and feedback sensors. It also handles telemetry like fuel state and manual override detection.

Demonstration and Results

Over a series of live-water tests in the Chesapeake Bay, MARV transitioned from simulation to fully autonomous operation. Each trial was designed to replicate critical elements of a combat search and rescue mission, validating autonomy, sensor fusion, and remote command and control.

The mission began with target information sent via ATAK. Operators pushed last known position data directly to the vehicle, which allowed MARV to immediately plan and execute autonomous navigation to the designated coordinates.

TILT MARV vessel in open water autonomously approaches a floating target during testing, equipped with radar and comms.

ATAK Information

Higher-level command or other connected systems could send last known positions and mission data to ATAK, which was then relayed to the vehicle. This closed the loop between external sources of intelligence and real-time autonomous mission execution, enabling MARV to operate as part of a broader, networked command structure.

Search Pattern Execution

A variety of search and rescue patterns could be customized to search the area and sent to the vehicle to drive the waypoints.

Track and ID Targets

As radar tracks appeared, operators in the TILT GCS could click on those returns to automatically slew the FLIR camera to the target, providing visual confirmation and triggering onboard machine learning to classify the object.

Dock with Target

After identifying a standard-issue Air Force life raft, MARV transitioned to docking mode. Using onboard vision systems, the vessel approached the target in reverse and performed a controlled docking maneuver.

Outcomes and Impact

In less than three months, MARV evolved from concept to operational prototype. It didn’t just prove that autonomy was possible, it showed that with the right architecture, autonomy could be fielded quickly, with mission-relevant performance.

MARV confirmed the feasibility of using commercial platforms to deliver high-performance rescue capabilities, equipped with remote command over Starlink and real-time mission adaptation through TILT’s Autonomy architecture. The results validated that a USV could search for, identify, and approach rescue targets without human presence onboard, making it a viable tool for contested environments where risk to personnel must be minimized.

TILT would like to thank Beacon Light Marina for graciously hosting our team and demonstrations during this project. We appreciate their support and hospitality. For more information about their marina and Yamaha inventory, go to beaconlightmarina.com

Have Questions About MARV or Autonomous Systems?

If you’d like to learn more about how MARV works, what went into its development, or how our approach to autonomy might apply to your mission, reach out to connect.

© 2025 TILT Autonomy, Inc. All rights reserved.

Top