Charles River Analytics Develops Monocular Unmanned Leader-Follower
- MULE-F autonomously tracks and follows a designated pedestrian operator
Charles River Analytics, a developer of intelligent systems solutions, has developed a compact system that autonomously tracks and follows a designated pedestrian operator. The system is known as MULE-F, or the Monocular Unmanned Leader-Follower. MULE-F enables an unmanned ground vehicle (UGV) to follow an operator on foot autonomously.
Unmanned ground vehicles (UGVs) offer a range of benefits in both military and civilian applications, and are particularly suited to eliminating or reducing dull, dirty, or dangerous (DDD) jobs that are currently done by humans. For example, mobile robots have successfully been applied to tasks including explosive ordnance disposal (EOD), scouting, and emergency response. However, these robots typically require continuous teleoperation by at least one human, and require detailed operator oversight for even mundane tasks such as travelling from one location to another. As a result, UGVs may offer the promise of removing the human from the DDD environment, but at the expense of considerably increased task completion times (slower), operator workload (harder), and/or crew manning (more expensive).
The Charles River Analytics Solution
To address these issues, Charles River developed a compact system to autonomously track and follow a designated operator/leader, called Monocular Unmanned LEader-Follower (MULE-F). The system enables a UGV to follow an operator on foot autonomously—requiring no teleoperation on the part of the operator—using a lightweight video camera and requiring no modifications to the leader's equipment, like special patterns, infrared beacons, or the like. The system requires only a monochrome camera, with the intent to maximize ease of deployment on existing UGVs. The system's core software capabilities are designed with a reduced computational complexity to minimize the payload's size, weight, power and cost (SWAP-C) impact.
MULE-F is composed of three core modules: a pedestrian detection module that determines the locations of all visible pedestrians in the UGV camera's field of view; an appearance-learning and tracking module that maintains a lock on the leader and differentiates between the leader and nearby troops or team members; and a gesture recognition module that enables natural control of the vehicle by their leader. Our system integrates state-of-the-art detection methods with kinematic tracking and online appearance learning techniques to reliably track a human leader walking in complex outdoor environments. The system operates at up to 15Hz on 640x480 video on a small 2.5GHz computing platform.
Interest in introducing autonomous and semi-autonomous capabilities to mobile robots continues to grow. The development and deployment of autonomous capabilities will free operators from performing mundane teleoperation tasks, and permit them to focus their attention on tasks requiring a high level of human engagement, leaving the truly DDD tasks to the mobile robots.
Source : Charles River Analytics