The sUAS Guide Issue 01, January 2016 | Page 137

III. Localization
The proprioceptive data from the IMU and the exteroceptive position information are fused to get the current system state estimate. We use a visual SLAM system based on the open-source Multi-Camera Parallel Tracking and Mapping (MCPTAM) framework. The system is designed to navigate using all available sensors in the environment, which includes both GPS and vision outdoors and pure vision indoors. Since sensor availability is not guaranteed, we use a modular sensor fusion approach using a hybrid Kalman filter with fault detection to maintain a robust state estimate. An important point to be noted here is that the global X, Y and yaw error is bounded by GPS and magnetometer measurements, without which the error would have grown unbounded. We were motivated to use all the information from the sensors with the view that even if a particular subset or module were to fail, the overall system performance would not be compromised.
IV. Navigation
Sense-and-avoid is a very important part of our design, since navigation in constrained environments is impossible without it. Depth images are computed at frame-rate from the stereo camera system using a GPU-accelerated semi-global block matching approach. The Octomap library is used for creating a global map of the MAV's surroundings. Fully autonomous exploration uses a frontier exploration algorithm, while operator-controlled modes give assisted manual control (obstacle avoidance with teleoperation) using our implementation of the 3DVFH+ local planning algorithm. Precision landing and marker detection is also implemented in the navigation framework using AprilTags, which allows the system to land on small targets under very constrained conditions (e.g a boat at sea)
V. Results
We demonstrate a large trajectory flown with the full sensor suit around a university complex and sufficiently demonstrate the need of our multi-sensor fusion approach. The test environment has large trees, a metal shelter, indoor corridors and large outdoor open spaces. The ortho-photo (Figure 3) shows the vehicle trajectory along with the sensor availabilities throughout the flight. The flight was teleoperated by a ground operator with assistive obstacle avoidance active. This shows that multi-sensor fusion is an important necessity for such indoor-outdoor flights.