The sUAS Guide Issue 01, January 2016 | Page 136

Project Artemis
Visual Navigation for Flying Robots
by Mohammed Kabir

Towards an Open and Robust Visual MAV Platform
A multi-purpose aerial platform suitable for indoor and outdoor flight with full autonomy
(Extended Abstract)
by Mohammed Kabir
St. Xavier's Collegiate School
Kolkata, India
[email protected]
http://projectartemis.github.io/

We present a robust autonomous Micro Aerial Vehicle (MAV) platform equipped with onboard sensing and computing for use in a diverse number of fields. We use a combination of computer vision and GPS for navigation with a modular sensor fusion system to deal with varying sensor availabilities. The platform is equipped with stereo cameras which allows it to perform dense 3D mapping of areas and live reconstruction. A high-bandwidth ground link allows it to instantly transmit crucial information on otherwise inaccessible areas. Our proposed system is capable of both autonomous exploratory navigation and operator controlled flight with sense-and-avoid.

I. Introduction
Small, autonomous aerial vehicles can provide assistance to ground-based teams in various fields, from structural inspection to collaborative search-and-rescue. Often, limited infrastructural availability and adverse conditions mean that external communications to a ground control station, and more importantly, satellite based positioning cannot be relied on. Full autonomy is important for such a system, be it optimal path-planning, global localization or target detection. The limited payload capability of these systems make it a further challenge to find the best balance of sensors and computing systems while maintaining a high level of system agility for navigation. We target the fields of Urban Search and Rescue (USAR) and 3D mapping for deployment of our system. We chose a quadrotor for our test platform, due to their cost-effectiveness and agility. The goal of the system is to be rapidly deployable, capable of operating with or without a human operator at the controls, and capable of mapping out inaccessible areas for first responders to a situation. Our system is fully vehicle-independent, and can be run on any multi-rotor vehicle equipped with the necessary sensor suite.

II. Overview of System
The core system consists of a onboard companion computer, and a flight controller – we use an Intel NUC running the Robot Operating System (ROS) middleware and a Pixhawk Autopilot (with an IMU, magnetometer and pressure altimeter) running the PX4 Middleware and Flight Core. Our platform also carries a multi-camera cluster, GNSS receiver, laser altimeter and a long-range broadband wireless link. Our hybrid software runs on top of the extremely powerful ROS middleware on the onboard computer, and on top of the NuttX real time operating system and PX4 Middleware. The sensor suite flown on the system comprises a 4 camera cluster, out of which two face forward in a stereo configuration, and two face downwards. The stereo cameras and one of the bottom-looking cameras actively use visual cues, while the other, an optical flow camera (a PX4Flow) is used for recovering body velocities.
We designed a custom vehicle from scratch for testing our system – the “Artemis” quadrotor platform. It was designed with the considerations of cost-effectiveness, ease of repair and protection of expensive sensing modules in mind. We use commercially available carbon fibre frame plates and aluminium for construction. Critical systems like the flight controller and cameras are vibration isolated and protected with carbon plates. In our testing sessions, the frame design has withstood high velocity crashes with only minor damage.