Development of an Autonomous Aerial Reconnaissance System

The U of A team has developed a network of aerial vehicles to attempt ... This gave our UAV a total of 1300 mL (44 floz) of fuel allowing a total flight time of 1 hour and .... The final communication system for mission 2 of the IARC consists of an.
696KB taille 4 téléchargements 403 vues
Development of an Autonomous Aerial Reconnaissance System Keith Brock, Jessica Dooley, Frank Manning University of Arizona Tucson, AZ ABSTRACT In preparation for the 2004 International Aerial Robotics Competition (IARC) the Aerial Robotics Club (ARC) at the University of Arizona has designed an autonomous aerial system. This year’s aerial system consists of two small-scale airplanes with on-board autopilots and a ground station consisting of networked computers for mission planning, user interface, and vision related computations. This document summarizes the IARC mission requirements, team objectives, and overall system design including guidance, navigation and control, mission planning software, computer vision techniques and risk reduction strategies. INTRODUCTION The IARC is an annual, international event organized by the Association for Unmanned Vehicle Systems designed to promote research and development of aerial robotic systems. The competition consists of three submissions and an ultimate mission that combines the requirements of the previous three. Mission one of the IARC involves autonomous flight over a 3 km course designated by Global Positioning System (GPS) waypoints. The University of Arizona completed this mission during last summer’s 2003 competition. This year, the team has designed an aerial system to attempt submissions two and three at the IARC. The second of the required sub-missions is to autonomously locate, and identify the desired target structure from a group of structures marked by the IARC symbol (See Figure 1) and illuminated by two lights [1].

Figure 1. IARC symbol used to mark the target building. Once the target structure has been located, the aerial vehicle must further identify at least one open portal on the structure. The portals may be windows, doors, or other openings, and must be of dimensions no less than one meter by one meter. To earn maximum points, the aerial robot must identify all open portals on the structure with an accuracy of 0.25 meters. The third mission requires the main vehicle or a sub-vehicle to enter the target structure through an opening. Once inside, the vehicle must relay images to the ground station for analysis and review by the competition judges. The video or photos from the interior of the structure must enable the judges to clearly identify one of three scenarios- a hostage situation, a tapestry on the wall, or a control panel from a nuclear reactor. The fourth and final mission requires the completion of levels one through three be performed sequentially, with no human intervention [1]. The U of A team has developed a network of aerial vehicles to attempt mission two and prepare for missions three and four of the IARC. Coordination and control of this network is initiated by the user and carried out by autopilot units located onboard each aerial vehicle and a series of ground station computers. The ground station provides support for all computer vision processing and handles the overall mission control tasks. Figure 2 presents an overview of the system hardware. The overall system

architecture from a software perspective is described in Figure 3 and will be discussed in more detail in following sections of this document.

Figure 2. System Hardware Architecture

Figure 3. System Software Architecture During the past year, the U of A team has made significant progress in the areas of computer vision and high-level programs which coordinate the overall mission status of the aerial vehicles. The Mission Control software is the backbone of the entire autonomous system and provides the intelligence needed to perform the challenging IARC missions. AERIAL VEHICLE The airframe for our aerial robots is based on the Hanger 9 “Xtra Easy II, Super, Almost Ready to Fly (ARF)” kits. The “Super ARF” designation indicates that the kit has the engine and radio gear preinstalled. This feature is advantageous to a team with little aircraft experience, or time before the IARC

because it allows valuable time to be spent on other aspects of the overall design problems such as navigation and computer vision. Using a kit plane also requires less time spent on platform maintenance or repairs. In the same respect, we have learned that valuable time can be wasted in preparing for this competition if you try to “reinvent the wheel” when not necessary. Our design utilizes two of these ARF planes to collect data of certain aspects of the environment for the computer vision related tasks. Using two vehicles also reduces the payload for each vehicle thereby distributing the sensors amongst the two planes.

Figure 3. Solid Works drawing of aerial system including the main vehicle, PAD, and rover.

Figure 4. The arsenal of aerial robots just before flight testing.. Propulsion and Lift System The ARF aircraft weighs about 3 kilograms (6.6 lb) out of the box, has a wing span of 1.75 meters (5.75 ft) and a wing area of approximately 0.51 square meters (5.5 square feet). See Figure 3 for a computer aided drawing of our aerial vehicle and sub-vehicles. Propulsion was originally provided by a 0.4 cubic inch, 2-stroke, glow-engine. With some modifications to the airframe and the addition of an optical system, autopilot system and safety termination device, the airframe weighs a total of 6 kg (13 lb). Airframe Modifications While ARF airframes are a great choice for preliminary development, they require design modifications in order to serve as an efficient platform for an autonomous aerial robot. For example, the ARF kits are built with the radio gear (servos and radio receiver) preinstalled in the fuselage. This arrangement leaves no space for an onboard autopilot or optical system for collecting data for computer vision analysis. Furthermore, the engine supplied with the kit is not suitable for power or endurance when considering the payload additions required to support intelligent, autonomous flight. For these reasons, we have modified the Xtra Easy airframes in a number of ways. First, we replaced the 6.7 cm3 (0.41 in3) two-stroke glow engine with a Saito 14.9 cm3 (0.91 in3) four-stroke motor. This motor has a considerable increase in power and is actually lighter in weight.

Furthermore, the four-stroke motor has lower fuel consumption then the 6.7 cm3 two-stroke. Another modification to the airframe was the addition of two 473 mL (16 floz) saddle tanks to the exterior of the fuselage. This gave our UAV a total of 1300 mL (44 floz) of fuel allowing a total flight time of 1 hour and 45 minutes. The increase in endurance gives our team sufficient time to collect data for the identification of the IARC and the open portal. Additional modifications included the relocation of servos to the exterior of the airframe to provide room for our autopilot and optical sensors. Instead of gluing the vertical and horizontal tail surfaces to the plane we used bolts so that the tails could be removed for ease of transportation to the testing site and competition. Results from attempting mission 2 during the 2003 events also instigated changes in our airframe design. Last year, our computer vision camera was mounted under a plane with a tail-dragger configuration so that a nose wheel would not interfere with the camera view. We found that the overall configuration, with a heavy, gimbaled, exterior mounted camera and tail-dragger arrangement was inappropriate and made taking off very difficult. To avoid difficulties associated with takeoffs, we designed this year’s plane for “hand-launches” and “belly-in” landings. We removed the landing gear, thus saving weight, decreasing drag, and most importantly, proving a clear view for our optical system which elegantly deploys from the fuselage during flight and retracts prior to landing. Sheets of fiberglass were added to the bottom of the fuselage for protection during landings. Low-Level Guidance, Navigation and Control We utilize a Cloud Cap Technology Piccolo autopilot to provide low-level guidance, navigation and control of our aerial vehicles. The Piccolo is a commercially available autopilot system that supports pre-programmed flight for small-scale, fixed-wing, Unmanned Aerial Vehicles (UAVs). Autonomous flight of multiple UAVs is supported with a single ground station hardware unit and user interface. Classical control methods utilizing Proportional, Integral and Differential gains are employed to meet altitude, airspeed, waypoint, and other commands sent by a user or our Mission Control software. These commands are transmitted from the ground station to the processor of the onboard avionics unit via a 910 MHz radio link. Onboard sensors gather information about the environment in order to stabilize, control the airframe and support navigation of a Global Positioning System (GPS) waypoint course. The Piccolo does not provide knowledge of, or support target identification, threat avoidance or overall mission planning [2]. Mission Control Software The Mission Control software performs high level control of the system, and coordinates the two UAVs plus sub-vehicles. The Mission Executive module handles overall mission planning, and controls Ground Support Equipment, UAV Executive, PAD Executive and Rover Executive modules. The UAV Executive handles high level control of the two UAV autopilot ground stations through the UAV Navigation module. The UAV Executive also coordinates image processing in the machine vision system, which does symbol and portal detection. Symbol Recognition Recognition of the IARC symbol is handled through template matching. Image processing is done by MVTec Halcon, which is a commercial library of image processing software. The IARC symbol is used as a template to train the vision system. The system derives a model from the template, and various features are extracted along contours of the template. After training, the extracted features are used to search through camera images for matches. Training parameters are determined with Halcon's CreateScaledShapeModel. The actual symbol is found using FindScaledShapeModel. This method is capable of finding shapes that are at arbitrary angles, as well as shapes that are partially occluded. Once a match is found, the coordinates of the symbol are derived as a function of known position and attitude of the UAV, as well as the attitude of the camera relative to the UAV. Open Portal Detection General Approach Once the target building has been found, the Mission Control software takes the location of the IARC target symbol and sends it to the navigation and rerouting routine. This routine reroutes the Symbol Seeker UAV to the 3 km ingress area where it deploys the parachute for an autonomous landing. The Portal

Hunter UAV is rerouted to look for open portals on the target structure. This new flight path makes several passes over the structure at approximate right angles to the structure walls. The vision system searches building walls for closed shapes that are either dark or change internally as a function of view angle, indicating an internal 3D cavity. The search is not limited to dark shapes. Since the view angle is 30º to 60º below horizontal, sunlit objects such as floors are potentially visible through a portal. Note in particular that portals may be covered in partially-reflective transparent materials (such as glass or plastic) that may reflect light from the external environment. Reflected images may mimic a 3D internal cavity and spoof the vision system. This problem is handled by detecting light polarization. Window Detection Light reflecting off a transparent material can be polarized, depending on the incidence angle. Maximum polarization occurs at Brewster's angle, which for light passing through two materials is a function of the indexes of refraction for each material. Brewster's angles for glass and plastic is approximately 56º and 53º respectively [5,6]. Thus the objective is to determine whether a feature consists of polarized light. If it is, the feature is rejected as an open portal.

Figure5. Brewster’s Angle. Detecting Polarized Reflections The key question is how to detect polarized reflections. When the UAV is searching for open portals, it approaches the target building at an altitude well above the building, which means the flight path is such that the camera sweeps through a range of angles relative to portals on the building. If the frame rate is high enough, we can record multiple frames that are close to Brewster's angle. Since the camera is above the building, reflections are typically of the ground adjacent to the building, at least for portals that are oriented vertically. The UAV camera is equipped with a continuously-rotating polarizing filter. The rotation rate is 600°/s (see below). The vision system searches frame sequences for maximum and minimum brightness values in features that are candidates for portals. If the difference in brightness exceeds a certain threshold, the feature is deemed to be polarized, and is rejected as an open portal. False positives and false negatives are possible. A false negative can occur if a window reflects light from a dark surface, which makes it difficult to detect if the reflected light is polarized. A false positive can occur if the interior of an open portal generates polarized light. This does not occur often, given that most light sources are unpolarized (sunlight, incandescent light, fluorescent light, LEDs), and given that most internal surfaces are diffuse reflectors, which also generate unpolarized light. Polarizer Rotation Rate Ideally we need at least two frames within ±10º of the optimum polarization angles of 0º and 90º relative to incoming light. This constrains us to a minimum of 1 frame every 20º. Assuming a frame rate of 30 frame/s, the polarizer rotation rate is 600º/s. The filter would need to sweep through 180º to insure we get at least two frames at each of the two optimum angles, so a complete sweep requires 10 frames, including endpoints. At a cruise speed of 20 m/s, a frame rate of 30 frame/s generates a new frame every 0.67 m and 33 ms. During a 10 frame sweep, the camera travels 6.0 m. The worst-case portal location is assumed on the third floor, 8 m above the ground. The UAV altitude is 100 m AGL, which puts the UAV

camera 92 m above the portal. During a 10 frame sequence, the UAV flies 6 m and sweeps through an angle of (56.3º - 53.7º) = 2.6º, which is close enough to Brewster's angle to detect polarized light.

Figure 6. Portal View Angle To summarize, the vision system begins examining a 10 frame sequence that starts when the portal is at an angle of approximately 50º to 55º below horizontal, and ends 300 ms later, for a sweep angle of about 3º. If the reflected light is polarized, 2 of the 10 frames will show a maximum and minimum brightness. Since the polarizer is spinning open loop, we don't know beforehand which 2 frames to choose, but we know they are 100 ms and 4 frames apart. At one extreme, frames 1 and 5 are chosen (0º and 80º ). At the other extreme, frames 6 and 10 are chosen (100º and 180º ). Note that there is some redundancy at the two extremes, which produce about the same filtering. Flight Termination System In order to better ensure spectator safety and meet all IARC requirements, an independently operated and independently powered flight termination system was designed in the event of a system failure. In the past, our team used a fuel cutoff device and a large 4.8 m (16 ft) diameter parachute to render the aircraft safe. This year our team has implemented a similar parachute recovery system and an air cutoff device. Parachute Recovery System The parachute recovery system has a dual purpose in the competition. The parachute container or shroud is mounted with two servo release mechanisms. An independently operated and powered servo and receiver are mounted on one side of the parachute container. A separate servo, mounted on the opposite side of the container, is controlled through the autopilot and Mission Control software. If either of the servos is activated the chute will be deployed and the plane will drift safely to the ground. Thus, the parachute recovery system can serve as safety device in the event of an emergency or malfunction, or it can provide autonomous landing when activated by the Mission Control software. Autonomous landings allow the ground-based computations to continue without ending the mission attempt instead of regaining manual control of the aerial vehicle when fuel sources have been depleted or one of the vehicles has completed its data collection. In an autonomous landing sequence, the Mission Control software redirects the vehicle to the 3 km ingress area and, when reached, releases the chute by activating one of the servos. Only the safety pilot in charge of the safety termination device can activate the second servo, but the activation of the servos is independent and activation of either will deploy the chute.

Figure 7. Parachute deployment sequence. Pre-Programmed Fail-Safes In addition to the parachute recovery system, the Piccolo autopilot has a number of preprogrammed fail-safes instructing the autopilot to perform a certain task when communication or GPS signal is lost for an allotted period of time. The lost communication waypoint can be set to direct the aerial vehicle away from a populated area where the termination device can be safely activated. A “deadman” switch can also be used in situations where a malfunction does not enable the pilot to re-gain control of the autonomous vehicle. Air Cut-Off Device In addition to the parachute recovery system, an air cut-off device plugs up the carburetor intake, thereby “killing” the 4-stroke engine. A few seconds after the air cutoff device is activated by the gear switch on a 72 MHz transmitter, one of the servo-operated pins that secures the protective parachute casing to the body of the aerial vehicle is released. After the casing is jettisoned, the parachute drops from the bottom of the plane, and a braided shock cord dampens the force as the plane slowly descends to the ground. The combination of the lost communication waypoint feature and the separate parachute and engine kill termination device guarantees safety of all operators and spectators. PAYLOAD GNC Sensors As previously mentioned, we utilize a Piccolo autopilot for the low-level guidance, navigation and control of our aerial vehicles. The Piccolo utilizes a MPC555 micro controller, Motorola M12 GPS, MicroHard MHX 910/2400 radio modem, 3 Tokin CG-16D rate gyros, 3 Analog Devices ADXL202 accelerometers, dual ported mpxv50045 4 kPa dynamic pressure sensor, absolute ported mpx4115a barometric pressure sensor, and an air temperature sensor. Unlike some autopilot designs, the single datalink is used for command and control, telemetry, and even handles commands from the pilot in the loop instead of using a standard R/C receiver onboard the plane [2]. Mission Sensors Optical sensors will be used in addition to the GNC sensors discussed above for mission two, three and four of the IARC. Specifically for mission two, the Symbol Seeker UAV will house a retractable Sony Block Camera in order to search for the IARC target symbol. The Portal Hunter UAV will rely on a Sony Block Camera and a rotating, polarized lens to determine the location of the portals and if they are open or closed. The Mission Control software relies on these two vehicles and complementary optical sensors to provide a piece of the IARC puzzle. Without data from both vehicles missions two and four cannot be completed in their entirety. Communications Communication between the aerial vehicle and the low-level GNC is done through a 900 MHz wireless link. The pilot control console (modified R/C transmitter) plugs directly into the ground station hardware unit. The 900 MHz transmission handles all command and control sent by the high-level Mission Control software, from a ground station operator, or a pilot-in-the-loop. The 900 MHz link is used to

transmit telemetry data from the avionics unit to the ground station for use by the operator and the Mission Control software. L-band video receivers and small video transmitters (1.70 GHz to 1.85 GHz, rated at 2 W) are used for computer vision data collection. The L-band receivers are plugged directly to a 4 channel video multiplexer where the frames are collected at a rate of 11 frame/s. The final communication system for mission 2 of the IARC consists of an independently powered R/C receiver and transmitter for operation of the safety termination device for each aerial vehicle. Additional communication is involved in the PAD and rover systems but will be briefly described in later sections of this document. Power Management System The power system onboard the two UAVs consists of 5 battery packs each. While a single battery and DC-to-DC converter has been used in past designs to save weight and wiring hassles, this allows the failure of one system to interfere with the rest of the system. This year we decided to use multiple batteries. The only drawbacks for this arrangement include the need for many chargers to recharge the batteries and a lot more wires in the interior of the plane. The power system for the Symbol Seeker makes use of three types of batteries -- standard nickel cadmium, nickel metal hydrides and lithium polymers. The first battery is a 12 V, 2.7Ah Ni-MH battery that supplies power to the Piccolo autopilot unit. A second 4.8 V, 0.6 Ah Ni-Cd battery supplies power to the 4 servo actuators. The third 11.1 V, 3.27 Ah Li-Poly battery powers the video transmitter. The fourth battery is a Li-Poly battery with 7.4 V, 3.27Ah for the camera. Last a 3.7 V, 1.2 Ah Li-Poly supplies power for the termination device. As mentioned previously, charging is cumbersome with multiple batteries and battery types. A great degree of caution is used when charging batteries. Each battery is charged at less than 0.5 C (i.e. half the 1 hr discharge rate). This prevents heating and extends the life of each battery pack. Furthermore, each battery has its own charger, made specifically for its particular chemistry. Each charger is equipped with a peak detection feature that will put the batteries on “trickle” charge. This feature prevents over charging which can lead to a battery pack exploding. After a full charge, each battery is set aside for 5 minutes. This prevents stress in the packs and if any heating is present will allow each pack to cool before use. Each battery is checked with a voltmeter that applies a load to the pack to prevent reading a surface charge. The voltage of the autopilot system is the most critical and is monitored on the ground station through the graphical user interface during flight [4]. Subvehicles The subvehicle systems needed for missions three and four of the competition are currently under development and testing, but they are not the primary focus of our current research. The subvehicle design is based on a glider mother ship and ground rover approach. See Figure 6 for computer aided drawings and photos of the subvehicle system components. PAD Configuration The Precision Air Deployment Vehicle (PAD) has no propulsion system of its own. The vehicle relies on the carrier UAV for transport to the target. After release, the PAD glides to the portal target at a glide angle of approximately 30° to 45°. A relatively steep descent angle is used to avoid interference from nearby buildings. The PAD has a missile configuration, with three main wings at the rear and three canards in front. This configuration is chosen for maneuverability at low speeds, and for so-called skid turn capability -- that is, the ability to turn without banking. Aerodynamic surfaces (wings and canards) use low aspect ratio delta wing planforms for the following reasons: o o o o o

High maximum lift coefficient Light weight Structural simplicity. Overall wedge shape makes portal entry easier Low lift curve slope ( δCl / δα ) reduces sensitivity to wind gusts

Although a delta wing planform has a high induced drag penalty, the additional drag is acceptable because of the steep glide angle. High drag is actually beneficial because it prevents excessive speed buildup during the steep descent. PAD Guidance, Navigation and Control The PAD uses a lightweight three-axis rate gyroscope to control angular rates. Just before release from the carrier UAV, the PAD receives an attitude update from the UAV through an umbilical. After release, an onboard flight computer integrates data from the rate gyros to keep track of attitude. Determining attitude solely by integrating rate gyros results in a relatively high drift rate. The drift rate is acceptable because the flight time is very short -- on the order of a few seconds. The PAD uses a machine vision system to navigate to the portal target. Before the PAD is released, the ground-based vision system already knows what the building and portal look like as a result of earlier events during the mission. Since the location of the PAD is also known at release, the system can predict what the PAD camera will see at release. The machine vision system relies on this data to recognize the building and portal in images received from the PAD camera. Navigation is a four-step process that relies on internal gyro data, as well as video data transmitted to the ground station from a camera on the PAD. ( 1 ) A three axis gyro is used initially to point the PAD in the approximate direction of the portal target. ( 2 ) Once the PAD is on its intended heading, video data from an on-board camera is received by a ground-based computer, and a machine vision system uses a template-matching algorithm to search for the target building. ( 3 ) Once the building is found, the vision system searches for the portal target. ( 4 ) When the vision system locks on to the portal, it is tracked in real time and the PAD is steered to the portal. After the imaging system locks on to the target, pitch and yaw are controlled externally, by the PAD ground controller. The roll angle is controlled internally such that the wings are kept level. Note that all three axes of the rate gyro must be integrated in order to keep the wings level. Ground Rover Once the PAD has entered the structure, the rover will be released from the inside of the glider and will begin navigating the interior of the structure by means of a BX-24 microcontroller. The rover structure is based on a design by the University of Minnesota [3].

Figure 8a. The Precision Air Deployment (PAD) vehicle.

Figure 8b. The ground rover.

OPERATIONS Flight Preparations To prepare for each flight the U of A team has developed a series of checklists that are executed before each flight of our aerial system. Following the checklist ensures that equipment is safe for flight. A single person on the team is appointed checklist manager for a particular flight operation. This individual is responsible for making sure that every element of the system falls within the acceptable operational parameters before each flight. If one of these elements fails the check, the entire system is grounded. Example of elements of the system under scrutiny are battery voltage, network communication, or airspeed sensors. To keep the preflight process in order, the checklist is broken up into subsections. These sections include video system, aerial system, sub-vehicle system, ground system, termination system and vision system. To make sure that the equipment is checked in a reasonable amount of time, the checklist manager communicates to smaller groups who specialize in these smaller sub-systems. When the smaller groups have inspected their system they report to the checklist manager who give the final go, no-go for flight. User Interfaces Our system utilizes two interfaces. The graphical user interface for the low-level autopilot system provides vital information about the UAVs with Piccolo autopilot units. Visual and audio warnings assist the crew in detecting concerns before and during operations. The Display Executive of the Mission Control software also provides a user interface. This display reports the current standing of the mission and the outputs of the vision system such as reporting that the symbol has been found or the locations of portals. Together, these systems provide operators and spectators with a comprehensive view of the system health and mission status. RISK REDUCTION Preliminary Calculations The modifications made to the original airframe kit and the added equipment considerably increased the system weight. To ensure that the plane would be able to fly with the increased payload and other modifications, and to determine the minimum flight speed, we made a simple model of the wing and performed a straightforward calculation. The minimum velocity for our airframe configuration is important because slow speed flight aids in the ability to capture large quantities of quality video data in a single pass with the airplane. Weight carrying capability is also important in determining if a larger wing must be constructed and or a larger motor installed. To estimate the lift capabilities of the wing we imported the shape of the airfoil into X-Foil, a 2-D computational fluid dynamics (CFD) program. XFOIL results are shown in Figure 5. Next we used the simple formula for lift,

[1]

where L is the total lift or weight of the vehicle, ρ is the density altitude, V is the relative velocity, C L is the non-dimensional lift coefficient and S is the wing area. We determined the minimum speed required to produce the lifting force to support the 5.9 kg (13 lb) aerial system in flight to be 12.2 m/s (40 ft/s).

Figure 9. XFOIL results for our airfoil. A minimum speed of 12.2 m/s is reasonable for our airframe and engine combination and we therefore moved forward with simulation and flight-testing. Actual flight testing of the fully-loaded vehicle yielded a stall speed to be 13 m/s. The small difference is due to the combination of the 3-D effects of the wing and the body and the approximations made in the fluid dynamics calculations, air density, and wing dimensions. Even so, the theoretical calculations provided us with general insight into the performance capabilities of our aerial vehicle prior to simulation and flight-testing. Simulation and Testing The Piccolo system provided our team with a risk-free way of testing the autonomous flight capabilities of our aerial vehicle. Using a Hardware-In-the-Loop (HIL) simulator and a text file describing our aircraft we adjusted the control gains to achieve stable, reliable autonomous flight. The HIL simulator is also proving helpful in the testing of rerouting programs and other high-level navigation concerns. While all problems cannot be solved through simulation, the majority of concerns can be brought to the developer’s attention and possibly ironed out without risking expensive equipment. Even with simulation, actual flight-testing of all system components is truly the key to success. CONCLUSION The University of Arizona Aerial Robotics Club has designed a network of aerial vehicles to attempt mission two and prepare for missions three and four of the IARC. This document describes the components of the system, both hardware and software, and the strategies for completing the mission objectives. To complete mission 2, multiple vehicles are instrumented with imaging sensors to collect data for computer vision analysis. The collaboration between the two vehicles provides complementary pieces of the environment and when organized by the high-level Mission Control software an elegant solution is produced. After completion of mission 2, a small glider carrying a ground rover is released from one of the main vehicles and navigates threw the open portal. Once inside, this rover is ejected and begins to search the interior of the building to complete mission 3. This document also describes the safety features in each element of the design to ensure system safety as well as the safety of the spectators and judges. Overall, the system provides a simplistic approach to the IARC tasks and allows for future expansion and development. ACKNOWLEDGEMENTS The Aerial Robotics Club at the University of Arizona would like to acknowledge the generous financial and technical assistance of our sponsors: Raytheon, Lockheed Martin, Cloud Cap Technology,

Advanced Ceramics Research, and NetMedia. We would also like to thank the University of Arizona, specifically, the Aerospace and Mechanical Engineering, Electrical and Computer Engineering and Computer Science departments. Special thanks to our supportive group of faculty advisors, Dr. Kobus Barnard, Dr. Alon Efrat, and Dr. Hermann Fasel. REFERENCES 1. Michelson, R., Rules for the Current International Aerial Robotics Competition Mission http://avdil.gtri.gatech.edu/AUVS/CurrentIARC/200xCollegiateRules.html 2. Vaglienti,Bill. Hoag, Ross. A highly integrated UAV avionics system. http://www.cloudcaptech.com/piccolo/PiccoloRev1.01.pdf 3. Nikos Papanikolopoulos, National Science Foundation, Turning Robots into a Well-Oiled Machine Robot teams to help emergency responders in the trenches http://www.nsf.gov/od/lpa/newsroom/pr.cfm?ni=74 4. WARNING Safety precautions for Lithium Polymer and NiCd cells/packs stocked by FMA Direct https://www.fmadirect.com/site/fma.htm?body=Products&cat=28 5. Halliday, David. Resnick, Robert. Fundamentals of Physics, 1970, John Wiley & Sons, Inc., p. 676-677. 6. Polarized Light Microscopy: Interactive Java Tutorials http://micro.magnet.fsu.edu/primer/java/polarizedlight/brewster/index.html