You are on page 1of 2

Sliding Mode Controller for Stereo Vision Based Autonomous Flight

Paper:

Sliding Mode Controller for Stereo Vision Based Autonomous Flight of Quad-Rotor MAV
Dwi Pebrianti , Wei Wang , Daisuke Iwakura , Yuze Song , and Kenzo Nonami
of Articial System Science, Graduate School of Engineering, Chiba University 1-33 Yayoi-cho, Inage-ku, Chiba 263-8522, Japan E-mail: dwi pebri@graduate.chiba-u.jp College of Information and Control Engineering, Nanjing University of Information Science And Technology 219 Ning Liu Road, Nanjing, Jiangsu 210044, China Department of Mechanical Engineering, Chiba University 1-33 Yayoi-cho, Inage-ku, Chiba 263-8522, Japan [Received July 29, 2010; accepted October 7, 2010]
Department

We have investigated the possibility of a Sliding Mode Controller (SMC) for autonomous hovering and waypoint of a quad-rotor Micro Aerial Vehicle (MAV) based on an on ground stereo vision system. The object tracking used here is running average background subtraction. Among the background subtraction algorithms for object tracking, running average is known to have the fastest processing speed and the lowest memory requirement. Stereo vision system is known to have a good performance in measuring the distance from camera to object without any information regarding the object geometry in advance. SMC is known to have advantage of insensitivity to the model errors, parametric uncertainties and other disturbances. The experiment on autonomous hovering and way-point by using running average method for object tracking and SMC for the ight control shows a reliable result.

Keywords: quad-rotor MAV, stereo vision system, background subtraction object tracking, sliding mode controller, autonomous ight

1. Introduction
Recently, research on Micro Aerial Vehicles (MAVs) becomes rapidly increasing. This is due to the need on the ability of aerial vehicles to conduct tasks in narrow space, urban area as well as indoor environment. Aerial vehicles are categorized in many types, such as airplane-like xed wing models, bird- or insect-like ornithopter (apping wing) models, and helicopter-like rotary wing models. Each model produces qualities which favour them more for specic tasks. Fixed wing models are more suited to outdoor ight and cope well in the presence of wind. Flapping wing models are more suited for low wind areas, indoor or outdoor depending on the ability to hover. Finally, rotary wing models are more suited for stable and hovering ight, especially suited to indoor Journal of Robotics and Mechatronics Vol.23 No.1, 2011

ight. In this research, we use quad-rotor, which is one type of rotary wing. The reason for choosing this platform is due to its capabilities to do hovering, vertical take-off and landing. Other reasons are that quad-rotors do not need to vary the pitch of the propeller blades, only variation in speed of the motors is needed for complete control and stability of the air vehicle [1]. This in turn reduces the cost and maintenance of such a vehicle. The use of four rotors also allows for a smaller diameter in rotor size in comparison to a single rotor vehicle, allowing them to store less kinetic energy during ight. Another benet of reducing propeller size is the reduction in damage caused by objects and to objects in the even of a crash. To further ensure propeller protection, the propeller blades can also be enclosed with a frame [2]. Quad-rotor is one type of aerial vehicle that is able to conduct Vertical Take Off and Landing (VTOL). Research on VTOL platform is not new. A full-scale fourrotors helicopter was built by De Bothezat in 1921 [3]. Research on quad-rotor is varying from the design of quad-rotor [46], modeling and control [1, 2, 713], vision based autonomous ight [3, 1421], GPS based autonomous ight [22], IR and ultrasonic based autonomous ight [23], indoor and outdoor application, etc. Research on the application of quad-rotor for indoor environment and narrow space are currently in focus. This kind of application can be used for search and rescue process that needs the MAV to be as near as possible to the ground. Such kind of area is limited from the GPS signal, therefore, passive sensor like camera will be one of the candidates to conduct the task. However, vision based ight still has many problems ranging from hardware and software development to pure theoretical issues, which are even more complicated when applied to small ying machines operating in unstructured environments. Moreover, the difculty found when using imaging sensors is the high bandwidth of data, and the resulting heavy computational burden [20]. Low-cost vision sensor for autonomous ight of an MAV is preferable. However, low cost means that the sen137

Pebrianti, D. et al.

sor performance is questionable. In order to use low-cost vision sensor and to obtain a reliable autonomous ight of an MAV, a robust controller is needed. Our previous research on autonomous ight of a quad-rotor MAV [15, 16, 20] by using vision system and a hierarchical controller shows that the accuracy on the autonomous hovering is about 1 2 m. Work done by Achtelik et al. [24] shows that the accuracy of the quad-rotor when conducting autonomous hovering by using LQR controller is about 1 m. Altug et al. did the experiment by using feedback linearizing and backstepping-like controller for autonomous hovering of a quad-rotor MAV. The accuracy achieved in their experiment is less than 0.2 m. They used markers on the quad-rotor for conducting this task at 1.5 m of height. However, if the height of the quad-rotor is more than 1.5 m, there is no explanation whether their controller can still do the autonomous hovering precisely or not, due to the detection problem of the markers on the quad-rotor. Based on the result of recent researches, we have investigated the possibility of an on ground stereo camera for autonomous ight of a quad-rotor MAV and improve the performance of recent controllers by designing a Sliding Mode Controller, SMC for autonomous hovering and way-point navigation of quad-rotor MAV. SMC is chosen since it can make system motion robust with respect to system parameter variations, unmodeled dynamics and external disturbances. In addition, this technique provides efcient control laws for linear and nonlinear plants. Another distinguishing feature is its order reduction capability, which enables simplication of design and system decoupling. Our system can be used for supporting an embedded vision system that often has problem with vibration which makes the distance measurement from MAV to the ground inaccurate. This system can also be used for autonomous recharging system that needs a precise landing performance. This paper is organized as follow. Section 2 will describe the platform and the system architecture used in the research. Image processing will be explained in Section 3. Section 4 will discuss about the development of MAV model and Sliding Mode Controller (SMC). Section 5 will be the experimental result and discussion. Conclusion and future works will be discussed in Section 6.

Propeller

Brushless Motor X-Bee Wireless Com.

SH2 Microcontroller

MNAV Naviga on Sensor

Fig. 1. Quad-rotor MAV.

Table 1. Specication of the platform.

Items Height Diameter Mass Max. Lift Max. Payload Flying Time Max. Speed

Description 200 mm 530 mm (with blade) 400 g (original platform with battery) 600 g (modied platform with battery) 80 g/f plus 300 g of its original weight 23 min (original platform) and 12 min (modied platform) 10 m/s

sor contains IMU, GPS and pressure sensor with weight about 35 g, excluding the GPS antenna. It includes 3-axis accelerometers, 3-axis angular velocity sensors and 3-axis magnetometers, static pressure (altitude) and dynamic pressure (airspeed) sensors and a GPS receiver module. The PPM interface allows for software interpretation of R/C signals (throttle, pitching/rolling/yawing torques, switch, communication status, GPS data) from MNAV100CA sent to the AP-SH2A-0A FCC through the serial port RS-232. AP-SH2A-0A FCC also receives data (image processing output data) from the Ground Station.

2. MAV Platform, Stereo Vision System and Graphical User Interface


2.1. MAV Platform
The platform used in this research is an off-the-shelf quad-rotor type MAV, X-3D-BL platform from Ascending Technology. Fig. 1 shows the platform that has been modied with MNAV100CA sensor from Crossbow and AP-SH2A-0A micro-controller from Sharp. The specication of the platform is shown in Table 1. AP-SH2A-0A micro-controller is used for Flight Control Computer (FCC). For autopilot purpose, we used MNAV100CA sensor from Crossbow. The low cost sen138

2.2. Stereo Vision System


Stereo vision system used in this research is a Commercial-Off-The-Shelf (COTS) product, Bumblebee from Point Grey Research. The specication of Bumblebee camera is 1) 12 cm baseline, 2) resolution up to 1204 768, 3) 15 frame per second (fps) of maximum frame rate, 4) 50 horizontal eld of view, 5) IEEE 1394 interface, 6) 375 g of weight and 7) dimension 160 mm 50 mm 40 mm. Bumblebee is placed on the ground while facing upward and connected to a laptop computer for image processing. This computer will send the 3D position data to a Ground Station PC for data saving. The Journal of Robotics and Mechatronics Vol.23 No.1, 2011

You might also like