Professional Documents
Culture Documents
Abstract
Wheel odometry is a common method for mobile robot relative localisation. However, this method is known to suffer from systematic errors. In this paper a comprehensive systematic odometry error model for a synchronous drive robot is proposed. The model addresses the systematic error for both rotational and translational motions. Results on real data show a real potential to produce a significant reduction in the odometry error for translational motions.
Keywords:
Odometry, Error Model, Synchronous Drive
Introduction
Wheel odometry, also known as dead reckoning is a common method for relative localisation in a mobile robot. However, odometry is known to suffer from systematic errors. Initial research on correcting systematic odometry errors were based on modelling the kinematics of differential drive (DD) robots (eg., [1][2]). The kinematics of a synchronous drive (SD) robot differ fundamentally from a DD robot. Odometry error models based on the kinematics of the DD robot cannot be directly ported to SD robots. Despite both types of robots providing two odometry data streams from two control motors, SD robots have a control motor exclusively for effecting translation by rotating the wheels along the ground. In addition, a separate control motor rotates the wheels around a vertical axis and synchronously rotates the upper carousel section, thereby changing the direction of forward motion with a corresponding change in the heading. SD robots therefore have two control motors effecting rotation and translation independently of each other. In comparison the two control motors in a DD robot effect only translation of the wheels i.e. both odometric data streams relate to the travel of the wheels along the ground. Changes in heading are effected by differing the amount of travel between the wheels. In 2001, Martinelli proposed an odometry error model for a synchronous drive robot [3]. Further theoretical developments were presented in [4]. Still, the model was not validated using real data. More importantly, this and other works (eg., [6]) failed to observe that the curvature of the robot trajectory during translational motion was a sinusoidal function of the heading. This omission was recognised by
Figure 1 A Synchronous Drive Robot Figure 1 is the synchronous drive robot used. The robot was built by Surrey University on a commercial synchronous drive base. Figure 2 is a schematic view of the kinematics, showing how rotation by is effected in a synchronous drive robot in comparison to a differential drive robot. The orientation of the base and the position of the wheels in a synchronous drive robot do not change, instead the three wheels rotate synchronously with the carousel, around a vertical axis. This can be compared to the kinematics of a differential drive robot where rotation is effected by translating the wheels by differing amounts, or for spot rotation by translating the wheels an equal amount in opposite directions. Therefore rotations in a differential drive robot require a translation of the wheels along the ground. Translational motion for both SD and DD robots are effected by rotating the wheels along the ground.
839
Proceedings of the International Conference on Robotics, Vision, Information and Signal Processing ROVISP2007
Figure 2 - Effecting Spot Rotation in Synchronous and Differential Drive Robots The pose of the robot is expressed as (x, y) for position and (h) for heading. The heading is represented by the sum of two variables: (i) h (the orientation of the carousel with respect to the base), and (ii) b (the orientation of the base with respect to the World). Ordinarily, b is not expected to change unless there is a systematic error, however, this value does change during translational motion, affecting the heading. Heading Error which is the difference between the heading estimated from rotation odometry and the true change in heading. As the carousel rotates relative to the base, a datum is required to be set where a relative orientation of the carousel to the base defines h = 0. A pointer affixed to the carousel pointing to a fixed marked position on the base was used. Rotation Error Model
The model considers a scaling error (S) to correct the odometry estimate of the rotation of the carousel to the true change. This parameter was estimated by comparing rotation odometry with the ground truth of the rotation of the carousel (see Figure 3(a)). Observations of the robot during spot-rotational motion show that the displacement describes a near circle. The displacement is modelled as sinusoidal functions fx and fy respectively. The parameters are estimated by fitting ground truth (x,y) displacements to fx and fy (see Figure 3(b)), with reference to the pose at h = 0. There is no observable change in the orientation of the base with respect to the World during rotational motion. This aspect need not be modelled. Hence the rotation can be modelled by f. Translation Error Model Modelling the translational systematic error is more complicated. The robot path describes an arc, which varies with h. There is also a scaling error between the true distance travelled and the translation odometry. This scaling error is modelled by fd. The true distance traversed considers the distance travelled along the arc. The theory of the format of the function for modelling position and heading error for translational motion is explained as follows. A synchronous drive robot will tend to curve during translational motion, with the degree of curvature depending upon the orientation of the wheels (i.e., h). As the orientation of the wheels changes during rotational motion the degree of curvature during any subsequent translational motion will be affected. The ground
f y ( h ; Ay , y , K y ) = Ay sin( h + y ) + K y
TRANSLATION f d ( d odo ; S d ) = S d d odo
f b ( h ; Ab , b , K b ) = Ab sin( h + b )+ K b f c ( h ; Ac , c , K c ) = Ac sin( h + c ) + K c
The two inputs to the model are rotation and translation odometry, denoted as odo and dodo respectively. In addition, there are three error components for each of the two parts: Conversion Error which is the scaling error in the software transforming translation odometry to a translation in units of metres, or rotation odometry to a rotation of the carousel in units of degrees, Position Error which is the difference between the change in position estimated from odometry and the true displacement, and
840
Proceedings of the International Conference on Robotics, Vision, Information and Signal Processing ROVISP2007
work laid down in [5] which describes a force model based on the kinematics, is expanded and explained in further detail here, supported by real data. This is used to support the error model equations derived from empirical observations of the robot during translational motion.
The following assumptions are made, without loss of generality on the format of the error model equations: 1. The contact points of the wheels on the ground form an equilateral triangle. 2. The centre of the triangle is the origin and the centre of mass. Applying the lever law, the torque, denoted Ft, is:
Ft = r1F1 + r2 F2 + r3 F3
where r1 = D sin( h + 1 ) , r2 = D sin( + h + 2 ) and 3 r3 = D sin( h 3 ) . Now, assuming that the traction 3 forces ( F. ) are equal, applying the addition formula to expand the terms, and then converting cosine terms to sines, Ft can be expressed as the sum of 5 sine terms:
Ft = FD{sin( h + 1 ) 1 sin( h + 2 ) + 2
3 2 3 2
sin( h + 2 + ) 2
sin( h + 3 + ) 1 sin( h + 3 )} 2 2
Figure 4- Drive Wheel Torque Force The forces causing the robot to curve are considered to be:
Now applying the superposition formula to each of the sine terms (i.e., C sin( + ) = A sin + B cos , C = A2 + B 2 ) and tan =
B A
A centripetal drag force (Fc), and A twist of the base from a torque force (Ft).
) and rearranging:
i =1,5
Centripetal Force (Fc). The centripetal force is simply the sum of the component of forces due to wheel misalignment, denoted as n where n = 13 (see Figure 4). The centripetal force is therefore
Fc = 3F sin n
(1)
where Ai and Bi correspond to the A and B terms in the superposition formula. The superposition formula can be applied again to convert the sum of a sine and cosine to a single sine function, thus:
Ft = FDC sin( h + )
B where C = ( Ai ) 2 + ( Bi ) 2 and tan = Ai . i
(2)
841
Proceedings of the International Conference on Robotics, Vision, Information and Signal Processing ROVISP2007
sin ( 2 ) cos( 2 ) dt = 0 ~ d f ( ) b h
4. Estimate Robot Pose and State. Update the robot pose P and heading offset b:
P(k+1)= P(k) + Tr2w(k) dr, b(k+1) = b(k) + b .
A localisation algorithm based on the proposed model equations is described here. 1. Initialise. Define the robot pose P(k),and Tr2w(k) at time k as:
cos b P(k) = (x, y, h)T , Tr2w(k) = sin b 0 sin b cos b 0 0 0 0 0 1 1
Results
A total of 6 run ranging from 1.7m to 3.5m at various wheel orientations (h) were conducted (see Table 2). Runs 1,2,3 and 4, where the odometry error is the greatest, are at wheel orientations where the path curvature is greatest. The results show that the proposed model provides a significant reduction in the net position and heading error. In the case of runs 5 and 6 where the curvature is lower, the proposed model provides a marginal improvement if any.
Table 2- Residual Localisation Errors
Odometry Error h 1 340 340 170 170 80 260 Distance (m) 1.707 3.522 1.804 3.519 2.994 3.490 Model Error
where (x,y) is the position, h the heading, and b the heading offset (initialised to zero). ~ ~ 2. Input Data. Let f and d f d . 3. Compute dr. The state change vector d r is derived separately for rotation and translation. The functions f () refer to the model functions and dr = (x, y, h , b ) T.
(a) Compute dr from Rotation. Let
~ f x h + f x ( h ) ~ f + f y ( h ) dr[rotation] = y h ~ . 0
x , y
(m) 0.050 0.270 0.096 0.309 0.103 0.042
h
(deg) 4.0 9.7 5.7 9.0 3.3 3.0
x,y
(m) 0.019 0.045 0.039 0.096 0.067 0.072
h
(deg) 0.1 1.7 1.1 0.0 3.3 2.2
2 3 4 5 6
( (
) )
842
Proceedings of the International Conference on Robotics, Vision, Information and Signal Processing ROVISP2007
Conclusion
Figure 6 shows barcharts of the position and heading error respectively between raw odometry and after application of the proposed error model. Error bars are to 3-sigma standard deviation A complete odometry error model addressing both rotational and translational systematic errors was presented. Results from real experiments show a significant reduction in the heading and position error, especially where the systematic error is greatest.
Discussion
The path curvature of a synchronous drive robot during translational motion is unexpected. Kinematic modelling has shown that this behaviour can be explained as being a systematic rather than a random, non-systematic error, which at first sight may have been assumed.
Rotational Motions. The sinusoidal function for the spot rotation can be explained the vertical rotation axis of the wheels being slightly offset from the contact point of the wheel on the ground causing a cam effect. However, this displacement is localised and does not accumulate. It is relatively small in comparison to errors from translational motion (especially at wheel orientations of greatest path curvature). Translational Motions. The results show that the most dramatic reduction in heading error is where the systematic error is greatest, with little or no improvement when the trajectory of the robot is relatively straight. The results suggest that the overall net heading and position error can be minimised by conducted translational motions at wheel orientations where the curvature is greatest.
Acknowledgments
M. Zaman gratefully acknowledges the support of the UK Engineering and Physical Sciences Research Council (EPSRC).
References
[1] Borenstein J., and Feng L. 1996. Measurement and correction of systematic odometry errors in mobile robots, IEEE Transactions Robotics and Automation, vol. 12, pp. 869880. [2] Bak M., Larsen T., Andersen N. A., and Ravn O. 1999. Auto-calibration of systematic odometry errors in mobile robots, Proceedings of SPIE Conference Mobile Robotics XIV, vol. 3838, pp. 252263. [3] Martinelli A. 2001. A possible strategy to evaluate the odometry of a mobile robot, Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 19461951. [4] Martinelli A. 2002. The odometry error of a mobile robot with a synchronous drive system, IEEE Transactions on Robotics and Automation, pp. 399405. [5] Doh N. L., Choset H., and Chung W. K. 2003. Accurate relative localization using odometry, Proceedings of IEEE International Conference on Robotics and Automation, pp. 16061612. [6] Kelly A. 2001. General solution for linearised systematic error propagation, Proceedings of International
This can be explained by the sinusoidal nature of the functions fb and fc where the gradient of the function is greatest when the path curvature is the least (ie., around the region where the function argument crosses the x-axis). The function is sensitive to small errors around this region and is less able to correct for systematic odometry errors.
843
Proceedings of the International Conference on Robotics, Vision, Information and Signal Processing ROVISP2007
[7] Zaman M. 2001. Mobile robot localisation, MPhil to PhD Transfer Report, University of Surrey, UK.
844