Research Article | Open Access
Geometry Based Approach to Obstacle Avoidance of Triomnidirectional Wheeled Mobile Robotic Platform
Mobile robots undergo a collision-free autonomous motion by using the information obtained from a suitable combination of multiple sensors of same or different families. These sensors are often configured around the chassis of the robotic platform. However, little to no information is available as to how these sensors are configured on mobile robotic platforms and how many of these sensors to place on such platforms. Instead, an empirical approach is adopted. That is, the number of sensors of the same family or any type as well as combination of sensors for detecting obstacles is determined by experiment or information obtained from external sensors. This approach is often seen to be iterative and time consuming. In this paper, an approach for determining the minimum number of sensors and their spacing on the robotic platform is proposed so that mobile robots undergo collision-free motion. The effectiveness of the developed approach is experimentally tested by examining the obstacle avoidance capability of the triomnidirectional wheeled robotic platform based on a motion triggering signal obtained from a skirt of ultrasonic sensors only. It was observed that the newly developed approach allows this robotic platform to avoid obstacles effectively.
Mobile robots are developed to perform a purposeful task by traversing through their working zone with little to no human intervention. Hence, their usefulness depends on the ability to achieve the set sequence of operations entailed to them while traversing safely in both structured and unstructured environments. However, they are unable to function properly in such environments unless they have efficient sensing and predicting capabilities and thus require a sensor fusion system  to extract useful information from their environment. Thus, making a precise planning before moving the robots is often seen failing since these changes are rarely predictable or perfectly known. Therefore, bridging this gap and executing the set tasks require knowing their environments and avoiding obstacles by using local information and obstacle detecting sensors. In order to perform collision-free motion or a defined task, a combination of one or more sensors of the same or different type is often used in most autonomous robotic platforms. However, such fused sensor system is either complementing each other or being redundant most of the time.
An online obstacle avoidance method for the holonomic circular robot is presented in . This navigation system combines perception and dead reckoning by fusing 24 ultrasonic sensors and wheel encoders and the algorithm for motion planning is modified online based on the recorded positions of the obstacles. However, the basis on which the fused sensors and their spacing around the developed robotic system are determined was not specified.
In , rangefinder sensors are used to generate a collision-free path and guide the robot towards the goal and keep it away from obstacles through the directive circle method. The position and velocity information of static or dynamic obstacles are obtained using these sensors. However, as presented in , such sensors experience poor directionality depending on the distance to the obstacle and the angle between the obstacle surface and the acoustic beam. These sensors are also pruned to frequent misreadings caused by ultrasonic sensor noise or cross-talk. In such scenarios, the robot may start to misbehave or collide with obstacles unless the misreadings and cross-talk are filtered.
As presented in , a combination of ultrasonic sensors, contact switches, and a modular sonar camera that is always swinging is used to complement the ultrasonic sensor for avoiding the obstacles in an autonomous mode. The robot velocities are taken to be a function of distance to obstacles and the ultrasonic sensor works at its full scanning range. For this case, a safe minimum distance is imposed on the linear speed of the robot. This is due to the fact that obstacle avoidance using ultrasonic sensors involves local scanning of the robot environment, local mapping containing information about the obstacles, and the action to be taken by the robot.
A neural control system embedded in Arduino platform is presented  to perform obstacle avoidance in real time using the information obtained from the ultrasonic sensors. This complex decision-making strategy is implemented in MATLAB and processing environment. Although such approach allows the robot to learn the nonlinear relationships between the sensor readings and the control inputs, system level complexity is greatly reduced; they are computationally involved. Furthermore, such realization may require more ultrasonic sensors and additional sensors such as gyroscope, wheel encoder, and other methods to extract further position information of the robot.
A fuzzy logic controller that generates an instantaneous collision-free motion is presented in [7, 8]. Obstacle avoidance capabilities are achieved through online planning wherein the planner receives a continuous flow of information about occurring events and generates a corresponding command in response to the events. Although this approach allows obstacle avoidance in dynamic environments as well, it is not effective if the environment is changing unexpectedly. A pair of the obstacle detecting sensors is used to extract useful information about the surrounding environment . Although these sensors give updated information about the surrounding environment and the relative position of the obstacle, a joystick is used as motion assistant to induce scanning. An obstacle detection technique that uses combination of Kinect and sonar sensor is used for estimating the pose of moving obstacles . Although this technique improves the accuracy and robustness of the reactive motion planning approach in uncertain dynamic environments, the fused sensors are empirically dependent on each other.
According to fuzzy obstacle avoidance and Elman Neural Network (ENN) algorithm proposed in , a virtual force field is used to keep the mobile robots away from the obstacles by the desired distance. Since this approach is subjected to uncertainties, an Elman Neural Network is proposed to compensate for the uncertainties between the moving robot and the obstacles. However, the exact distance to the obstacles is still difficult to obtain. Thus, computationally involving Elman fuzzy adaptive controller is used to adjust the distance to an obstacle.
Furthermore, navigation and obstacle avoidance capabilities of mobile robotic platforms strongly depend on the performance of ultrasonic sensors and the fused sensor system in most of the cases. As a result, ring of sensors forming a skirt is often used. Despite all these efforts, mobile robots often lack the ability to avoid obstacles completely and fail to perform autonomously. This unsatisfactory collision-free autonomous operation of mobile robots is mainly due to an empirical approach implemented for determining the type and number of sensors to be used on the mobile robotic platforms.
In this paper, a new geometry based obstacle avoidance approach is presented. This approach explicitly presents the geometrical dependency of a family of distance to obstacle measuring sensors (specifically ultrasonic sensors) and their reading outputs with the geometry of mobile robot chassis. The safe distance to obstacle obtained from this geometrical relationship is a function of the sensor readings in centimeter and is mainly used as motion triggering signals for obstacle-free motion. Furthermore, it allows determination of the minimum number of sensors and the spacing strategy on the mobile robotic platforms. Then, the effectiveness of the developed approach is examined on triomnidirectional wheeled mobile robotic platform traversing in an unstructured environment.
2. Sensor Spacing and Skirting Strategy
The geometrical shape of wheeled mobile robot base can assume any shape as shown in Figure 1(a) and the configuration of sensors on such robots also varies accordingly. Despite such variations, obstacle-free motion and set task execution will be compromised unless they are equipped with properly and optimally spaced sensors of the same type or fused sensors. Hence the closeness of the predicted real world state to its actual state can greatly affect the way the mobile robots are behaving and responding to the changes in their surroundings.
Although it is a common practice to see mobile robots having several ultrasonic sensors placed on their chassis, there are no clear methodologies for determining the number of ultrasonic sensors and the spacing from each other. Instead, an empirical approach is adopted. This may result in interference of the readings if the sensors are placed very close to each other or loss of information if less number of these sensors is used. In this paper, two approaches are proposed to determine the number and spacing strategy of ultrasonic sensors on the triomnidirectional wheeled mobile robotic platform with circular chassis shown in Figure 1(b). The first approach involves the following:(i)Fixing the number of ultrasonic sensors and . Irrespective of the application dependent configuration of mobile robotic chassis, this approach assumes that two sensors are placed on the opposite sides of the wheels and at an angle from the center of the wheels. The remaining ultrasonic sensors are arranged between the three wheels at angle from each other. Hence the spacing angle is determined such that interference problem is avoided.
The second approach involves the following:(i)Fixing and and determining the number of ultrasonic sensor . This approach also ensures that interference is completely avoided as long as the object is far away from the robotic platform at least by distance .
In most practical scenarios, despite their ideal working range of , the reading accuracy of these sensors starts to deteriorate when the angle formed by emitted sound and echo wave exceeds . This shows that the dependency of the spacing angles and is affected by the number of sensors . Using too closely spaced N sensors may create interference and cross-talk and sparsely configured N sensors result in missed sensor readings. Thus, the analysis given below assumes that the sensors are configured at a radial distance of such that cross-talk and interference are eliminated. Assuming the robotic platform is subjected to detectable object, is determined geometrically as shown in Figure 1(b). Therefore, the minimum safe distance for which the ultrasonic sensor yields noninterfering reading is the magnitude of the line formed by connecting the center of the wheelbase to the intersection point of the lines drawn at an angle of from the respective axis of adjacent ultrasonic sensors. Its value is dependent on the number of ultrasonic sensors and the radius of the circular chassis . Hence, the relationship between and the circumference of the circular chassis is defined as
The value is a correction factor for the approximated arc length defined by the product of and the ultrasonic sensors length . is the circumferential distance intercepted by ultrasonic sensors that are configured adjacent to each wheel and is the circumferential distance intercepted by any two consecutive ultrasonic sensors presented as follows:
For the given configuration,
Equation (5) yields the angular spacing of the sensors on the circular chassis of the mobile robotic platform in degree. However, the following equation must be satisfied all the time:
The values and are determined by geometrical analysis from 3D CAD model of the mobile robotic base or the platform chassis as shown in Figure 1 or by measuring the position of the wheels and the ultrasonic sensors with respect to the origin of the robotic platform chassis. Therefore, depending on the measured value of and , the minimum number of ultrasonic sensors and their relative positions with respect to each other will be determined with ease.
3. Obstacle Avoidance System Design
The distance measuring accuracy of ultrasonic sensors greatly affects the capability of the mobile robotic platform in avoiding obstacles. Generally, Figure 3(a) shows that ultrasonic sensors cannot give an accurate reading if the obstacles are farther than a distance as per the manufacturer’s specification or the object is at a distance less than . The reading can also be affected if the obstacle has a reflective surface at a shallow angle greater than as shown in Figure 3(b). This happens because the sound wave will not be reflected to the sensor properly. The obstacle can either be missed or too small sound reflects to the receiver if the obstacle is too small and too soft as in the case of Figures 3(c) and 3(d), respectively. Ultrasonic sensor may also detect sounds reflecting off the floor if it is mounted close to the floor of the robotic platform.
By implementing sensor spacing method explained in approach 1, a combination of and is chosen. As an example, the configuration of the nine sensors, the robotic platform chassis, and obstacle around it is shown in Figure 2(a). In this approach, a normal distance is drawn from the contact point of the emitted wave with the obstacle to the -axis. These normal distances are denoted by to and determined by applying similarity of triangles. For example, the distances and to obstacle are determined from of triangles with vertices , , and , , in Figure 2(a). Given the sensor reading values, the normal distances are determined as follows:
(a) Ultrasonic sensor configuration
(b) Platform global and local coordinate frame
(a) Working principle of Ping Ultrasonic Sensor
(b) Flat obstacle at angle to Ping Ultrasonic Sensor
(c) Too small obstacle in front of Ping Ultrasonic Sensor
(d) Too soft obstacle in front of Ping Ultrasonic Sensor
Applying sensor spacing approach 1 and (3), for . Hence, . Obtaining m and m from the geometry shown in Figure 1(b) and simplifying (7), the normal distance is expressed as a function of the reading from the corresponding sensor and is given as
Similarly, the distances to obstacles to are shown as follows:
The normal distances to and , , and are obtained as follows: , , , , , , , , , and .
The sensor reading of to is gathered and stored in the form of an array by following the flowchart shown in Figure 5. Ten states of the robotic platform are chosen for the nine ultrasonic sensors and a value of and is assigned to each sensor for indicating the presence and absence of an obstacle, respectively. To examine the response of the mobile robotic platform to the presence of obstacles, a set of state-obstacle-sensor-actions is developed as shown in Table 1. The path for avoiding these obstacles is selected based on the converted analog signal into digital value  of the ultrasonic sensors. Thus, the robotic platform takes the corresponding actions as indicated in Table 1 and the motion directions shown in Figure 4.
4. Kinematic Equation of Triomnidirectional Wheeled Robotic Platform
The kinematic modeling of the robotic platform is carried out based on the design and operational assumptions presented in . The rotation direction of the angular velocity of the wheels attached to the mobile robotic chassis is about , , and , respectively. is the radius of the circular chassis of the robotic platform and , , and are the orientation angles of the wheels with respect to the coordinate frame of the robotic platform chassis. Taking , , and , the unit vectors attached to the wheel’s geometric center take the values given in Table 2.
This robotic platform chassis uses three symmetrically aligned omnidirectional wheels with rollers configured at to its axis. The velocity of each wheel is measured by an encoder attached to the motor that actuates each wheel. The rollers on the wheels are self-accommodating since they are neither sensed by encoders nor actuated by motors . The vector sum of the tangential velocities of the three wheels on the robotic platform is given as follows:where,where , , and are the angular velocities of the three wheels, respectively. of the robotic platform is expressed in (10) and is further simplified in (12) as a function of the rotational and the component of the velocities of the robotic platform about its geometric center.
The wheel angular velocities , , and are obtained from (13). Equation (14) shows the wheel’s angular velocities in terms of the robotic platform velocity and its radius , the wheel width , and the wheel radius .
Since the developed robotic platform uses Swedish wheels, there is no sliding constraint and the movement orthogonal to the wheels is free due to the rolling of the small rollers. Hence (14) is taken as the rolling constraint  imposed by the three wheels. Therefore, only , , and of each wheel are controllable and obtainable by substituting the robot velocity components , , the wheel radius m, wheel width m, and platform radius m into (14). This yields the angular velocities of the three wheels as expressed in terms of the robot velocity and its orientation angle . In addition, Algorithm 1 was used to ensure the correct wheel direction of rotation.
5. Experimental Validation of the Proposed Method
An experimental setup consisting of a triomnidirectional wheeled robotic platform shown in Figure 6(a) is developed to validate the effectiveness of the proposed approach for obstacle avoidance application. The setup uses a low-level PID controller whose gains are tuned in real time by using Faulhaber motion manager software for Faulhaber motor (Series of 3863H024CR with Gearhead of 38 A Series and 24 V Motor Winding voltage) and motion controller series (MCDC3006S RS). The system uses TTL to RS232 converter for data transferring between Arduino Mega 2560 microcontroller and Faulhaber motion controller series. A skirt of 9 PING)))™ Ultrasonic Sensors is used for detecting and avoiding obstacles while moving according to the motion directions defined in Figure 4.
(a) Prototype of the mobile robotic platform
(b) Experimental setup for obstacle avoidance
The robustness of the developed approach is examined by varying the orientation and motion directions of the robotic platform while maintaining the three wheel orientations at , , , and . Table 3 shows the possible combinations of the platform orientation and the corresponding motion directions. The triggering signal presented in this table is a combination of the individual sensor reading values expressed in terms of the robotic chassis geometry and its physical dimensions.
6. Result and Discussions
A new geometry based obstacle avoidance method is proposed in this work. This approach involves obtaining the position of an obstacle with respect to the robotic platform through scanning and mapping of local environment by using a skirt of PING))) Ultrasonic Sensors only. A method by which the geometry of the robotic platform chassis is related to the physical dimensions of the sensors and their corresponding distance measurement outputs is developed. Thus, the minimum number of these sensors and their configuration around the chassis of the robotic platform is determined. In this approach, explicit formulation assumes that at least two sensors are placed on the opposite side of each wheel at an angle of from the wheel axis and the intermediate sensors are spaced at an angle of from each other. The developed approach is validated by using the kinematic model of the triomnidirectional wheeled robotic platform with circular chassis. The three wheels on this robotic platform are configured at a radial distance of from its geometric center and are driven by three Faulhaber motors (Series of 3863H024CR with Gearhead of 38 A Series and 24 V Motor Winding voltage). Each motor has a gearhead with a reduction ratio of 16 : 1 and an IE2-1024 magnetic encoder and is controlled by a Faulhaber motion controller series (MCDC3006S RS). The motion of this robotic platform is controlled using low-level PID controller that is tuned in real time. Laptop PC is used as a central processing unit and the motion controller (MCDC3006S RS) is connected to TTL to RS232 converter to receive the speed, orientation, and the ultrasonic sensors reading outputs from Arduino Mega 2560.
Finally, the robustness of the developed approach is evaluated by allowing the robotic platform to traverse across the randomly placed obstacles as shown in Figure 6(b). This robotic platform continues moving along its default motion direction until it encounters an obstacle and detours away from it according to the wheel kinematic equations and geometry based triggering signals shown in Table 3. Table 4 shows the evaluation results collected by starting over the robotic platform motions five times and examining its obstacle avoidance capabilities in each test. The detouring capability of this platform was controlled by following the direction of the arrows drawn on the top cover of the robotic platform as shown in Figure 6(b).
|● Completely avoids obstacles.|
⊗ Closely clears obstacle.
This robotic platform was seen to completely avoid obstacles for 77.5% of the test cases. It was also seen to clear the obstacles for more than 90% of the test cases. Hence, the robotic platform collides with the obstacles for less than 10% of the test cases. This is mainly because the configurations of the ultrasonic sensors on the robotic platform were above the height of the smallest obstacle along its motion directions. Although this robotic platform can clear obstacles for most of the test cases, its robustness was not examined in terms of path correction and go-to-goal application.
In this paper, a new geometry based obstacle avoidance technique is developed. This technique mainly used a skirt of ultrasonic sensors that are configured on the chassis of the triomnidirectional wheeled robotic platform. The effectiveness of this approach is validated by studying obstacle-free motion capability of the robotic platform for different maneuverable modes. In this work, the motions triggering signals were obtained by relating the geometry of the robotic platform chassis, ultrasonic sensor physical dimension, and its readings outputs. It was observed that the robotic platform was seen to avoid obstacles autonomously with higher mobility with instant sensor reading outputs for most of the test cases. This eliminates the need for taking multiple sensor readings that often requires averaging. The new technique also eliminates the need to rely on external sensor information as it enables the robotic platform to clear obstacles that are very close. Therefore, this approach can be implemented on mobile and service robots that involve close range obstacle avoidance.
Conflicts of Interest
The authors declare that there are no conflicts of interest regarding the publication of this paper.
This research was supported by Ministry of Science and Technology, Taiwan, under Grant no. 103-2221-E-011-104-MY2 at Department of Mechanical Engineering, National Taiwan University of Science and Technology.
- A. Song, G. Song, D. Constantinescu, L. Wang, and Q. Song, “Sensors for robotics 2015,” Journal of Sensors, vol. 2015, Article ID 412626, 2 pages, 2015.
- A. M. Zaki, O. Arafa, and S. I. Amer, “Microcontroller-based mobile robot positioning and obstacle avoidance,” Journal of Electrical Systems and Information Technology, vol. 1, no. 1, pp. 58–71, 2014.
- E. Masehian and Y. Katebi, “Sensor-based motion planning of wheeled mobile robots in unknown dynamic environments,” Journal of Intelligent and Robotic Systems: Theory and Applications, vol. 74, no. 3-4, pp. 893–914, 2014.
- J. Borenstein and Y. Koren, “Real-time obstacle avoidance for fast mobile robots,” IEEE Transactions on Systems, Man and Cybernetics, vol. 19, no. 5, pp. 1179–1187, 1989.
- I. Doroftei, V. Grosu, and V. Spinu, “Omnidirectional mobile robot—design and implementation,” in Bioinspiration and Robotics Walking and Climbing Robots, chapter 29, InTech, Rijeka, Croatia, 2007.
- A. Medina-Santiago, J. L. Camas-Anzueto, J. A. Vazquez-Feijoo, H. R. Hernández-De León, and R. Mota-Grajales, “Neural control system in obstacle avoidance in mobile robots using ultrasonic sensors,” Journal of Applied Research and Technology, vol. 12, no. 1, pp. 104–110, 2014.
- P. G. Zavlangas, S. G. Tzafestas, and K. Althoefer, “Fuzzy obstacle avoidance and navigation for omnidirectional mobile robots,” in Proceedings of the European Symposium on Intelligent Techniques, pp. 375–382, Aachen, Germany, 2000.
- H. Omrane, M. S. Masmoudi, and M. Masmoudi, “Fuzzy logic based control for autonomous mobile robot navigation,” Computational Intelligence and Neuroscience, vol. 2016, Article ID 9548482, 10 pages, 2016.
- Y. Kondo, T. Miyoshi, K. Terashima, and H. Kitagawa, “Navigation guidance control using haptic feedback for obstacle avoidance of omni-directional wheelchair,” in Proceedings of the Symposium on Haptics Interfaces for Virtual Environment and Teleoperator Systems, pp. 437–444, March 2008.
- D. Tuvshinjargal, B. Dorj, and D. J. Lee, “Hybrid motion planning method for autonomous robots using kinect based sensor fusion and virtual plane approach in dynamic environments,” Journal of Sensors, vol. 2015, Article ID 471052, 13 pages, 2015.
- S. Wen, W. Zheng, J. Zhu, X. Li, and S. Chen, “Elman fuzzy adaptive control for obstacle avoidance of mobile robots using hybrid force/position incorporation,” IEEE Transactions on Systems, Man and Cybernetics Part C: Applications and Reviews, vol. 42, no. 4, pp. 603–608, 2012.
- N.-S. Pai, H.-H. Hsieh, and Y.-C. Lai, “Implementation of obstacle-avoidance control for an autonomous omni-directional mobile robot based on extension theory,” Sensors, vol. 12, no. 10, pp. 13947–13963, 2012.
- P. F. Muir and C. P. Neuman, “Kinematic modeling of wheeled mobile robots,” Journal of Robotic Systems, vol. 4, pp. 281–340, 1987.
- Y. Leow, K. H. Low, and W. Loh, “Kinematic modelling and analysis of mobile robots with omni-directional wheels,” in Proceedings of the 7th International Conference on Control, Automation, Robotics and Vision (ICARC '02), pp. 820–825, December 2002.
- R. Siegwart, I. R. Nourbakhsh, and D. Scaramuzza, Autonomous Mobile Robots, Massachusetts Institute of Technology, 2004.
Copyright © 2017 Tesfaye Wakessa Gussu and Chyi-Yeu Lin. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.