About this Journal Submit a Manuscript Table of Contents
Mathematical Problems in Engineering
Volume 2013 (2013), Article ID 902013, 14 pages
http://dx.doi.org/10.1155/2013/902013
Research Article

Remote Teleoperated and Autonomous Mobile Security Robot Development in Ship Environment

Department of Applied Informatics Multimedia, Chia Nan University of Pharmacy & Science, Tainan 717, Taiwan

Received 20 February 2013; Accepted 21 March 2013

Academic Editor: Jia-Jang Wu

Copyright © 2013 Long-Yeu Chung. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

We propose a wireless remote teleoperated and autonomous mobile security robot based on a multisensor system to monitor the ship/cabin environment. By doing this, pilots in charge of monitoring can be away from the scene and feel like being at the site monitoring and responding to any potential safety problems. Also, this robot can be a supplementary device for safety cabin crew members who are very busy and/or very tired of properly responding to crises. This can make one crew member on duty at the cabin a possible option. In fact, when the robot detects something unusual in the cabin, it can also notify the pilot so that the pilot can teleoperate the robot to response to whatever is needed. As a result, a cabin without any crew members on duty can be achieved through this type of robot/system.

1. Introduction

To reduce whatever manpower and/or energy used in a ship, current ships are getting bigger and using fewer crew members. This can be achieved because the current technology is getting smarter and smarter. For example, the equipment built in a cabin can monitor different data for a safer and more efficient navigation, including the temperature, pressure, and ocean flow. Consequently, a one crew member is possible [1]. In fact, the most advanced ship, under certain condition regulated by the adoption of amendments to the International Convention for the Safety of Life at Sea (SOLAS) [2], can be operated without any crew member in the cabin: conditions such as notifying a crew member and pilot to signal any unusual problem in a ship as well as notifying next possible crew member when the previous crew member cannot respond to the problem. However, when the crew member(s) is tired or failing to respond to the problem, the pilot cannot tackle the problem directly by himself. To solve this problem, we propose a wireless remote teleoperated and autonomous mobile security robot based on a multisensor system.

The autonomous robot becomes a growing interest and curiosity subject and has been implemented widely in many popular areas. Traditionally, the robots have been used to help people in many fields of industrials and science such as semiconductor factories. In the near years, the robots are being carefully designed, expected to provide all kinds of service in human daily lives [36]. Basically, the developed robot in service has the following functions to perform such specifications and security service: autonomous navigation, supervision through internet, a remote operation utility, vision system, human-robot interaction (HRI), and so on.

Many researchers and institutions have developed various kinds of multifunction robots. The famous R&D groups MIT,  IROBOT in America and SONY, HONDA in Japan have invested in advance robotics such as rescue robots, military robots, service robots and so forth. The developments of robotics continually have been published in recent years by more and more research organizations. For example, ASIMO [7] is made by HONDA, and the University of Tokyo proposes Human Robot-HRP [8]. In Japan, security robots have become more and more popular with people and companies such as ALOSK [9], FSR [10], and SECOM [11].

Figure 1 shows the wireless remote teleoperated and autonomous mobile security robot platform in this investigation, and there are four major systems that are developed as shown in Figure 2. Through these four systems, the remote security mobile robot can execute some tasks autonomously or be controlled by pilot teleoperation to response to whatever is needed. The top-left blocks are the vision system. When the remote mobile security robot executes patrolling tasks in a ship environment, the CCD Camera will acquire the images with the pan-tilt-zoom utility. The top-right blocks are the environment sensing system. There are smoke and fire sensors installed in the mobile security robot. When these sensors detect any smoke or fire event, the mobile security robot will sound the alarm and send message to the pilot/console. Furthermore, if the mobile security robot has equipped a fire extinguisher, it may directly extinguish the fire via the remote teleportation with time efficiency. The bottom-left blocks are the user interface system. The bottom-right blocks are the navigation system. The wireless remote teleoperated and autonomous mobile security robots are designed with the teleoperated mode and autonomous mode. As we expect, the wireless remote teleoperated and autonomous mobile security robot has the ability to patrol in the ship/cabin environment; it must locate its position and plan its path to the destination automatically.

902013.fig.001
Figure 1: The wireless remote teleoperated and autonomous mobile security robot platform.
902013.fig.002
Figure 2: Four major subsystems of the wireless remote teleoperated and autonomous mobile security robot.

2. Sensory Configuration of Mobile Security Robot

2.1. Ultrasonic Sensor

In the motion planning system, the obstacle detection is a key issue. The mobile robot uses the ultrasonic sensors as shown in Figure 3 to detect obstacles of the environment.

fig3
Figure 3: The ultrasonic sensor includes (a) the transducer and (b) the driver circuit.

There are eight Polaroid 6500 ultrasonic range sensors distributed around the mobile robot for near obstacle detection. The ultrasonic sensors are controlled by a microprocessor. The beam width for each sensor is near 90 degree, and the range to an object is determined by the time of flight of an acoustic signal generated by the transducer and reflected by the object. The robot can use the sensors to detect the range information from 50 cm to 300 cm.

2.2. Laser Range Finder

The mobile security robot is equipped with a SICK-LMS100 laser ranger [12] with a detection range from 50 cm to 20 m. The field of view is 270° with 0.5° resolution. Because of its wide range and high-resolution advantages, the laser ranger is used for major environment sensing and self-localization mapping as shown in Figure 4(a).

fig4
Figure 4: (a) The SICK LMS-100 laser ranger and (b) the SONY-EVI CCD Camera.
2.3. PTZ Vision Camera

The CCD Camera, as shown in Figure 4(b), we used is Sony communication color video camera EVI-D70. Its features are listed below:  (i) 216x room ratio (18x Optical, 12x Digital); (ii) pan angle: −170° to +170° (max. pan speed: 100°/s); (iii) tilt angle: −30° to +90° (max. tilt speed: 90°/s); (iv) max resolution = 768 × 494.

The CCD Camera is connected with an image transformer. It can capture PAL real-time color images, providing 768 × 576 pixel image. And these images will be processed to let us get some information from them. After we got the information, we can send commands/instructions to control the CCD Camera pan, tilt, and zooming. For example, when the mobile robot notifies the pilot the information about the location of fire event or the robot detects something unusual in the cabin, the pilot/remote user can send the commends/instructions to control the CCD Camera for pan, tilt and zooming through the wireless network in ship.

2.4. Fire Detection Sensor

In the fire detection system, we need to use multiple various sensors to detect the fire accident and use the sensory data to generate a reliable fire-detected signal. Specifically, we employ a smoke sensor, a flame sensor, and a temperature sensor for fire detection. Even if one sensor is wrong, we can use the data from the others and isolate the wrong sensor signal and then obtain the right result by using adaptive sensory fusion method. More specifically, in the beginning, we use the ionization smoke sensor as shown in Figure 5(a). When smoke occurs, an ionizing radioactive source is brought close to the plates and the air itself is ionized.

fig5
Figure 5: (a) Ionization smoke detector, (b) flame sensor, (c) temperature sensor.

Figure 5(b) shows a flame sensor R2868, which is an UV TRON ultraviolet detector that makes use of the photoelectric effect of metal and the gas multiplication effect. It has a narrow spectral sensitivity of  185 to 260 nm, being completely insensitive to visible light. The temperature sensor is shown in Figure 5(c) which is the DS1821. It can work as a standalone thermostat with an operating temperature range from −55°C to +125°C. These signals of sensors must be processed to be binary digital by comparison circuits. An embedded system translates the analog signal to digital signal and sends to the host as shown in Figure 6.

902013.fig.006
Figure 6: Fire detection signal process.

3. Adaptive Sensory Fusion on Fire Detection

The adaptive data fusion method is proposed by Ansari [1316]. The computations of reinforcement updating rules are very complex, and it is difficult to design hardware modular, so a modify update rule using the Taylor expansion [17] is shown in Figure 7. S1, S2, and S3 represent the smoke sensor, flame sensor, and temperature sensor. When these sensors detect fire event, these sensor signals must be high or low.

902013.fig.007
Figure 7: Adaptive sensory fusion on fire detection.

The updating rule of the fusion algorithm is shown as follows: where and represent the weight value after and before each update.

4. Autonomous Navigation of Mobile Security Robot

4.1. Local Obstacle Avoidance Using Tangent Bug

The tangent bug algorithm [18, 19] is an obstacle avoidance algorithm. The robot keeps moving to the goal until encountering an obstacle. When encountering an obstacle, the robot will follow the edge of the obstacle until it can link itself to the goal without crossing the obstacle. Again, the robot keeps moving to the goal until encountering another obstacle. Figure 8 shows the representation, where is the discontinuous point, is the circular scope, and is the center of the robot.

902013.fig.008
Figure 8: is the discontinuous point.

The detail of this algorithm is described as shown in Algorithm 1.

alg1
Algorithm 1: Tangent bug algorithm.

Figures 9 and 10 show the mobile robot with different behaviors on different sensor ranges. In Figure 9, the edge following behavior is demonstrated. In Figure 10, the sensing range is very far so the tangent bug can make a better choice for self-path planning.

902013.fig.009
Figure 9: Tangent bug path with zero sensing range.
902013.fig.0010
Figure 10: Tangent bug with infinite sensing range.
4.2. Global Path Planning Using D*

The purpose of path planning algorithm is to solve problems, where a robot has to be navigated to given goal coordinates in a map-based environment. It makes assumptions about the unknown part of the terrain and finds a shortest path from its current coordinates to the goal coordinates under these assumptions. Furthermore, the key feature of   is that it supports incremental replanning (Figure 11). This is important if, while we are moving, we discover that the world is different from our map. If we discover that a route has a higher than expected cost or is completely blocked, we can incrementally replan to find a better path.

902013.fig.0011
Figure 11: path planning flow chart.

The original [20], by Stentz, is an informed incremental search algorithm. Focused [21] is an informed incremental heuristic search algorithm by Stentz that combines ideas of   [22] and the original . Focused resulted from a further development of the original .   Lite [23] is an incremental heuristic search algorithm by Koenig and Likhachev that builds on [23], an incremental heuristic search algorithm that combines ideas of and Dynamic SWSFFP [24].

4.3. Landmark Extraction from Laser Ranger

The data set of a 2D range sensor is represented by points  , and the points are sequentially acquired by the laser range sensor with a given angle and distance in polar coordinates. These points can be transferred to the Cartesian coordinates as . For an environment line feature extraction purpose, the first step is to apply an Iterative End Point Fit [25] on the data set . The IEPF will recursively split    into two subsets and while a validation criterion distance is satisfied from point to the virtual line segment consisted of . Through the iteration, IEPF function will return all segment endpoints ,  . Figure 12(a) shows IEPF results when the laser scanning is near a corner. Because the vertex of corner is beyond a distance measurement, IEPF will lead out three line segments. Obviously, the shortest line segment is not a real feature candidate. In this work, a modified IEPF with a weighting threshold is shown: where the is the angle between endpoints and is the resolution of laser ranger. When the number of points between is beyond 80% of the threshold, then the line segment extraction from IEPF is valid. Otherwise, the line segment is inactive. Figure 12(b) shows the weighting threshold result.

fig12
Figure 12: The IEPF result (a) without weighting and (b) with weighting.
4.4. Robust Particle Filter Localization

Particle filter, also called CONDENSATION (conditional density propagation) [26], is a method based on Monte-Carlo and Bayesian. Unlike Kalman filter [27], particle filter is a nonliner filter. As the particle filter has the property of random sampling, it is appropriated to be used in tracking and localization problems. The advantage of particle filter is that it can get rid of noises, background noises. However, it suffers from using a large amount processing time in prediction. In other words, it takes considerable computational time. For mobile robots, we distinguish two types of data: perceptual data such as laser range measurements and odometry data or control which carries information about robot motion. Denoting the former by and the latter by , we have Without loss of generality, we assume that observations and actions occur in an alternating sequence. Note that the most recent perception in is , whereas the most recent control odometry reading is .

Bayes filters estimate the belief recursively. The initial belief characterizes the initial knowledge about the system state. In the absence of such knowledge (e.g., global localization), it is typically initialized by a uniform distribution over the state space.

To derive a recursive update equation, we use Expression (3) transformed by Bayes rule: The Markov assumption states that measurements are conditionally independent of past measurements and odometry readings given knowledge of the state : This allows us to conveniently simplify Equation To obtain the final recursive form, we now have to integrate out the pose at time which yields The Markov assumption also implies that given knowledge of and , the state is conditionally independent of past measurements and odometry readings up to time : Using the definition of the belief Bel, we obtain a recursive estimator known as Bayes filter: where is a normalizing constant. The sequence of particle filter for mobile robot self-localization is shown in Algorithm 2, where is robot posture in time sequence , is the robot motion command, and is the robot measurement.

alg2
Algorithm 2: Particle filter algorithm for mobile robot localization.

5. Remote Telepresence Operation

The wireless remote teleoperated and autonomous mobile security robots are designed with the teleoperated mode and autonomous mode. In order to implement the teleoperation, a pilot/an operator must be able to sense the remote environment and send commands according to the transmitted image sequence or sensor information. Joystick provides a convention communication interface for motion control. The remote view is visible. As the remote site environment image captured by wireless remote teleoperated and autonomous mobile robots, the pilot freely controls these robots to detect and monitor something unusual in the cabin. Remote images are compressed by using the H.263 Codec technique, providing a low-bitrates compressed format video and acceptable video quality that was originally developed by the ITU-T Video Coding Experts Group (VCEG). It is a member of the H.26x family of video coding standards in the domain of the ITU-T. Since the time-delay and packet-lose are important issues, one of the most important challenges in the remote operation is how to solve the problems caused by time-delay of the communication. Real-Time Protocol (RTP) is a transport protocol that meets the requirements of voice and other real time data.

The basic features in RTP are(1)sequence number-reorder packets;(2)timestamp (synchronization): tell receiver what is corresponding display time and using timestamp to provide relationship between display time of different media;(3)multiple source identifier ability to support multiple media bit streams simultaneously.

The remote teleoperated mode user interface is shown in Figure 13. Several studies which have been made on RGB color model were sensitive to light and not suitable as the color threshold value. From the experiment experience, HSI color model has a high detect ratio.

902013.fig.0013
Figure 13: Remote teleoperated mode robot operation interfaces.

6. Experiments

6.1. Remote Telepresence Operation and Environment Sensing

The remote camera pan-tilt control and laser ranger environment sensed by the mobile security robot platform are shown in Figures 14 and 15. The experiment scenario shows that the security robot is moving in the free space and detects the fire event as shown in Figure 16. When a fire event starts up, the security robot can detect it and send a warning message immediately. If the fire detection is true, the security robot will sound the alarm and send the message to notify the pilot/remote security guard so that the pilot can teleoperate the robot to response to whatever is needed.

902013.fig.0014
Figure 14: Remote camera pan-tilt control.
902013.fig.0015
Figure 15: Environment mapping of laser ranger.
fig16
Figure 16: Remote environment sensing: (a) Normal State; (b) a flame is detected.
6.2. The Comparison of Mobile Robot Path Planning

Figure 17 shows the result of path planning of tangent bug. The start position is at (40, 10) and the goal is at (30, 30). For the same map space, Figure 18 shows the path planning result. Obviously, the has the shortest path planning result without obstacle following.

902013.fig.0017
Figure 17: Tangent bug local path planning result.
902013.fig.0018
Figure 18: Shortest global path planning result.
6.3. Mobile Robot Self-Localization

For the particle filter localization experiment, we apply a particle filter estimator [28] which is configured with 1000 particles. Figure 19(a) shows that the particles (the green points) are initially uniformly distributed over the map space which has the associated landmark features that can be identified (the black diamond). In timestamp 20, all the particles are located near the real position as shown in Figure 19(d).

fig19
Figure 19: Evolution of the particle cloud (green dots) from timestamp = 1 to timestamp = 20. The robot is shown as a red triangle.

Figure 20(a) shows the ground truth (blue) and estimated (red) trajectory of the robot. Besides, after the timestamp 20, the localization deviation is always less than 0.5 m.

fig20
Figure 20: Particle filter localization results.
6.4. Particle Filter Performance Comparison

In this experiment, the particle filter performance is compared with various particles as shown in Figure 21. The ground truth (blue) and estimated trajectory (red) are represented in the robot trajectory within 1000 simulation step. When the particles are less than 100 such as in Figures 21(a) and 21(b), the localization result will always cause errors. However, when the particles are more than 400, the localization results will be more accurate, but the computational time is near exponential growth with the particle number such as in Figures 21(d)21(f). One interesting property can be noticed: as more particles occur, the initial localization will converge to the actual position as shown. Thus, one may first add the particle number in the initial localization step for fast localization convergency and then reduce the particle number for computational time saving.

fig21
Figure 21: Particle filter performance comparison with different particles.

Figure 22 shows the localization results from different range sensor uncertainties by using the same particle number (particle no. = 500). In Figure 22(a), the range sensor uncertainty is set as  m (distance) and (angle). In Figure 22(b), the range sensor uncertainty is set as  m (distance) and . Obviously, with higher range sensor resolution, the particle filter localization result will be closer to the actual position.

fig22
Figure 22: Particle filter performance comparison with different sensor uncertainties.

7. Conclusions and Future Works

From the past data, most shipping accidents occur because of the improper response from the crew members, causing the irreversible damages and/or injuries. To avoid further damages and/or injuries, a remote mobile security system can be installed to improve these problems. In this paper, we propose a wireless teleoperated mobile security robot based on a multisensor system to monitor the ship/cabin environment. By doing this, the pilot as well as the crew members in charge of monitoring can be away from the scene and feel like being at the site monitoring and responding to any potential safety problems. Also, this robot can be a supplementary device for safety crew member(s) who are very busy and/or very tired to properly response to crisis. This can make one man on duty at the cabin a possible option. In fact, when the robot detects something unusual in the cabin, it can also be teleoperated to response to whatever is needed. As a result, when carefully designed, a cabin without crew member on duty can be possible. In fact, with this capacity, this robot will not affect any current safety procedures. On the contrary, it can be a supplementary device for a better safety cabin environment.

For future research, the robot can be equipped with toxic gas detectors to detect any leakage problems.

References

  1. C.-Y. Pao, Research on ship condition monitoring and integrated alarm system [M.S. thesis], National Kaohsiung Marine University, 2011.
  2. International Convention for the Safety of Life at Sea(SOLAS), “Resolution MSC128(75): Performance Standards for a Bridge Navigational Watch Alarm System(BNWAS),” 2002.
  3. R. C. Luo, T. Y. Hsu, P. K. Wang, and K. L. Su, “Multisensors based on dynamic navigation for an intelligent security robot,” in Proceedings of the IEEE International Conference on Automation Technology, pp. 504–509, May 2005.
  4. R. C. Luo and C. C. Lai, “Enriched indoor map construction based on multisensor fusion approach for intelligent service robot,” IEEE Transactions on Industrial Electronics, vol. 59, no. 8, pp. 3135–3145, 2012.
  5. “iRobot Corporation Home Page,” http://www.irobot.com/.
  6. C.-C. Tsai, H.-C. Huang, and C.-K. Chan, “Parallel elite genetic algorithm and its application to global path planning for autonomous robot navigation,” IEEE Transactions on Industrial Electronics, vol. 58, no. 10, pp. 4813–4821, 2011.
  7. L. Sentis, J. Park, and O. Khatib, “Compliant control of multicontact and center-of-mass behaviors in humanoid robots,” IEEE Transactions on Robotics, vol. 26, no. 3, pp. 483–501, 2010. View at Publisher · View at Google Scholar · View at Scopus
  8. M. Sreenivasa, P. Soueres, and J.-P. Laumond, “Walking to grasp: modeling of human movements as invariants and an application to humanoid robotics,” IEEE Transactions on Systems, Man and Cybernetics A, vol. 42, no. 4, pp. 880–893, 2012.
  9. http://pc.watch.impress.co.jp/docs/2005/0624/alsok.htm.
  10. http://www.robotdiy.com/.
  11. http://www.secom.com.tw/.
  12. M. Alwan, M. B. Wagner, G. Wasson, and P. Sheth, “Characterization of infrared range-finder PBS-03JN for 2-D mapping,” in Proceedings of the IEEE International Conference on Robotics and Automation, pp. 3936–3941, April 2005. View at Publisher · View at Google Scholar · View at Scopus
  13. J. G. Chen, N. Ansari, and Z. Siveski, “Improving multiuser detection performance by data fusion,” in Proceedings of the IEEE Communications Theory Mini-Conference, vol. 3, pp. 1794–1798, November 1996. View at Scopus
  14. N. Ansari, J. G. Chen, and Y. Z. Zhang, “Adaptive decision fusion for unequiprobable sources,” IEEE Proceedings, vol. 144, no. 3, pp. 105–111, 1997.
  15. R. C. Luo, C. C. Yih, and K. L. Su, “Multisensor fusion and integration: approaches, applications, and future research directions,” IEEE Sensors Journal, vol. 2, no. 2, pp. 107–119, 2002. View at Publisher · View at Google Scholar · View at Scopus
  16. R. C. Luo, C. C. Chang, and C. C. Lai, “Multisensor fusion and integration: theories, Applications, and its Perspectives,” IEEE Sensors Journal, vol. 11, no. 12, pp. 3122–3138, 2011.
  17. R. C. Luo, T. Y. Lin, H. C. Chen, and K. L. Su, “Multisensor based security robot system for intelligent building,” in Proceedings of the IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI '06), pp. 408–413, September 2006. View at Publisher · View at Google Scholar · View at Scopus
  18. H. Choset, K. Lynch, S. Hutchinson et al., Principles of Robot Motion: Theory, Algorithms, and Implementation, 2005.
  19. R. C. Luo, Y. T. Chou, C. H. Tsai, and C. T. Liao, “Motion planning for security robot using tangent bug method,” in Proceedings of the International Conference on Automation Technology, June 2007.
  20. A. Stentz, “Optimal and efficient path planning for partially-known environments,” in Proceedings of the IEEE International Conference on Robotics and Automation, vol. 4, pp. 3310–3317, May 1994. View at Scopus
  21. A. Stentz, “The focussed D* algorithm for real-time replanning,” in Proceedings of the International Joint Conference on Artificial Intelligence, pp. 1652–1659.
  22. P. Hart, N. Nilsson, and B. Raphael, “A formal basis for the heuristic determination of minimum cost paths,” IEEE Transactions on Systems Science and Cybernetics, vol. 4, no. 2, pp. 100–107, 1968.
  23. S. Koenig and M. Likhachev, “Fast replanning for navigation in unknown terrain,” IEEE Transactions on Robotics, vol. 21, no. 3, pp. 354–363, 2005. View at Publisher · View at Google Scholar · View at Scopus
  24. G. Ramalingam and T. Reps, “An incremental algorithm for a generalization of the shortest-path problem,” Journal of Algorithms, vol. 21, no. 2, pp. 267–305, 1996. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  25. G. A. Borges and M. J. Aldon, “Line extraction in 2D range images for mobile robotics,” Journal of Intelligent and Robotic Systems, vol. 40, no. 3, pp. 267–297, 2004. View at Publisher · View at Google Scholar · View at Scopus
  26. P. Yang, “Efficient particle filter algorithm for ultrasonic sensor-based 2D range-only simultaneous localization and mapping application,” IET Wireless Sensor Systems, vol. 2, no. 4, pp. 394–401, 2012.
  27. S. Y. Chen, “Kalman filter for robot vision: a survey,” IEEE Transactions on Industrial Electronics, vol. 59, no. 11, pp. 4409–4420, 2012.
  28. P. I. Corke, “A robotics toolbox for MATLAB,” IEEE Robotics and Automation Magazine, vol. 3, no. 1, pp. 24–32, 1996. View at Scopus