- About this Journal ·
- Abstracting and Indexing ·
- Aims and Scope ·
- Annual Issues ·
- Article Processing Charges ·
- Author Guidelines ·
- Bibliographic Information ·
- Citations to this Journal ·
- Contact Information ·
- Editorial Board ·
- Editorial Workflow ·
- Free eTOC Alerts ·
- Publication Ethics ·
- Recently Accepted Articles ·
- Reviewers Acknowledgment ·
- Submit a Manuscript ·
- Subscription Information ·
- Table of Contents
Journal of Robotics
Volume 2011 (2011), Article ID 691769, 9 pages
High-Speed Tactile Sensing for Array-Type Tactile Sensor and Object Manipulation Based on Tactile Information
1Department of Computer Science and Systems Engineering, Graduate School of Engineering, Kobe University, 1-1 Rokkodai-cho, Nada-ku, Kobe 657-8501, Japan
2Hyogo Prefectural Institute of Technology, Hyogo 654-0037, Japan
3Maeda Precision Mfg, Co., Ltd., Kobe 650-0017, Japan
4Hiroshima International University, Kure 737-0112, Japan
5The Advanced Materials Processing Institute Kinki Japan, Hyogo 660-0083, Japan
Received 14 July 2011; Revised 11 October 2011; Accepted 18 October 2011
Academic Editor: Yangmin Li
Copyright © 2011 Wataru Fukui et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
We have developed a universal robot hand with tactile and other sensors. An array-type tactile sensor is crucial for dexterous manipulation of objects using a robotic hand, since this sensor can measure the pressure distribution on finger pads. The sensor has a very high resolution, and the shape of a grasped object can be classified by using this sensor. The more the number of measurement points provided, the higher the accuracy of the classification, but with a corresponding lengthening of the measurement cycle. In this paper, the problem of slow response time is resolved by using software for an array-type tactile sensor with high resolution that emulates the human sensor system. The validity of the proposed method is demonstrated through experiments.
Various humanoid robots have been developed for helping people working in a human-living environment . However, these robots cannot work like humans because these robots do not have enough elements to permit them work in a human-living environment. One of these missing elements is a robotic hand, which humanoid robots need in order to work in and use tools in a human-living environment. The human hand has five multijointed fingers, and the fingers have various sense organs that are both tactile and kinesthetic. To develop such a robot hand, research on multifingered robot hands have been conducted so far. The Utah/MIT Dexterous Hand has four fingers with four joints driven by tendon cables operated by pneumatic actuators . A full tactile sensing suite has also been developed for this hand . The sensor consists of capacitance sensing, and it is suitable for use in contact force control. However, there is difficulty in maintaining a high degree of accuracy in position control, because the tendon cable has a tendency to stretch. The Gifu Hand has 20 joints with 16 degrees of freedom (DOF). This robot hand is light and has six-axis force sensors and tactile sensors with a conductive ink. The six-axis force sensors are mainly used to control the robot hand. In the robot arm study, the surface tracking control using tactile sensing is studied [4, 5]. However, the tactile sensors are not actively used to control these robot hands [6–8]. Although several robot hands have been developed [9–11], there remain many problems in realizing dexterous motions using the tactile sensors.
We have researched and developed a universal robot hand with five multijointed fingers, as shown in Figure 1 [12–15]. The universal robot hand has torque sensors in the joints, encoders in the joints, and array-type tactile sensors on the finger pads. These array-type tactile sensors can measure the pressure distribution on finger pads. These sensors have a very high resolution, enabling them to classify the shape of a grasped object . The more the number of measurement points provided, the higher the accuracy of the classification, but with a corresponding lengthening of the measurement cycle. This is not a problem in the robot arm study [4, 5], but it is a problem in the robot hand study because of the large installation surface for the array-type tactile sensors. There is no basic solution, such as making the circuit faster. The spatial resolution needs to be changed to suit the intended purpose. In this study, we propose a haptic sensing system with active perception to solve this problem. This is an active system that selects sensors with useful information for discriminating the target object . This system can be applied in a fixed condition but cannot be applied to a multifingered robot hand because of the dynamic situation. During development, we found a physiology clue to help change spatial resolution. A human has tactile acceptors over the entire body, with 17,000 acceptors on the palm alone. Neural transmission speed is slower than signal transmission speed in an electric circuit, and much more time is required to sweep all acceptors. Thus, a person tends to confine acceptors as external stimuli.
In this paper, this responsiveness problem is resolved for our array-type tactile sensor with high resolution by using software to emulate the human sensor system. Before a discussion about the software system, the grasped objects are classified into two groups depending upon the contact condition as perceived by the tactile sensors. In the first group, it is assumed that a palm-sized object is to be grasped. In this case, the robot hand grasps the object by using three or more fingers, each with one-point contact . Thus, only a small area around the contact point needs to be swept. The measurement point of the array-type tactile sensor is assumed to be the individual of a genetic algorithm (GA). The individual has its own coordinates and fitness is the pressure value. The number of individuals is less than the number of all measurement points and the individuals are converged with the area around the contact point. Details of the genetic-based tactile sensing are described in Section 3.1. In the second group, it is assumed that an object smaller than a finger pad is to be grasped. In this case, it is supposed that there are objects on the finger pad, and that there is multiarea contact on the tactile sensor. However, the proposed method described above cannot be applied to a multiarea contact. Thus, the proposed method is extended in order to detect multi-area contact. The individuals are divided into groups, and the contact information (position and measurement data) is shared among these groups. The shared information from other groups is underestimated purposely so that individuals do not converge on the same area. The individuals of each group converge exclusively with each area. Details of the multicontact tactile sensing are described in Section 3.2. By using this method, high-speed tactile sensing is achieved with high resolution, and object manipulation based on this tactile information is achieved by using the universal robot hand.
2. Universal Robot Hand
2.1. Universal Robot Hand System
The universal robot hand has three movable fingers, two immovable fingers, and a palm. Each finger is 333.7 mm long and has four joints. Each joint is driven by a miniaturized DC motor with a rotary encoder and reduction gears (Harmonic Drive Systems Inc.) built into each link. In addition, two fingertip joints, joints 3 and 4, drive at equal ratios as in a human finger, so each finger has three DOFs. Each finger has three joint torque sensors to measure force in the vertical direction on the finger pads, as well as tactile sensors on the finger pads. The tactile sensors are attached on the ventral side of each link.
The control system of the universal robot hand consists of three units: a motion control unit, a tactile sensor processing unit, and a user interface unit, as shown in Figure 2. The motion control unit and the tactile sensor processing unit are driven on RT-Linux, and the user interface unit is driven on Windows. The units are connected through a local area network. The motion control unit acquires torque information from the torque sensors and the rotation angle of the motors and controls the finger joints according to the information on the motors’ rotation angle, torque information, and tactile information. The tactile sensor processing unit specifies the measurement points of the tactile sensors and preprocessing tactile information from tactile sensors. The user interface unit shows the conditions of the universal robot hand including sensor information.
2.2. Tactile Sensors
The universal robot hand has tactile sensors as shown in Figure 3. This sensor has a three-layered structure; urethane gel (EXSEAL Co.), pressure-sensitive rubber (INABA RUBBER Co., Ltd.), and an electrode pattern sheet. This simple arrangement provides high productivity and easy maintenance. The firmness of the urethane gel is ASKER-C 15 and its thickness is 2.5 mm. When an object comes into contact with this sensor, the force and shape of the contact surface deforms the urethane gel. This deformation creates a pressure distribution in the pressure-sensitive rubber. The pressure sensitive rubber is about 0.5 mm thick. When this rubber receives pressure and becomes thin, electric resistance in that region drops. By measuring this electric resistance, the pressure in the region can be calculated. Measuring the electric resistance of the pressure-sensitive rubber requires two electrodes. One electrode induces voltage and the other electrode receives current. We designed these electrodes on a multilayer flexible board of polyimide to simplify the sensor structure.
The tactile sensors are attached on the ventral side of the three links of the finger, as shown in Figure 4. There are 102 measurement points on a fingertip, and the other two parts of a finger have 70 points. Each measurement point is 3.4 mm × 1.8 mm, and the points are separated by a gap of 0.2 mm. Each measurement point can measure the vertical force (approximately 1 N). Wiring from the tactile sensor is connected to the tactile sensor unit via the sensor control circuit, which consists of an amplifier and a multiplexer circuit. The tactile sensor unit selects measurement points via the sensor control circuit and measures pressures at the measurement points. Then, the tactile sensor processing unit receives pressures through an analogue-to-digital (A/D) conversion board with a resolution of 12 bits.
3. Multicontact Recognition with Genetic Based Tactile Measurement
3.1. Genetic-Based Tactile Measurement
The array-type tactile sensor on each fingertip has 102 measurement points and the pressure distribution can be acquired from these measurement points. The measurement of one point requires about 0.2 ms to acquire the pressure value of the developed universal robot hand system. The measurement of all points requires about 20 ms. This measurement cycle is very slow for realtime control. Thus, only some measurement points are selected, so the measurement time is decreased and the tactile information is acquired quickly. A GA [19, 20] is applied for this determining measurement points. A measurement point is defined as an individual of the GA, the gene of the individual is the position information and the fitness of the individual is the pressure value of the measurement point. Individuals are reproduced by genetic operation: crossover, mutation, and selection [19, 20]. The determined measurement, points converge on the area with the highest pressure value (the fitness), that is, the contact area. The individuals are generated by mutation according to the following method: where is the measurement point of the individual at the next generation and is the normal distribution with the mean value and the dispersion . The dispersion is calculated as follows by using the pressure value at the individual : where is the pressure value at . is the upper limit of the tactile sensor.
The outline of tactile sensing based on the GA is shown in Figure 5. Figure 5(a) shows the initial individuals that are randomly generated, and the selected points are randomly moved until the object touches a tactile sensor (when the measured value transgresses the threshold). When object contact is made, individuals converge around the area with the highest fitness (the measured pressure), as shown in Figure 5(b). If the object is missed, the selected points are randomly moved again, as shown in Figure 5(c). As described above, the tactile sensing system based on the GA can measure the pressure distribution around the contact position. The number of the determined points is less than the number of all points, and the measurement cycle of the tactile sensor is shortened.
In the case of one-point contact, the tactile sensor can measure the pressure spread around the contact position by means of the urethane gel of the tactile sensor. In the case of line contact, the selected points assemble in a row and the sensor can measure the edge contour. In the case of plane contact, the selected points spread throughout the contact plane. The pressures among the measured points are filled up easily, and the plane contact is measured, because the pressure is spread by the urethane gel and the pressure distribution is smooth.
3.2. Multicontact Recognition
In the case of a palm-sized object, the robot hand grasps the object by using three or more fingers each with one-point contact , and the pressure distribution around the contact area is measured by the tactile sensing system, as described in Section 3.1. In the case of a fingertip-sized object, as shown by Figure 6, there may be multiple objects on the finger pad, and these multiple contacts must be recognized. In this section, a multicontact recognition method based on the GA is proposed. This method is an extension of the selection method described in Section 3.1.
Before the proposed method is described, the number of recognized contacts is discussed. In the case of a task such as picking up a screw, the state of picking up three or more screws is recognized only as a state of picking up multiple screws, and the number of screws is not recognized. It is clear that a human being does not attach importance to picking up three or more objects, considering the two-point recognition threshold (4 mm) and the area of the finger pad. In this paper, it is proposed that multicontact recognition assumes two points of contact.
In multicontact recognition, the measurement points determined by genetic-based tactile measurement are divided into some groups according to the positions of the measurement points. Figure 7 shows the procedure for multicontact recognition. In Figure 7(a), the measurement values of all the measurement points are shown when two objects are in contact with the universal robot hand. Here, the number of individuals (measurement points) is adjusted to 10, and each group has five measurement points. Every point measures at the same time, and the overlap of these points is not permitted. The fitness value is calculated as follows: is the evaluation value of the measurement point (the individual) for the set is the measurement pressure at individual is the set of the individual belonging to group , and is the constant number (). This undervalued fitness is shared among all individuals. The evaluation value map for each individual is different according to the belonging group (Figures 7(b) and 7(c)).
Individuals are reproduced by genetic operation (crossover, mutation, and selection) same as Section 3.1. In one-point crossover, the individual is recombined in all individuals including another group. Although every measurement point is evaluated higher than a nonmeasurement point, belonging to another group is evaluated lower than original fitness. Thus, different locations can be settled without settling on the position measured by another group.
3.3. Response Experiments in Multicontact Recognition
To confirm the effectiveness of multicontact recognition, we conducted experiments. Figures 8 and 9 show the first experiment. In this experiment, the universal robot hand makes no contact with an object until about 1500 ms. After about 1500 ms, the universal robot hand makes contact with two objects. The measurement points and the contact position on the -axis of the tactile sensor are shown in Figure 9. From this figure, the measurement points are determined randomly until about 1500 ms, then the measurement points get concentrated at the two contact points. Figures 10 and 11 show the second experiment. In this experiment, the universal robot hand makes contact with two objects until 1500 ms and with one object after 1500 ms. The results are shown in Figure 11. In this figure, the measurement points get concentrated at one to two contact points. Consequently, the tactile sensor processing unit can detect multiple contact points by means of multicontact recognition.
Next, to show the response of multicontact recognition, a simulation experiment is conducted. In this experiment, tactile data for all measurement points are collected beforehand, and these data are applied to the proposed technique, and it is determined by how many steps the contact is detected. These given data are alternately repeated data: noncontact condition and two-contact condition. Each condition is maintained for 100 steps, this cycle is repeated 50 times, and the delay time (delay step: DS) is measured from the contact step. Figure 12 shows graph excerpts for five cycles of two contacts. The upper figure is the contact position (group 1) on the -axis of the tactile sensor, and the lower is the contact position (group 2). Each graph depicts the process on the -axis (average) for each group. The given data are indicated by a dashed line. In the results, the delay time is 1.3 steps on average, that is about 6 ms, considering that the measurement cycle is 5 ms. Using the method of using all measurement points, the delay time is 20 ms. Two-contact recognition can be obviously acquired at a higher speed compared with the method of using all the measurement points.
4. Objects Manipulation Based on Multicontact Recognition with Universal Robot Hand
4.1. Outline of Objects Manipulation
The motion control unit controls the universal robot hand by using torque information, the joint angles, and the tactile information from the tactile sensor processing unit. For the universal robot hand to manipulate multiple objects, the unit has to control the universal robot hand according to the contact information for the multiple objects. Therefore, object manipulation by using multicontact recognition is conducted. This paper deals with the rotation manipulation by the universal robot hand. Here, as an example of the above-mentioned operation, the universal robot hand rotates a washer and a bolt in the same manner as a human hand, as shown in Figure 13.
The universal robot hand can recognize the contact positions and contact pressures on tactile sensors by multicontact recognition. According to the contact positions of multiple objects, the system controls individual fingers by using object manipulation. In general, the radius of the washer is larger than that of the bolt. If the universal robot hand rotates the washer and the bolt simultaneously, the robot hand cannot rotate the bolt effectively. That is, the robot hand needs to recognize the contact of the washer and the bolt simultaneously and rotate only the bolt.
The universal robot hand recognizes the contact positions and pressures of the washer and the bolt on its tactile sensors by using multicontact recognition. Here, the pressures from the washer are higher than those from the bolt because the radius of the washer is larger than that of the bolt. By measuring each pressure, the robot hand distinguishes between the washer and the bolt. Then, the motion control unit controls the universal robot hand without making contact with the washer by adducting or abducting the index finger as shown in Figure 14. Finally, the universal robot hand rotates the bolt alone. Here, the motion control unit controls the universal robot along the trajectory of the fingertip of the index finger and the thumb generated in advance. In this experiment, the multicontact recognition method is used from the start to the estimation the screw disk. After the estimation, the robot hand rotates the screw disk without the tactile information. It is verified that the proposed method is used in the real-time control by this experiment.
4.2. Experiment with Universal Robot Hand
The proposed multi-contact recognition and multiobject manipulation have been applied to a rotation experiment with the universal robot hand system. In this experiment, two disks with different radii are used, one (the red disk) is an imitation of a washer with a larger radius, and the other (blue disk) is the imitation of a bold with a smaller radius (Figure 15). Here, the radii of the two disks are unknown parameters. Figure 15 shows the contact situation of the two disks by the universal robot hand. In this experiment, the index finger and the thumb are used in order to rotate the two disks.
Photographs of rotation manipulation of the two disks by the universal robot hand are shown in Figure 16. First, the thumb was brought into contact with the disks, but the index finger was not brought into contact, as shown in the upper left figure. Second, the motion control unit controls the index finger by making contact with the two disks, as shown in the upper middle figure. After making contact with the two disks, the system detects the contact conditions of the two disks by using multi-contact recognition. Third, according to the contact conditions, the motion control unit adducts or abducts the index finger without making contact with the larger disk (the washer), as shown in the upper right figure. Then, the motion control unit controls the index finger again and confirms that the index finger is in contact with only one disk (the bolt), as shown in the lower left figure. Then, the universal robot hand rotates one disk (the bolt) as shown in the lower middle and right figures.
To control a universal robot hand dexterously, it is very important to measure the tactile information provided by the tactile sensor at a high resolution. However, tactile information with a high resolution requires long input time and processing time. We proposed a genetic-based tactile sensing method. This method is based on a genetic algorithm. By using this method, the number of measurement points is reduced, the input time is shortened, and the processing time is shortened. Thus, we achieved high-speed tactile sensing using an array-type tactile sensor at a high resolution. In this paper, the effectiveness of the proposed method was experimentally confirmed.
In future, the proposed high-speed tactile sensing method will be applied to dexterous object manipulation.
- K. Kaneko, F. Kanehiro, S. Kajita et al., “Design of prototype humanoid robotics platform for HRP,” in Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 2431–2436, October 2002.
- J. M. Hollerbach and S. C. Jacobsen, “Anthropomorphic robots and human interactions,” in Proceedings of the 1st International Symposium on Humanoid Robots, pp. 83–91, 1996.
- D. Johnston, P. Zhang, J. Hollerbach, and S. Jacobsen, “A full tactile sensing suite for dextrous robot hands and use in contact force control,” in Proceedings of the 13th IEEE International Conference on Robotics and Automation. Part 1 (of 4), pp. 3222–3227, April 1996.
- J. G. Wang and Y. Li, “Tracking control of a redundant manipulator with the assistance of tactile sensing,” Intelligent Automation & Soft Computing, vol. 17, no. 7, pp. 833–845, 2011.
- J. G. Wang and Y. Li, “Surface-tracking of a 5-DOF manipulator equipped with tactile sensors,” in Proceedings of the 11th International Conference on Control, Automation, Robotics and Vision, pp. 2448–2453, 2010.
- T. Mouri, H. Kawasaki, K. Yoshikawa, J. Takai, and S. Ito, “Anthropomorphic robot hand: gifu hand III,” in Proceedings of the International Conference on Control Automation and Systems, pp. 1288–1293, 2002.
- T. Mouri, H. Kawasaki, and K. Umebayashi, “Developments of new anthropomorphic robot hand and its master slave system,” in Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 3474–3479, 2005.
- S. Ueki, H. Kawasaki, and T. Mouri, “Adaptive coordinated control of multi-fingered hands with rolling contact,” in Proceedings of the SICE Annual Conference, pp. 852–857, August 2005.
- H. Iwawata, T. Hayashi, Y. Shiozawa, et al., “Mechanism design of human mimetic robotic hands with passivity in skins and joints,” in Proceedings of the JSME Conference on Robotics and Mechatronics, 2008.
- D. Kikuchi, T. Kasai, H. Chyon, I. H. Kim, and S. Sugano, “Stabilizing handling controller using machine learning for anthropomorphic hands with passivity,” in Proceedings of the 27th Annual Conference of the Robotic Society of Japan, 2009.
- M. Higashimori, H. Jeong, I. Ishii, A. Namiki, M. Ishikawa, and M. Kaneko, “Development of four-fingered robot hand with dual turning mechanism,” Journal of the Robotics Society of Japan, vol. 24, no. 7, pp. 813–881, 2006.
- H. Nakamoto, F. Kobayashi, N. Imamura, and H. Shirasawa, “Universal robot hand equipped with tactile and joint torque sensors (development and experiments on stiffness control and object recognition),” in Proceedings of the 10th World Multi-Conference on Systemics Cybernetics and Informatics, vol. 2, pp. 347–352, 2006.
- H. Nakamoto, F. Kobayashi, N. Imamura, H. Shirasawa, and F. Kojima, “Shape classification in continuous rotation manipulation by universal robot hand,” Journal of Advanced Computational Intelligence and Intelligent Informatics, vol. 13, no. 3, pp. 178–184, 2009.
- Y. Saitou, F. Kobayashi, F. Kojima, et al., “Haptic feedback in universal robot hand tele-operation,” in Proceedings of the Joint 4th International Conference on Soft Computing and Intelligent Systems and 9th International Symposium on advanced Intelligent Systems, pp. 1123–1128, 2008.
- N. Imamura, M. Kaneko, K. Yokoi, and K. Tanie, “Development of a two-fingered robot hand with compliance adjustment capability,” in Proceedings of the Symposium of Flexible Automation, pp. 997–1004, 1990.
- H. Nakamoto, F. Kobayashi, N. Imamura, H. Shirasawa, and F. Kojima, “Outer shape classification of object in rotation operation by universal robot hand using continuous dynamic programming (Mechanical Systems),” Transactions of the Japan Society of Mechanical Engineers C, vol. 74, no. 746, pp. 2521–2527, 2008.
- Y. Sakaguchi, “Haptic sensing system with active perception,” Advanced Robotics, vol. 8, no. 3, pp. 263–283, 1994.
- B. Mishra, J. T. Schwartz, and M. Sharir, “On the existence and synthesis of multifinger positive grips,” Algorithmica, vol. 2, no. 1, pp. 541–558, 1987.
- D. E. Goldberg, Genetic Algorithms in Search, Optimization, and Machine Learning, Addison Wesley, 1989.
- J. H. Holland, Adaptation in Natural and Artificial Systems, University of Michigan Press, 1975.