Abstract

Loss of an eye is a tragedy for a person, who may suffer psychologically and physically. This paper is concerned with the design, sensing and control of a robotic prosthetic eye that moves horizontally in synchronization with the movement of the natural eye. Two generations of robotic prosthetic eye models have been developed. The first generation model uses an external infrared sensor array mounted on the frame of a pair of eyeglasses to detect the natural eye movement and to feed the control system to drive the artificial eye to move with the natural eye. The second generation model removes the impractical usage of the eye glass frame and uses the human brain EOG (electro-ocular-graph) signal picked up by electrodes placed on the sides of a person's temple to carry out the same eye movement detection and control tasks as mentioned above. Theoretical issues on sensor failure detection and recovery, and signal processing techniques used in sensor data fusion, are studied using statistical methods and artificial neural network based techniques. In addition, practical control system design and implementation using micro-controllers are studied and implemented to carry out the natural eye movement detection and artificial robotic eye control tasks. Simulation and experimental studies are performed, and the results are included to demonstrate the effectiveness of the research project reported in this paper.