Table of Contents Author Guidelines Submit a Manuscript
Computational Intelligence and Neuroscience
Volume 2016 (2016), Article ID 9354760, 18 pages
Research Article

Designs and Algorithms to Map Eye Tracking Data with Dynamic Multielement Moving Objects

1School of Industrial and Systems Engineering, University of Oklahoma, 202 West Boyd Street, Norman, OK 73019, USA
2Aerospace Human Factors Research Division, Civil Aerospace Medical Institute AAM-520, Federal Aviation Administration, P.O. Box 25082, Oklahoma City, OK 73125, USA
3School of Electrical and Computer Engineering, University of Oklahoma, 110 W. Boyd Street, Devon Energy Hall 150, Norman, OK 73019-1102, USA

Received 28 November 2015; Revised 15 March 2016; Accepted 16 May 2016

Academic Editor: Hong Fu

Copyright © 2016 Ziho Kang et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.


Design concepts and algorithms were developed to address the eye tracking analysis issues that arise when (1) participants interrogate dynamic multielement objects that can overlap on the display and (2) visual angle error of the eye trackers is incapable of providing exact eye fixation coordinates. These issues were addressed by (1) developing dynamic areas of interests (AOIs) in the form of either convex or rectangular shapes to represent the moving and shape-changing multielement objects, (2) introducing the concept of AOI gap tolerance (AGT) that controls the size of the AOIs to address the overlapping and visual angle error issues, and (3) finding a near optimal AGT value. The approach was tested in the context of air traffic control (ATC) operations where air traffic controller specialists (ATCSs) interrogated multiple moving aircraft on a radar display to detect and control the aircraft for the purpose of maintaining safe and expeditious air transportation. In addition, we show how eye tracking analysis results can differ based on how we define dynamic AOIs to determine eye fixations on moving objects. The results serve as a framework to more accurately analyze eye tracking data and to better support the analysis of human performance.