Table of Contents Author Guidelines Submit a Manuscript
Journal of Applied Mathematics
Volume 2014, Article ID 423876, 6 pages
Research Article

Video Object Tracking in Neural Axons with Fluorescence Microscopy Images

1School of Mechanical Engineering, Xinjiang University, Urumqi, Xinjiang 830047, China
2Department of Electrical and Computer Engineering, University of Macau, Macau

Received 25 April 2014; Revised 2 July 2014; Accepted 3 July 2014; Published 21 July 2014

Academic Editor: Hesheng Wang

Copyright © 2014 Liang Yuan and Junda Zhu. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.


Neurofilament is an important type of intercellular cargos transmitted in neural axons. Given fluorescence microscopy images, existing methods extract neurofilament movement patterns by manual tracking. In this paper, we describe two automated tracking methods for analyzing neurofilament movement based on two different techniques: constrained particle filtering and tracking-by-detection. First, we introduce the constrained particle filtering approach. In this approach, the orientation and position of a particle are constrained by the axon’s shape such that fewer particles are necessary for tracking neurofilament movement than object tracking techniques based on generic particle filtering. Secondly, a tracking-by-detection approach to neurofilament tracking is presented. For this approach, the axon is decomposed into blocks, and the blocks encompassing the moving neurofilaments are detected by graph labeling using Markov random field. Finally, we compare two tracking methods by performing tracking experiments on real time-lapse image sequences of neurofilament movement, and the experimental results show that both methods demonstrate good performance in comparison with the existing approaches, and the tracking accuracy of the tracing-by-detection approach is slightly better between the two.