Vision-Based Control and Its ApplicationsView this Special Issue
Research Article | Open Access
Video Object Tracking in Neural Axons with Fluorescence Microscopy Images
Neurofilament is an important type of intercellular cargos transmitted in neural axons. Given fluorescence microscopy images, existing methods extract neurofilament movement patterns by manual tracking. In this paper, we describe two automated tracking methods for analyzing neurofilament movement based on two different techniques: constrained particle filtering and tracking-by-detection. First, we introduce the constrained particle filtering approach. In this approach, the orientation and position of a particle are constrained by the axon’s shape such that fewer particles are necessary for tracking neurofilament movement than object tracking techniques based on generic particle filtering. Secondly, a tracking-by-detection approach to neurofilament tracking is presented. For this approach, the axon is decomposed into blocks, and the blocks encompassing the moving neurofilaments are detected by graph labeling using Markov random field. Finally, we compare two tracking methods by performing tracking experiments on real time-lapse image sequences of neurofilament movement, and the experimental results show that both methods demonstrate good performance in comparison with the existing approaches, and the tracking accuracy of the tracing-by-detection approach is slightly better between the two.
Neurofilaments are defined as the long flexible protein polymers that are transported along the axonal process between neurons. Neurofilaments provide the protein polymers as the major components of the axonal cytoskeleton, and therefore, their movement patterns are important indicators for the growth and repair of neurons. The diameter of a neurofilament is about 10 nm and its length is about many micrometers . A neurofilament aligns to the long axis of the axon and has an intermittent and bidirectional motion [2–5]. The movement patterns of neurofilaments are largely unpredictable and stochastic in nature according to the neurology studies [4, 6]. To analyze the neurofilament movement pattern, neurofilaments need to be tracked in a time-lapse image sequence. By now, this tracking has often been done manually, which is labor-intensive, and the accuracy is undermined by human errors . Therefore, it is highly necessary to develop an automated tracking method with improved tracking efficiency and no subjectivity and variability associated with manual labeling.
Neurofilament tracking can be categorized under the study of intracelluar movement tracking and is related to visual object tracking in the computer vision literature. Jaqaman et al. [8, 9] solved a few problems in the correspondence step such as the problems of disappearing, merging, and splitting objects. Yang et al.  proposed an approach based on Kalman filter for tracking particle motion. However, these approaches demonstrate unsatisfactory performance with images of low SNR [11–13]. For object tracking in noisy image sequences, Smal et al.  presented a particle filtering based approach. They devised a point-spread function (PSF) to account for the imaging blur due to the diffraction-limit, which is used to calculate a more sensible likelihood value in the observation model. However, the inherent computational complexity involved with generic particle filtering approach was not studied in their approach.
In this paper, we present two technical solutions to solve the axonal neurofilament tracking problem based on particle filtering and tracking-by-detection, respectively. For the particle filtering approach, we take advantage of the fact that axons have elongated shape within which neurofilaments move. As a result, the dynamics of neurofilament movement are spatially constrained, and the searching space for localizing the neurofilament in the following frame is greatly reduced. Consequently, the particles can be reduced in comparison to the generic particle filtering approach without compromising the tracking accuracy. In the tracking-by-detection approach, an axon is first divided into multiple image blocks, and block-level neurofilament detection is done using Markov random field (MRF). The corresponding detections of a neurofilament are associated in successive frames to form a continuous track.
The paper is organized as follows. In Section 2, we describe the modeling method of the axon’s shape. In Section 3, axon constraints in the particle filtering algorithm is presented. In Section 4, we present the tracking-by-detection approach. In Section 5, experimental results are shown from the two approaches. In Section 6, we discuss and conclude the paper.
2. Modeling the Axon Path
Common to the two approaches discussed in this paper, we consider that it is necessary to model the axon path because the neurofilament movement is constrained by the path . Neural axons in fluorescence microscopy images appear as smooth curved path. Smooth curves like quadratic/cubic spline, Bezier curve, nonuniform rational B-spline (NURBS), and so forth can be used to fit the axon shape. In this paper, we choose to use cubic spline  for axon shape description because it can fit the axon with small error, yet has easy-to-compute derivatives. The function of the axon curve is described by where is any pixel position and , , , and represent the polynomial coefficients of each segment. Because the axon is fixed during the imaging process, maximum intensity projection of the whole video sequence can be performed to delineate the path of the filament for handling random imaging noises. The knot number of the spline is chosen by experiment to minimize the curve fitting error. Figure 1 shows models of the axon using this method.
3. The Constrained Particle Filtering
In this section, we introduce the first tracking approach: the constrained particle filtering . The key idea is as follows: because a neurofilament always moves within the axon path, we can limit its position and orientation according to the shape of the axon path. Such a reduction in search space requires fewer samples to cover it in the particle filtering approach and, therefore, yields an efficient solution without comprising tracking accuracy. Consider a neurofilament under tracking and model its dynamic state at time as , where is the position, is the orientation, and , , and represent the corresponding velocities. The dynamic model can thus be expressed as where represents a normal distribution with , , and being its deviation for location and orientation components.
Considering that the particle’s orientation is constrained by the axon path, we include the orientation constraint into the dynamic state model as in (2): and the orientation corresponds to the tangent direction of the curve at point :
In addition, the axon also limits the position of the neurofilament with the extending of axon path. We model the axon as a narrow strip with a width of 2 along the cubic spline curve. The strips’ width 2 is determined by the maximum width of the axon in the image, which is 10 pixels given the magnification level and physical size of the samples used in our dataset. A rejection sampling scheme is employed to generate samples that distribute according to the position constraint: if the distance () of a particle from the axis is less than , the particle will be accepted; otherwise, it is rejected. Thus, we can devise a constrained particle filtering approach to simultaneously account for the orientation and position constraints.
4. Tracking-by-Detection Approach
The tracking-by-detection approach works by first decomposing the axon into multiple equally sized blocks . The axon path is modeled by a cubic spline curve as discussed in Section 2. Rectangular blocks with fixed width and height are then generated along the axon, as shown in Figure 2. Similar to the strip width in the particle filtering approach, the block size is determined by the maximum width of the axon in the image, which is related to the magnification level and physical size of the axon.
After block decomposition of an axon, we perform neurofilament detection to find blocks containing the neurofilaments. We cast it as a labeling problem: a Label 1 is assigned to the neurofilament block and 0 otherwise. Denote the label configuration of all blocks along the axon by . Given the image observation , the the maximum a posterior (MAP) optimal label configuration is inferred by , where denotes the posterior probability.
We consider that the labeling of a block depends on its local image observation and its immediate neighboring labels. It is assumed that the probabilistic dependencies among variables satisfy the Markovian property. In this sense, the corresponding undirected probabilistic graph is a Markov random field (MRF). In this sense, the posterior probability can be written in the following form: where is the normalization factor named the partition function and is the energy function. The energy function is comprised of the clique potentials, that is, , where is a clique within the MRF. For our case, the unary and pairwise potentials are considered: The local observation and the neighboring consistency are taken care of by the unary and pairwise potentials, respectively.
Considering each block independently, the unary potential is defined as the negative log-likelihood for a block label: A logistic classifier determines the probability : where is the parameter vector obtained by training. For training the classifier, positive and negative sample blocks are chosen from selected images with manual labeling.
The pairwise potential tends to assign identical labels to neighboring blocks unless their image observations strongly violate such assignment. We define the prior depending on feature similarity as follows: where and are the intensity histograms obtained from the image observations and , the Bhattacharyya coefficient between two histograms, and the parameter for controlling the smoothness.
The formulation above satisfies the submodular requirement  for the pairwise potentials; that is, Minimizing the energy as defined in (6) can be efficiently solved by graph cut algorithms. Due to the space limit, we do not discuss the details for constructing the - graph and solving for the minimum - cut and refer the readers to .
In this section, the performance of two tracking methods, particle filtering and tracking-by-detection, has been tested using the video of neurofilament movement.
All of the image sequences of neurofilament movement were on the neurons cultured and recorded by the Brown Lab at The Ohio State University. The raw output image from the microscope measured 512 512 in area and the pixel factor was 0.131 m/pixel and was cropped to remove irrelevant regions. In the constrained particle filtering approach, the distance was selected to be 5 pixels. The positional variances ( and ) are 25 pixels and the angular variance () is 0.5 rad. In the tracking-by-detection approach, the minimum - cut algorithm is based on the implementation of Boykov-Kolmogorov algorithm  by graph-tool. The block size is 10 pixels. The smoothness prior factor is 2. Both approaches are implemented using Python/NumPy. We use MetaMorph software to manually track the neurofilament motion for achieving the ground true velocity value of the neurofilament motion. For the tracking-by-detection approach, we need to train the MRF model by manual labeling of training data. For our experiment, we have selected 20 images for training and used all the remaining for testing. The labeling is done by using MetaMorph software.
Figure 3(a) shows the tracking results of Generic Particle Filtering using 100 particles in Movie 1. We can see large tracking errors. Figures 3(b) and 3(c) show spatially constrained particle filtering with only 50 particles in Movie 1 and Movie 2, which show the good tracking results. Figure 4 shows the corresponding estimated neurofilament velocities acquired by the constrained particle filtering approach with 50 particles and actual neurofilament velocities in Movie 1. Thus, we can see that generic particle filtering gives many tracking errors and the constrained particle filtering shows more accurate tracking performance with fewer particles.
Figure 5 shows the image frames and tracking results, only the neurofilament in the middle is moving, and we only plot the results for this one for clarity. The leading and trailing ends are marked by crosshairs in the figure. The qualitative tracking results on both sequences are quite accurate.
Figure 6 shows that the neurofilament velocities between manual labeled method and our method for Movie 1. The calculated velocities from our method are very close to the manually labeled results. Moreover, comparing the tracking results of two methods in Movie 1 in Figures 4 and 6, we find that the tracking-by-detection is more accurate than particle filtering approach.
The runtime performance is important for the throughput of neurology analysis. For our current prototype implementation using Python running on a laptop with 2.6 GHz i7 processor and 8 GB RAM, the average durations for processing one frame are around 3 seconds and 0.2 seconds for the particle filtering and tracking-by-detection approaches, respectively. The latter is more computationally efficient. We expect that a reimplementation using low-level programming language and GPGPU techniques can dramatically improve the runtime performance of the tracking-by-detection approach and makes it practically useful for high-throughput neurofilament movement analysis.
In this paper, we propose two tracking approaches for neurofilament movement analysis based on particle filtering and tracking-by-detection. For the constrained particle filtering approach, we combine two constraints of the orientation and position into the generic particle filtering algorithm such that the tracking accuracy and efficiency are largely improved. For the tracking-by-detection approach, we model the detection problem as a Markov random field and use graph cut algorithm to solve the detection problem efficiently. Experiment results on fluorescence microscopy images of neurofilament movement demonstrate that both particle filtering and tracking-by-detection based approaches can achieve the satisfactory performance, but the latter has better tracking accuracy.
Conflict of Interests
The authors declare that there is no conflict of interests regarding the publication of this paper.
The authors thank Professor Anthony Brown at The Ohio State University for providing the fluorescence microscopy data used in this paper. Liang Yuan’s work is supported by grants from the National Science Foundation of China (Grant no. 61262059) and Scientific Research Foundation for Returned Scholars, Ministry of Education of China. Junda Zhu’s work is partially supported by Start-Up Grant and Multi-Year Research Grant (MYRG2014-00072-FST) from the Research Council University of Macau.
- R. Perrot, R. Berges, A. Bocquet, and J. Eyer, “Review of the multiple aspects of neurofilament functions, and their possible contribution to neurodegeneration,” Molecular Neurobiology, vol. 38, no. 1, pp. 27–65, 2008.
- A. Brown, “Slow axonal transport: Stop and go traffic in the axon,” Nature Reviews Molecular Cell Biology, vol. 1, no. 2, pp. 153–156, 2000.
- L. Wang, C. Ho, D. Sun, R. K. H. Liem, and A. Brown, “Rapid movement of axonal neurofilaments interrupted by prolonged pauses,” Nature Cell Biology, vol. 2, no. 3, pp. 137–141, 2000.
- A. Brown, L. Wang, and P. Jung, “Stochastic simulation of neurofilament transport in axons: the “stop-and-go” hypothesis,” Molecular Biology of the Cell, vol. 16, no. 9, pp. 4243–4255, 2005.
- N. Trivedi, P. Jung, and A. Brown, “Neurofilaments switch between distinct mobile and stationary states during their transport along axons,” Journal of Neuroscience, vol. 27, no. 3, pp. 507–516, 2007.
- P. Jung and A. Brown, “Modeling the slowing of neurofilament transport along the mouse sciatic nerve,” Physical Biology, vol. 6, no. 4, Article ID 046002, 2009.
- A. Brown, “Live-cell imaging of slow axonal transport in cultured neurons,” Methods in Cell Biology, vol. 2003, no. 71, pp. 305–323, 2003.
- K. Jaqaman, D. Loerke, M. Mettlen et al., “Robust single-particle tracking in live-cell time-lapse sequences,” Nature Methods, vol. 5, no. 8, pp. 695–702, 2008.
- K. Jaqaman and G. Danuser, “Computational image analys is of cellular dynamics: a case study based on particle tracking,” Cold Spring Harbor Protocols, 2009.
- G. Yang, A. Matov, and G. Danuser, “Reliable tracking of large scale dense antiparallel particle motion for fluorescence live cell imaging,” in Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops (CVPR Workshops '05), p. 138, San Diego, Calif, USA, June 2005.
- M. K. Cheezum, W. F. Walker, and W. H. Guilford, “Quantitative comparison of algorithms for tracking single fluorescent particles,” Biophysical Journal, vol. 81, no. 4, pp. 2378–2388, 2001.
- B. C. Carter, G. T. Shubeita, and S. P. Gross, “Tracking single particles: a user-friendly quantitative evaluation,” Physical Biology, vol. 2, no. 1, pp. 60–72, 2005.
- I. Smal, K. Draegestein, N. Galjart, W. Niessen, and E. Meijering, “Particle filtering for multiple object tracking in dynamic fluorescence microscopy images: application to microtubule growth analysis,” IEEE Transactions on Medical Imaging, vol. 27, no. 6, pp. 789–804, 2008.
- L. Yuan, Y. F. Zheng, J. Zhu, L. Wang, and A. Brown, “Object tracking with particle filtering in fluorescence microscopy images: application to the motion of neurofilaments in axons,” IEEE Transactions on Medical Imaging, vol. 31, no. 1, pp. 117–130, 2012.
- R. Burden and J. Faires, Numerical Analysis, Brooks/Cole, Pacific Grove, Calif, USA, 1997.
- L. Yuan and J. Zhu, “Neurofilament tracking by detection in fluorescence microscopy images,” in Proceedings of the of International Conference on Imaging Processing, pp. 3123–3127, Melbourne, Australia, 2013.
- V. Kolmogorov and R. Zabih, “What en ergy functions can be minimized via graph cuts?” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 26, no. 2, pp. 147–159, 2004.
- Y. Boykov and V. Kolmogorov, “An experimental comparison of min-cut/max-flow algorithms for energy minimization in vision,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 26, no. 9, pp. 1124–1137, 2004.
Copyright © 2014 Liang Yuan and Junda Zhu. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.